Online Safety Bill: Four in five adults want social media bosses held legally responsible if children hurt by content

An overwhelming majority of UK adults want tech giants to employ senior managers who are held legally responsible for children harmed by social media, according to new polling.

An overwhelming majority of UK adults want tech giants to employ senior managers who are held legally responsible for children harmed by social media, according to new polling.

Four in five surveyed by YouGov would back the requirement being added to the government’s Online Safety Bill, which aims to regulate internet content to help keep users safe.

It comes as a cross-party group of MPs backs an amendment to the legislation that would see tech bosses held to account should their platforms have contributed to the serious harm, abuse, or death of a child.

Last year, a coroner ruled that schoolgirl Molly Russell had died from “an act of self-harm while suffering from depression and the negative effects of online content”.

The MPs, including the Labour shadow cabinet and Conservatives Bill Cash and Miriam Cates, want social media companies to be made liable for such incidents, and are calling on the government to amend the Online Safety Bill.

In its current form, the bill would only hold managers responsible for failing to give information to regulator Ofcom, rather than for corporate decisions that result in preventable harm or sexual abuse.

The chief executive of the NSPCC, which commissioned the YouGov research, said the legislation should provide “bold, world-leading regulation that ensures the buck stops with senior management”.

Read more:
Why the Online Safety Bill is proving so controversial

Online Safety Bill might not be too little, but it’s certainly too late

Please use Chrome browser for a more accessible video player


0:49

Ian Russell ‘worried’ about revised Online Safety Bill

Culture Secretary Michelle Donelan wrote an open letter to parents before Christmas, promising that social media companies would be held responsible not just for illegal content on their platforms, but any material which can “cause serious trauma” to children.

The letter outlined six measures the bill will take to crack down on social media platforms:

• Removing illegal content, including child sexual abuse and terrorist content

• Protecting children from harmful and inappropriate content, such as cyberbullying or promoting eating disorders

• Putting legal duties on companies to enforce their own age limits, which for most are 13

• Make companies use age-checking measures to protect children from inappropriate content, similarly to a recent crackdown on porn sites in Louisiana

• Posts encouraging self-harm will be made illegal

• Companies will be made to publish risk assessments on potential dangers posed to children on their sites

If companies are found to be falling short, Ms Donelan said they face fines of up to £1bn and may see their sites blocked in the UK.

Please use Chrome browser for a more accessible video player


1:19

What is in the online safety bill?

Her letter came after the legislation returned to parliament following multiple delays, having found itself in the crosshairs of free speech campaigners concerned that far-reaching regulation could amount to censorship.

The bill is scheduled to return to parliament on 16 January.