Social media companies including Facebook and Twitter will be legally required to protect their users under government plans to introduce a regulator.
The long-awaited government proposal, which would also see bosses of companies personally liable for harmful content on their platform, will ensure internet firms meet their responsibilities, which will be outlined by a mandatory duty of care.
“The era of self-regulation for online companies is over,” culture secretary Jeremy Wright said.
We’ll tell you what’s true. You can form your own view.
From 15p €0.18 $0.18 USD 0.27
a day, more exclusives, analysis and extras.
“Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.”
The government is currently consulting on whether a new regulator is needed to enforce the rules or if they should use an existing one such as Ofcom.
The measures come amid concerns about the growth of violent content, disinformation and inappropriate material online.
In March, the father of Molly Russell urged the government to introduce regulation on social media platforms in response to the suicide of his 14-year-old daughter, who was found to have viewed content related to depression and suicide on Instagram before her death.
Prime minister Theresa May said the proposals were a result of social media companies’ failure to self-regulate.
“The internet can be brilliant at connecting people across the world – but for too long these companies have not done enough to protect users, especially children and young people, from harmful content,” she said.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe.
“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
The proposed new laws will apply to any company that allows users to share or discover user-generated content or interact with each other online, the government said.
The regulation will be applicable to companies of all sizes – from social media platforms to file hosting sites, forum, messaging services and search engines.
The proposal also calls for powers to force internet firms to publish annual transparency reports about the harmful content on their platforms and how they are addressing it.
Companies including Facebook and Twitter already publish reports of this nature.
In March, Facebook boss Mark Zuckerberg wrote an op-ed calling for governments to play a more active role in establishing regulation for the internet.
The home secretary, Sajid Javid, said tech firms had a “moral duty” to protect the young people they “profit from”.
“Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online,” he said.
“That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise.”
However, former culture secretary John Whittingdale warned ministers risk creating a “draconian censorship regime” in their attempt to regular internet firms.
He feared the plans would send the wrong message to other countries which censor their people.
“Countries such as China, Russia and North Korea, which allow no political dissent and deny their people freedom of speech, are also keen to impose censorship online, just as they already do on traditional media,” he said.
He added that the UK regulator “must not give the despots an excuse to claim that they are simply following an example set by Britain”.
Daniel Dyball, UK executive director at the Internet Association, criticised the current scope of the proposals for being “extremely wide”, which could hinder their implementation.
“The internet industry is committed to working together with government and civil society to ensure the UK is a safe place to be online. But to do this we need proposals that are targeted and practical to implement for platforms both big and small,” he said.
“We also need to protect freedom of speech and the services consumers love. The scope of the recommendations is extremely wide, and decisions about how we regulate what is and is not allowed online should be made by parliament.”
The proposal received a positive response from children’s charities which have campaigned for regulation.
Peter Wanless, chief executive of the NSPCC, said it would make the UK a “world pioneer” in protecting children online, and Barnardo’s chief executive Javed Khan said the announcement was “an important step in the right direction”.
Anne Longfield, the Children’s Commissioner for England, welcomed the proposed legislation but added that it needed to be backed up by robust penalties.
“The social media companies have spent too long ducking responsibility for the content they host online and ensuring those using their apps are the appropriate age,” she said.
“Any new regulator must have bite. Companies who fail in their responsibilities must face both significant financial penalties and a duty to publicly apologise for their actions, and set out how they will prevent mistakes happening in the future.”
In response to the proposals, Facebook’s UK head of public policy Rebecca Stimson said the company was looking forward to working with the government to ensure the regulation is effective.
A 12-week consultation will now take place before the government publishes its final proposals for legislation.