More than seven in 10 Brits are against Meta reducing the amount of harmful content from its platforms to allow more "free expression".
Meta, which owns and , will stop proactively scanning for harmful content in some instances in order to boost free speech and reduce "censorship". It would instead rely on users reporting harmful content to the firm, in a model more similar to Elon Musk’s X/.
Polling on Wednesday shows 86% of adults believe social media platforms should be required by law to proactively search for harmful content. Asked specifically about Meta's change to reduce the amount of content it automatically removed, 71% said they opposed it.
READ MORE:
The survey, of more than 2,000 people, was conducted by suicide prevention charity The Molly Rose Foundation, named after teen who took her life after viewing harmful material online. It comes amid concerns that the Government will give concessions to tech firms to secure an economic deal with , amid tariffs expected to come into force on Wednesday.
The MRF has previously warned that Meta's changes could place young people at greater risk of encountering harmful content online, and has urged the Government to strengthen the Online Safety Act with measures to stop social media firms from making policy changes similar to Meta's
Andy Burrows, chief executive of the MRF, said: “Decisions on how we protect our children must be taken by our Prime Minister and democratically elected government, not determined by tech oligarchs and the demands of the White House.”
He added: "Mark Zuckerberg's reckless changes pose a fundamental risk to children and young people. The Online Safety Act is the best vehicle we have to protect young people and society from harm, but Meta knows only too well that unless the legislation is strengthened there is nothing to stop them lighting the touchpaper on a disturbing bonfire of safety measures. The public want and expect urgent action to stop us going backwards on online safety.”
A DSIT spokesman said: “The law is clear - all social media companies which operate in the UK must remove illegal content - including content which encourages people to self-harm or take their own lives - and from this Summer, they will have to protect children from being exposed to harmful content under the Online Safety Act. These laws are the foundation for safer experiences online - we will be monitoring their impact closely and will not hesitate to strengthen protections to keep children safe.”
READ MORE: