Facebook and X must comply with UK online safety laws
Sandy Verma January 12, 2025 11:24 PM

Social media sites such as Facebook and X will still have to comply with UK law, Science Secretary Peter Kyle has said, following a decision by tech giant Meta to change rules on fact-checkers.

Mark Zuckerberg, whose company Meta includes Facebook and Instagram, said earlier this week that the shift – which only applies in the US – would mean content moderators will “catch less bad stuff” but would also reduce the number of “innocent” posts being removed.

Kyle told the BBC’s Sunday with Laura Kuenssberg show the announcement was “an American statement for American service users”.

“If you come and operate in this country you abide by the law, and the law says illegal content must be taken down,” he added.

On Saturday Ian Russell, the father of Molly Russell, who took her own life at 14 after seeing harmful content online, urged the prime minister to tighten internet safety rules, saying the UK was “going backwards” on the issue.

He said Zuckerberg and X boss Elon Musk were moving away from safety towards a “laissez-faire, anything-goes model”.

He said the companies were moving “back towards the harmful content that Molly was exposed to”.

A Meta spokesperson told the BBC there was “no change to how we treat content that encourages suicide, self-injury, and eating disorders” and said the company would “continue to use our automated systems to scan for that high-severity content”.

Internet safety campaigners complain that there are gaps in the UK’s laws including a lack of specific rules covering live streaming or content that promotes suicide and self-harm.

Kyle said current laws on online safety were “very uneven” and “unsatisfactory”.

The Online Safety Act, passed in 2023 by the previous government, had originally included plans to compel social media companies to remove some “legal-but-harmful” content such as posts promoting eating disorders.

However the proposal triggered a backlash from critics, including the current Conservative leader Kemi Badenoch, concerned it could lead to censorship.

In July 2022, Badenoch, who was not then a minister, said the bill was in “no fit state to become law” adding: “We should not be legislating for hurt feelings.”

Another Conservative MP, David Davis, said it risked “the biggest accidental curtailment of free speech in modern history”.

The plan was dropped for adult social media users and instead companies were required to give users more control to filter out content they did not want to see. The law still expects companies to protect children from legal-but-harmful content.

Kyle expressed frustration over the change but did not say if he would be reintroducing the proposal.

He said the act contained some “very good powers” he was using to “assertively” tackle new safety concerns and that in the coming months ministers would get the powers to make sure online platforms were providing age-appropriate content.

Companies that did not comply with the law would face “very strident” sanctions, he said.

He also said Parliament needed to get faster at updating the law to adapt to new technologies and that he was “very open-minded” about introducing new legislation.

Rules in the Online Safety Act, due to come into force later this year, compel social media firms to show that they are removing illegal content – such as child sexual abuse, material inciting violence and posts promoting or facilitating suicide.

They also says companies have to protect children from harmful material including pornography, material promoting self-harm, bullying and content encouraging dangerous stunts.

Platforms will be expected to adopt “age assurance technologies” to prevent children from seeing harmful content.

The law also requires companies to take action against illegal, state-sponsored disinformation. If their services are likely to be accessed by children they should also take steps to protect users against misinformation.

In 2016, Meta established a fact checking programmer where by third party moderators would check posts on Facebook and Instagram that appeared to be false or misleading.

Content flagged as inaccurate would be moved lower in users’ feeds and accompanied by labels offering viewers more information on the subject.

However, on Tuesday, Zuckerberg said Meta would be replacing the fact checkers, and instead adopt a system – introduced by X – of allowing users to add “community notes” to posts they deemed to be untrue.

Defending the change, Zuckerberg said moderators were “too politically biased” and it was “time to get back to our roots around free expression”.

The step comes as Meta seeks to improve relations with incoming US President Donald Trump who has previously accused the company of censoring right-wing voices.

© Copyright @2025 LIDEA. All Rights Reserved.