Tech companies can now face fines and other measures for failing to remove illegal content under the UK’s Online Safety Act, regulator Ofcom said on Monday.
The law, passed in October 2023, brings into place more stringent measures against social media platforms, search engines, messaging systems, gaming and dating apps and pornography and file-sharing sites for failing to remove illegal content.
Sanctioned content includes materials related to militant groups, hate, fraud, child sexual abuse, and content encouraging or assisting suicide.
Content rules
In December communications regulator Ofcom published its initial codes of practice for tech firms under the new legislation and gave them until 16 March to assess the risks illegal online content poses to their users.
“Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that,” said Ofcom enforcement director Suzanne Cater.
Platforms targeted by the legislation include Facebook, TikTok, YouTube and others, which Ofcom said must now ensure they have better moderation, reporting and built-in safety tests to root out criminal activity.
Ofcom can levy fines of up to £18m or 10 percent of a company’s annual global turnover for failure to comply.
In the most severe cases the regulator may seek a court order blocking a service from access in the UK.
File-sharing and file-storage services are especially vulnerable to being misused for sharing child sexual abuse material, Ofcom said, and on Monday launched a separate enforcement programme to assess the safety measures of such platforms.
It said it requested several file-storage firms to deliver risk assessments by 31 March.
Effectiveness
Iona Silverman, a partner at UK law firm Freeths, said that for the Online Safety Act to make a difference Ofcom must adopt a robust approach to ensure platforms, especially the largest ones, make concrete compliance efforts.
Silverman noted that some of the biggest platforms covered by the legislation have recently shown signs of non-compliance, citing Meta’s decision in January to discontinue third-party fact-checking in the US and implement a community-driven model.
Some US-based organisations have criticised the Act, with the Electronic Frontier Foundation saying at the time of its passage that it would lead to a “much more censored, locked-down internet for British users” and could undermine the privacy and security of “internet users worldwide”.
Similarly the chairman of the US’ Federal Communications Commission, Brendan Carr, earlier this month called the EU’s Digital Services Act rules, which also regulate illegal materials on online platforms, was “incompatible” with US free speech laws.