Concern Over New Powers Handed to Ofcom to Regulate Online Harm


Fears remain about how the Online Safety Bill will hand unprecedented powers to the UK regulator of communications Ofcom to protect “online users from harm” now that the law has passed.

On Tuesday, the government announced that the Online Safety Bill, legislation to regulate online spaces, will become law.

The government has granted Ofcom new responsibilities and powers, with a wide range of compliance tools, fines, and sanctions, to become the regulator for online harm.

As the watchdog prepares to impose new online safety laws, it has made a string of Silicon Valley appointments and now, under the law, it must appoint a “committee on disinformation and misinformation.”

Ofcom told The Epoch Times that it will “flag any potential conflicts of interest” on this committee.

The watchdog already has experience in tackling this through its role in overseeing telecommunications and broadcasting.

If social media platforms do not comply with the rules, Ofcom can now fine them up to £18 million or 10 percent of their global annual revenue, whichever is biggest, meaning fines handed down to the largest platforms could reach billions of pounds.
Ofcom Chief Executive Dame Melanie Dawes welcomed the Online Safety Bill passing its final stage in Parliament, saying it was a “major milestone in the mission to create a safer life online for people.”

‘Strong Incentive … to Remove Content’

Last year Matthew Lesh, head of public policy of the free market think tank the Institute of Economic Affairs, warned in a report that the secretary of state and Ofcom will have unprecedented powers to define and limit speech, with “limited parliamentary or judicial oversight.”

Mr. Lesh told The Epoch Times that he thinks over time “we’re going to become a less free place in the UK.”

“Ofcom are going to have powers to, and this was added to the bill quite last minute, to be able to have remote access into the company’s systems, which obviously raises some quite serious privacy and security concerns,” he added.

“They’re meant to have that access to be able to monitor compliance with online safety, but really, they could use it for anything they want,” he added.

Mr. Lesh said there is now a “pretty strong incentive on the companies to remove more content.”

“So people are going to see their posts, even potentially innocent posts and speech removed on a far greater scale,” he added.

He said that there is a risk over time of non-UK platforms opting to leave the UK and block British users because “the compliance costs with the bill are so high.”

Disinformation Board

Under the law, Ofcom must appoint an “advisory committee on disinformation and misinformation.”

It is also written into law that Ofcom, as an organisation, must understand “the nature and impact of disinformation and misinformation, and reduce their and others’ exposure to it.”

The Epoch Times has previously highlighted that Ofcom works closely with partners such as Full Fact, which says it is a “team of independent fact checkers.”

Full Fact is a member Ofcom’s Making Sense of Media Advisory Panel, alongside Google, Facebook, and the BBC, and a member of the Counter-Disinformation Policy Forum, which is convened by the Department for Digital, Culture, Media and Sport (DCMS).

The DCMS has worked with the for-profit company NewsGuard Technologies, which audits online publishers for accuracy and issues “green” and “red” trust ratings and “nutrition labels” on news sites.
Anna-Sophie Harling, former managing director for Europe at NewsGuard Technologies, is Ofcom’s online safety principal and is in charge of “tackling disinformation on digital platforms.”

Ofcom also made a series of Big Tech hires recently.

After overseeing the development of the company’s voice-activated services at Amazon, Sachin Jogia now serves as Ofcom’s chief technology officer.

And former Google executive Gill Whitehead will lead Ofcom’s new online safety role.

‘Open Process’

Regarding the disinformation committee, an Ofcom spokesman told The Epoch Times by email that “Ofcom is preparing to regulate online safety by ensuring that tech companies have effective measures in place to protect their users, particularly children.”

The act also has rules to prevent children from viewing pornography, content that promotes suicide, self-harm, or eating disorders.

“The new laws will require us to set up an advisory committee on mis/disinformation, who will advise Ofcom, including on how we should use our transparency powers and media literacy duties in relation to mis/disinformation across in-scope platforms.”

He added that the committee “will need to be made up of experts in mis/disinformation, and those who can represent UK users and in-scope platforms.”

He said that when establishing the committee, they will “set out publicly what its statutory functions and remit will be, as well as how it will operate.”

“All advisory committee members follow a code of conduct that includes an ongoing requirement to flag any potential conflicts of interest,” he added.

‘Draconian Measures’

Alan Miller, co-founder of the Together Declaration, told The Epoch Times, “The public rallied against the Online Safety Bill and changed some of the very, very draconian measures within it and put pressure on the government.”

Together was formed in August 2021 “to unite people from all walks of life to push back against the rapidly growing infringements on our rights and freedoms.”

This includes campaigns against the Online Safety Bill, but also COVID-19 lockdowns, and ULEZ.

“Unfortunately, this is still a censor’s charter and outsources it to Big Tech, unelected, undemocratic, [who can]decide upon who can say what and when. This is unacceptable, we need to insist on free speech, no ifs or buts, in Britain,” added Mr. Miller.