Thursday, November 28

Senior European Union officials were furious over the weekend after Twitter owner Elon Musk pulled the social media platform out of the bloc’s “Code of Practice,” which critics say is tantamount to a censorship regime.

The EU’s internal market commissioner, Thierry Breton, wrote that Twitter left the bloc’s Code of Practice, after reports claimed the platform would do so. Breton warned that Twitter would face some legal liabilities.

“Twitter leaves EU voluntary Code of Practice against disinformation. But obligations remain. You can run, but you can’t hide,” Breton wrote. “Beyond voluntary commitments, fighting disinformation will be legal obligation under #DSA as of August 25. Our teams will be ready for enforcement.”

An EU official also told Euractiv that the bloc is “waiting for this,” and “it was purely a matter of time” before reports surfaced that Musk would withdraw.

The rules known as the Digital Services Act (DSA) require companies to do risk management, conduct external and independent auditing, share data with authorities and researchers, and adopt a code of conduct by August.

The 19 companies that are subject to the rules include Alphabet’s Google Maps, Google Play, Google Search, Google Shopping, YouTube, Meta’s Facebook and Instagram, Amazon’s Marketplace, Apple’s App Store, and Twitter. The others are Microsoft’s two units LinkedIn and Bing, booking.com, Pinterest, Snap Inc’s Snapchat, TikTok, Wikipedia, Zalando, and Alibaba’s AliExpress.

Breton said he was checking to see whether another four to five companies fall under the DSA, with a decision expected in the next few weeks. Breton singled out Facebook’s content moderation system for criticism because of its role in building opinions on key issues.

“Now that Facebook has been designated as a very large online platform, Meta needs to carefully investigate the system and fix it where needed ASAP,” he said, adding: “We are also committed to a stress test with TikTok which has expressed also interest. So I look forward to an invitation to Bytedance’s headquarters to understand better the origin of Tiktok.”

Twitter had agreed earlier to a stress test, and Breton said he and his team would travel to the company’s headquarters in San Francisco at the end of June of this year to carry out the voluntary mock exercise. Breton didn’t detail what the test would entail.

There are guardrails for content generated by artificial intelligence, like deep fake videos and synthetic images, which will have to be clearly labeled when they come up in search results, Breton said. He’s also said that under the Digital Services Act, violations could be punished with hefty fines of up to 6 percent of a company’s annual revenue.

Platforms will have to “completely redesign” their systems to ensure a high level of privacy and safety for children, including verifying users’ ages, Breton said.

Big Tech companies also will have to revamp their systems to “prevent algorithmic amplification of disinformation,” he said, saying he was particularly concerned about Facebook’s content moderation systems ahead of September elections in Slovakia.

Facebook’s parent company said it supports the EU’s new Digital Services Act. “We take significant steps to combat the spread of harmful content on Facebook and Instagram across the EU,” Meta said several weeks ago. “While we do this all year round, we recognize it’s particularly important during elections and times of crisis, such as the ongoing war in Ukraine.”

Criticism

Jacob Mchangama, a Danish historian, sounded the alarm about the Digital Services Act in late 2022, writing in an opinion article that the plan would be a case of the “cure” being “worse than the disease.”

“But when it comes to regulating speech, good intentions do not necessarily result in desirable outcomes,” he wrote for the Los Angeles Times. “In fact, there are strong reasons to believe that the law is a cure worse than the disease, likely to result in serious collateral damage to free expression across the EU and anywhere else legislators try to emulate it.”

Although “removing illegal content sounds innocent enough,” he wrote that “it’s not.” That term—”illegal content”—is “defined very differently across Europe,” he said. “In France, protesters have been fined for depicting President Macron as Hitler, and illegal hate speech may encompass offensive humor” while “Austria and Finland criminalize blasphemy.”

An Epoch Times email that was sent to Twitter for comment was returned with an automated response that included a poop emoji. Musk announced earlier this year that the emoji would be sent automatically when journalists sent requests for comment.

Exit mobile version