May 9, 2024

The Commission has given the firms until October 25 to provide information about measures they’ve taken to curb the spread of illegal content.

After X, Meta’s Facebook and Instagram as well as TikTok are under close scrutiny over potentially breaching the European Union’s content moderation law, the Digital Services Act (DSA), for content related to the Israel-Hamas conflict.

The European Commission on Thursday sent formal requests for information letters to Meta and TikTok, giving the platform firms until October 25 to provide information about measures they’ve taken to curb the spread of illegal content and disinformation on their popular platforms in the wake of Hamas’ attack in Israel and the war that started since.

X, formerly known as Twitter, received a similar request last and had to answer on October 18 about the measures it had taken connected to the ongoing Middle Eastern crisis.

A formal request of information is a preliminary step that could result in the European Commission opening an official investigation into non-compliance. Firms face fines of up to 6 percent of their annual global revenue if it turns out they violated the DSA.

The European Commission also gave both companies until November 8 to lay out compliance with other potential issues including how they have protected the integrity of elections. TikTok is also required to clarify how it has been safeguarding minors on its video-sharing app used by tens of millions of European teenagers.

Meta and TikTok’s chief executive officers Mark Zuckerberg and Shou Zi Chew were warned by Internal Market Commissioner Thierry Breton last week to step up their efforts to stem falsehoods and illegal posts like terrorist propaganda.

“We’re happy to provide further details of this work, beyond what we have already shared, and will respond to the European Commission,” said Ben Walters, a spokesperson for Meta.“We have a well-established process for identifying and mitigating risks during a crisis while also protecting expression,”

He said that following the Hamas attack, Meta set up a “special operations center” with teams including Hebrew and Arabic speakers working to moderate content violating Meta’s policies or local laws and coordinate with third-party fact-checkers.

“We just heard from the European Commission this morning and our team is currently reviewing the [request for information],” said James Lyons, spokesperson for TikTok. “We’ll publish our first transparency report under the DSA next week, where we’ll include more information about our ongoing work to keep our European community safe.”

Source: Politico