Meta, the American tech giant, is being investigated by European Union regulators for the spread of disinformation on its facebook and instagram platforms, poor oversight of misleading ads, and potential failure to protect the integrity of elections.
On Tuesday, European Union officials said Meta did not appear to have sufficient safeguards to combat misleading ads, deepfakes and other misleading information that is maliciously spread online to amplify political divisions and influence elections.
The announcement appears to be aimed at pressuring Meta to do more ahead of elections in the 27 EU countries this summer to elect new members of the European Parliament. The vote, which will be held June 6-9, is being closely watched for signs of foreign interference, particularly from Russia, which has sought to weaken European support for the war in Ukraine.
Meta's research shows how European regulators are taking a more aggressive approach to regulating online content than authorities in the United States, where free speech and other legal protections limit the role the government can play in policing speech. online. An EU law that came into force last year, the Digital Services Act, gives regulators broad authority to monitor Meta and other large online platforms over content shared through their services.
“Large digital platforms must meet their obligations to dedicate sufficient resources to this, and today's decision shows that we are serious about compliance,” said Ursula von der Leyen, president of the European Commission, the executive branch of the Union. European, in a statement.
European officials said Meta must address weaknesses in its content moderation system to better identify malicious actors and remove related content. They observed a recent ai Forensics reporta civil society group in Europe, which identified a Russian information network that purchased misleading ads through fake accounts and other methods.
European officials said Meta appeared to be decreasing the visibility of political content with possible damaging effects on the electoral process. Officials said the company should provide more transparency about how such content is spread.
Meta defended its policies and said it acted aggressively to identify and block the spread of disinformation.
“We have a well-established process to identify and mitigate risks on our platforms,” the company said in a statement. “We look forward to continuing our cooperation with the European Commission and providing them with more details of this work.”
The Meta investigation is the latest announced by EU regulators under the Digital Services Act. The content moderation practices of TikTok and x, formerly known as twitter, are also being investigated.
The European Commission can fine companies up to 6 percent of global revenue under the digital law. Regulators may also raid a company's offices, interview company officials, and gather other evidence. The commission did not say when the investigation will end.
Social media platforms are under immense pressure this year as billions of people around the world vote in elections. The techniques used to spread false information and conspiracies have become more sophisticated (including new artificial intelligence tools to produce text, video and audio), but many companies have reduced their content and election moderation teams.
European officials noted that Meta had reduced access to its CrowdTangle service, which governments, civil society groups and journalists use to monitor disinformation on their platforms.