European Union regulators on Thursday opened investigations into American tech giant Meta over the potentially addictive effects that instagram and facebook have on children, a move with far-reaching implications because it goes to the core of how the company's products are designed. .
Meta's products can “exploit the weaknesses and inexperience of minors” to create behavioral dependencies that threaten their mental well-being, the European Commission, the executive branch of the 27-member bloc, said in a statement. EU regulators could ultimately fine Meta up to 6 percent of its global revenue, which was $135 billion last year, in addition to forcing other product changes.
The investigations are part of a growing effort by governments around the world to monitor services like instagram and TikTok to protect minors. Meta has faced criticism for years that its products and recommendation algorithms are tuned to hook children. In October, three dozen US states sued Meta for using “psychologically manipulative product features” to attract children, in violation of consumer protection laws.
EU regulators said they had been in contact with their US counterparts about the investigations announced on Thursday. Regulators said Meta could be violating the Digital Services Act, a law passed in 2022 that requires large online services to more aggressively monitor their platforms for illicit content and have policies in place to mitigate risks to children. People under 13 are not supposed to register for an account, but EU investigators said they would examine the company's age verification tools as part of their investigation.
“We will now investigate in depth the potential addictive effects and 'rabbit holes' of the platforms, the effectiveness of their age verification tools and the level of privacy offered to minors in the operation of recommendation systems,” said Thierry Breton, representative of the European Union. the Internal Markets Commissioner, who is overseeing the investigations, said in a statement. “We spare no effort to protect our children.”
On Thursday, Meta said its social media services were safe for young people, highlighting features that allow parents and children to set time limits on how much they use instagram or facebook. Teens also have default access to more restrictive content and recommendation settings. Advertisers may not display targeted ads to underage users based on their activity in Meta apps.
“We want young people to have safe, age-appropriate online experiences and have spent a decade developing more than 50 tools and policies designed to protect them,” Meta said in a statement. “This is a challenge facing the entire industry and we look forward to sharing details of our work with the European Commission.”
EU officials did not give a timeline for how long the investigation would last. But the opening of a formal investigation Thursday gives regulators broad authority to gather evidence from Meta, including sending legal requests for information, interviewing company executives and conducting inspections of corporate offices. The instagram and facebook investigations will be carried out separately.
EU regulators have targeted several companies since the Digital Services Act came into force. Last month, TikTok suspended a version of its app in the European Union after authorities raised questions about an “addictive” feature that allows users to earn rewards such as gift cards for watching videos, liking content and following to certain creators.
goal faces another investigation related to political advertising, while x, the social media site owned by Elon Musk, faces a consultation related to content moderation, risk management and advertising transparency.