Meta has received more than 1.1 million reports from users under the age of 13 on its Instagram platform since early 2019, but “disabled only a fraction” of those accounts, according to a newly unsealed legal complaint against the company filed by prosecutors. generals of 33 states.
Instead, the social media giant “continued to routinely collect” children’s personal information, such as their locations and email addresses, without parents’ permission, in violation of federal children’s privacy law, according to the judicial file. Meta could face hundreds of millions of dollars, or more, in civil penalties if states prove the allegations.
“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed,” the complaint said, “and protected jealously so that it is not revealed to the public.
The privacy charges are part of a broader federal lawsuit, filed last month by California, Colorado and 31 other states in the U.S. District Court for the Northern District of California. The lawsuit accuses Meta of unfairly trapping young people on its Instagram and Facebook platforms while hiding internal studies that show harm to users. And it seeks to force Meta to stop using certain features that states say have harmed young users.
But much of the evidence cited by the states was redacted by redactions in the initial filing.
Now the open complaint, filed Wednesday night, provides new details of the states’ lawsuit. Using snippets of internal emails, employee chats and company presentations, the complaint alleges that Instagram for years “coveted and pursued” underage users, even as the company “failed” to comply with children’s privacy law.
The unsealed filing said Meta “continually failed” to make effective age verification systems a priority and instead used approaches that allowed users under 13 to lie about their age to set up Instagram accounts. He also accused Meta executives of publicly stating in congressional testimony that the company’s age verification process was effective and that the company deleted minor accounts when it became aware of them, even though the executives knew there were millions of underage users on Instagram.
“Tweens want access to Instagram and they are lying about their age to get it now,” Adam Mosseri, head of Instagram, said in an internal company chat in November 2021, according to the court filing.
In Senate testimony The following month, Mosseri said: “If a child is under 13 years old, they are not allowed to access Instagram.”
In a statement Saturday, Meta said it had spent a decade working to make online experiences safe and age-appropriate for teens and that the states’ complaint “mischaracterizes our work using selective citations and cherry-picked documents.” “.
The statement also noted that Instagram’s terms of use prohibit users under the age of 13 in the United States. And it said the company had “measures in place to remove these accounts when we identify them.”
The company added that verifying people’s ages was a “complex” challenge for online services, especially with younger users who may not have a school ID or driver’s license. Meta said he would like to see federal legislation that would require “app stores to get parental approval every time their teens under 16 download apps” instead of requiring teens or their parents to provide personal data such as birth dates. birth to many different applications.
The privacy charges in the case center on a 1998 federal law, the Children’s Online Privacy Protection Act. That law requires online services with content aimed at children to obtain verifiable permission from a parent before collecting personal data (such as names, email addresses or selfies) from users under 13. Fines for violating the law can amount to more than $50,000 per violation.
The lawsuit argues that Meta decided not to create systems to effectively detect and exclude such underage users because it viewed children as a crucial demographic (the next generation of users) that the company needed to capture to ensure continued growth.
Meta had many indicators of underage users, according to Wednesday’s filing. An internal company chart displayed in the unsealed material, for example, showed how Meta tracked the percentage of 11- and 12-year-olds who used Instagram daily, according to the complaint.
Meta was also aware of accounts belonging to specific underage Instagram users through the company’s reporting channels. But she “automatically” ignored certain reports from users under 13 and allowed them to continue using her accounts, according to the complaint, as long as the accounts did not contain a biography or photographs of the user.
In one case in 2019, Meta employees discussed in emails why the company had not deleted four accounts belonging to a 12-year-old girl, despite requests and “complaints from the girl’s mother who claimed that her daughter was 12 years old,” according to the complaint. The employees concluded that the accounts were “ignored” in part because Meta representatives “could not say with certainty that the user was a minor,” according to the legal document.
This is not the first time the social media giant has faced accusations of privacy violations. In 2019, the company agreed to pay a record $5 billion and change its data practices to resolve Federal Trade Commission charges that it misled users about its ability to control their privacy.
It may be easier for states to go after Meta for violations of children’s privacy than to prove that the company encouraged compulsive social media use (a relatively new phenomenon) among young people. Since 2019, the FTC has successfully filed similar children’s privacy complaints against tech giants, including Google and its YouTube platform, Amazon. microsoft and Epic Games, the creator of Fortnite.