Britain’s data protection authority on Tuesday issued a $15.9 million fine on TikTok, the popular video-sharing app, saying the platform failed to comply with data protection rules aimed at protecting children in line.
The Information Commissioner’s Office said TikTok had improperly allowed up to 1.4 million children under the age of 13 to use the service in 2020, breaching British data protection rules that require parental consent for organizations to use. children’s personal information. TikTok was unable to obtain that consent, regulators said, even though it should have known younger children were using the service.
The British research found that the video-sharing app did not do enough to identify underage users or remove them from the platform, despite the fact that TikTok had rules that prohibited children under 13 from creating accounts. TikTok failed to take appropriate action, regulators said, even after some senior employees of the video-sharing platform raised concerns internally about the app’s use by underage children.
TikTok, owned by Chinese internet giant ByteDance, has also faced scrutiny in the United States. Last month, members of Congress questioned its chief executive, Shou Chew, about the potential national security risks posed by the platform.
TikTok’s privacy fine underscores growing public concern about the mental health and safety risks popular social networks can pose to some children and teens. Last year, researchers reported that TikTok began recommending content related to eating disorders and self-harm to users as young as 13 within 30 minutes of joining the platform.
In a statement, John Edwards, Britain’s information commissioner, said TikTok’s practices could have put children at risk.
“An estimated one million children under the age of 13 were granted inappropriate access to the platform, and TikTok collected and used their personal data,” Edwards said in the statement. “That means your data may have been used to track and profile you, which could lead to harmful and inappropriate content on your next scroll.”
In a statement, TikTok said it disagreed with the regulators’ findings and was reviewing the case and considering next steps.
“TikTok is a platform for users over the age of 13,” the company said in the statement. “We invest heavily in helping keep children under 13 off the platform, and our 40,000-strong security team works around the clock to help keep the platform safe for our community.”
This isn’t the first time regulators have subpoenaed the popular video-sharing app. about children’s privacy concerns. In 2019, Musical.ly, the operator of the platform now known as TikTok, agreed to pay $5.7 million to settle Federal Trade Commission charges that it had violated rules protecting children’s online privacy in the United States. Joined.
Since then, lawmakers in the United States and Europe have implemented new rules to try to strengthen the protection of children online.
In March, Utah passed a sweeping law that would ban social media platforms. like TikTok and Instagram to allow minors in the state to have accounts without parental consent. Last fall, California passed a law that would require many social networks, video games and other apps to turn on the highest privacy settings and turn off potentially risky features like friend finders that allow adult strangers to communicate with children, by default for minors. .