TikTok has been fined £12.7m for multiple breaches of data protection law, including using the personal data of children under 13 without parental consent, Britain’s data watchdog said .
The Chinese-owned video app had not done enough to verify who was using the platform and remove underage children, the Information Commissioner’s Office (ICO) said on Tuesday.
Failure to enforce age limits led to “up to 1.4 million UK children” under the age of 13 using the platform as of 2020, the ICO estimated, despite the fact that the ICO’s own rules company prohibited the practice. UK data protection law does not strictly prohibit children’s use of the internet, but requires that organizations using children’s personal data obtain the consent of their parents or carers.
In a statement, Information Commissioner John Edwards said: “Laws are in place to ensure our children are as safe in the digital world as they are in the physical world. TikTok did not comply with those laws.
“As a consequence, approximately 1 million children under the age of 13 received inappropriate access to the platform, with TikTok collecting and using their personal data. That means your data may have been used to track and profile you, which could lead to harmful and inappropriate content on your next scroll.”
“TikTok should have known better,” Edwards added. “TikTok should have done better. Our £12.7 million fine reflects the serious impact their failures may have had. They did not do enough to verify who was using their platform or take enough steps to remove underage children who were using their platform.”
The ICO investigation found that the concern was raised internally but that TikTok did not respond “adequately.”
In a statement, a TikTok spokesperson said: “TikTok is a platform for users 13 years of age and older. We invest heavily in helping keep children under 13 off the platform and our 40,000 strong security team works around the clock to help keep the platform safe for our community.
“While we disagree with the ICO’s decision, which relates to May 2018 to July 2020, we are pleased that the fine announced today has been reduced to less than half the amount proposed last year. We will continue to review the decision and are considering next steps.”
TikTok stressed that it had changed its practices since the period the ICO investigated. Now, like its social media peers, the site uses more signals than users’ self-reported age when trying to determine their age, including training its moderators to identify accounts underage and providing tools for parents to request the age. deletion of your accounts. accounts of minor children.
The allegations also predate the ICO’s introduction of the “age-appropriate design code,” which specifies an even stricter set of rules that platforms are expected to follow when handling children’s personal data. That code also makes it more explicit that platforms cannot argue ignorance about the ages of younger users as a defense of not treating their personal data with care.
In 2019, TikTok was fined $5.7 million by the US Federal Trade Commission for similar practices. That fine, a record at the time, was also imposed on TikTok for inappropriately collecting data from children under 13. That year, the company pledged to improve its practices, saying it would start keeping younger users in “age-appropriate TikTok environments.” where those under 13 would be pushed into a more passive role, able to watch videos, but not post or comment on the platform.