It is home to dance tutorial videos and viral comedy skits. But it also hosts self-harm and disordered eating content, with an algorithm that has been called the “crack cocaine of social media.”
Now the information commissioner has concluded that as many as 1.4 million children under the age of 13 have access to TikTok, and the watchdog accuses the Chinese company of not doing enough to verify that underage children are not using the app.
“There is self-harm content, there is nonsense content about mental health cures. [conditions]said Imran Ahmed of the Center for Countering Digital Hate, which produced a report last December suggesting that TikTok’s algorithm was sending harmful content to some users within minutes of signing up.
He said he found the most dangerous aspect was the sheer prevalence of such content on social media platforms. “The truth is that they are being inundated with content that gives them extremely distorted views of themselves, their bodies, their mental health, and how they compare to other people,” she added.
Ahmed said his organization’s research suggested that changing a user’s name from “Sarah” to “Sarah Lose Weight” on TikTok could result in its algorithm serving 12 times more self-harm content.
“The algorithm recognizes vulnerability and instead of seeing it as something to be careful about, it sees it as a potential point of addiction, of helping maximize time on the platform for that child by offering them content that could trigger some of the issues. pre-existing concerns.
Ahmed said the center’s research, which was based on 13-year-old users in the UK, US, Canada and Australia, suggested that about two and a half minutes after setting up a TikTok account, young people could be pushed self-harm content – and eating disorder content within eight.
More recent research, he said, suggested that a 14-year-old boy on the platform likely received content from the virulently misogynistic Andrew Tate in less than three minutes after setting up an account.
“That is deeply concerning. [because] what underlies both, of course, are incredibly dangerous views of how a girl should be and how a young man should behave and view young women. There is a fundamental misogyny that underpins both.”
According to the information commissioner, more than 1 million underage children could have been exposed, with the platform collecting and using their personal data. “That means your data may have been used to track and profile you, which could lead to harmful and inappropriate content on your next scroll,” he said.
A TikTok spokesperson said: “Many people struggling with their mental health or on a recovery journey come to TikTok for support, and our goal is to help them do so safely.
“Our community guidelines are clear that we do not allow the promotion, normalization, or glorification of eating disorder, suicide, or self-harm content. We regularly consult with health experts, eliminate violations of our policies, and provide access to support resources for anyone who needs it.
“We are open to feedback and scrutiny, and look to engage constructively with partners who have experience with these complex issues, just as we do with NGOs in the US and UK.”
Ahmed added that TikTok is by no means the only social media platform he has seen do too little to police harmful content being shared, describing an arms race among others to come up with increasingly effective ways to keep viewers engaged. users, even if it means putting them at risk.