TikTok executives and employees were well aware that its features encourage compulsive use of the app, as well as its corresponding negative effects on mental health, according to NPR. The broadcasting organization reviewed unredacted documents from the lawsuit filed by the Kentucky Attorney General's Office released by the Kentucky Public Radio. More than a dozen states sued TikTok a few days ago, accusing it of “falsely claiming (that it is) safe for young people.” Kentucky Attorney General Russell Coleman said the app was “specifically designed to be an addiction machine, targeting children who are still in the process of developing adequate self-control.”
Most of the documents submitted for the lawsuits had redacted information, but those from Kentucky had faulty wording. Apparently, TikTok's own research found that “compulsive use is correlated with a number of negative mental health effects, such as loss of analytical skills, memory formation, contextual thinking, depth of conversation, empathy, and increased anxiety.” “. TikTok executives also knew that compulsive use can interfere with sleep, work and school responsibilities, and even “connecting with loved ones.”
They also reportedly knew that the app's time management tool does little to help keep young users away from the app. While the tool sets the default limit for app use at 60 minutes per day, teens still spent 107 minutes on the app even when it was on. This is just 1.5 minutes less than the average usage of 108.5 minutes per day before the tool's launch. According to internal documents, TikTok based the tool's success on how it “improved public trust in the TikTok platform through media coverage.” The company knew the tool wasn't going to be effective, and a document said that “minors do not have executive function to control their screen time, while young adults do.” Another document reportedly said that “in most engagement metrics, the younger the user, the better the performance.”
Additionally, TikTok reportedly knows that “filter bubbles” exist and understands that they could be potentially dangerous. Employees conducted internal studies, according to the documents, in which they were sucked into negative filter bubbles shortly after following certain accounts, such as those that focused on painful (“painhub”) and sad (“sadnotes”) content. They are also aware of content and accounts that promote “skinny inspiration,” which is associated with eating disorders. Because of the way TikTok's algorithm works, its researchers found that users are placed in filter bubbles after 30 minutes of one-time use.
TikTok is also struggling with moderation, according to the documents. An internal investigation found that underage girls on the app were receiving “gifts” and “coins” in exchange for stripping naked live. And the company's top brass reportedly ordered its moderators not to remove users who were supposedly under 13 unless their accounts indicated that they were indeed under 13. NPR says TikTok also acknowledged that a substantial amount of content that violates its rules passes through its moderation techniques, including videos that normalize pedophilia, glorify minor sexual assault and physical abuse.
TikTok spokesperson Alex Haurek defended the company, telling the organization that the Kentucky AG's complaint “cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety.” It also said that TikTok has “robust protections, including proactively removing suspected underage users” and that it has “voluntarily rolled out safety features such as default screen time limits, family pairing, and default privacy for under 16s.”