TikTok might have had a stellar year, which it proudly reported here, but the platform has a very dark side. This troubling side-effect was uncovered by the Wall Street Journal, prompting the reporters to contact the video-sharing platform for comments. TikTok responded by rapidly changing some of its policies.
The WSJ investigation centered around the experiences of young girls thrown into the dizzying and scary world of weight loss, purging methods, and extremely harmful diets on TikTok. After conducting its own experiments, the WSJ concluded TikTok content can lead to eating disorders and exacerbate the situation of young girls already struggling with it.
To the surprise of the investigators, TikTok suddenly introduced changes to its video recommendation engine. While the investigators were no doubt pleased, the speed leaves them wondering why the TikTok had not implemented the features if it was that easy to do.
The WSJ created over 100 accounts which used the app without human oversight, with 12 of them registered as 13 year olds. The profiles were pointed in a particular direction by watching videos on weight loss, alcohol, and gambling. Next, the accounts stopped consuming gambling video and focused on only weight loss. TikTok’s video recommendation swiftly increased the number of weight loss videos it served to the accounts, despite their underage status.
Of all the 255,000 videos the bots watch during the experiment, 32,700 had descriptions that used hundreds of weight loss related keywords. 11,615 had tags related to eating disorders, and 4,402 contained tags that normalized eating disorders. WSJ discovered that some keywords were deliberately misspelt to escape TikTok’s flagging system.
The investigation team reported 2,960 videos related to eating disorders and 1,778 of them disappeared from the platform. It is not confirmed if the offending videos where deleted by TikTok or the creators.
As the WSJ prepared to publish the report, in fact, just a day before it went live, TikTok announced that it was working on ways to combat the existence of dangerous rabbit holes on its platform. This was just days after the WSJ requested for a comment from TikTok for the upcoming story, leading the researchers to conclude TikTok moved swiftly to preempt the report.
In the announcement, TikTok admitted that some of its videos could be harmful. It said “At TikTok, we recognize that too much of anything, whether it’s animals, fitness tips, or personal well-being journeys, doesn’t fit with the diverse discovery experience we aim to create.” The annoucnment then covered how TikTok would change its For You Page algorithm to prevent users seeing too many harmful videos in clusters. It said it wanted to guide users away from rabbit holes of content that reinforce negative personal experiences.
In a statement obtained by The Verge, TikTok’s spokesperson, Jamie Favazza, acknowledged that the WSJ experiment could happen in real life. “While this experiment does not reflect the experience most people have on TikTok, even one person having that experience is one too many. We allow educational or recovery-oriented content because we understand it can help people see there’s hope, but content that promotes, normalizes, or glorifies disordered eating, is prohibited.”
Favazza noted that TikTok points users towards the National Eating Disorder Association Hotline within the app.
TikTok is not the only social media platform facing scrutiny over negative effects on users, especially underage accounts. A similar investigation on Instagram showed how easy it is for kid accounts to find drugs and link up with drug dealers on the app by simply following hashtags. This was preceded by a whistleblower leaking internal documents that proved Meta knew about the negative effects of Instagram on teenage girls but did nothing about it. It culminated in an appearance before Congress by Instagram’s boss, Adam Mosseri.