TikTok’s recommendations algorithm has often been praised for its ability to start delivering content sooner than other social media platforms. TikTok attributes some of its fast growth to this algorithm and other aspects that make the app so popular. But a recent study shows that TikTok’s algorithm could be promoting harmful content, especially to younger users.
TikTok has become the fastest-growing social media platform to date. It has grown immensely in popularity in the few short years since its release. It’s grown so much that it’s already de-throned YouTube and is now going for Facebook. Part of this success is attributed to its unique recommendation algorithm, which gets users involved in the app immediately.
When first signing up, TikTok shows the user a ‘For You’ page with a few broad recommendations based on the user’s location, language, device and what’s currently popular. As users start interacting with the platform, the suggestions become more specific. However, a recent study has just shown how dangerous some of these recommendations can be.
The Center for Countering Digital Hate (CCDH) recently researched the recommendations algorithm, and the results were disturbing. The study by the CCDH found that certain users were repeatedly served content related to eating disorders, self-harm, and other harmful topics shortly after joining the platform.
The CCDH tested their theories by creating two accounts each in the US, UK, Canada and Australia, posing as 13-year-olds. One account was just given a female name, whereas the other account was given a female name and a reference to losing weight. The study then monitored the content being served to both accounts within 30 minutes.
Part of the research methodology meant that each account reacted to content regarding body image, mental health and eating disorders by pausing on related videos or liking them. They did not take into account the videos’ intent (positive or negative), as this is very vague.
The results showed some shocking tendencies. One user was shown content that referenced suicide within three minutes of joining. Another account showed eating disorder content within eight minutes of joining TikTok. On average, these accounts were shown videos about mental health or body image every 39 seconds.
The research also showed that the more vulnerable accounts, i.e. the one with reference to losing weight in the user handle, were shown three times more harmful content than the control group. Also, these accounts were shown 12 times more self-harm and suicide-related content.
This study has raised some concerns regarding the platform’s recommendation algorithm. Many are concerned that this promotes hatred of your body and extreme suggestions of self-harm and disordered attitudes toward food. TikTok has responded by saying the study does not reflect real human behavior and that the platform is already doing what it can to make TikTok a safe environment for users.
Some of TikTok’s practices are questionable, however, could they be giving this issue too little concern?