TikTok promotes harmful content including eating disorders and self-harm to teen user feeds, according to a new NGO report.
A new report shows that TikTok pushes harmful content promoting eating disorders and self-harm to teens every 39 seconds.
According to the CCDH report, harmful content includes eating disorders, self-harm, and sexual assault content.
Research shows that harmful TikTok content goes under the radar with moderators due to coded hashtags.
The CCDH has shared four recommendations to TikTok to help make the app safer for teens.
A new report found that TikTok pushes harmful eating disorders and self-harm content to teens every 39 seconds.
The Center for Countering Digital Hate's (CCDH) report says that harmful content includes information about eating disorders, self-harm, and sexual assault.
The CCDH is an international, U.S.-based nonprofit and NGO. According to the study, the NGO seeks to decrease factors that encourage digital hate and misinformation.
The CCDH researchers looked into TikTok’s algorithm by making new accounts for 13-year-olds in the U.S., U.K., Australia, and Canada. One of these accounts has a username that shows someone worried about their appearance.
During the study, the researchers recorded and liked any videos about body image, mental health, or eating disorders suggested by the app’s algorithm on the "For You" feed for the first 30 minutes. The recordings were then viewed to see how often they suggested content about body image, mental health, self-harm, and eating disorders.
The CCDH study made “standard teen” and “vulnerable teen” accounts to help get more specific results. The "Vulnerable Teen Accounts" used in the study were based on research from the tech reform group Reset.
This research showed that young Instagram users who support eating disorders usually choose usernames that include related words such as "anorexia." Teens with eating disorders are more likely to see content about eating disorders on social media because they are looking for information that confirms their beliefs about the subject.
In the same way, there is evidence that people who are likely to be affected by content about depression, self-harm, and suicide will choose usernames with words related to these topics.
The study found that TikTok showed teens' body image and mental health content every 39 seconds. New TikTok accounts in the study were recommended self-harm and eating disorder content within minutes of scrolling the app’s “For You” feed.
The study found that some pro-eating disorder content avoided moderation by using coded hashtags, like co-opting the name of singer Ed Sheeran.
It also showed that TikTok targets vulnerable teens with more harmful content than standard teens, not less. In fact, teens described as vulnerable were shown three times as many harmful videos as standard teen accounts and 12 times as many self-harm videos as standard teen accounts.
Since the app launched in 2017, TikTok has become the social media platform with the fastest growth, reaching a billion users. Today, two-thirds of U.S. teens use the app, and the average American user watches 80 minutes of TikTok videos daily.
An ongoing study has found a steady increase in eating disorders over the last 50 years in young women and girls ages 15 to 24.
Currently, suicide is the second leading cause of death for those between the ages of 15 and 24 in the U.S. About 20% of American teens say they have had serious thoughts of suicide.
A British court ruling that Instagram posts contributed to the suicide of 14-year-old Molly Russell in September 2022 has further raised concern for social media’s effects on teen mental health struggles.
The CCDH says that TikTok has a responsibility to be a safer app for teens, especially the most vulnerable. The study offers four recommendations to make TikTok safer for teenagers:
TikTok needs to show how its algorithms and rules are enforced, or the government should step in and make it do so.
TikTok should seek help from public health, civil society, and academic professionals with expertise in eating disorders to improve its eating disorder content.
Lawmakers should encourage TikTok to use the CCDH STAR Framework for social media, which includes: Safety by Design, Transparency, Accountability, and Responsibility.
"Until social media companies are liable for negligence in the coding of their algorithms, instead of hiding behind the Section 230 liability shield, they will continue to behave in a negligent manner that puts children and adults at risk."CCDH
The CCDH encourages parents, along with government officials, to read their study to make sure TikTok is held accountable.
“Despite the promises made by Big Tech, this report shows that self-regulation has failed. Legislators, parents, and young people should be concerned that TikTok is far from being the friendly dance app it promotes itself to be to the public,” said the CCDH report.