TikTok bombards teenagers with self-harm content within minutes of logging on to the platform, according to a new study.
The Center for Countering Digital Hate (CCDH) created eight fake profiles appearing as 13-year-old girls, and researchers were “extremely disturbed” by their findings.
On average, mental health and body image videos were leaked to accounts every 39 seconds.
Content referencing suicide was shown to one account in just 2.6 minutes, while content related to eating disorders was shown to another in eight minutes.
Profiles designed to mimic vulnerable teens concerned about their weight saw three times more harmful content than standard accounts.
They were exposed to 12 times more self-harm and suicide videos.
The report also discovered that TikTok hosts an eating disorder community that uses encrypted and open hashtags to share content with more than 13.2 billion views.
TikTok’s policies prohibit eating disorder content on its platform, but people don’t use explicit search terms to bypass the app’s algorithm.
For example, “pro-ana” was used as a scrambled search term for pro-anorexia content to give people tips and encouragement to starve.
That phrase and others like it were banned from TikTok, but the CCDH found many are slipping through.
TikTok has insisted that the study and its research methods “do not reflect the behavior or real viewing experiences of real people.”
CCDH researchers set up two accounts in the US, UK, Australia and Canada – a standard profile and a vulnerable profile with traditionally female names.
The vulnerable accounts had the characters “loseweight” in their usernames, as studies show that those with body dysmorphic issues often express it through their username.
They then recorded the first 30 minutes of content TikTok automatically recommended for those accounts in their For You feed.
Whenever something promoted potentially dangerous content about mental health, eating disorders, or self-harm, the researchers stopped and wanted it to simulate the behavior of a young adult who might be vulnerable to such content.
Examples of the videos and images that resulted include a young person referring to suicide with a caption for her clip that reads, “Go to school after trying.”
Another user wrote, “If you make everyone think you’re ok, you can try private.”
Some self-harm content featured a teen flushing blades down the toilet to stop it.
Others share “tips” for eating disorders, such as “eating gum makes you less hungry” with the hashtag “imnothungry.”
However, some self-harm and eating disorder content is educational or discusses recovery from these issues.
It is not clear how much the CCDH distinguished between these types of clips and harmful ones.
CCDH CEO Imran Ahmed said: “TikTok was designed to entice young users to give up their time and attention, but our research proves that its algorithms not only entertain children but also poison their minds.
“It promotes in children hatred of their own bodies and extreme suggestions of self-harm and disturbed, potentially lethal attitudes towards food.
“Parents will be shocked to learn the truth and furious that lawmakers are not protecting young people from big tech billionaires, their inexplicable social media apps and increasingly aggressive algorithms.”
TikTok said, “We regularly consult with health professionals, eliminate violations of our policies, and provide access to supportive resources for all those in need.
“Realizing that triggering content is unique to each individual, we remain focused on creating a safe and comfortable space for all, including people who choose to share their journeys of recovery or others about these important ones clarifying issues.”
Get in touch with our news team by emailing us at email@example.com.
For more stories like this, Visit our news page.
Get the top news, feel-good stories, analysis and more
https://metro.co.uk/2022/12/15/tiktok-shows-teens-eating-disorder-and-self-harm-content-in-minutes-17941433/ TikTok shows teens 'eating disorder and self-harm content in minutes'