TikTok suggests content about eating disorders and self-harm to new teen accounts in minutes, study finds UK News

0
42


A study of TikTok’s video recommendation algorithm found that it suggests eating disorder and self-harm content to some new teen accounts within minutes.

Research by the Center for Countering Digital Hate (CCDH) found that one account featured suicidal content within 2.6 minutes and another suggested eating disorder content within eight minutes.

Further research by Sky News also found evidence that harmful eating disorder content is recommended via TikTok’s proposed search feature, despite not specifically looking for harmful content.

British eating disorders charity BEAT said the findings were “extremely alarming” and called on TikTok to take “urgent action to protect vulnerable users”.

Content Warning: This article contains references to eating disorders and self-harm

TikTok’s For You page offers a stream of videos that are suggested to the user based on the type of content they are engaging with on the app.

According to the social media company, recommendations are based on a number of factors, including likes, follows, shares and device settings such as language preferences.

More on data and forensics

However, some have raised concerns about how this algorithm behaves when it comes to recommending malicious content.

Picture:
This is one of the videos suggested during the study. The lyrics read “she’s thinner” and the music playing over the video says “I starved myself for you”. Image: Center for Countering Digital Hate via TikTok

The CCDH established two new accounts based in the UK, US, Canada and Australia. Each was assigned a traditionally female username and set the age at 13.

Each country’s second account also included the phrase “loseweight” in its username, which separate research has shown to be a trait of accounts owned by vulnerable users.

CCDH researchers analyzed video content shown on the For You page of each new account over a 30-minute period and only interacted with videos related to body image and mental health.

It found that the standard teens were being served mental health and body image videos every 39 seconds.

Not all of the content recommended at this frequency was harmful, and the study did not distinguish between positive content and negative content.

However, it turned out that all users were shown content about eating disorders and suicide, sometimes very quickly.

CCDH’s research also found that vulnerable accounts were shown this type of content three times as often as standard accounts, and these accounts were shown more extreme content than standard accounts.

The Center for Countering Digital Hate found 56 hashtags associated with eating disorder content.  35 of these contained a high concentration of content advocating an eating disorder.  Image:TikTok
Picture:
The Center for Countering Digital Hate found 56 hashtags associated with eating disorder content. 35 of these contained a high concentration of content advocating an eating disorder. Image:TikTok

It follows CCDH’s findings that TikTok hosts an eating disorder content community that has garnered over 13.2 billion views across 56 different hashtags.

About 59.9 million of those views came from hashtags that contained a high concentration of eating disorder videos.

However, TikTok says the activities captured in the study and resulting experiences “do not reflect the behavior or real viewing experiences of real people.”

FILE PHOTO: In this illustration dated January 6, 2020, a TikTok logo is displayed on a smartphone.  REUTERS/Dado Ruvic/File Photo
Picture:
Eating disorder content is banned on TikTok, and it says it periodically removes content that violates the Terms of Service. PICTURED: REUTERS/Dado Ruvic/File Photo

Kelly Macarthur began suffering from an eating disorder at the age of 14. She has since recovered from her illness, but as a content creator on TikTok, she is concerned about the impact some of her content might have on people who are suffering.

“When I was feeling uncomfortable, I thought social media was a really healthy place to vent my problems. But in reality it was full of pro-anorexia material giving you different tips and triggers,” she told Sky News.

“I see the same thing happening to young people on TikTok.”

Further investigation by Sky News also revealed that TikTok was suggesting harmful content related to eating disorders in other areas of the app, although it wasn’t specifically searched for.

Sky News conducted its own research into TikTok’s recommendation algorithm using several different accounts. But instead of analyzing the For You page, we searched TikTok’s search bar for safe terms like “weight loss” and “diet.”

Sky News has noticed that people search for terms like "diet" returns suggested searches related to eating disorder content.
Picture:
Sky News found that searches for terms like “diet” suggested searches related to eating disorder content. Image: Tiktok

A search for the term “diet” on one account brought up another suggestion “pr0 a4a”.

That’s code for “pro ana,” which refers to pro-anorexia content.

TikTok’s Community Guidelines prohibit content related to eating disorders on its platform, and that includes prohibiting searches for terms explicitly related to it.

However, users often make subtle changes to the terminology, which means they can still post about specific issues without being spotted by TikTok’s moderators.

While the term “pro ana” is banned on TikTok, variations on it still crop up.

The left screenshot shows the suggested results for the term "weight loss".  The right screenshot shows the suggested results when the first suggested result is clicked.
Picture:
The left screenshot shows the suggested results for the term “weight loss”. The right screenshot shows the suggested results when the first suggested result is clicked. Image:TikTok

Sky News also found that eating disorder content is easily accessible through TikTok’s user search function, although not explicitly searched for.

A search for the term “weight loss” returns at least one account in the top 10 results that appears to be an eating disorder account.

Sky News reported this to TikTok and it has since been removed.

“It’s alarming that TikTok’s algorithm is actively pushing users towards harmful videos that can have devastating effects on vulnerable individuals,” said Tom Quinn, BEAT’s director of external affairs.

“TikTok and other social media platforms urgently need to take action to protect vulnerable users from malicious content.”

In response to the findings, a TikTok spokesperson said, “We regularly consult with health professionals, eliminate violations of our policies, and provide access to supportive resources to all those in need.

“Realizing that triggering content is unique to each individual, we remain focused on creating a safe and comfortable space for all, including people who choose to share their journeys of recovery or others about these important ones clarifying issues.”

The Data and Forensics Team is a multi-skilled entity dedicated to providing transparent journalism from Sky News. We collect, analyze and visualize data to tell data-driven stories. We combine traditional reporting skills with advanced analysis of satellite imagery, social media and other open source information. Through multimedia storytelling, we want to better explain the world while showing how our journalism is done.

Why data journalism matters to Sky News



Source link

Leave a Comment