TikTok Shows Teens Harmful Content Within Minutes of Joining, Study Says
- Get link
- Other Apps
As if parenting isn't hard enough already, here's another thing to worry about. If your teens join TikTok, they could be served damaging content about self-harm, suicidal ideation and eating disorders within minutes, according to a report released Wednesday.
Researchers working for the Center for Countering Digital Hate posed as 13-year-olds who were establishing new accounts on the platform and they recorded the first 30 minutes of content that TikTok's For You Page -- a home feed that quickly learns and adapts to your content preferences -- served up to them.
One new account was served content relating to suicide within 2.6 minutes, whereas another was served eating disorder content within eight minutes, researchers said. On average, they were served content relating to mental health and body image every 39 seconds, they said in the study, which hasn't been peer reviewed.
"Parents worry about who their kids are spending time with, what sort of things they're listening to, but actually because of these apps, which they have no insight into, it could be anyone," CCDH CEO Imran Ahmed told journalists during a briefing Tuesday. "Frankly, having TikTok babysit your kids is a terrible idea because of the way that these algorithms work."
When setting up the accounts, the researchers interacted with any harmful content they encountered, by liking any videos that contained content relating to self-harm, eating disorders or suicide. This indicated to TikTok's algorithm that these were subjects the user was interested in.
In a statement, a spokesperson for TikTok said the researchers' activity and resulting experiences don't "reflect genuine behavior or viewing experiences of real people."
The company is removing all content highlighted by the report that violates its community guidelines. TikTok bans all content depicting, promoting, normalizing or glorifying activities that could lead to self-harm, as well as content that promotes eating behaviors or habits that are likely to cause adverse health outcomes.
Growing scrutiny for TikTok
TikTok's popularity, particularly among teens, has skyrocketed over the past few years, and the platform now boasts 111 million monthly active users in the US, according to data.ai analytics. But its success has also attracted the attention of child safety specialists, lawmakers and regulators.
On Tuesday, three members of Congress introduced a bill that would ban the Chinese-owned platform in the US due to concerns about spying and propaganda. The research published by the CCDH highlights additional safety concerns around the platform, many of which involve problems shared by multiple social media sites.
The research into TikTok comes a little more than a year after Meta whistleblower Frances Haugen revealed that Meta's own internal research had shown that Instagram was having a negative impact on the mental health of teens, especially girls with eating disorders. As a result of the revelations, Meta made a number of changes to the platform, including adding stronger parental controls and nudging teens away from content that might not promote their well-being.
The CCDH said it used the same methodology to conduct its research into TikTok as Meta used in its own internal studies (though it should be noted that the two platforms have very different algorithms for recommending content).
Researchers set up eight accounts in total (an admittedly small sample size) -- two based in the US, two in the UK, two in Canada and two in Australia. Half had generic usernames and half had usernames that included "loseweight" in the username. This was to see if TikTok would serve up different content to young people who identified themselves as interested in weight loss.
They discovered that these "vulnerable" accounts were served three times more harmful content than the standard accounts, and 12 times as many self-harm and suicide videos. "What we found really worrying is that TikTok recognizes vulnerability," said Ahmed.
Protecting the vulnerable
During the course of its research, the CCDH said it found eating disorder content with more than 13.2 billion views in total. People were often using coded hashtags to post "pro-ana" or "thinspiration" content (that is, content that promotes anorexia) while evading TikTok's filters, including co-opting the name of singer Ed Sheeran.
Keeping up with the evolving keywords people are using to dodge its content filters is a never-ending challenge for TikTok, and whenever it does find them, it redirects anyone clicking on them to eating disorder helplines and charities. In the period between April and June, the company said, 75.2% of the problematic eating disorder videos it identified were removed at zero views, 82.5% were removed within 24 hours of being posted and 86.7% were removed before any reports.
TikTok acknowledged it doesn't catch every piece of content that violates its guidelines but said it's continually investing in improving trust and safety and added that it has 40,000 people currently working on keeping the platform safe.
"We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need," said the company spokesperson. "We're mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics."
- Get link
- Other Apps
Comments
Post a Comment