A recent study from the CCDH found that even minutes after creating their accounts, TikTok had a significant likelihood of exposing kids to information about eating disorders and suicide.
The Center for Countering Digital Hate (CCDH) released the study on Wednesday. Findings suggest that teens might be exposed to dangerous information on TikTok three minutes after creating an account. According to the study, a large amount of social media information promotes eating disorders and suicidal thoughts. The CCDH said that they learned of the potential after setting up accounts with a minimum age of 13 in Canada, the United Kingdom, the United States, and Australia.
While the accounts enjoyed articles on body positivity and mental wellness, there are still articles that promote hostile ideologies. Following contentious discussions between politicians who want TikTok to be outlawed in the US and TikTok, the report was released. The legislators focused on national privacy, arguing that China may exploit the data for its purpose.
“Two-thirds of American teenagers use TikTok, and the average viewer spends 80 minutes daily on the application. The app, owned by the Chinese company, Bytedance, rapidly delivers a series of short videos to users and has overtaken Instagram, Facebook, and YouTube in the bid for young people’s hearts, minds, and screen time,” said CCDH CEO Imran Ahmed.
“What we found was deeply disturbing. Within 2.6 minutes, TikTok recommended suicide content. Within 8 minutes, TikTok served content related to eating disorders. Every 39 seconds, TikTok recommended videos about body image and mental health to teens. The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them and their physical and mental health,” Imran continued.
Read Also: A Better 2023 for Chinese Firms
TikTok criticized the study
A representative from TikTok commented in response to the latest report, claiming there were errors. The amount of the sampling, the length of the test, and the process for liking and navigating through the video, according to TikTok, are all problematic. According to the spokeswoman, the CCDH is generalizing the subject matter and not the specifics of the film. For instance, a video about eating disorders may also tell the tale of someone who has overcome the condition.
“This activity and resulting experience do not reflect genuine behavior or viewing experiences of real people. We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need,” explained the spokesperson
“We’re mindful that triggering content is unique to each and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”
The US government’s cyber regulations will be followed, according to TikTok, which has consistently said this. To protect its users from hazardous information posted on the platform and detected there, the spokesman also mentioned that they would implement revised and updated security measures. Additionally, the firm is developing filters to better deal with and fight the items that encourage suicide and other dangerous issues.
Recommendations from the study
The CCDH developed many suggestions for TikTok to address the problems they believe are damaging to its users following each inquiry. A few eating disorder-related videos on TikTok, for instance, have received more than 13 billion views, according to the CCDH. Furthermore, these kinds of content emerge soon after opening a new account. In addition, users’ feeds now contain information on various mental health issues.
“TikTok must provide full transparency of its algorithms and rules enforcement, or regulators should step in and compel the platform to do so,” the CCDH report said.
“Dealing with eating disorder content within videos or using coded hashtags requires proactive, informed enforcement that leverages the experience of public health, civil society and academic bodies with expertise in this issue,” it continued.
“Legislators can change the incentives that shape TikTok’s business model by implementing our STAR Framework for social media regulation, which includes: Safety by Design, Transparency, Accountability, and Responsibility.”
“Until social media companies are liable for negligence in coding their algorithms, instead of hiding behind the Section 230 liability shield, they will continue to behave in a negligent manner that puts children and adults at risk.”
Read Also: Bill Gates’ Strong Speech to People
TikTok content risks mental health
Due to the pandemic, TikTok, one of the most popular social media sites, has become more popular. The site is loaded with various material, including family, mental health, movies, travel, hobbies, and cuisine, with millions of videos posted by billions of users every day. However, the CCDH discovered that while the TikTok algorithm provides a user information relevant to his interests and pastimes, it would display other films in the users For You page that focus on mental health.
“What makes TikTok unique is that the For You feed and its algorithmic recommendations are designed. In the words of its maker, to be “central to the TikTok experience. But this also introduces unique dangers to TikTok as that same algorithm can recommend harmful content. It seeks to keep users viewing its content and the ads that earn the platform money,” explained the CCDH report.
“Content recommended by TikTok in our study shows that teens with interest in body image and mental health may face similar harms from a high rate of recommendations for body image and mental health content.”
“Most mental health videos recommended to Standard Teen Accounts in our study consisted of users sharing their anxieties and insecurities. In contrast, body image content appeared to be more harmful, with videos advertising weight-loss drinks and ‘tummy tuck’ surgeries to accounts with a stated age of 13.”