View all posts

Demystifying Harmful Content on YouTube and TikTok

Truescope, in collaboration with the Civic Tech Lab at the National University of Singapore Faculty of Arts and Social Sciences (NUS FASS), has published a report following an in-depth study on the portrayal of harmful content on popular video-based platforms YouTube and TikTok.

The study involved an analysis of 610 YouTube and 508 TikTok videos between 9 March and 9 April 2023. It was observed that close to 40% of these searched videos on the platforms contained harmful content pertaining to eating disorders, self-harm and suicide. What’s more concerning is how these users skilfully mask their harmful content from platform moderation. Techniques like hashtag creation and purposeful content curation were identified as some of the strategies used to evade platform detection, with subtle differences in approach between YouTube and TikTok.

Additional Key Findings

  • Over 80% of the videos containing harmful content on YouTube and TikTok were published by female users
  • Of the users who publicly posted their age on their accounts, close to 70% of these users on both YouTube and TikTok were below the age of 18
  • We observed differences in the proportion of harmful content on YouTube vs. TikTok, particularly on eating disorders and self-harm: Eating Disorder (23% vs. 65%), Self-Harm (70% vs. 31%), Suicide (10% vs 6%)
  • Hashtags were used to classify and propagate harmful content across both platforms. Commonly used hashtags comprised abbreviations and acronyms (“#sh” for self-harm), deliberate misspelling of terms (“#suwerslide” to imply suicide), euphemisms (“#thinspo” for “thinspiration”), and hijacking of famous subjects (“#ednotsheeran” as a reference to eating disorders)

See how you can get more signals and insights from your media intelligence platform

g2.com truescope Media Monitoring HighPerformer Small Business
g2.com truescope Media Monitoring High Performer
g2.com truescope users love us badge