
TikTok’s ban on #SkinnyTok fails to stop the spread of dangerous dieting content as creators find clever workarounds, endangering vulnerable young users despite regulatory pressure.
Key Takeaways
- TikTok banned the hashtag #SkinnyTok follows European regulatory pressure, but harmful content promoting extreme weight loss continues to circulate through alternative means.
- Research shows social media content glorifying unhealthy thinness increases the risk of eating disorders, with anorexia having the highest mortality rate among psychiatric disorders.
- Content creators easily circumvent platform regulations using coded language and alternative hashtags to continue spreading potentially harmful messaging.
- TikTok has implemented safety measures including redirecting #SkinnyTok searches for eating disorder resources, but these efforts fall short of addressing the underlying problems.
- Body positive content struggles to gain the same algorithmic traction as content promoting unrealistic beauty standards, reflecting a concerning shift back toward extreme thinness ideals.
TikTok’s Regulatory Response Falls Short
Under pressure from European regulators, TikTok recently banned the hashtag #SkinnyTok in an attempt to combat content promoting unrealistic body images and dangerous weight loss methods. The platform now redirects searches for the hashtag to resources from the National Alliance for Eating Disorders. However, this surface-level solution fails to address the deeper problem, as content creators quickly adapt by using alternative terminology, coded language, and new hashtags to continue sharing potentially harmful messaging about extreme dieting and unhealthy body standards.
“You have many kinds of content in the gray zones,” said Brooke Erin Duffy, associate professor at Cornell University who studies social media.
The Dangerous Reality of Pro-Anorexia Content
The persistence of harmful content on TikTok presents a genuine threat to vulnerable users, particularly young women and girls. Research consistently links consumption of such content to higher risks of disordered eating behaviors. With anorexia having the highest mortality rate among all psychiatric disorders, the stakes couldn’t be higher. Some content creators like Kate Glavan have attempted to raise awareness about these dangers, but their efforts are often overshadowed by the algorithm’s preference for content that generates higher engagement—often the very content promoting unrealistic standards.
“A lot of creators are explicitly promoting anorexia to their audience,” said Kate Glavan, a TikTok creator concerned about harmful content.
The Failure of Body Positivity in Algorithm-Driven Spaces
The body positivity movement faces significant challenges in gaining traction on platforms like TikTok. Content promoting unrealistic beauty standards consistently outperforms body-positive messaging in terms of engagement and reach. This reflects a troubling trend where societal beauty standards are shifting back toward extreme slimness after brief progress toward more diverse representation. Content creators like Megan Jayne Crabbe and Nyome Nicholas-Williams have voiced concerns about widespread fat phobia and biased content moderation that disproportionately targets plus-size creators, particularly Black women.
“Negative images that are unrealistic or show really thin people or really muscular people tend to have a more lasting impact than body-positive content,” notes Amanda Raffoul, research scientist at Boston Children’s Hospital.
Platform Responsibility and Conservative Values
While TikTok didn’t create these harmful beauty standards, the platform bears responsibility for how its algorithm amplifies and targets such messaging. This situation highlights the broader issue of corporate responsibility in the social media landscape, where profit motives often conflict with user welfare. For conservatives who value personal responsibility and family well-being, the failure of platforms like TikTok to effectively protect vulnerable young users from harmful content represents yet another example of corporate entities placing profits above traditional values and community standards.
The ongoing struggle to regulate harmful content on TikTok underscores the need for more effective solutions that respect free speech while protecting impressionable users. President Trump’s administration has consistently emphasized the importance of holding social media companies accountable, and this latest failure demonstrates why that approach is essential. Until platforms implement truly effective content moderation or face meaningful consequences, harmful messaging will continue to reach vulnerable Americans, undermining parental authority and traditional values around health and self-acceptance.