TikTok investigating videos promoting starvation and anorexia
Guardian found potentially harmful pro-weight-loss accounts were still available in search results
TikTok has launched an investigation and banned some search terms after the Guardian found harmful pro-anorexia content was still easily searchable despite measures taken by the social media company to prohibit the advertising of weight-loss products.
The video app – one of the most popular in the world with more than 800 million users, almost half of whom are between the ages of 16 and 24 – has imposed new restrictions on weight-loss ads after criticism for promoting dangerous diets.
But harmful accounts that promote potentially life-threatening eating disorders were still easy to find. While the company had blocked some hashtags, putting the same words into a search for profiles brought up dozens of accounts promoting eating disorders.
Those searching for content via hashtags can also get around restrictions by using slight misspellings or variants on common terms.
After being presented with the findings, TikTok launched an investigation and said it had taken action to ban harmful phrases across all search verticals, including when searching for users.
One account showed messages from a girl saying she wanted tips on losing a lot of weight, in a healthy or unhealthy way. Another account said: “This is a warning if you don’t like stuff about starving leave please.”
Another user asked people to follow for “low calorie” safe food when you don’t want to purge, a form of an eating disorder that involves self-induced vomiting, misuse of laxatives or medications.TikTok said it had banned six accounts flagged to them for violating the community guidelines on posting content that promotes eating habits that are likely to cause health problems.
Dr Jon Goldin, vice-chair of the child and adolescent faculty at the Royal College of Psychiatrists, described the findings as “deeply disturbing”. He urged social media companies to do more and said regulators needed strong powers to sanction inaction.
Ysabel Gerrard, a lecturer in digital media and society at the University of Sheffield, said: “It takes little more than 30 seconds to find a pro-eating disorder account on TikTok and, once a user is following the right people, their For You page will quickly be flooded with content from similar users. This is because TikTok is essentially designed to show you what it thinks you want to see.”
TikTok’s For You page is a feed of videos – not always from people you follow – recommended by an algorithm based on your history. People have reported being served up accounts that regularly post about eating disorders, weight loss or diets.
Gerrard said that since the first wave of press coverage about pro-eating disorder content on TikTok, the company had taken steps to address the issue by banning ads for fasting apps and weight-loss supplements. “I applaud the company for making it. However, there are some more things that TikTok urgently needs to do to make the platform even safer,” she said. She added that restricting the “results for hashtag searches is not enough, and hashtag searches might not even be the way users find new content anyway.”
At present, TikTok doesn’t send resources to people in the UK searching for pro-eating disorder terms. “It simply says ‘no results found’ or directs you to the platform’s community guidelines – their rulebook for user behaviour,” Gerrard said.She acknowledged that removing content was tricky. “In particular, TikTok would need to be careful when limiting search results for usernames because some accounts might be pro-recovery, and there’s plenty of evidence to tell us how helpful social media can be for people with eating disorders.”
Tom Quinn, director of external affairs for Beat, the UK’s eating disorder charity, said: “So-called ‘pro-ana’ or ‘pro-mia’ content can be very attractive to people affected by eating disorders and has the potential to be devastating.”
Quinn said they had shared their concerns with TikTok, and the company had been receptive to hearing from people with experience of eating disorders in order to make their platform safer. “We welcome the steps they have taken against advertising weight-loss products, and we urge them to take further action against harmful content,” he said.
The Conservative MP, Damian Collins, the former chair of a parliamentary committee charged with investigating social platforms, said it was not clear howTikTok’s algorithm worked. “It’s amazing how fast TikTok has grown … I would like for them tackle this [pro eating disorder content] and explain what policies they will be put in place to more effectively spot and not promote harmful content.”
A spokesperson for TikTok said: “As soon as this issue was brought to our attention, we took action banning the accounts and removing the content that violated those guidelines, as well as banning particular search terms. As content changes, we continue to work with expert partners, update our technology and review our processes to ensure we can respond to emerging and new harmful activities.”
TikTok presented new policies against ads for weight loss and diet products, but triggering content still exists on the app.
TikTok is looking into new ways to keep harmful pro-starvation and anorexia videos off its platform after discovering how prevalent they were on the short-form video app.
TikTok made a commitment to support body-positive content on its platform with a September statement announcing efforts to crack down on ads promoting weight-loss and dieting products, as well as partnerships with the National Eating Disorder Association (NEDA) and other advocates using the app. Still, a piece in the Guardian showed just this week how easy it is for users to find harmful videos through loopholes, such as using slight misspellings for search terms commonly associated with the pro-eating-disorder community that TikTok blocks.
The pro-eating-disorder community (which also dubs itself pro-ED and pro-Ana, for anorexia) is a harmful, web-based subculture in which “people with anorexia, bulimia, or other eating disorders support practices involved with anorexia or weight loss,” according to the American Addiction Centers website. “These sites may strengthen the disease of anorexia, as people involved in the discussions often praise weight loss and discourage healthy body shape and size.”
The videos the Guardian recently uncovered has users encouraging viewers to, for example, “flood these comments with ways to loose alot [sic] of weight in 3 days, healthy or unhealthy.” According to TikTok, efforts have already been made to ensure that these videos aren’t allowed on the platform.
“As soon as this issue was brought to our attention, we took action banning the accounts and removing the content that violated those guidelines, as well as banning particular search terms. As content changes, we continue to work with expert partners, update our technology and review our processes to ensure we can respond to emerging and new harmful activities,” TikTok U.K. said in a statement to the Guardian.
A representative for TikTok U.S. assures Yahoo Life that the team behind the U.S. platform has taken similar action. “We recently introduced new ad policies that ban ads for fasting apps and weight loss supplements and place stronger restrictions on weight loss claims and references to body image. These types of ads do not support the positive, inclusive and safe experience we strive for on TikTok,” the representative says. “In addition, we do not show search results for terms related to eating disorders, and we continually update our safeguards to account for intentional misspellings and as terms/phrases evolve.”
TikTok’s September statement also made mention of steps users can take to ensure that they’re not being served certain content by reporting videos, blocking users, filtering comments and using features to tell TikTok that they’re “not interested” in seeing related content.
When it comes to ensuring that harmful content doesn’t make its way onto the app to begin with, however, Clara Guillem, a 24-year-old content creator who focuses on mental health and eating-disorder recovery, tells Yahoo Life that TikTok’s proactive measures aren’t enough. “The truth of the matter is that there are two different ways that pro-ED content makes its way to TikTok. One is the obvious: Using weight loss hashtags, and different misspellings of proana search terms (pr04n4, edthings, thinsparation),” Guillem says via email. “The other is not so obvious: Masking content as pro-recovery.”
Guillem explains that popular hashtags like #edrecovery will often be used to post content that might not be intended to harm viewers but does, by way of putting an eating disorder or behavior associated with it on display. “Before-and-after recovery photos, where the before photos can be used as ED inspiration,” she cites as an example. “Other content could include ‘bragging’ about eating-disorder symptoms or making ‘relatable’ posts like ‘you can only recognize these images if you’ve had an eating disorder’ followed by photos of crushed ice, apple cider vinegar, mint gum, fitness apps and others things that in a way end up teaching kids how to successfully starve themselves.”
Coming across this content, whether intentional or not, can be extremely harmful, NEDA communications manager Chelsea Kronengold tells Yahoo Life. “With social media, it’s known that people are more likely to follow advice or follow trends from peers or people they perceive to be peers, so micro-influencers, or even you’re everyday influencers, more so than celebrities,” she says. “And so, these influencers that are health and wellness and diet, fitness, that have a large following, can cause harm and damage, because people are taking their likely nonmedical, nontraining advice at face value. So there are these wild fad diets and trends that people are taking as medical advice when it’s not.”
The power that this content can have on users can be seen through the vast responses and the harmful communities that it in turn forms.
It’s also not new to TikTok, as a number of social media platforms before it have worked to block similar dangerous content and communities. Previously popular hashtags like #thinspo found homes on Tumblr, Pinterest and Instagram before people advocated against them. “A lot of what I ‘learned’ to do actually came from Tumblr,” Guillem says of her own eating disorder, which she developed at age 14. “Eating disorders are so competitive, and on that website (and between my sick friends and I) there was always an invisible push to be the ‘sickest’ one. People would post their sickly bodies and habits under several different ‘pro-ana’ hashtags that kids like me would easily get sucked into. Even after the competition and Tumblr took absence from my life, the eating disorder stayed.”
And while Guillem thinks that TikTok should “completely ban any tags that can be used in this way,” TikTok’s safety policy manager, Tara Wadhwa, tells Yahoo Life that the app is wary of removing certain hashtags that also provide support.
“TikTok supports those who want to share their story and use their voice to raise awareness for eating disorders,” Wadhwa says. “Our policies aim to enable people to find support within communities on TikTok while also addressing and removing content that promotes eating habits that are likely to cause health issues.”
While the balance of enabling supportive communities on the app and identifying harmful content within them is a difficult one to strike, TikTok’s partnership with NEDA is to make sure it’s on the right track. The short-form video app has even earned praise for elevating creators who are posting videos of themselves eating food and offering users a safe space to virtually join them, in an effort to provide meal support to young people with, or recovering from, an eating disorder. The app hopes to not only eliminate harmful content but also utilize the partnership with NEDA to present users with helpful resources instead.
“We’ll soon begin redirecting searches and hashtags — for terms provided to us by NEDA, or associated with unsafe content we’ve removed from our platform — to the NEDA Helpline, where NEDA can then provide confidential support, tools and resources,” the TikTok rep says. “TikTok recently supported Weight Stigma Awareness Week by launching a dedicated page in our app to support NEDA’s #EndWeightHateCampaign. This page was featured in our Discover page and educated our community about what weight stigma is, why it should matter to everyone and how someone can find support or support others who may be struggling.”
If you or someone you know is struggling with body image or eating concerns, NEDA’s toll-free, confidential helpline is available to help by phone (800-931-2237) and click-to-chat message. Crisis support is also available via text message by texting ‘NEDA’ to 741741.
Read more from Yahoo Life
Why the coronavirus self-quarantine can be triggering for people with eating disorders
Young people struggling with eating disorders find support on TikTok
The D’Amelio sisters get ‘canceled’ after ‘immature’ behavior. But experts say they were just acting like teens.
Want lifestyle and wellness news delivered to your inbox? Sign up here for Yahoo Life’s newsletter.
TikTok Has a Pro
The go-to social media platform for teens needs to improve its recommendation algorithm and partner with eating disorder experts.
“Pro-ana” communities—websites, blogs, forums, and social media spaces dedicated to promoting the worsening of eating disorders like anorexia—have been a fixture of the web more or less since its inception. So it’s no surprise that, as Buzzfeed reported last month, some TikTok users have found disturbing pro-ana content on their For You page, a personalized section of the platform that displays videos users are likely to enjoy.
Discovering, damage-controlling, and deleting pro-ana content has become a rite of passage for web companies. In 2001, Yahoo removed 113 pro-ana websites from its servers. MySpace, Tumblr, Instagram, Pinterest, Reddit, and many other social media platforms have faced pro-ana problems. This well-publicized history makes it frustrating that TikTok wasn’t better prepared, beyond claiming it doesn’t allow “content that promotes eating habits that are likely to cause health issues.” But now that TikTok’s policies are under a microscope, what guidance will the company take from a longer history of regulating online pro-ana communities, and exactly how worried should its users be?
Dr. Ysabel Gerrard is a lecturer in digital media and society at the University of Sheffield. Her research on social media content moderation has been featured in venues like The Guardian and The Washington Post. She also consults for social media companies, including Instagram.
The problem TikTok has right now is that its For You page is working exactly as it should: It gives users a personalized and therefore pleasurable experience by showing them what they likely want to see. I’ve previously written about the same problem playing out on Instagram, Pinterest, and Tumblr. Recommendation algorithms like this are the bread and butter of social media platforms. The happier you are on a platform, the likelier you are to stay, and if you stay, the company can retain your profitable data-generation.
But the problem—a problem most major social media companies have faced—is that recommendation algorithms aren’t really trained to make moral and health-related judgements about the kinds of content they recommend. Do you like cats? TikTok thinks you do, based on what you’re liking and searching for, so its algorithm will show you more cats. Yay cats! But the exact same formula applies to potentially harmful forms of content. Do you have anorexia? TikTok thinks you do, so here’s a bunch of triggering videos. Have at it!
In a recent BuzzFeed article, some TikTok users shared anecdotes of randomly receiving recommendations for pro-ana videos through their For You page. It is difficult to describe pro-ana behaviors without triggering readers, but they might involve sharing diet tips and purging methods, writing personal stories, and pairing up with a “buddy” to further encourage weight loss. We know from charities like Beat that eating disorder patients often report feeling “triggered” by certain images or words. If a TikTok user continuously sees triggering posts on their For You page, this could very well harm them. But one of the frustrations social media researchers have is that the inner workings of recommendation systems like the For You page are notoriously opaque, making it difficult to figure out why particular users see certain recommendations while others don’t. A recent New Media and Society article notes how social media users often create elaborate theories for figuring out how recommender systems work, what the author calls “algorithmic gossip.”
Without dismissing anyone’s claims about their For You recommendations, readers should know that users who are not engaging with videos related to eating disorders are highly unlikely to have them randomly recommended. A TikTok spokesperson explained that users can also adjust the content they see by, for example, “hearting” videos, clicking “not interested,” and following users. “In doing so, through time users will see more of the content they prefer.”
Whenever stories like BuzzFeed’s appear, I always worry that social media companies will respond by panicking and prohibiting all content relating to eating disorders, even if it’s about recovery or support.
Researchers have long known that social media and older online communities can offer support for people with stigmatized conditions like eating disorders. For example, Reddit’s decision to remove the r/proED sub in 2018 was met with outcry from community members who explained that, despite its name, the sub wasn’t actually used as a space to promote eating disorders and functioned more like a support group.
When moderated more appropriately, there’s no reason TikTok can’t offer an extra space for people to express their feelings and share their experiences in a highly creative way. TikTok could also become a helpful resource for people struggling with eating disorders. Secrecy is one of the hallmarks of an eating disorder, meaning social media sometimes exists as a sufferer’s only form of support. With this in mind, TikTok could develop genuinely useful eating disorder resources beyond sending users a list of contact details for local charities, “the 2020 equivalent of handing a teen a tri-fold brochure,” as psychiatrists Neha Chaudhary and Nina Vasan recently wrote in WIRED. Pinterest, for example, has pioneered a series of wellbeing exercises that it recommends to users searching for self-harm-related Pins.
The Paradox of Tik Tok Anti
The literature shows that social pressure promotes non-suicidal self-injury (NSSI) Eating disorders, along with self-injury, are also favored by underregulated social media. Tik Tok is one of the most used social media platforms among adolescents. It has been shown that the time young children spend …
Public health information (CDC)
Research information (NIH)
SARS-CoV-2 data (NCBI)
Prevention and treatment information (HHS)