Social media algorithms mean viewing benign content related to fitness or healthy eating can escalate to something more sinister. Photo / Getty Images
Social media algorithms mean viewing benign content related to fitness or healthy eating can escalate to something more sinister. Photo / Getty Images
Young women are being exposed to dangerous diet and exercise advice – even when actively trying to avoid it.
I did not go looking for Liv Schmidt’s videos on Instagram, but they soon found me. Schmidt, a lithe, blonde 23-year-old based in New York, is an influencer who shares weight-losstips with an audience of hundreds of thousands of followers.
The fact that Schmidt’s videos appeared in my feed in the first place is testament to how, through the mysterious workings of social media algorithms, viewing benign content related to fitness – as I was doing – or content on weight loss and healthy eating can escalate to something more sinister.
On TikTok, as I started viewing fitness videos, the results that began showing up on my “For You” page (FYP) – a feed that TikTok populates with videos its algorithm thinks you will be interested in – were more concerning still. Almost instantly, it became filled with “model secrets to stay thin”, food diaries that amounted to less than 800 calories per day and, most disturbingly, painfully thin young women showcasing hip bones, collarbones and ribs.
Welcome to “SkinnyTok”, where users post extreme weight-loss tips and “thinspiration” (images of very thin women meant to be aspirational). This type of content isn’t new, of course, nor is it limited to TikTok. But on the social media platform – widely considered to be the most popular app among young people, with over 1.5 billion monthly users – these influencers have experienced a particularly meteoric rise in popularity and reach. But it’s Schmidt, with her 326,000 Instagram followers, who is the unofficial social media leader.
As well as walking daily and eating in a calorie deficit, Schmidt’s accounts have shared (and in some cases since removed) controversial dieting advice, such as drinking water or tea to suppress appetite, eating your meals from side plates and implementing something called the “three bite rule”: eating just three bites of something you fancy, then leaving the rest. (In a restaurant, she says she “tastes everything and finishes nothing”.) It is the kind of weight loss-talk that would be more at home in 2005 than 2025, when the aesthetic ideal was a hangover from the super-thin models of the 1990s.
Schmidt’s TikTok account, which had amassed more than 670,000 followers, was banned for violating the platform’s community guideline in September last year (the hashtag SkinnyTok has also since been blocked, and if a user searches it, they are directed to “expert resources”). In an interview at the time, she said that “weight is a touchy topic, but that’s what the viewers want”. And despite the ban, her content soon reappeared. It was re-shared by other accounts on TikTok and posted to Instagram and her fledgling YouTube channel, where she has active accounts with 325,000 and 100,000 followers respectively.
“Being skinny is literally a status symbol,” she said, in a now-deleted video that is still doing the rounds online. “You’re living life on hard mode being fat… you’re wondering why the bouncer won’t let you in? Check your stomach. You’re wondering why… this job isn’t taking you? Look at yourself.” On Instagram, she captioned a recent photo of her in a bikini with the phrase, “nothing tastes as good as being this effortless feels” – seemingly a direct reference Kate Moss’s now infamous mantra, “nothing tastes as good as skinny feels”.
It was there on Instagram, the photo-sharing app owned by Mark Zuckerberg’s Meta, that Schmidt’s videos appeared in my feed. (Instagram has now banned Schmidt’s account from using monetisation tools and it is hidden to users under 18.) This type of content isn’t new; as long as there has been social media there have been hidden weight loss and even “pro-ana” – pro-anorexia – communities hidden in its ecosystem. But they were just that, hidden, on the blogging site Tumblr and in obscure forums and chatrooms. What has changed in the context of GLP-1 weight loss drugs is that thin is back in fashion and, with it comes a new wave of pro-anorexia content. This time, it is hidden in plain sight.
Schmidt’s intentions may well be nakedly commercial. You can buy her “New Me” diet tracker for US$50 ($85), and her “skinny essentials”, which include resistance bands and fat-free salad dressing, from Amazon. She also runs a members-only group chat, which you can join for a fee, called “the Skinni Société”. Or perhaps her tough-love rhetoric may be a cynical ploy to farm engagement – as she has said herself, videos that merely mention her name get “millions” of views as a result. But, judging by the comments, some of her followers take her advice as gospel. And on TikTok, there is no shortage of other creators like her.
The app’s powerful algorithm can send users down a rabbit hole of content within a niche, meaning videos promoting extreme dieting techniques could be being fed to teenagers. Regulators are taking note – in fact, one French government minister is seeking to ban it once and for all. Clara Chappaz is the Minister Delegate for Artificial Intelligence and Digital Technologies in the Government of Prime Minister François Bayrou. In April, she reported “#SkinnyTok” to France’s audiovisual and digital watchdog, and to the EU, over concerns that it is promoting anorexia.
“These videos promoting extreme thinness are revolting and absolutely unacceptable,” she said. “Digital tools are marvellous in terms of progress and freedom, but badly used they can shatter lives… the social networks cannot escape their responsibility.” The European Commission opened a probe into TikTok’s algorithm and how it affects minors last February, under the bloc’s content moderation rulebook, the Digital Services Act. As part of it, it began investigating how the platform promotes content relating to eating disorders.
French politician Clara Chappaz said ‘social networks cannot escape their responsibility’ to crack down on eating disorder content. Photo / Getty Images
Point de Contact, an organisation recently named by regulator Arcom as a “trusted flagger” of harmful digital content, also confirmed their teams are looking into the matter in co-ordination with authorities, as reported by Politico. “The difficulty is to prove that the content is illegal, and that the message is directly targeted at minors,” a Point de Contact spokesperson says. “But it’s certain that TikTok isn’t scanning this hashtag fast enough.”
Experts are clear that eating disorders have no single cause, but there is a growing body of research that suggests – perhaps unsurprisingly – that exposure to this kind of online content could be a factor in fuelling or exacerbating disordered eating. Researchers have studied the impact it can have on young women’s body image and concluded that it can cause “psychological harm even when explicit pro-ana content is not sought out and even when their TikTok use is time-limited in nature”.
Dr Victoria Chapman is a child and adolescent psychiatrist at the Royal Free Hospital in London who specialises in eating disorders. She says that weight loss social media content often comes up in her clinical practice. “When we meet patients, when we do assessments, we quite often ask what they’re doing on social media,” she says. “My view – and I think there’s increasing evidence for this – is that these platforms that focus on image [such as TikTok and Instagram] are associated with the risk factors that make someone vulnerable to an eating disorder.”
More worrying still is how severe mental illness in children and young people has risen. There has been a 65% rise in the number of children admitted to acute hospital wards in England because of serious concerns over their mental health in a decade, according to a study published in the Lancet Child & Adolescent Health journal. Over half – 53.4% – were because of self-harm, but the number of annual admissions for eating disorders surged over the same period, from 478 to 2938.
“We know that a combination of genetics, biological factors and sociocultural factors contribute to the development of an eating disorder,” says Umairah Malik, the clinical manager for Beat, an eating disorder charity. “Some of these sociocultural factors include low self esteem, body dissatisfaction… alongside things like anxiety, depression and perfectionist traits. If someone is already vulnerable to developing an eating disorder, [social media] has the potential to be really harmful and damaging.”
Chappaz may be fighting a losing battle – it seems that, even when social media platforms attempt to crack down on pro-anorexia content, it is impossible to stem the flow. When I first started researching this not-so-hidden online world, I found out that it wasn’t difficult to access. If you search an obvious term in TikTok, such as “skinny” or “anorexia” a cartoon heart and a support message appears with links to mental health and eating disorder resources. TikTok does not allow content showing or promoting disordered eating or dangerous weight-loss behaviours, and age-restricts content that idealises certain (thin) body types.
But despite these safety measures, users may circumvent content filters by using covert hashtags – misspelt words, for instance, or abbreviations – and speak in code. Moreover, some concerning videos seem to be hiding in plain sight: under the seemingly benign “weight loss” tag, videos promoting extreme dieting and unhealthy body weights appear.
Amy Glover, a 29-year-old writer in recovery from an eating disorder, discovered that it was almost impossible to avoid this kind of content on TikTok, no matter how hard she tried. “It feels to me like any food or exercise-related search I make [on social media] eventually leads to weight-loss content, and [it’s] quite often not what I would consider healthy advice,” she says. “I wonder if the algorithm is simply ‘testing’ more controversial content on me. I find that frustrating and worrying… Eating disorders can be competitive and involve a lot of negative self-talk, which I feel a lot of these videos encourage.”
Research conducted by Beat has shown that, even if harmful social media content doesn’t directly cause eating disorders, it can easily exacerbate them. “It goes beyond young people,” Malik says. “We did a survey in 2022, looking at online platforms – of the people who answered, over 90% of those with experience of having an eating disorder had encountered content online that was harmful in the context of that eating disorder. People talked about it being addictive, and not having control over the content that was being displayed.
“That kind of content could be actively encouraging, promoting or glamourising an eating disorder, but then you also have things like diet culture, fitness and weight-loss content that can [also] be really harmful for people,” she adds.
A TikTok spokesperson says: “We regularly review our safety measures to address evolving risks and have blocked search results for #SkinnyTok since it has become linked to unhealthy weight loss content. We continue to restrict videos from teen accounts and provide health experts and information in TikTok Search.”
Glover is four years into her recovery, and now in her late 20s, which she says makes it easier. But these algorithms – which seem to be fine-tuned to pick up on the slightest hint of body insecurity – could be force-feeding these videos to young women and girls who are much younger. TikTok, of course, is where they spend all their time. Ofcom research earlier this year found 96% of 13-17 year olds in the UK are on social media.
While the app takes measures to shut down dangerous hashtags (as it did with #legginglegs, another tag related to disordered eating, earlier this year), it is like playing whack-a-mole, as more content springs up to evade the platform’s safety features. What is clear, though, is that its preternatural algorithm can make this worse, serving up potentially dangerous content to those who aren’t even looking for it.
“Even after reporting harmful content or attempting to avoid it, users often still see more being recommended to them, or popping up without warning. We’d like to know what platforms plan to do about recommended content and algorithms,” says Tom Quinn, Beat’s director of external affairs.
“We know that people who create and share this kind of content are often unwell themselves... but we’d like to see more proactivity and extensive bans on damaging content being uploaded or shared. Alongside this, we want to see platforms working with eating disorder experts to improve moderation efforts and ensure that recovery-positive, support-based content is widely available,” he adds.
In her own defence, Schmidt has said, “We all have the option to follow and block any content we want.” But when you’re a teenager, and potentially a vulnerable one, should the social media platforms be doing more to block it for you? Some lawmakers now certainly think so – and few parents would disagree.