Spring of 2009, New York. At that time, beauty standards in the city were inescapable. As a young female student, I constantly felt pressure to conform. I worked out excessively and strictly monitored my diet. One Saturday morning, like every week, I went to the gym. But that day, something felt different. I wasn’t myself. Despite the discomfort, I pushed on—missing a session felt unthinkable—until I nearly collapsed. My instructor stopped me, looked me in the eye, and told me plainly: “You need to take care of yourself first.”
Struggling with an eating disorder is difficult enough offline. In today’s digital world, it’s nearly impossible to escape. According to a recent survey, 70% of girls aged 12 to 19 report that social media harms their mental health. On top of that, young people are bombarded with 6,000 to 10,000 ads per day, many promoting unattainable beauty ideals.
The business model of social media doesn’t merely mirror societal pressures—it amplifies them. Predictive algorithms lock users into feedback loops, curating content that reinforces harmful norms. While the EU’s Digital Services Act (DSA) has introduced a strong legal framework requiring large online platforms to operate responsibly within the single market, one question remains: are they complying?
Most platforms are designed to keep users engaged—regardless of the cost. Their algorithms analyze user behavior and feed back content accordingly. Just one video about dieting can lead to a homepage brimming with extreme fitness videos, pro-anorexia content, and unhealthy messaging. The Center for Countering Digital Hate (CCDH) discovered that YouTube recommends harmful eating disorder-related videos to simulated 13-year-old users after just one initial search. This isn’t coincidence—it’s the result of deliberate algorithm design.
Curious and concerned, I conducted a small experiment. I followed a popular social media account with over 200,000 followers that subtly promotes eating disorders. It employed coded language—“skinni” instead of “skinny”—and glorified unhealthy weight loss. Despite submitting multiple reports, the platform refused to take it down, maintaining that the account didn’t violate Community Standards on eating disorders. Technically, they may be right—but technically doesn’t account for the vulnerable teens who see that content and relapse.
The DSA was supposed to mark a turning point. It recognized that platforms carry the responsibility of ensuring recommender systems—those algorithms that control content visibility—do not impair users’ autonomy. It also emphasized the necessity of protecting minors from harm. Yet, addictive design features continue to manipulate users, using toxic content and rabbit-hole mechanics to keep them engaged and vulnerable.
The European Commission is currently investigating TikTok’s role in digital addiction, and France has launched a parliamentary inquiry into whether the platform promotes self-harm and hypersexualized content to youth. TikTok is only one example of a much larger issue.
So, what’s still missing? Enforcement is part of the problem—but the push for safer digital environments must go further. In July 2024, European Commission President Ursula von der Leyen announced the first EU-wide inquiry into the impact of social media on youth well-being, with particular focus on addictive platform designs. I’ve since asked the Commission for more information. Whatever the results, one thing is certain: this inquiry must directly inform the forthcoming Digital Fairness Act (DFA).
The EU leads global efforts to regulate the digital space, with landmark policies like the DSA and the Artificial Intelligence Act. Looking forward to the DFA, our top priority should be the mental health and well-being of children and young people. The legislation must tackle addictive platform design head-on. It must establish enforceable rules against malicious algorithmic practices and digital dependence. This includes creating a clear list of prohibited techniques and mechanisms. The DFA should also demand transparency and accountability regarding content prioritization, ensuring young users aren’t manipulated into consuming harmful or exploitative content.
Sixteen years ago, someone had the courage to intervene when I was pushing myself too far. It was easy to stop going to the gym. But today, the promotion of hazardous eating habits occurs online—there are no physical doors to close. Harmful content follows users everywhere. That’s why it’s urgent for us to regulate the algorithmic forces that influence how children perceive themselves and their worth.
Dear reader,
Opinions expressed in this op-ed are those of the author and do not reflect the official stance of our publication. We are committed to giving space to diverse voices and viewpoints, including those that may challenge prevailing opinions. Our goal remains steadfast: to offer fair, accurate, and high-quality journalism. Your continued support makes this possible. Thank you.
Eurotoday is a daily online newspaper based in Belgium. Eurotoday provides unique and independent reporting on European and international issues. With a continent-wide lens, it covers EU policymaking, member state developments, and key
Leave a Reply