Thinness as a Trend and the Role of the Algorithm: How Far Does the Responsibility Go?

A reflection on youth, thinness, and the influence of algorithms on social media and beauty standards.

The first version of this post was what motivated me to create this blog. You can check out a shortened version of it here (This version is in pt-br). When I posted the first entry on the blog, I decided this would be the second one, especially because, during this time, I watched a movie that connects a lot with the theme.

A few weeks ago, I watched Death Becomes Her, a film that tells the story of two women willing to do anything to reclaim the beauty and youth they once had. Recently, I saw The Substance, which follows the journey of an actress whose career crumbles as aging sets in. Desperate, she takes a mysterious substance that transforms her into a new version of herself.

These films made me think of a trend that has been circulating on TikTok. In this trend, girls, teenagers, and women share extreme practices, like spending the day only eating gelatin or even celebrating stomach aches, because they believe it will help them lose weight.

What do these three things have in common?

The endless pursuit of youth and thinness. In both the movies and the trend, it’s clear that many women are willing to do almost anything to become younger, thinner, and therefore “more beautiful and desirable.”

This thought leads me to question: how much of this is the algorithm’s fault? Should it be held responsible? And more importantly, can it be held accountable?

According to the platform’s guidelines, the minimum age for using TikTok is 13 (or 14 in South Korea, Indonesia, and Quebec). It’s common to find children much younger on there, exposed to harmful content and beauty standards. And the impact of this content isn’t limited to children; it also affects teenagers and adult women who feel constant pressure to meet the thin and beautiful standards promoted on social media.

Although TikTok’s content guidelines prohibit the display of harmful weight control practices (like low-calorie diets and the use of weight-loss drugs), this content spreads widely and uncontrollably. In the U.S., for example, there is a version of TikTok for users under 13, which includes additional protections such as interaction restrictions and content evaluations for that age group. However, these protections don’t prevent harmful content from reaching other age groups, including teenagers and adults.

The Responsibility of the Algorithm

Like many other social networks, TikTok uses recommendation algorithms designed to keep users engaged by delivering content based on their interests and browsing behavior. In other words, the more you interact with beauty and diet content, the more the algorithm delivers similar posts.

How Can We Improve This Scenario?

t’s important for social networks to invest in more responsible algorithms that filter out harmful content to protect the mental and physical health of users, especially younger ones. Additionally, educating users to recognize and question these standards is essential. Partnerships with mental health organizations and regulatory bodies to strengthen safety and digital health could be key steps.

As a society, we can promote a culture of acceptance and well-being by creating positive content and pushing for stricter and more responsible moderation on social media platforms. Society and platforms must work together to reduce the aesthetic pressure, prioritizing well-being and self-acceptance over unattainable standards.


What do you think about this topic? Have you ever stopped to think about the impact of the algorithm on our lives? Leave a comment with your opinion or share an experience you’ve had about this. Let’s talk and think together about how we can create a healthier and more inclusive internet!

Built with Hugo
Theme Stack designed by Jimmy