Death by algorithm

Last weekend I made cookies for the first time since I can remember. I'm not a cook, far from it. And I'm not much of a sweet tooth. But I made them with my daughter. A new activity to spend the weekend, because we have already done a lot of play dough, watercolor, puzzles, yoga... and everything I can think of to entertain and stimulate her.
We also watch movies, yes. We sit together on the sofa and comment on what’s happening on the screen. Alas, the dreaded screens. I see families with children eating in restaurants and there are the kids, with their iPhone or tablet, watching cartoons or playing video games. I also see them sitting with their phones on the swings in the park, rocking slowly without taking their eyes off the screen. Or in groups, quietly and endlessly scrounging, each one on his or her device. And as much as I invent new activities, I think someday that will be us too. Because I see it as increasingly Martian that they are isolated from an inevitable reality. The digital world is not evil per se. Just as the industrial revolution was not. But like any social change, we must try to understand it in order to use it as safely as possible. And that is our responsibility as mothers and fathers. To know where our children are going and how and what they are using these new technologies for.
Meta (the company that owns Facebook, Instagram and Whatsapp) has just been sued by 41 states in the United States for building its products with addictive elements that intentionally harm younger Instagram and Facebook users and designing apps that could affect children’s mental health. At this point in the movie, it would be a no-brainer to explain the responsibility of these companies in our addiction to the Internet in general and social networks in particular. As bad as it may be for us, in the dominant system, the main objective of capitalist companies is to generate profits, and although we wish they also had a social responsibility, in most cases it takes the limitations of the public administrations on duty – or the courts, in this case – to take such measures. That Instagram has an algorithm that seeks to make us use its application as long as possible? That it does so with specific strategies to generate addiction in the youngest, without taking into account possible repercussions on their health? That’s where we will have to regulate.

In 2017 a 14-year-old girl named Molly committed suicide after months of consuming content on Instagram and Pinterest that encouraged self-harm and encouraged suicide. The parents discovered that their daughter had Liked thousands of those posts once the girl had passed away. The algorithm worked like a charm and Molly gained access to some of the so-called outlaw online communities. In the six months prior to her death, Molly shared, saved or liked 16,300 posts on Instagram. 2,100 of those posts (about 12 a day) were related to suicide, self-harm and depression, according to information shared by Meta with the family, after years of unanswered requests appealing privacy and data protection. Even her farewell letter contained excerpts almost identical to quotes found in those posts. Did social media function as a catalyst in this case? Of course they did. Do parents have a share of responsibility? Unfortunately, so do parents.Could it be that parents were not sufficiently equipped to understand the risks of social media use among children and teens? Let’s not be hypocritical, let’s not judge. It could have happened to us. The key is to understand that we do not understand. That we are missing pieces to complete such a complex puzzle. And that technology companies clearly do not want us to have them.
In September 2021, a Wall Street Journal report titled The Facebook Files leaked an internal document conducted by Facebook researchers. In the filing, the researchers confirmed that Instagram can make teens feel worse about their bodies. This can’t come as a surprise either. I imagine that, like me, many of you reading this page can put yourself in their shoes and think about how having social networks would have affected us at that convulsive time in our lives, when you don’t quite know who you are or who they expect you to be or who you want to be. It was already such a vulnerable time, having only minimal references from the immediate environment, movies, music and literature, that I pity the child I would have been with so many external stimuli that constantly question you in all areas of your life and yourself. Thank goodness we didn’t have networks. But you have to understand that society has changed drastically. The report in question concluded that 32% of teenage girls said that when they felt bad about their bodies, Instagram made them feel worse. They also stated that 13.5% of teenage girls said that Instagram increased their suicidal ideations and 17% said it made their eating disorders worse. And I give a lot of data because it seems necessary for us to open our eyes. Because I have been shocked reading so many horrors to write this article. Because the situation is critical and that is what we have to transmit from the media. And because the companies in question continue to deny in public their intention or the devastating effects of their products. Molly’s case, after years of legal battle by her family, was closed with the verdict that she had died from “an act of self-harm under depression and the negative effects of online content”. The judge ruled that the internet “adversely affected his mental health and contributed to his death significantly.” No financial compensation was sought or awarded. His parents only wanted to denounce the dangers of the networks.
And the king of networks among children and teenagers is undoubtedly TikTok. According to the report From Alpha to Zeta, educating digital generations, by the digital security company Qustodio, in 2022, minors in Spain spent an average of four hours a day connected to screens outside the classroom and spent 96 minutes a day connected to their favorite network: the Chinese TikTok; almost twice as much as they dedicated to Instagram. I know next to nothing about that social network. As a Millennial that (I think) I am, I don’t understand it much. But I try to inform myself so it doesn’t catch me by surprise. And what I read scares me. Not only because of the fact that the short video format decreases attention span, or because of the absolute nonsense that has millions of views. Or for the new ‘stars’ with tens of millions of followers who are examples of little or nothing. It is, above all, for the brazenness with which they want to capture the most vulnerable population. Children, pre-adolescents and teenagers in search of references to place themselves in the world. It seems that children under 13 cannot use the app (in the United States there is a version for children with controlled content). From TikTok, they state bluntly: “We have made a firm commitment to ensure that TikTok provides a safe and positive experience for users under the age of 18 (hereafter, “minors”). This starts with making sure they are old enough to use TikTok. To have an account, you must be at least 13 years old. [The safety of minors is our priority. We do not allow content that may expose underage users to a risk of exploitation or psychological, physical or developmental harm. This includes material that reflects child sexual abuse, child molestation, harassment, dangerous activities and challenges, exposure to overtly adult themes, and use of alcohol, tobacco, drugs or other controlled substances.” However, one of TikTok’s famous viral challenges called The Blackout Challenge, which involved trying to drown oneself or another person to obtain a brief state of euphoria, and posting it online to show other users, is linked to the deaths of at least 15 children 12 years of age or younger. One of them, 10-year-old Italian Antonella Sycomero, hung herself from a towel rack at home with the belt of a bathrobe. The report said the girl, whose school was closed because of covid-19, spent up to ten hours a day on the app, and had claimed she was 13 in order to create an account. One thing I fail to understand is how it is possible that, if we know our children are not old enough to be on these networks, we see that they have an account. Or do we not know? Because, first, they have lied to us, and second, we are putting their safety at risk. It’s okay that they spend a lot of time on their cell phones, but we should at least know what they consume. Without control, but without closing our eyes or looking the other way. And I no longer comment on the number of hours that particular girl spent on networks, because I would like to think that it is an exception. But it is unacceptable. An executive who worked for the Chinese company says that while TikTok’s official message is that user safety is their biggest concern, the truth is that their priority is growing the app. “Making more money is the number one priority.” Are we surprised? I don’t think so. Just as it is not surprising, but shocking, that the top executives of Silicon Valley tech companies strictly forbid their children to use phones, tablets and social networks. Back in 2011, Steve Jobs stated that his children could not use his company’s recent creation: the iPad. And Bill Gates did not allow them to have a phone until they turned 14. In recent years, members of the tech elite have created contracts for caregivers prohibiting them from exposing children to screens and, of course, from using electronic devices themselves during work hours. I imagine it will be something similar to what the top executives of food companies do, who, while they overwhelm us with all kinds of products full of fats, toxins and preservatives with the smiling face of Mickey Mouse or the protagonists of The Canine Patrol, will give their children a plant-based diet that protects their microbiota. Or those of Phillip Morris, who, while producing cigarettes with addictive components that kill half the world, will alert their children to the terrible risks of tobacco. They do have all the pieces of the puzzle. And if they ban it or limit it, there’s a reason.
In my case, we have time. We can keep baking cookies and learning the alphabet and painting watercolors for several years. And in the meantime, we’ll be educating ourselves and trying to find strategies to deal with the risks of the online society for the younger ones. So that it doesn’t catch us. But it will catch us. It’s already catching us and it’s going to get worse if we don’t do something. First of all, ask for responsibility. But also take responsibility. A study by the University of Pennsylvania concluded that an abuse of Facebook, Snapchat and Instagram increased the feeling of loneliness, and that reducing the time spent on social networks to 30 minutes a day could generate a very significant improvement in people’s well-being. So that’s what the University of Pennsylvania says, and what common sense says. That we should not abuse. Neither of the networks, nor of sugar, nor of anything. Neither the kids nor us. As my father always tells me, everything is good in its right measure. And if the kids can’t understand what that measure is, because they are still forming their personality and their sense of right and wrong, and if there are those who take advantage of the circumstance, we have to be there, accompanying them and understanding that the same thing would have happened to us -which may also happen to us even as adults-. What, over the years, I have been able to verify that it almost always works, is to lead by example. And we can do that. Because for that you don’t need to know much about technology or social networks.
Article published in the opinion section of El Diario Vasco on November 5, 2023.