Last week, New York State legislators put a stop to personalized social media feeds for the under 18 set, citing mental health harms. The law they passed takes an unusual approach to the challenging task of regulating social media. It focuses on algorithms, rather than platforms or specific content. Lawmakers say the algorithms are addictive. It’s a much more specific argument than the one made by Surgeon General Vivek Murthy, who called recently for a warning label to be applied more broadly to social platforms. (Spoiler alert: he won’t be able to do anything without Congress.) Meta, TechNet, and NetChoice couldn't immediately return a request for comment on the new law. So what does the science actually say about all this? Research agrees — mostly. But the way the research explains addictive algorithms and their harms is more of a constellation than a straight line. Social media companies aren’t transparent about how their algorithms work, so researchers have had to study them indirectly. Let’s unpack how addiction is defined. It’s when someone uses something — a substance, the internet — in a way that interferes with their normal daily life. Rates of teen social-media addiction vary greatly depending on the country. A study out of Turkey found a quarter of teens meet this description, and another found that in Southeast Asian countries addiction can be as high as 50 percent of teens. Researchers estimate in the U.S. and Europe that around five percent of teens may be addicted to social media. So what’s keeping kids scrolling? Testimony from former Meta employees tells us that content algorithms are designed to keep kids glued to their phones, and past research shows that negative content is often the most engaging. More time on social media is correlated to worse mental health. Worryingly, research shows that part of what teens like about the algorithms is that they mirror their sense of self. While kids think that they scroll past content that doesn’t jibe with their sense of self, researchers find that they not only watch that content, they start to question whether it actually might be reflective of them. Those algorithms appear to be succeeding in keeping usage high. On average, teens spend 4.8 hours a day on social media, according to a 2023 Gallup Poll. Researchers agree that teens who spend more than three hours a day online are twice as likely to experience depression or anxiety. “We know the notifications, and we know the endless scroll and we know the recommended content — all of those are very much designed to keep people engaged for longer than they would have otherwise,” said Mitchell Prinstein, chief science officer at the American Psychological Association, which has been advocating for laws that curb the ill affects of social media on kids. As Murthy noted in his well-read opinion column calling for warning labels on social media, kids' brains, which have under-developed impulse control, are no match for these systems. What makes feeds tailored to an individual's preferences so toxic has less to do with personalization and much more to do with how companies hack engagement. Not only are social media algorithms keeping people on for longer, social media platforms may also be exposing teens to certain types of negative content that keeps their attention. But, but, but: We don’t know as much about the science of social media algorithms as we would like, Prinstein said, because social media companies do not share information about how they work. To better understand how to regulate social media, we need to know more, he said. But he said, researchers have come to believe that algorithms are pushing fearful or angry content as a way of increasing engagement. “If they were recommending content about how to solve conflicts and study well, it would be a very different situation,” said Prinstein. “But we know that that's not the content that's being recommended.” A U.K. study on the effects of algorithms on wellbeing found that one in five 11- to 16-year-olds have been exposed to content that shows them how to cut themselves, or how to eat small meals to be thinner and how to engage in binging and purging behaviors. Another study says that seeing self-harm leads to self-harm and increasing connection with others who self-harm. Whether or not the science is firm, legal experts aren’t sure that banning algorithms will hold up in court. “The bill effectively imposes blanket parental consent requirements for minors to access constitutionally protected speech across many websites and apps and courts have invalidated similar efforts to limit speech by minors and speech to minors,” said attorney Mark Brennan at Hogan Lovells. The industry hasn’t challenged New York's law yet, but don’t expect companies to take this idea lying down. Algorithms — really, really effective ones — are central to platforms' business models. Any law cutting off recommendations stands to have a big impact on social media companies’ bottom line. “The longer someone is on, the more ads they can show them,” said Vineet Kumar, associate professor of marketing at Yale University, who focuses on digital technologies. Hypothetically speaking, he said, If the new law limits engagement from an hour down to five minutes, that could have a huge impact on how much time kids are spending online. New York’s law also stops platforms from sending kids notifications about happenings on the platform between midnight and 6 am. Kumar liked this idea as well, since limiting notifications during certain hours could reduce kids fear of missing a post — perhaps solving another great affliction of being young: FOMO. |