Social Media Bans Could Deny Teenagers Mental Health Help

Social medias effects on the mental health of young people are not well understood. That hasnt stopped Congress, state legislatures, and the U.S. surgeon general from moving ahead with age bans and warning labels for YouTube, TikTok, and Instagram.

This story also ran on USA Today. It can be republished for free.

But the emphasis on fears about social media may cause policymakers to miss the mental health benefits it provides teenagers, say researchers, pediatricians, and the National Academies of Sciences, Engineering, and Medicine.

In June, Surgeon General Vivek Murthy, the nations top doctor, called for warning labels on social media platforms. The Senate approved the bipartisan Kids Online Safety Act and a companion bill, the Children and Teens Online Privacy Protection Act, on July 30. And at least 30 states have pending legislation relating to children and social media from age bans and parental consent requirements to new digital and media literacy courses for K-12 students.

Most research suggests that some features of social media can be harmful: Algorithmically driven content can distort reality and spread misinformation; incessant notifications distract attention and disrupt sleep; and the anonymity that sites offer can embolden cyberbullies.

But social media can also be helpful for some young people, said Linda Charmaraman, a research scientist and director of the Youth, Media & Wellbeing Research Lab at Wellesley Centers for Women.

For children of color and LGBTQ+ young people and others who may not see themselves represented broadly in society social media can reduce isolation, according to Charmaramans research, which was published in the Handbook of Adolescent Digital Media Use and Mental Health. Age bans, she said, could disproportionately affect these marginalized groups, who also spend more time on the platforms.

You think at first, Thats terrible. We need to get them off it, she said. But when you find out why theyre doing it, its because it helps bring them a sense of identity affirmation when theres something lacking in real life.

Arianne McCullough, 17, said she uses Instagram to connect with Black students like herself at Willamette University, where about 2% of students are Black.

I know how isolating it can be feeling like youre the only Black person, or any minority, in one space, said McCullough, a freshman from Sacramento, California. So, having someone I can text real quick and just say, Lets go hang out, is important. Email Sign-Up

Subscribe to KFF Health News' free Morning Briefing. Your Email Address Sign Up

After about a month at Willamette, which is in Salem, Oregon, McCullough assembled a social network with other Black students. Were all in a little group chat, she said. We talk and make plans.

Social media hasnt always been this useful for McCullough. After California schools closed during the pandemic, McCullough said, she stopped competing in soccer and track. She gained weight, she said, and her social media feed was constantly promoting at-home workouts and fasting diets.

Thats where the body comparisons came in, McCullough said, noting that she felt more irritable, distracted, and sad. I was comparing myself to other people and things that I wasnt self-conscious of before.

When her mother tried to take away the smartphone, McCullough responded with an emotional outburst. It was definitely addictive, said her mother, Rayvn McCullough, 38, of Sacramento.

Arianne said she eventually felt happier and more like herself once she cut back on her use of social media.

But the fear of missing out eventually crept back in, Arianne said. I missed seeing what my friends were doing and having easy, fast communication with them. Arianne McCullough (left) and her mother, Rayvn, of Sacramento, California, support social media legislation that would require platforms like YouTube, Instagram, and TikTok to be more transparent about the effects of their products on adolescent mental health.(Rayvn McCullough)

For a decade before the covid-19 pandemic triggered what the American Academy of Pediatrics and other medical groups declared a national emergency in child and adolescent mental health, greater numbers of young people had been struggling with their mental health.

More young people were reporting feelings of hopelessness and sadness, as well as suicidal thoughts and behavior, according to behavioral surveys of students in grades nine through 12 conducted by the Centers for Disease Control and Prevention.

The greater use of immersive social media like the never-ending scroll of videos on YouTube, TikTok, and Instagram has been blamed for contributing to the crisis. But a committee of the national academies found that the relationship between social media and youth mental health is complex, with potential benefits as well as harms. Evidence of social medias effect on child well-being remains limited, the committee reported this year, while calling on the National Institutes of Health and other research groups to prioritize funding such studies.

In its report, the committee cited legislation in Utah last year that places age and time limits on young peoples use of social media and warned that the policy could backfire.

The legislators intent to protect time for sleep and schoolwork and to prevent at least some compulsive use could just as easily have unintended consequences, perhaps isolating young people from their support systems when they need them, the report said.

Some states have considered policies that echo the national academies recommendations. For instance, Virginia and Maryland have adopted legislation that prohibits social media companies from selling or disclosing childrens personal data and requires platforms to default to privacy settings. Other states, including Colorado, Georgia, and West Virginia, have created curricula about the mental health effects of using social media for students in public schools, which the national academies also recommended.

The Kids Online Safety Act, which is now before the House of Representatives, would require parental consent for social media users younger than 13 and impose on companies a duty of care to protect users younger than 17 from harm, including anxiety, depression, and suicidal behavior. The second bill, the Children and Teens Online Privacy Protection Act, would ban platforms from targeting ads toward minors and collecting personal data on young people.

Attorneys general in California, Louisiana, Minnesota, and dozens of other states have filed lawsuits in federal and state courts alleging that Meta, the parent company of Facebook and Instagram, misled the public about the dangers of social media for young people and ignored the potential damage to their mental health.

Most social media companies require users to be at least 13, and the sites often include safety features, like blocking adults from messaging minors and defaulting minors accounts to privacy settings.

Despite existing policies, the Department of Justice says some social media companies don’t follow their own rules. On Aug. 2, it sued the parent company of TikTok for allegedly violating child privacy laws, saying the company knowingly let children younger than 13 on the platform, and collected data on their use.

Surveys show that age restrictions and parental consent requirements have popular support among adults.

NetChoice, an industry group whose members include Meta and Alphabet, which owns Google and YouTube, has filed lawsuits against at least eight states, seeking to stop or overturn laws that impose age limits, verification requirements, and other policies aimed at protecting children.

Much of social medias effect can depend on the content children consume and the features that keep them engaged with a platform, said Jenny Radesky, a physician and a co-director of the American Academy of Pediatrics Center of Excellence on Social Media and Youth Mental Health.

Age bans, parental consent requirements and other proposals may be well-meaning, she said, but they do not address what she considers to be the real mechanism of harm: business models that aim to keep young people posting, scrolling, and purchasing.

Weve kind of created this system thats not well designed to promote youth mental health, Radesky said. Its designed to make lots of money for these platforms.

Chaseedaw Giles, KFF Health News digital strategy & audience engagement editor, contributed to this report.

Daniel Chang: dchang@kff.org, @dchangmiami Related Topics Race and Health Children's Health Health IT Contact Us Submit a Story Tip