By Jean Twenge, Professor of Psychology, San Diego State University
Right at the time social media became popular, teen mental health began to falter. Between 2010 and 2019, rates of depression and loneliness doubled in the U.S. and globally, suicide rates soared for teens in the U.S. and emergency room admissions for self-harm tripled among 10 to 14 year-old girls.
Social scientists like myself have been warning for years that the ubiquity of social media might be at the root of the growing mental health crisis for teens. Yet when Facebook CEO Mark Zuckerberg was asked during a congressional hearing in March to acknowledge the connection between social media and these troubling mental health trends, he replied, “I don’t think that the research is conclusive on that.”
Just six months later, The Wall Street Journal reported that Facebook had been doing its own research for years on the negative effects of Instagram, the company’s photo-sharing app popular with teens and young adults. Six internal documents summarizing the research, leaked by a whistle-blower, were posted in full on September 29, 2021.
The details in the 209 pages are revealing. They suggest not only that Facebook knew how Instagram could be harmful, but that the company also was aware of possible solutions to mitigate those harms. Facebook’s own research strongly suggests that social media should be subject to more stringent regulation and include more guardrails to protect the mental health of its users. There are two primary ways the company can do this: enforcing time limits and increasing the minimum age of users.
A ticking time bomb for mental health
Academic research shows that the more hours a day a teen spends on social media, the more likely she or he is to be depressed or to self-harm. That is important because many teens, especially girls, spend large amounts of time on social media.
One study in the U.K. found that one-quarter of 15-year-old girls spent more than five hours a day using social media – and 38% of those girls were clinically depressed. Comparatively, among girls who used social media less than one hour a day, only 15% were depressed.
Although the internal Facebook research did not examine links between time on Instagram and mental health, they did ask teens about what were, in their view, the worst aspects of Instagram. One of the things teens disliked the most about the app was how much time they spent on it.
Teens, the report said, had “an addict’s narrative about their use. … They wish they could spend less time caring about it but they cannot help themselves.”
They knew they were spending too much time online, but had a hard time controlling how much time they spent. One-third of teens suggested Instagram should remind them to take a break or encourage them to get off the app.
That would be a step in the right direction, but simple nudges might not be enough to get teens to close the app and keep it closed. And while parents can already set time limits using the parental controls included on most smartphones, many of them do not know how to use these controls or are unaware how much time teens are spending on social media.
So better regulations might need to put teeth into time limits, such as limiting the number of hours teens under 18 can spend on social media apps. A blackout period overnight might also be useful, as many teens use their smartphones at night when they should be sleeping.
ID, please
One internal Facebook study of more than 50,000 people from 10 countries found that half of teen girls compare their appearance to others’ on Instagram. Those appearance-based comparisons, the study found, peaked when users were 13 to 18 and were much less common among adult women.
This is key, as body image issues seem to be one of the biggest reasons why social media use is linked to depression among teen girls. It also dovetails with research I reported in my book, “iGen,” finding that social media use is more strongly linked to unhappiness among younger teens than older ones.
This suggests another avenue for regulation: age minimums. A 1998 law called the Children’s Online Privacy Protection Rule already sets the age minimum for social media accounts at 13. That limit is problematic for two reasons. First, 13 is a developmentally challenging time, right as boys and girls are going through puberty and bullying is at its peak.
Second, the age minimum is not regularly enforced. Kids 12 and under can simply lie about their age to sign up for an account, and they are rarely kicked off the platform for being underage. During a Facebook event with Instagram head Adam Mosseri, the young celebrity JoJo Siwa noted she had been using Instagram since she was 8 years old, forcing Mosseri to acknowledge that it is easy to lie about your age.
The problem is how to enforce an age limit online for a population that is too young for IDs. Raising the minimum age to create a social media account to 16, 17 or 18 could solve two problems at once: It would prevent kids from signing up until they are a bit more developed and mature, and it would be easier to enforce. For example, potential users might be asked to submit a photo of their state-issued ID, which most teens have by 16.
Verifying age would also make it easier to construct a safer app for younger users that might, say, hide follower counts or restrict access to celebrity accounts, both of which Facebook’s research found negatively impacted girls’ body images.
Curtailing that fear of missing out
It is tempting to think regulations like these would cause teens to riot in the streets – after all, they love keeping up with their friends on social media. But the teens interviewed by Facebook for its internal research were well aware of social media’s downsides.
“The reason why our generation is so messed up and has higher anxiety and depression than our parents’ is because we have to deal with social media. Everyone feels like they have to be perfect,” one teen girl told the researchers. Other teens have spoken publicly about the negative effects of social media.
More stringent regulation would help with another issue teens know all too well: the unwritten mandate to use social media or be left out.
“Young people are acutely aware that Instagram can be bad for their mental health yet are compelled to spend time on the app for fear of missing out,” Facebook’s internal research concluded.
If age limits were enforced, the peer pressure of being on social media would vanish; no or few classmates would be there. Regulating time on the app could also help if teens knew their friends would not constantly be online.
Facebook’s research demonstrates something else: The company was aware of the issues with Instagram but chose not to set these limits itself. Congress is now considering taking action. Until they do, it will be up to parents and teens themselves to set limits. That will not be easy, but teens will be safer for it.
Originally published on The Conversation as Facebook’s own internal documents offer a blueprint for making social media safer for teens
Support evidence-based journalism with a tax-deductible donation today, make a contribution to The Conversation.