By Katie Joseff
At Common Sense Media, our research has long shown us that social media can negatively impact kids’ mental health — especially for kids with depression or other mental health issues. And we’ve been sounding the call for years, asking social media platforms to take action to protect kids. It turns out that Facebook and Instagram have long known that Instagram is harming adolescent and teen users, and they have lied about the impacts instead of making substantive changes.
Internal research documents recently released by a whistleblower reveal just how much Facebook knew. Below are some of the findings:
- “We make body image issues worse for one in three teen girls.”
- “One in five teens say that Instagram makes them feel worse about themselves.”
- Teens “often feel ‘addicted’ and know that what they’re seeing is bad for their mental health but feel unable to stop themselves.”
- 32% of teens feel they “don’t have enough friends” due to Instagram and 22% feel “alone or lonely.”
Mark Zuckerberg, CEO of Facebook, the parent company of Instagram, and Adam Mosseri, head of Instagram, learned about these research results in 2020, and yet they continued to mislead the public about Instagram’s impact. During a March 2021 congressional hearing, when Zuckerberg was asked if he agreed that “too much time in front of screens passively consuming content” was harmful to the mental health of children, Zuckerberg said, “I don’t think that the research is conclusive on that.”
As for Mosseri, this past May, he told reporters that Instagram’s impact on teens’ mental health is likely “quite small.” And in August, when called upon to release Facebook’s internal research on youth mental health by Senators Blackburn and Blumenthal, Facebook neglected to comply. Instead, the company claimed that it is challenging to conduct research on the topic and that they “are not aware of a consensus among studies or experts about how much screen time is ‘too much.'”
The scope of Instagram’s harmful impact is massive. According to Instagram’s internal research materials, 22 million American teens use Instagram every day, and 40% of Instagram’s users are 22 years old or younger.
Facebook cannot be trusted to govern itself. We need to cut through these lies that minimize its impact and its empty proclamations of being a force for good and protect kids and teens on Instagram. Fortunately, there are two essential pieces of legislation on the horizon: the CAMRA Act and the KIDS Act.
The Children and Media Research Advancement (CAMRA) Act has strong bipartisan, industry, and consumer group support, and is vitally important to breaking the stranglehold on research that enables Facebook to largely evade evidence-based scrutiny. CAMRA would engage the National Institutes of Health in studying the health and development effects of media, including social media, on infants, kids, and teens. Long overdue, CAMRA would strengthen the independent research field, and Zuckerberg would not be able to claim ignorance about how much screen time is “too much” or elude questions about Instagram’s impact on kids’ mental health by declaring research on the topic too challenging.
Companies like Facebook clearly also need more guardrails. The Kids Internet and Design Safety (KIDS) Act would create strong rules for how online platforms position ads alongside children’s content and curb the use of manipulative design that pushes inappropriate content on kids. It is an adaptation of the Children’s Television Act, which has successfully regulated kids’ programming for the last 30 years. It is also long overdue, given that online videos constitute the largest portion of total video viewing for children age 0 to 8 years, and that their viewing times are progressively increasing.
Among other mandates, the KIDS Act would limit the method and content of online advertising targeting kids, address the use of algorithms that push extreme content on kids, require platforms to provide clear guidance and labeling on kid-healthy content, and stop manipulative design that targets kids. Manipulative design includes autoplay on video streaming and social media; badges and rewards for elevated levels of engagement; unlabeled product placements and influencer endorsements, which children have particular difficulty discerning; and frictionless access to hundreds of thousands of often-unvetted apps in the app store, which increases exposure to inappropriate content.
The CAMRA Act and the KIDS Act are common-sense solutions that are sorely needed to protect children and teens online. Depression, self-harm, and suicide attempts have substantially increased among U.S. youth since the advent of ubiquitous social media in 2011, and while debates about correlation and causation continue, the evidence has become all the more concerning with the revelations of Instagram’s internal research.
How will a generation raised in such a hostile information environment — intentionally designed to exploit their feelings of well-being and self-confidence — act as adults? And now that we know that they are being willfully harmed — and that we have legislative solutions to stop this — how can we continue to allow it?
Katie Joseff is the misinformation and disinformation specialist at Common Sense. Her work focuses on platform accountability.