You noticed it slowly at first. Your daughter who used to read books started spending hours scrolling. Your son who loved soccer stopped going to practice. They seemed anxious all the time, checking their phones during dinner, in the bathroom, under the covers at night. When you asked them to put it down, they became defensive, angry, sometimes desperate. You thought maybe it was just adolescence. Maybe you were being overprotective. Maybe every teenager acted this way now.
Then the school called about grades slipping. Or you found the journal entries about hating their body. Or the pediatrician asked about the cuts on their arms and suddenly you were sitting in a therapist office hearing words like major depressive disorder, generalized anxiety, body dysmorphic disorder, even suicidal ideation. Your child, who was happy and healthy just two years ago, was now on medication and struggling to get through each day. The therapist asked about screen time. You said yes, they use social media, but all kids do. You felt a gnawing guilt, wondering if you should have seen this coming, if you should have taken the phone away sooner.
What you did not know, what you could not have known, is that the companies who made those platforms had research showing this would happen. They had internal studies tracking exactly how their products affected teenage mental health. They knew the risks and they built the products anyway, designed specifically to keep your child scrolling, comparing, despairing. This was not your failure as a parent. This was a documented business decision.
What Happened
The injury looks like a child who cannot stop checking their phone. They wake up and reach for it immediately. They check it during class, during meals, in the middle of the night. When you take it away, they experience what looks like withdrawal: anxiety, irritability, panic. They talk about likes and followers and views the way someone might talk about their worth as a person.
The depression comes on gradually or sometimes suddenly. They start comparing themselves to the filtered, perfected images they see hundreds of times per day. They feel ugly, inadequate, boring, unwanted. Girls develop eating disorders trying to look like the influencers they follow. Boys feel inadequate about their bodies, their success, their lives. The anxiety is constant, a fear of missing out, of being left out, of not being enough.
Some kids start cutting themselves or burning themselves. When you ask why, they say it helps them feel something, or it helps them feel nothing. Some stop eating. Some cannot get out of bed. Some end up in the emergency room after taking pills or worse. The children who survive tell their parents they did not actually want to die. They just wanted the feeling to stop. The feeling of never being good enough, of always being watched and judged, of being trapped in a cycle they could not escape.
Parents watch their children disappear into their screens and then disappear inside themselves. Therapists use terms like dopamine dysregulation, social comparison theory, fear of missing out, but what parents see is simpler and more devastating: their child is not okay, and the thing that seems to make them feel worse is the same thing they cannot put down.
The Connection
Social media platforms are engineered to be addictive. This is not an accident or a side effect. It is the core business model. The longer users stay on the platform, the more ads they see, the more money the company makes. To maximize time on platform, these companies employed some of the most sophisticated behavioral psychologists and neuroscientists in the world to build features specifically designed to trigger compulsive use.
The mechanism works like this: Every time a teenager posts something and gets likes or comments or views, their brain releases dopamine, the same neurotransmitter involved in drug addiction and gambling. But the platform does not deliver that dopamine on a predictable schedule. Sometimes a post gets lots of engagement, sometimes very little. This variable reward schedule, the same mechanism used in slot machines, is more addictive than consistent rewards. The brain keeps checking, keeps posting, hoping for the next hit.
The infinite scroll feature, pioneered by these platforms, removes natural stopping points. There is no end to the content, no moment where the brain can say we are done now. The autoplay feature on videos does the same thing. Each feature is designed to override the brain regions responsible for self-control.
For teenagers, whose prefrontal cortex is still developing and whose ability to regulate impulses is not fully formed, these features are especially powerful. Research published in JAMA Pediatrics in 2019 found that adolescents who checked social media more than 15 times per day were three times more likely to develop depression than those who checked less frequently. A study in the Journal of Abnormal Psychology in 2020 found that between 2009 and 2017, the same period when smartphone adoption became universal among teens, rates of major depressive episodes increased 52 percent in adolescents and rates of serious psychological distress increased 71 percent in young adults aged 18 to 25.
The harm is not just about time spent. It is about what happens during that time. Teenage girls are exposed to thousands of images of apparently perfect bodies, perfect faces, perfect lives. Studies published in the International Journal of Eating Disorders in 2020 found direct correlations between Instagram use and body dissatisfaction, drive for thinness, and eating disorder symptoms. The comparison is constant and it is rigged. Users are comparing their everyday reality to other people's carefully curated, filtered, edited highlight reels.
For vulnerable teenagers, those already struggling with self-esteem or family problems or trauma, social media can become the thing that tips them into clinical mental illness. For others who had no predisposition to depression or anxiety, the platforms themselves create the vulnerability. The research is clear: social media use is not just correlated with teen mental health decline, it is causally connected to it.
What They Knew And When They Knew It
Meta, the parent company of Facebook and Instagram, had internal research documenting the mental health harms of its platforms years before the public knew. In 2019, Meta researchers conducted internal studies on tens of thousands of users across multiple countries. The research, revealed in internal documents obtained by whistleblower Frances Haugen and reported by the Wall Street Journal in September 2021, showed that Instagram made body image issues worse for one in three teenage girls. Thirty two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse.
The internal research stated: We make body image issues worse for one in three teen girls. This was not speculation. This was Meta's own finding from its own research on its own users. Another internal document from 2019 stated: Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.
In 2020, Meta researchers found that 13.5 percent of teen girls in the UK said Instagram made their suicidal thoughts worse. Six percent of American teens who reported suicidal thoughts traced the issue to Instagram. These were not external critics or academic researchers. These were Meta employees studying Meta users and reporting to Meta executives.
The documents show Meta knew Instagram was pushing vulnerable teenagers toward anorexia content. An internal report from 2019 described how the platform recommendation algorithm led users from healthy recipe content to extreme dieting content to anorexia content in a clear progression. The researchers wrote that the platform was promoting content that could be harmful to users with eating disorders, and that teenage users were being shown this content without seeking it out.
Facebook had similar research about its impact on teen mental health dating back even further. Internal studies from 2017 and 2018 showed that Facebook knew heavy use of its platform was correlated with loneliness, anxiety, and depression, particularly among younger users. The research showed that passive consumption of content, scrolling through feeds without interacting, was particularly harmful. Facebook knew this and continued to optimize its feed algorithm to maximize time on site regardless of whether that time was actively engaged or passively consumed.
TikTok internal documents obtained through legal discovery show the company conducted studies on compulsive use in 2018 and 2019. Engineers described in internal communications how they tested various features to increase what they called session time and frequency of opens. They knew that users, particularly young users, were opening the app dozens or hundreds of times per day and spending hours in single sessions. Documents show TikTok executives discussed whether the compulsive use patterns they were seeing and encouraging constituted addiction. They decided not to use that term publicly but continued to optimize for the same behavioral patterns.
A 2020 internal TikTok document outlined how the recommendation algorithm was designed to push users into rabbit holes of repetitive content. The document acknowledged that this could be problematic for vulnerable users but concluded that it was effective at driving engagement. TikTok researchers found that teenagers were particularly susceptible to the algorithm and would watch the same type of content for hours without realizing how much time had passed.
Snapchat, while more secretive about its internal research, has been forced through litigation to produce documents showing the company studied the mental health effects of its features. Documents from 2018 show Snapchat executives discussed whether features like Snapstreaks, which reward users for sending snaps back and forth for consecutive days, were creating anxiety and compulsive use in teenage users. They found that teenagers reported feeling obligated to maintain streaks, feeling anxious when they might lose a streak, and prioritizing streaks over sleep, homework, and in-person social interaction. Snapchat made minor modifications but kept the feature because it drove daily active use.
In 2019, Snapchat conducted research on its Snap Map feature, which shows users where their friends are in real time. Internal documents show researchers found the feature increased fear of missing out and social anxiety, particularly among middle school users who would see their friends together without them. Snapchat kept the feature and made it more prominent in the app.
All three companies had research showing their products were particularly harmful to users who were already vulnerable. Meta research from 2020 found that teenagers who already had low self-esteem or pre-existing body image concerns experienced the most severe negative effects from Instagram. Rather than adding protections for these users, Instagram research focused on how to keep them engaged despite their negative experiences.
How They Kept It Hidden
The primary strategy was simply not disclosing the research. Meta conducted extensive internal studies on teen mental health but did not publish the results in peer-reviewed journals and did not share the findings with regulators, child health advocates, or parents. When external researchers requested access to data to study mental health effects, Meta denied the requests or provided only limited data sets that made it difficult to draw conclusions about causation.
When external research began to show concerning correlations between social media use and teen mental health decline, Meta funded its own studies and promoted research that showed more favorable results. In 2020 and 2021, Meta pointed to studies that found no causal link between social media and depression while failing to mention that those studies used different methodologies and looked at different populations than Meta's internal research.
Meta used its significant lobbying power to prevent regulation. Between 2016 and 2021, Meta spent over 80 million dollars on lobbying, much of it focused on preventing legislation that would restrict how platforms could collect data on minors or limit features designed to maximize engagement. When states proposed laws requiring platforms to assess mental health impacts before launching new features targeting children, Meta lobbied against them.
The companies also used public relations strategies to shift blame. When parents raised concerns about social media and mental health, the companies emphasized parental responsibility and digital literacy. They created educational programs and parental control tools while knowing from their internal research that these measures were largely ineffective against the addictive design of the platforms themselves. The message was clear: if your child is struggling, it is a parenting problem, not a product design problem.
All three companies required employees who worked on sensitive research to sign strict non-disclosure agreements. When Frances Haugen came forward as a whistleblower in 2021, it was the first time the public saw the internal research Meta had been conducting. Haugen faced significant legal threats and had to carefully navigate whistleblower protections to avoid prosecution for sharing the documents.
When litigation began and companies were forced to produce internal documents through discovery, they fought aggressively to keep those documents under seal. Even in cases where plaintiffs won the right to see internal research, protective orders prevented the information from becoming public. Settlement agreements in early cases included non-disclosure provisions that prevented families from discussing what they learned about what the companies knew.
The companies also used design obfuscation. They would test harmful features under innocuous names in internal communications. They would roll out changes gradually so that no single update would trigger regulatory scrutiny or public backlash. They would describe addictive features in positive terms: engagement, connection, community building. The word addiction was banned from internal communications at multiple companies even when that was precisely what researchers were measuring.
Why Your Doctor Did Not Tell You
Pediatricians and mental health professionals did not have access to the internal research showing the causal connection between social media platforms and teen mental health decline. The published literature through the mid-2010s showed correlations but the technology companies successfully muddied the waters by funding contradictory studies and emphasizing that correlation does not prove causation.
Many physicians assumed social media was neutral technology that could be used well or poorly depending on the individual. They did not know that the platforms were specifically engineered to override self-control and create compulsive use patterns. They did not know about the variable reward schedules, the infinite scroll, the algorithmic amplification of harmful content. They thought of social media the way they might think of television: potentially problematic in excess but not inherently dangerous.
Medical education has been slow to incorporate information about behavioral addiction to technology. Most physicians currently in practice received no training on social media addiction because it was not a recognized phenomenon when they were in medical school. The research showing clear causal links between social media use and mental health problems in adolescents has only emerged in the past five years, and it takes time for that research to make its way into clinical practice guidelines.
When teenagers presented with depression, anxiety, or eating disorders, doctors focused on traditional treatment: therapy, medication, family interventions. They might ask about screen time as one factor among many, but they did not understand that for many adolescents, social media was not just a contributing factor but the primary cause of their mental health crisis.
The companies also shaped medical understanding through their funding of research and educational programs. Meta and other platforms funded digital wellness initiatives, parent education programs, and research centers at major universities. This funding came with no explicit strings attached, but it created relationships and shaped the conversation. Researchers who wanted continued funding knew not to be too critical. Medical organizations that received grants for educational programs were less likely to call for strict regulation of the platforms.
By the time physicians began to understand the scope of the problem, millions of teenagers were already struggling with mental health problems that had been caused or significantly worsened by social media use. The medical community is now playing catch-up, trying to develop treatment protocols for a form of behavioral addiction that did not exist 15 years ago.
Who Is Affected
The lawsuits are being filed on behalf of minors who used Meta platforms including Facebook and Instagram, TikTok, or Snapchat and subsequently developed mental health problems including depression, anxiety, eating disorders, body dysmorphic disorder, or suicidal ideation. The cases focus on use that began when the plaintiff was under 18 years old, though some cases include plaintiffs who began using the platforms as minors and continued into young adulthood.
If your child began using Instagram, TikTok, or Snapchat before age 18 and subsequently developed diagnosed mental health problems, your situation may fit the profile of these cases. The connection is strongest when there is documented mental health treatment including therapy or psychiatric medication that began during or after the period of heavy social media use.
The cases are particularly focused on compulsive or addictive use patterns. This means your child was using the platforms for multiple hours per day, checking them first thing in the morning and last thing at night, experiencing anxiety or distress when unable to access the platforms, and continuing to use them even when the use was causing problems with school, family relationships, or their own wellbeing.
Girls and young women who developed eating disorders including anorexia, bulimia, or binge eating disorder during periods of heavy Instagram or TikTok use are a significant focus of the litigation. The internal research showing these platforms specifically made body image issues worse for teenage girls is central to many cases.
Teenagers who engaged in self-harm including cutting, burning, or other forms of self-injury during periods of heavy social media use may also be affected. Some cases involve suicide attempts that occurred during or shortly after periods of intense social media use, particularly when the platforms were exposing the teenager to harmful content or facilitating social comparison and cyberbullying.
The timing matters. These cases focus on use that occurred from roughly 2012 onward, when the platforms were implementing the specific features that internal research showed were most addictive and most harmful to teens. Earlier use of social media platforms, before they incorporated infinite scroll, algorithmic feeds, and variable reward mechanisms, is less central to the litigation.
You do not need to prove that social media was the only factor in your child mental health problems. Many teenagers who were harmed by these platforms also had other challenges in their lives: family stress, academic pressure, social difficulties. The question is whether the platform made things significantly worse and whether the compulsive use of the platform became a mental health problem in itself.
Where Things Stand
As of 2024, more than 500 lawsuits have been filed against Meta, TikTok, Snapchat, and YouTube on behalf of minors who developed mental health problems related to social media use. These cases have been consolidated into a multidistrict litigation in the Northern District of California, assigned to Judge Yvonne Gonzalez Rogers. The MDL, officially titled In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, was created in October 2022 to coordinate the pretrial proceedings.
In addition to the individual cases, multiple school districts have filed lawsuits seeking to recover costs associated with mental health services for students harmed by social media platforms. The Seattle Public Schools filed suit in January 2023, followed by school districts across the country arguing that the platforms created a youth mental health crisis that has overwhelmed school counseling services and special education resources.
Several state attorneys general have also filed suit. In October 2023, 33 states filed a joint lawsuit against Meta alleging the company knowingly designed features to addict children to its platforms while misleading the public about the safety of those platforms. The complaint cites extensively from the internal documents revealed by Frances Haugen, showing Meta knew Instagram was harmful to teenage girls and took insufficient action to address the harms.
The companies have filed motions to dismiss arguing that Section 230 of the Communications Decency Act protects them from liability for how users choose to use their platforms. They argue that they are not responsible for third-party content on their platforms and that any harms flow from that content rather than from the platform design itself. Judge Gonzalez Rogers has rejected these arguments in part, finding that claims based on the addictive design features of the platforms themselves can proceed even if claims based on specific harmful content might be barred by Section 230.
In December 2023, Judge Gonzalez Rogers allowed a significant portion of the claims to move forward, finding that plaintiffs had adequately alleged that the platforms were defectively designed, that the companies failed to warn about known risks, and that they were negligent in designing products specifically to addict children. The decision marked a major victory for plaintiffs and signaled that these cases will likely proceed to discovery and potentially trial.
Discovery is ongoing, with plaintiffs seeking additional internal documents, communications between executives about teen mental health, and research that has not yet been made public. The companies are fighting to keep many documents under seal, arguing they contain trade secrets and proprietary business information. Plaintiffs argue the public has a right to know what these companies knew about the harms they were causing.
No cases have gone to trial yet, and no settlements have been announced in the individual injury cases. Given the early stage of the litigation, trials are likely still two to three years away unless the companies decide to settle. The school district cases and state attorney general cases are on a similar timeline.
Bellwether trials, which are test cases used to gauge how juries respond to the evidence and arguments, will likely be selected in 2025. These trials will help both sides understand the strength of their cases and may lead to broader settlement discussions if plaintiffs win significant verdicts.
The litigation is being closely watched because it represents one of the first major legal challenges to the business model of social media platforms. If plaintiffs succeed, it could force fundamental changes to how these platforms are designed and how they target young users. It could also open the door to liability for other technology companies whose products are designed to maximize engagement without regard to user wellbeing.
New cases are still being filed regularly as more families learn about the connection between social media use and their children mental health problems. The statute of limitations varies by state, but in most jurisdictions, the clock does not start until the injury is discovered or reasonably should have been discovered. For many families, that discovery is happening now as internal documents become public and as medical professionals become more aware of the causal connection between platform design and adolescent mental health.
You spent months or years watching your child struggle and wondering what you did wrong. You wondered if you should have been stricter about screen time or if you pushed them too hard or if there was something in your family history that made them vulnerable. You carried guilt that was never yours to carry.
What happened to your child was not random and it was not your fault. It was the result of deliberate design choices made by companies that had research showing those choices would harm children. They knew that teenage girls would develop eating disorders from comparing themselves to filtered images. They knew that adolescents would become compulsively dependent on likes and comments and views. They knew that vulnerable teenagers would be pushed toward content about self-harm and suicide. They knew all of this and they made billions of dollars anyway. That was the business decision. Your child paid the price.