You noticed it first in small ways. Your daughter stopped coming to dinner without being called three times. Your son began sleeping with his phone under his pillow, waking throughout the night to check notifications. The grades slipped. The friendships changed. Then came the harder things: the refusal to eat in front of others, the marks on their arms they tried to hide, the withdrawal into a sadness so deep it seemed to have no bottom. When you finally got them to a therapist, the diagnosis felt both like relief and devastation. Major depressive disorder. Anxiety disorder. In some cases, diagnoses you never imagined could touch your child: body dysmorphia, suicidal ideation, active self-harm. The doctor asked about screen time, about social media use, and you felt a spike of something—guilt, maybe, or recognition—but the doctor moved on quickly. Mental health is complicated, they said. Adolescence is hard. There are many factors.
You thought about taking the phone away a hundred times. You made rules about dinner, about bedtime, about homework. But the meltdowns that followed felt out of proportion to the request. Your child became a different person without the device—angry, desperate, panicked. You told yourself it was just their generation, just how kids communicate now. You worried you were being too controlling, too out of touch. When things got worse, you looked inward. What had you missed? What had you done wrong? The guilt was enormous. You replayed years of parenting decisions, wondering which moment was the one where you lost them. The possibility that this was not your fault, that this was something done to your child by design, probably never entered your mind. Why would it?
But there is a documented record. Internal research. Company emails. Presentations to executives who then chose not to act. Studies that showed harm and were buried. Algorithms that were tweaked specifically to increase the amount of time young users spent on platforms, even when engineers knew that time was causing psychological damage. This was not an accident. This was not an unfortunate side effect discovered too late. The companies knew. They had the data. They ran the numbers. And they made a business decision.
What Happened
The injury is not something you can see on an X-ray or measure with a blood test, but it is no less real. Young people—mostly between the ages of 10 and 25—began showing patterns of psychological distress that correlated directly with their social media use. Depression that went beyond normal teenage mood swings. Anxiety that became so severe it interfered with their ability to go to school, see friends, or function in daily life. Eating disorders triggered by endless exposure to filtered, perfected bodies and the metrics of likes and comments that turned self-worth into a number. Self-harm that started as a way to manage unbearable feelings and sometimes progressed to suicidal thoughts or attempts.
These young people describe feeling trapped. They know the apps make them feel worse. They delete them and reinstall them hours later. They set time limits and blow past them. They feel anxious when they are on the platforms and anxious when they are off them, wondering what they are missing, whether anyone has noticed their absence, whether their social standing is slipping while they are away. The compulsion is not a matter of willpower. It feels neurological, automatic. They reach for their phones without thinking. They scroll without seeing. Hours disappear.
Parents describe children who were outgoing and confident becoming withdrawn and self-critical. Kids who loved their bodies suddenly refusing to be seen in swimsuits or photos. Teens who had friends and hobbies becoming isolated, their entire social life mediated through a screen. The change often happens gradually, then suddenly. A tipping point where the child you knew seems unreachable. For some families, this ends in hospitalization. For others, in tragedy.
The Connection
Social media platforms are engineered to be addictive. This is not metaphor. The features that define these apps—infinite scroll, autoplay, push notifications, likes, streaks, algorithmic feeds—were designed using research into human psychology and neuroscience specifically to maximize the time users spend on the platform. The business model depends on attention. The more time a user spends, the more ads can be served. For adult users, this creates problems. For adolescent users, whose brains are still developing, the effects are more severe.
The adolescent brain is uniquely vulnerable to the mechanisms of social media addiction. The prefrontal cortex, which governs decision-making and impulse control, is not fully developed until the mid-twenties. Meanwhile, the limbic system, which processes reward and emotion, is hyperactive during adolescence. This creates a neurological imbalance. Adolescents are wired to seek social feedback and novelty, and they are less equipped to regulate those impulses. Social media platforms exploit this vulnerability. Every like is a hit of dopamine. Every notification triggers the same neural pathways as gambling or drug use. The intermittent reinforcement—not knowing when the next like or comment will come—is the same mechanism that makes slot machines addictive.
A 2017 study published in the journal Psychological Science showed that adolescents who spent more time on social media had significantly higher rates of depression. The researchers controlled for baseline mental health, meaning the depression followed the social media use, not the reverse. A 2019 study in JAMA Psychiatry followed over 6,000 adolescents and found that increased social media use predicted increases in depressive symptoms over time. A 2020 study in the Journal of Abnormal Psychology documented a sharp rise in major depressive episodes, self-harm, and suicide among adolescents between 2009 and 2017—the years when smartphone use and social media became ubiquitous—with the steepest increases among girls.
The harm is not just about time spent. It is about what happens during that time. Social comparison is a feature, not a bug. Platforms surface content designed to provoke engagement, which often means content that provokes envy, inadequacy, or outrage. For young people, especially girls, this means constant exposure to images of physical perfection, curated lives, and social hierarchies made visible and quantifiable. The feedback loop is immediate and public. A photo that gets few likes becomes evidence of social failure. A comment thread can turn into public humiliation. The pressure to perform a perfect life becomes overwhelming. For many young users, the result is anxiety, depression, disordered eating, and self-harm.
What They Knew And When They Knew It
In 2017, Facebook—now Meta—conducted internal research that directly examined the mental health effects of Instagram on teenage users. The research was never published. It was revealed in 2021 when whistleblower Frances Haugen leaked thousands of internal documents to the Wall Street Journal and the Securities and Exchange Commission. Those documents showed that Facebook knew Instagram was harmful to a significant percentage of teenage users, particularly girls. One internal presentation stated that 32 percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse. Another slide noted that among teens who reported suicidal thoughts, 13 percent of British users and 6 percent of American users traced the issue to Instagram.
The documents showed that Facebook researchers knew Instagram exploited young users in specific ways. One internal report from 2019 stated that social comparison was worse on Instagram than on other platforms, and that it was a feature of the design. The report noted that Instagram focuses on bodies and lifestyles, which are domains where social comparison is particularly damaging. Another document showed that Facebook knew its algorithms were directing users, including minors, toward harmful content. Researchers found that accounts interested in extreme dieting were quickly pushed toward anorexia content. The company knew this and did not change it.
Facebook also conducted research on what it called "problematic use"—the internal term for addiction. A 2018 internal study found that 12.5 percent of teen Instagram users and 13.5 percent of young adult users described their use as "compulsive" and said the platform was harming their sleep, work, relationships, or parenting. The researchers concluded that the platform created "addiction-like behaviors" and that these behaviors were not accidental but were the result of design choices intended to maximize engagement. Despite this knowledge, the company not only failed to address the problem but continued to develop features like infinite scroll and algorithmic recommendations that made the problem worse.
TikTok has been similarly aware of the risks its platform poses to minors. Internal documents obtained during litigation show that TikTok executives were briefed on the addictive nature of the app and its potential to harm young users. A 2018 internal memo described the optimal session length and daily usage time needed to form a "habit loop" in users. Company engineers calculated that users who watched the app for 260 minutes in a single day were likely to become habitual users. The company tracked this metric and designed features to push users toward this threshold.
In 2020, TikTok commissioned research on compulsive use and mental health risks. The findings were shared with executives. The research showed that extended use, particularly among teenagers, was associated with anxiety, depression, and body image issues. Despite this, TikTok continued to market the app to children and teenagers, and continued to refine the algorithm to maximize watch time. Internal communications show that when engineers raised concerns about the potential for harm, they were told that engagement was the priority.
Snapchat has also been the subject of internal scrutiny. Documents from litigation reveal that Snap Inc. studied the addictive potential of its features, particularly streaks—the feature that rewards users for sending snaps to the same person on consecutive days. Internal research showed that streaks created anxiety in users, particularly young users, who felt compelled to maintain them even when it interfered with sleep, school, or other activities. Snap knew that losing a streak caused distress, and the company used that distress to drive engagement. A 2019 memo noted that streaks were a "key driver of daily active use" among teens and that the feature was particularly effective at creating habitual behavior.
These companies did not just passively observe harm. They studied it, quantified it, and then made a business decision. The decision was to prioritize growth and engagement over the well-being of young users. Executives were presented with data showing that their platforms were causing psychological harm to children and teenagers, and they chose not to act. In some cases, they doubled down, refining the very features that were causing harm.
How They Kept It Hidden
The strategy for concealing harm was multipronged. First, the companies did not publish their internal research. Studies that showed negative mental health effects were kept confidential, shared only with executives and select teams. When external researchers requested data to study the platforms, the companies denied access or imposed restrictive terms that made independent research nearly impossible. This allowed the companies to control the narrative. Public statements emphasized the positive aspects of connection and community, while the data showing harm remained internal.
Second, the companies funded external research that supported their preferred narrative. Meta, TikTok, and Snapchat have all provided grants to academic researchers studying social media and mental health. The terms of these grants often included provisions that gave the companies influence over publication decisions or advance access to findings. This created a body of industry-friendly research that could be cited in public statements and regulatory proceedings. When independent researchers published findings that contradicted this narrative, the companies issued public rebuttals and pointed to the studies they had funded as evidence that their platforms were safe.
Third, the companies lobbied aggressively against regulation. Meta spent over $20 million on federal lobbying in 2021 alone. Much of that spending was directed at efforts to block legislation that would restrict data collection on minors, require transparency in algorithmic recommendations, or impose safety standards for features aimed at young users. TikTok and Snapchat also increased their lobbying expenditures significantly between 2018 and 2022, focusing on the same issues. The companies hired former government officials, built relationships with key legislators, and framed regulation as a threat to innovation and free speech.
Fourth, the companies settled early cases quietly. When parents sued over harms to their children, the companies offered settlements with strict nondisclosure agreements. These agreements prevented plaintiffs from discussing the terms of the settlement or the evidence that had been disclosed during litigation. This kept damaging internal documents out of the public eye and prevented other potential plaintiffs from learning about the scope of the companies' knowledge. It was only when whistleblowers came forward and when coordinated litigation efforts forced mass document disclosure that the full picture began to emerge.
Why Your Doctor Did Not Tell You
Most pediatricians and mental health professionals were not aware of the documented internal research showing that social media platforms were designed to be addictive and were causing harm to adolescents. The companies did not share that research with the medical community. What doctors saw was an increase in adolescent depression, anxiety, eating disorders, and self-harm beginning around 2010 and accelerating through the 2010s. They hypothesized about causes—academic pressure, helicopter parenting, decreased outdoor play—but they did not have access to the data showing that social media companies had intentionally designed their platforms to maximize compulsive use in vulnerable populations.
Medical literature on social media and mental health was mixed. Some studies showed correlations between social media use and poor mental health outcomes, but correlation is not causation. Other studies, including some funded by the platforms themselves, showed neutral or even positive effects. Without access to the internal research, doctors could not tell their patients with certainty that the platforms were causing harm. They could suggest moderation, recommend screen time limits, and treat the symptoms, but they could not give parents the full picture: that their children were the targets of a sophisticated, intentional effort to create compulsive behavior for profit.
The medical profession also tends to individualize mental health problems. Doctors are trained to look for risk factors in the patient—genetics, family history, trauma—not to question whether an entire industry might be causing harm. This is not a failing of individual doctors. It is a structural issue. The information asymmetry was deliberate. The companies had the data and chose not to share it. By the time independent research and leaked documents made the connection clear, millions of young people had already been harmed.
Who Is Affected
If your child or you yourself as a young person used Instagram, TikTok, Snapchat, or similar platforms regularly during adolescence, and subsequently developed depression, anxiety, an eating disorder, engaged in self-harm, or experienced suicidal thoughts, you may be among those affected. The pattern typically involves use that began between the ages of 10 and 18, continued over months or years, and coincided with or preceded the onset of mental health symptoms.
The harm is not limited to those who used the platforms for extreme amounts of time, though heavier use is associated with worse outcomes. Even moderate use—an hour or two a day—has been linked to increased mental health risks in adolescents. What matters is not just the quantity of time but the nature of the use: passive scrolling through feeds, engagement with appearance-focused content, social comparison, and exposure to cyberbullying or harassment.
Girls and young women appear to be disproportionately affected, particularly with regard to body image issues and eating disorders. Instagram and TikTok both surface large amounts of content related to appearance, weight, and beauty standards, and both platforms have been shown in internal and external research to increase body dissatisfaction and disordered eating behaviors in teenage girls. However, boys and young men are also affected, particularly with regard to compulsive use, anxiety, and depression.
If your child was hospitalized for mental health reasons, if they have been in treatment for depression or anxiety that began in adolescence, if they developed an eating disorder or engaged in self-harm, and if they were regular users of these platforms during the relevant period, the connection is worth examining. Many families have been told that the mental health crisis was just adolescence, just genetics, just bad luck. The internal documents tell a different story.
Where Things Stand
As of 2024, hundreds of lawsuits have been filed against Meta, TikTok, Snapchat, and other social media companies on behalf of minors and young adults who suffered mental health harm. These cases have been consolidated into multidistrict litigation in the Northern District of California. The litigation is ongoing. Plaintiffs include individual families as well as school districts, which have sued to recover costs associated with the adolescent mental health crisis.
The cases are in the discovery phase, which means that internal documents are being disclosed and depositions are being taken. This process has already yielded significant revelations, including the internal research described earlier. The companies are fighting to keep many documents sealed, arguing that they contain trade secrets or proprietary information. Plaintiffs are pushing for full transparency.
In October 2023, dozens of state attorneys general filed lawsuits against Meta, alleging that the company knowingly designed Instagram to be addictive to children and mislead the public about the safety of the platform. Similar actions have been filed against TikTok in multiple states. These government cases add significant pressure and resources to the overall litigation effort. They also increase the likelihood that damaging internal documents will become part of the public record.
No global settlement has been reached. Some individual cases have settled under confidential terms, but the vast majority of plaintiffs remain in active litigation. The timeline for resolution is uncertain. Multidistrict litigation of this scale typically takes years. However, the legal landscape is increasingly unfavorable to the defendants. The internal documents are clear. The harm is documented. The question is no longer whether the companies knew, but what consequences they will face.
New cases are still being filed. If you believe your child or you yourself were harmed by social media platform design, there is still time to pursue a claim. The statute of limitations varies by state and by the age of the person affected, but in many jurisdictions, the clock does not start until the harm is discovered or until the plaintiff learns that the harm was caused by corporate conduct rather than by individual factors. Given that the internal documents only became public in 2021 and later, many potential plaintiffs are still within the limitations period.
What happened to your child was not your fault. It was not their fault. It was not bad parenting or weak character or adolescent drama. It was the result of a business model that required compulsive use to generate profit, and a series of corporate decisions to pursue that profit even after the harm became clear. You are not imagining the change you saw in your child. You are not overreacting. The internal documents show that what you observed in your home was happening in millions of homes, and the companies knew it.
The path forward is not about blame. It is about accountability. These platforms can be designed differently. Features that exploit adolescent psychology can be removed. Algorithms can be tuned to reduce harmful content rather than amplify it. Age verification can be implemented. Transparency can be required. But none of those changes will happen without pressure. The lawsuits are part of that pressure. They force disclosure. They create consequences. They say that the business decision to prioritize profit over the mental health of children was unacceptable. Your story matters. What happened to your family matters. And you are not alone.