Meta Under Fire After Project MYST Reveals Teen Social Media Risks

Basil Igwe
6 Min Read
Meta Project MYST study on teen social media risks discussed in court.
Add us on Google
Add as preferred source on Google

In a courtroom in Los Angeles, the debate over teenage screen time has moved beyond family living rooms and into the legal system. What was once framed as a parenting challenge is now being tested as a question of corporate responsibility, internal research, and product design.

At the centre of the case is an internal research effort at Meta known as “Project MYST,” conducted in partnership with the University of Chicago. The study examined how teenagers engage with social media and whether parental supervision meaningfully reduces compulsive use. According to testimony presented in court, the findings suggested that common forms of parental control – time limits, supervision, and restricted access- had little measurable impact on whether teens used social platforms compulsively.

The study surfaced during a social media addiction trial underway in Los Angeles County Superior Court. The plaintiff, identified as “KGM” or Kaley, alleges that major platforms created products that were “addictive and dangerous,” contributing to anxiety, depression, body dysmorphia, eating disorders and self-harm. The lawsuit originally named Meta, YouTube, ByteDance, and Snap, though ByteDance and Snap settled prior to trial.

What distinguishes this case from earlier public criticism is the use of internal research to question how much companies understood about user behaviour. Project MYST, short for Meta and Youth Social Emotional Trends, surveyed around 1,000 teens and their parents. According to testimony, researchers concluded that “parental and household factors have little association with teens’ reported levels of attentiveness to their social media use.”

In practical terms, this suggests that even when parents attempt to manage usage through rules or in-app controls, those efforts may not significantly change a teenager’s ability to moderate consumption. The study reportedly found alignment between parents’ and teens’ responses, indicating no clear correlation between supervision and reduced compulsive behaviour.

The economic and structural implications extend beyond one family’s experience. Social media platforms operate on engagement-based models. Time spent on an app translates into advertising inventory, data collection, and revenue stability. Algorithmic feeds, notifications, and design features that encourage repeated interaction are not incidental; they are tied to business performance.

Plaintiffs argue that these systems are engineered to sustain attention through behavioural reinforcement mechanisms. Defense attorneys counter that high engagement does not equal addiction and that user well-being cannot be separated from broader life conditions. During testimony, Instagram head Adam Mosseri said he was not familiar with the details of Project MYST, though court documents suggested he had approved moving forward with it. He noted that the company uses the term “problematic use,” defined as spending more time on Instagram than one feels comfortable with, rather than labelling behaviour as addiction.

Another key finding introduced during the trial concerned teenagers facing stressful or adverse life experiences. According to the study, teens dealing with issues such as family instability, harassment, or other trauma were more likely to struggle with moderating their social media use. That dynamic complicates the question of causation. If social platforms become a form of escape from difficult circumstances, the platform is both a coping mechanism and a potential amplifier of harm.

For regulators and policymakers, this distinction matters. If compulsive usage is primarily driven by external stressors, regulatory focus may lean toward mental health infrastructure and parental education. If product architecture materially worsens compulsive behavior, scrutiny could shift toward design standards, age-based restrictions, or algorithmic transparency.

The trial unfolds against a broader wave of litigation targeting social media companies over alleged harms to minors. Outcomes this year could shape how platforms design youth experiences, how parental controls are marketed, and how internal research is disclosed. The fact that Project MYST was not publicly published, and that no formal warnings emerged from its findings, has become part of the legal argument about transparency.

For companies, the stakes are structural. Youth engagement represents a significant portion of long-term user acquisition. At the same time, heightened regulatory oversight could impose compliance costs, product redesign requirements, and reputational risk. For families, the case challenges the assumption that stricter rules at home alone can offset digital incentives engineered at scale.

As the jury weighs testimony and internal documents, the courtroom becomes a testing ground for a larger question: where does responsibility sit in an economy built on attention? Whether the verdict places greater weight on corporate design or parental oversight, the implications will likely extend beyond this single case, influencing platform governance, investor expectations, and the evolving architecture of youth digital life.

Share This Article
Follow:
Basil’s core drive is to optimize workforces that consistently surpass organizational goals. He is on a mission to create resilient workplace communities, challenge stereotypes, innovate blueprints, and build transgenerational, borderless legacies.
notification icon

We want to send you notifications for the newest news and updates.

Enable Notifications OK No thanks