Meta Accused of Burying Internal Research Showing Instagram and Facebook Harm Mental Health

Villpress Logo Icon
Villpress Insider
Villpress Logo Icon
Staff @Villpress
The Villpress Insider team is a collective of seasoned editors and industry experts dedicated to delivering high-quality content on the latest trends and innovations in business,...
4 Min Read

Newly unsealed court filings reveal that Meta shut down internal research after discovering strong evidence that Facebook and Instagram could be harming users’ mental health. The findings, part of an internal initiative known as Project Mercury, showed that people who deactivated Facebook for just one week reported lower depression, anxiety, loneliness, and reduced social comparison. Instead of publishing the results, Meta allegedly halted the project and dismissed the data as biased, despite staff privately affirming its accuracy.

Project Mercury, conducted in partnership with Nielsen, aimed to measure the impact of stepping away from major Meta platforms. According to internal documents obtained during litigation, the results consistently showed measurable improvement in participants’ well-being when they stopped using Facebook. One researcher even warned that suppressing the findings would mirror how the tobacco industry withheld evidence about cigarettes harming consumers.

The allegations surfaced in a filing by law firm Motley Rice, representing more than 1,800 plaintiffs, including schools, parents, children, and several state attorneys general. The suit forms part of a large multidistrict litigation in the Northern District of California against Meta, Google, Snapchat, and TikTok, accusing the companies of concealing known dangers from the public.

Internal communications included in the filing show that senior Meta executives, including Nick Clegg, were aware of the research outcomes. Employees reportedly assured Clegg that the findings were valid, contradicting Meta’s later public claims that the results were compromised by “existing media narratives” about the company’s harmful effects.

Other internal documents included in the filings paint a broader picture of Meta prioritizing growth and engagement over safety. Vaishnavi Jayakumar, the former head of Instagram safety and well-being, testified that she was shocked to learn Meta maintained a “17x strike policy” for accounts involved in sex trafficking. Under this system, users could violate rules related to prostitution and sexual solicitation sixteen times before suspension, a threshold Jayakumar described as extraordinarily lenient and unlike anything across the industry.

Additional allegations claim Meta knowingly designed youth safety tools that were ineffective, rarely used, and unable to protect minors from predators. The company recognized that optimizing algorithms for teen engagement increased exposure to harmful content but continued anyway. According to the filings, Meta also delayed rolling out features to protect children from unwanted adult contact due to concerns about slowing growth.

Also Read: Facebook is getting an AI dating assistant

A 2021 text message from CEO Mark Zuckerberg is cited in the documents, alleging he signaled that child safety was not his highest priority, stating he was more focused on building the metaverse.

Despite internal findings showing a causal link between platform use and mental health harm, Meta publicly told Congress that it could not determine whether its services harmed teenage girls. The newly revealed documents contradict that statement.

Meta spokesperson Andy Stone defended the decision to halt Project Mercury, asserting that the research had methodological flaws. He said Meta has spent more than a decade improving safety features, consulting parents, and addressing significant issues. Stone rejected the lawsuit’s claims and maintained that Meta’s safety measures are “broadly effective.”

The latest filing intensifies legal pressure on Meta as courts prepare to evaluate the company’s responsibility for potential harms to youth mental health and online safety. A hearing on the matter is set for January 26 in the Northern District of California, where plaintiffs will argue that Meta knowingly withheld critical research and prioritized growth over user well-being.

TAGGED:
Share This Article
Villpress Logo Icon
Staff @Villpress
Follow:
The Villpress Insider team is a collective of seasoned editors and industry experts dedicated to delivering high-quality content on the latest trends and innovations in business, technology, artificial intelligence, advertising, and more.