Mark Zuckerberg testified in a Los Angeles jury trial over claims Meta’s platforms harm children
Internal documents highlighted teen engagement goals while Meta denied social media addiction.
The case could influence thousands of lawsuits and future youth safety regulations online.
For the first time, Meta CEO Mark Zuckerberg appeared before a jury in court on Wednesday, 18 February 2026, in Los Angeles. He will be defending Meta Platforms regarding the lawsuit that questions the addictive and harmful nature of the algorithm for young users.
Major social media companies were accused in the high-profile trial, which concerns the mental health of minors using them. The platforms that come under the trial are big names like YouTube, Instagram, TikTok and Snapchat. There have been thousands of similar lawsuits all over the United States which would be influenced by this “bellwether” case.
The lead plaintiff’s lawyers argued about the struggle of the companies to enforce age limits across social platforms like Instagram, Facebook and WhatsApp through internal communications. The company further could not openly claim it was doing what was necessary to protect minors, which was decoded from an email of former Meta global affairs head Nick Clegg. This gave rise to concerns related to the age restriction under 13, which was found to be “unenforced”.
The official policies suggest a ban on under-13 users on social media, but the presentation showcased the companies were making efforts to retain “tweens” on the platform. Mark Zuckerberg said that he believed his company would eventually reach the “right place over time” as he expressed regret for not improving age restriction earlier. Mark even gave examples of his family, especially his wife Priscilla Chan, who uses child-focused services. He clarified that they are considering regulating products for younger users, dismissing the argument presented by the plaintiff’s attorney as “mischaracterising” internal discussions.
There were allegations of the company intentionally using algorithms to increase engagement among teenagers. There were emails from 2015 and 2017 in which Zuckerberg set goals to increase time spent on the platform and some internal notes that put teens as the company’s top priority. Zuckerberg did not deny the earlier growth-focused targets but later also clarified that now the company has changed its way of operating. He testified, “If something is of value, people tend to use it more.”
Earlier in the trial, Adam Mosseri also defended the platform, challenging the concept of addiction by terming it “problematic” rather than clinical addiction. Much research has found some links between compulsive use and negative mental health outcomes, but psychologists are yet to classify social media addiction as a formal diagnosis.
The case was first directed at a 20-year-old woman identified as K.G.M., who raised allegations of worsening depression and suicidal thoughts due to compulsive use of Instagram and YouTube. Plaintiffs put forward the claim that companies are aware of the harmful effects and yet design platforms in a way that maximises engagement.
On the other hand, company attorney Paul Schmidt defended its services as not being the substantial cause of the mental-health struggles faced by the plaintiff. The case also revolves around the question of whether the platform should face liability for user-generated content rather than just targeting the platform design. Earlier, platforms were shielded from this kind of responsibility through traditional legal protection.
The case comes amidst growing global scrutiny and discussion related to youth social media use. Australia has introduced a law banning social media accounts for those under 16, while several other countries like the United Kingdom, Denmark, France and Spain are considering similar restrictions. There are also 29 U.S. state attorneys general who have asked the federal court in California to take steps to remove accounts belonging to children under 13 before trial from social media platforms.
The case will proceed for some time now, with several potential testimonies planned, such as those of Meta employees and industry executives that could help better understand and examine the measures taken by platforms to safeguard minors. The case would decide an outcome that could shape liability standards, result in large financial settlements, and force platforms to change their design as part of broader litigation.
Suggested Reading: