“What I thought was that their app was safe for children, because that’s what it says in the app store,” one California parent said of popular social media platforms. “They’re children using these products, so there should be safeguards in place to protect them.”
That concern has now become the centerpiece of a landmark legal battle in California courts, where families are suing major social media companies over claims that their platforms were deliberately designed to hook children and teens—contributing to addiction, depression, anxiety, and other harms.

President Donald J. Trump welcomes Facebook CEO Mark Zuckerberg Thursday, Sept. 19, 2019, to the Oval Office of the White House. (Official White House Photo by Joiyce N. Boghosian)
Judges in California have allowed the lawsuits to proceed, making them among the most closely watched cases nationwide. Plaintiffs argue that companies such as Meta (Instagram and Facebook), Google’s YouTube, TikTok, and Snapchat knowingly engineered features—including infinite scroll, autoplay, personalized recommendations, and frequent notifications—to keep young users engaged, despite knowing the risks.
For decades social media platforms have invoked Section 230 of the Communications Decency Act, a federal law that shields tech companies from liability for third-party content posted by users. But California judges are now signaling that claims focused on platform design—rather than solely on speech—deserve to be heard by juries rather than dismissed on procedural grounds.
Late last year, Los Angeles Superior Court Judge Carolyn Kuhl ruled that at least one case filed by a California plaintiff known as K.G.M., along with related lawsuits, can move forward. The plaintiff—now 19—contends that more than a decade of compulsive social media use left her addicted and caused lasting mental health effects. The lawsuit seeks unspecified monetary damages and asserts that the companies intentionally designed products that harm young users.
As the cases advanced toward trial, some defendants opted to settle rather than face a jury. TikTok reached a yet-to-be-disclosed settlement just hours before jury selection was set to begin, according to people familiar with the negotiations. Snapchat’s parent company, Snap Inc., also reached an undisclosed settlement last week.
The trial—which begins with jury selection Tuesday in Los Angeles Superior Court—is considered a bellwether for hundreds of similar lawsuits pending in California and federal courts. A verdict in favor of the plaintiff could require companies to pay damages or alter their platform designs and could put pressure on other social media firms to settle thousands of related cases nationwide.
Internal company documents cited in the lawsuits have further intensified scrutiny. In one internal memo, a Meta researcher expressed alarm about Instagram’s effects on users:
“Oh my gosh yall IG is a drug,” the user experience specialist allegedly wrote to a colleague, referring to the social media platform Instagram. “We’re basically pushers… We are causing Reward Deficit Disorder bc people are binging on IG so much they can’t feel reward anymore.”
The researcher concluded that users’ addiction was both “biological and psychological,” adding that company leadership understood and exploited the dynamic. “The top down directives drive it all towards making sure people keep coming back for more.”
The lawsuits have drawn parallels to past product liability cases—including litigation against tobacco companies—with plaintiffs arguing that design choices, not just content posted by users, are at the root of the harm alleged.
Meta’s Response and Safety Measures
Meta, which owns Instagram and Facebook, has denied claims that its platforms are purposefully harmful. In a January blog post outlining Meta’s Record of Protecting Teens and Supporting Parents, the company defended its efforts to keep young users safe, noting it has listened to parents and researchers and made “concrete changes” to protect teen users over the past decade. Meta said recent litigation “oversimplifies” the complex factors behind youth mental health and emphasized its ongoing work on safety tools and controls.
Meta pointed to features such as teen accounts with built-in protections, parental supervision tools, content filters, and other safeguards intended to reduce excessive use and help parents manage their children’s experiences online. The company maintains that youth mental health is influenced by many factors beyond social media, including school, family environments, and offline stressors.
Still, plaintiffs argue that internal research and product decisions suggest engagement and growth repeatedly took priority over user well-being—a contention underscored by the internal memo quoted above.
For families involved in the lawsuits, the moment is significant regardless of the eventual verdicts.
“This is the first time families have ever had their right to a day in court,” said Matthew Bergman, attorney with the Social Media Victims Law Center representing plaintiffs in the cases. “This is a historic point.”
Whether juries ultimately hold social media companies liable remains uncertain. But as courtroom testimony unfolds and internal records become public, the trials are already reshaping the legal and public conversation about youth, technology, and accountability—challenging long-standing protections that have shielded Big Tech for decades.
Details are still emerging, and updates continue to come in. This is a developing story, and SW Newsmagazine will continue to cover it.
Discover more from SW Newsmagazine
Subscribe to get the latest posts sent to your email.
















