Meta and Google Ordered to Pay Millions in Damages in Social Media Addiction and Explicit Content Cases

29 March, 2026

Written by : Asaf Tenenbaum

Is the US Moving Away from Traditional Section 230 Protections Toward a European-Style DSA Liability Model?

On March 26, 2026, a jury decision in the matter P.F., et al. (K.G.M.) v. Meta Platforms, Inc., et al. found Meta and Google liable and ordered them to pay 6 million USD over claims that their platforms’ addictive design features caused harm. The plaintiff, a woman known as Kaley or KGM, now 20, sued the two tech conglomerates and argued that excessive social media use on the defendants’ platforms starting at age 10 had caused her to develop dysmorphia, as well as suffer anxiety, depression and suicidal ideation.

It follows closely in the steps of a related jury decision just 2 days before (March 24), in the case of New Mexico v. Meta Platforms, Inc., which found Meta liable and ordered it to pay penalties and compensation totaling 375 million USD for the way in which its platforms endangered children and exposed them to sexually explicit material and contact with sexual predators (while misleading consumers into believing the platforms were safe for children).

These two cases are considered bellwether cases for thousands of similar lawsuits currently pending in US courts. They mark a departure from previous litigation against big social media services, which have for years enjoyed broad protections under Section 230 of the Communications Decency Act (1996) against liability for third-party user content. These new lawsuits promoted a different theory: rather than challenging specific protected third party content featured on the platforms, plaintiffs focused on platform product design (e.g. infinite scroll, recommendation algorithms and push notifications) and other non-speech issues, such as internal decisions about content moderation and curation, as the cause of harm.

These decision may indicate a shift in the US online liability framework toward the EU’s Digital Services Act (DSA) model which expects Very Large Online Platforms (VLOPs) to take a proactive approach in mitigating harm to users, including by identifying and reducing systemic risks linked to how their services operate, including with respect to disinformation, illegal content, and cyber violence.

Want to know more?
Contact us

Shiri Menache

Head of Marketing and Business Development

Matan Bar-Nir

Press Officer, OH! PR