
It’s one thing to talk about platform harm in panels,
reports, and headlines. It’s another to sit in a courtroom and watch the story of social media’s impact on young people get translated into legal arguments, evidentiary rules, and sworn
testimony.
This week, Nicki Petrossi, host of the “Scrolling 2 Death” podcast, had one of the very scarce seats in the courtroom in Los Angeles as the first of the social media
harms cases moved forward.
She wasn’t there for the optics. She has been tracking this space for years, following the shift from research and advocacy into litigation.
As
Petrossi put it, when earlier policy efforts stalled, her focus changed: “I quickly realized, like, no matter what we do… we needed laws. We need help from our lawmakers to help protect
our children from predatory companies.”
advertisement
advertisement
When that route failed to move quickly enough, she described a pivot toward the courts: “When that failed -- for many different
reasons… I realized, maybe lawsuits are where it’s at. Maybe some of these big legal tactics can make a difference.”
That is the context for why being in the courtroom now
matters. This is not theoretical. This is the arena where product design choices, internal knowledge, and claims of responsibility get tested under oath. As Petrossi put it bluntly, “This is the
first time that Mark Zuckerberg was under oath in a court of law having to answer questions about his company and what it does to kids. That is groundbreaking history.”
What kinds of
cases are actually moving forward
This is not a single lawsuit. It is the leading edge of a much larger wave of litigation aimed at social media companies.
Petrossi put the scale and
structure plainly: “We have thousands of families in this consolidated grouping of cases. This is not class action. So the cases are going to be tried one by one.”
What is
unfolding now includes:
Individual youth harm cases
These are cases brought by young plaintiffs who allege that platform design contributed to measurable harm, including anxiety,
body image issues, depression, and suicidal ideation. The argument is not about one viral post. It is about cumulative exposure and product design.
Coordinated state court cases (JCCP in
California)
California has consolidated many youth social media harm cases into a Judicial Council Coordinated Proceeding. The case now in trial is one of several “bellwether”
cases meant to test arguments, evidence, and legal theories that will shape how hundreds of other cases proceed.
Federal cases
Separate from the California state actions, there
are federal cases moving through the courts that raise overlapping questions about product liability, duty of care, and corporate responsibility.
Actions by attorneys general and school
districts
States and school systems are bringing their own suits, arguing that platforms have contributed to widespread harm that now shows up in public health systems, schools, and
community services.
Taken together, this is not a symbolic fight. It is an attempt to build legal pressure from multiple directions at once, rather than betting everything on one plaintiff or
one venue.
What the legal strategy is actually targeting
One of the most important shifts in this litigation is what is being put on trial. The focus is not on individual pieces of
content, but on product design and corporate decision-making.
Petrossi described this directly in terms of intent and concealment: “The decisions they were making… to addict
children, to harm children, not telling the public about it.” In her framing, this was not accidental. “The intention of the company was to prey on teens… exploit them so they can
make greater profits. And that was done intentionally, not by accident.”
That framing is central to the legal strategy being advanced by teams working with the Social Media Victims Law
Center (SMVLC) and others. In practice, that strategy includes:
Moving away from “bad content” arguments
The cases are not about whether a specific video or post
was harmful. They are about whether the architecture of the platforms themselves creates foreseeable risk.
Centering design features
Infinite scroll, autoplay, algorithmic
recommendation systems, notifications, streaks, and engagement loops are being scrutinized as design choices that shape behavior. This keeps the legal focus on product liability and duty of care, not
on speech.
Establishing foreseeability and knowledge
A key question is what companies knew about the risks to young users, when they knew it, and what they chose to do with that
information.
Using early trials to set patterns
The initial cases serve as testing grounds for what evidence lands, how juries respond to arguments about design versus content,
and where companies are most legally vulnerable.
This is slow, technical work. It does not look like cultural reckoning. It looks like motions, objections, expert testimony, and document
review. But this is how accountability gets built in practice.
What being in the courtroom changes
Petrossi’s presence in the room matters because so much of this story is
flattened when it is filtered through headlines. In the room, you see how much time is spent arguing over what the jury is even allowed to hear, how platform representatives and expert witnesses lean
on neutral product language to describe systems that many families experience as harmful, how young people’s lived experiences get translated into clinical and legal categories, and how much of
the fight is not about whether harm exists at all, but about whether it can be legally attributed to design choices.
Petrossi also made clear that the most painful phase of the trial is still
ahead. As the proceedings move from plaintiff witnesses to the defense phase, the tone in the courtroom is expected to shift sharply. As she put it “what people can expect to see is finalizing
of the plaintiff witnesses through the next week or two, and then we’re gonna transfer over to defendants calling witnesses. And then it’s gonna get ugly…”
Why this
moment matters beyond this one case
We should not pretend that one trial will reform the social media industry. Even a plaintiff victory does not undo business models built around attention
and growth. But something meaningful is happening here.
For the first time at this scale, platform design is being interrogated in a setting where internal documents, testimony, and expert
analysis become part of a public record, with legal consequences. Discovery opens files that PR strategies cannot close. Testimony locks narratives into sworn statements. The story of how these
systems were built, and what their creators understood about the risks, becomes harder to rewrite after the fact.
For families and young people, this matters because it reframes harm as
something that warrants formal accountability, not just cultural debate. It does not resolve the problem. But it moves responsibility out of the realm of unintended side effects and into the realm of
foreseeable outcomes of deliberate design.
For those of us working in media, tech accountability, and youth safety, this is a reminder that change rarely arrives as a single dramatic moment.
It comes through long, often tedious pressure across courts, research, organizing, and public narrative. Petrossi being in that courtroom is one small but real piece of that broader ecosystem of
accountability.
Petrossi also put it plainly when thinking beyond this single verdict: “To me, we will win in the court of public opinion around this case regardless of the
outcome.”
You can watch a full interview with Petrossi here.