Tech Platforms Must Face Claims Over Racially Motivated Mass Shooting

Victims of a racially motivated mass shooting in Buffalo can proceed with a lawsuit alleging that YouTube, Reddit and other platforms indoctrinated the shooter by showing him posts about the “replacement theory," a state judge has ruled.

In two decisions issued this week, Erie County Supreme Court Justice Paula Feroleto rejected the tech companies' arguments that the case should be dismissed at an early stage under both Section 230 of the Communications Decency Act -- which broadly protects companies from lawsuits over users' content -- and the First Amendment.

Instead, Feroleto said the victims could attempt to argue that the companies are liable on the theory that their products are dangerously defective.

The rulings come as hundreds of social media users and their families throughout the country are suing tech platforms over supposedly addictive qualities. The plaintiffs in those cases also allege that tech platforms are dangerously defective -- a legal theory traditionally associated with injuries caused by physical products such as cars or pharmaceuticals. Meta, X (formerly Twitter), TikTok and Snap are among other the companies currently facing lawsuits under that theory.

advertisement

advertisement

Feroleto's ruling grows out of litigation over the May 2022 mass shooting at Tops Friendly Markets in Buffalo by teenage white supremacist Payton Gendron.

Victims and family members alleged in a complaint filed last year that platforms including YouTube and Reddit were “instrumental” in the shooting because they allegedly radicalized Gendron, and allowed him to learn about weaponry.

“Their unreasonably dangerous and negligent design choices resulted in the shooter’s addiction to their products, and caused him to develop the mentality required to target and kill Black people who were innocently shopping at their local market,” victims alleged in a complaint filed last August. “In addition to facilitating the shooter’s radicalization, the design of these social media platforms provided the shooter with knowledge regarding the tools, products, and skills he needed to commit the mass shooting at Tops.”

The plaintiffs added that Gendron's “near-constant use of social media” led him to believe in the racist “great replacement” conspiracy -- which includes the theory that white people in the United States are being replaced by non-white immigrants.

A separate complaint, also filed last year, alleges that Gendron became a “problematic user” of social media due to “dangerously defective and unreasonably dangerous algorithms powering Instagram, YouTube, and Snapchat.”

The tech companies urged Feroleto to throw out the case at an early stage for numerous reasons.

Among others, they said the First Amendment protects their right to host content, no matter how objectionable.

“Similar cases alleging that the dissemination of harmful speech -- including television programming, movies, video games, and online content -- helped cause violent crimes have been uniformly dismissed,” Google argued in papers filed last year.

“Even ideologically odious speech like what Gendron allegedly viewed is constitutionally protected, as are videos that provide information about topics like firearms and body armor,” the company added.

The tech companies also urged the judge to dismiss the case under Section 230 of the Communications Decency Act -- a 1996 law that immunizes tech platforms from lawsuits over users' speech.

“No matter the label plaintiffs attach to their claims, they seek to do exactly what Section 230 forbids: hold internet service providers like YouTube liable for publishing allegedly objectionable or injurious content created by third parties,” Google argued.

Feroleto discounted that argument for now, writing that the plaintiffs contend the platforms “are more than just message boards containing third-party content.”

“Plaintiff alleges they are sophisticated products designed to be addictive to young users and they specifically directed Gendron to further platforms or postings that indoctrinated him with 'white replacement theory,'” she wrote.

Other judges have reached contrary results. For instance, in 2009 a federal judge in New York dismissed a lawsuit against Craigslist by a shooting victim who claimed Craigslist was negligent for facilitating a gun sale to someone with a long history of mental illness. In that case, the judge said Craigslist was protected by Section 230.

A YouTube spokesperson said the company plans to appeal Feroleto's ruling. The spokesperson added that the company “has invested in technology, teams, and policies to identify and remove extremist content,” and “will continue to work with law enforcement, other platforms, and civil society to share intelligence and best practices."

Reddit also plans to appeal, a spokesperson said. The spokesperson added that the platform prohibits content that promotes “hate based on identity or vulnerability, as well as content that encourages, glorifies, incites, or calls for violence or physical harm against an individual or group of people,” and is “constantly evaluating ways to improve our detection and removal of this content.”

Next story loading loading..