Meta Warns It May Exit New Mexico

Meta Platforms is threatening to stop offering Facebook, Instagram and WhatsApp in New Mexico if state Attorney General Raúl Torrez convinces a judge to issue an injunction that would force the company to significantly revise its apps.

Among other proposals, Torrez wants a judge to order Meta to deploy "effective" age verification, restrict encryption for minors, post prominent warning labels and revise its recommendation algorithms.

Meta said in court papers filed this week that many of Torrez's proposed terms "are so hopelessly vague or ambiguous" that enforcing them would violate its right to due process.

The company added: "Many requests are technologically or practically infeasible and would essentially force Meta to build entirely separate apps for use only in New Mexico. Therefore, granting this onerous relief could compel Meta to entirely withdraw Facebook, Instagram and WhatsApp from the state as the only feasible means of compliance."

advertisement

advertisement

Meta also said many of Torrez's requests would violate "basic principles" of Section 230 of the Communications Decency Act (which protects companies from liability over content posted by users) and the First Amendment.

Torrez on Thursday called Meta's threat to leave the state a "PR stunt."

"We know Meta has the ability to make these changes," Torrez stated."This is not about technological capability. Meta simply refuses to place the safety of children ahead of engagement, advertising revenue, and profit.”

The arguments come in a lawsuit brought by Torrez in 2023, when he alleged that Meta "knowingly exposes children to the twin dangers of sexual exploitation and mental health harm."

His complaint included allegations that Facebook and Instagram "are a breeding ground for predators who target children for human trafficking, the distribution of sexual images, grooming, and solicitation."

He added that Meta allows adults to groom underage users by giving adults "unfettered access" to children.

Meta also used design features such as automatically playing videos even though the company supposedly "knew that design features fostered addiction, anxiety, depression, self-harm, and suicide among teens and preteens," according to the complaint.

In March, a Santa Fe jury found Meta liable for violating a state consumer protection law, and ordered the company to pay $375 million in fines. Meta has said it plans to appeal the jury verdict.

The presiding judge is scheduled to commence a hearing Monday on matters including whether to declare Meta a "public nuisance," and whether to issue an injunction that would force the company to revise its apps.

The proceeding in New Mexico is just one of numerous cases under way against social platforms. Meta, YouTube, TikTok and other platforms currently face complaints in federal and state courts throughout the country over allegations that they addict young users and then serve them with harmful content. 

The tech companies typically argue they're protected by Section 230 of the Communications Decency Act as well as the First Amendment. But the people who filed suit counter that many of their claims focus on design features such as algorithmic recommendations and automatically playing videos -- not the content itself.

The Supreme Court hasn't yet ruled on whether Section 230 protects publishers' choices about recommendations to users and other design features.

Next story loading loading..