Louisiana this week urged a federal appellate court to allow officials enforcement of a law requiring social platforms to verify users' ages, and prohibiting platforms from allowing
minors under 16 to create or maintain accounts without parental permission.
U.S. District Court Judge John deGravelles in the Middle District of Louisiana ruled last year that
the statute violates the First Amendment, and barred enforcement against 10 members of the tech organization NetChoice -- Meta Platforms, Nextdoor, Pinterest, Reddit, Snapchat, X, YouTube, Tumblr,
Discord and Amazon's Twitch.
His ruling came in a lawsuit by the tech group NetChoice, which argued the law is unconstitutional.
The organization has brought similar
challenges to social media restrictions in at least eight states including Arkansas, Georgia, Florida, Mississippi, Utah, Ohio, Texas and Tennessee. District court judges have blocked most of those
measures, but state attorneys general have appealed. So far, at least two appellate courts -- the 5th Circuit Court of Appeals and 11th Circuit Court of Appeals -- have sided with state officials and
allowed restrictions in Mississippi and Florida to take effect, at least temporarily.
advertisement
advertisement
DeGravelles said in his ruling that Louisiana officials failed to establish that social media causes health harms to minors. He also said that even if the state had proven a causal
connection the law would still violate the First Amendment for several reasons -- including that law was both overinclusive and underinclusive.
"The state seeks to regulate
minors’ access to speech on social media platforms," he wrote.
"Overwhelmingly, such speech qualifies for First Amendment protection," he continued, adding that
"identical speech" on websites other than social platforms "remains easily accessible to minors."
Louisiana Attorney General Liz Murrill and the state Department of Justice are
now asking the 5th Circuit Court of Appeals to reverse that ruling, arguing that the restrictions are comparable to laws regulating minors' consumption of alcohol or cigarettes.
"Social media has deteriorated adolescent mental health," the officials write.
"Age verification and parental consent are longstanding safeguards to protect
minors in contexts involving heightened risks," the state adds. "Such measures are commonly required for activities like purchasing alcohol or tobacco, gambling, getting tattoos, and accessing
pornography."
They argue that evidence in the case showed social platforms carry potentially harmful content despite their efforts to remove such material.
"In just six months of 2023, Facebook alone saw at least 2.7 billion spam posts, 28 million posts containing antisemitic, racist, or other derogatory content, and 27 million violent or
graphic posts," the state officials wrote.
"The record ... reflects that post hoc content moderation cannot fully protect minors from being exposed to harmful content or
contacted by bad actors on large social-media platforms," the government adds.
NetChoice is expected to file a response with the 5th Circuit next month.