Tech

Supreme Court weighs big tech content moderation policies in hearing today over Florida, Texas social media laws

The case examines how the First Amendment applies to the modern Big 5 internet era.

Photo of Marlon Ettinger

Marlon Ettinger

Supreme Court with internet network communications concept overlay

The Supreme Court heard arguments in two cases today brought by NetChoice, an industry lobbying group representing big tech companies like TikTok, Facebook, and X, against laws passed by state governments which allege that the companies discriminate against conservative political viewpoints in their moderation decisions.

Featured Video

The arguments looked at laws passed by Florida and Texas in early 2021 after platforms like Facebook and Twitter booted former President Donald Trump following the Jan. 6 Capitol riot, reported the Associated Press. Those laws essentially limited how much social media companies could make independent decisions to moderate user content, reported the Huffington Post.

After the laws were passed by the Texas and Florida state legislatures, NetChoice sued Florida Attorney General Ashley Moody and Texas Attorney General Ken Paxton.

The Texas law limits social media platforms with 50 million or more active users from blocking, removing, or demonetizing content posted based on the views of the posters, reported SCOTUSblog, and the Florida laws—which was dubbed the Stop Social Media Censorship Act—blocked social media companies from booting political candidates or journalistic enterprises.

Advertisement

Those cases, designated Moody v. NetChoice, LLC, and NetChoice, LLC v. Paxton, are basically First Amendment cases and look at whether social media companies should be allowed to make their own editorial decisions on content they host, or whether they have the duties of a “common carrier,” like the U.S. Post Office, to facilitate the exchange of any sort of speech on their platforms.

Some commentators picked up on this line of thinking on the case, which argues that big social media companies have obligations beyond making their own private editorial decisions because of the scale of their platforms.

“In Moody v. NetChoice this morning, I thought a key exchange was between Neil Gorsuch and [Solicitor General] Elizabeth Prelogar,” Open Markets Institute Legal Director Sandeep Vaheesan said on X. “He asked whether Western Union could have escaped common carrier status and [also] obtained strong First Amendment protection by blocking certain telegrams based on content.”

“That argument didn’t work a century ago,” Vaheesan continued, “and it shouldn’t work now. Western Union provided a service generally available to all comers, much like Twitter, Facebook, and YouTube do today. Government should have the authority to enact common carrier duties on these platforms.”

Advertisement

Other analysts took an opposing tack—that giving social media companies those obligations would require government regulation and moderation decisions on speech, which itself would be a violation of the First Amendment.

“This [case] is critical for the First Amendment,” said Robert Corn-Revere, the chief counsel for FIRE, a free speech advocacy group, in a video posted to X. “The question is how the First Amendment treats new communications technologies.” According to Corn-Revere, the Supreme Court historically hasn’t been very protective of the First Amendment as it applies to new technologies, but that changed when the internet came around.

Advertisement

“Today the court is going to reexamine that question and determine whether or not the First Amendment still keeps the government from sticking its nose into private moderation business,” Corn-Revere said. “A private editorial choice on a social media platform is just that—private!”

Justices on the court grappled with that question by asking whether the laws passed by the state governments are too broad.

“It covers almost everything,” Justice Sonia Sotomayor said. “But one thing I know about the internet is that its variety is infinite.”

Justice Samuel Alito, a court conservative, picked up on claims by conservatives that content moderation was basically just a euphemism for censorship, reported Digiday reporter Marty Swant. 

Advertisement

“If the government’s doing it, then content moderation might be a euphemism for censorship,”
argued Netchoice lawyer Paul Clement, disputing that. “If a private party is doing it, content moderation is a euphemism for editorial discretion.”

Observers noted the court seemed wary of the laws, with Justice Brett Kavanaugh stating that, “When I think of Orwellian, I think of the state, not the private sector, not private individuals.”

Alito also asked if email providers could delete accounts of people whose opinions they found objectionable.

Advertisement

“Does Gmail have a First Amendment right to delete, let’s say Tucker Carlson’s or Rachel Maddow’s Gmail accounts if they don’t agree with his or her viewpoints?” Alito asked.

“They might be able to do that, your honor,” Clement answered. “I meant that’s obviously not been something that’s been the … focus of this litigation.”

The Supreme Court ruled in favor of tech companies last year in a case brought by family members of a 2017 terrorist attack by ISIS, reported SCOTUSblog. The lawsuit the family filed tried to hold Twitter, Facebook, and Google liable for aiding and abetting international terrorism, but the Court ruled that the case couldn’t go forward. They also declined to rule on another case that tried to hold the platforms accountable for content published by their users—something they’ve traditionally been guarded against under Section 230 of the Communications Decency Act of 1996, perhaps signaling a potential bent in this case toward big tech in matters of content moderation.

Advertisement
web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
 
The Daily Dot