How much can states regulate social media? The Supreme Court hears cases for and against

Date:

Can states limit how social media companies can moderate content? That’s once again up for consideration in the U.S. Supreme Court.

On Monday, the nation’s high court spent hours hearing oral arguments regarding state laws in Florida and Texas, which passed in 2021 after former president Donald Trump was kicked off major social platforms in the aftermath of the Capitol attack on Jan. 6. In 2022, both state laws were challenged by NetChoice — a tech industry group representing giants including Google, Meta, TikTok and Snap. Lawyers for NetChoice say the state laws violate companies’ first amendment rights by forcing online platforms to allow content that violates their platforms’ policies. However, Florida and Texas say their laws enable free speech.

Although the topics overlap, there are key differences. Florida’s law protects political candidates from being permanently banned by online platforms. In Texas, social media companies with more than 50 million monthly users are barred from banning users over political posts. Texas’s law also excludes websites, and in particular websites focused on news, sports and entertainment.

The cases come just a year after the Supreme Court heard arguments in two other free-speech cases. However, arguments in the February 2023 hearing addressed whether Google and Twitter should be liable for harmful content — and specifically terrorist content. The Supreme Court unanimously sided with the tech companies, but chose not to address the scope of Section 230 of the Communications Decency Act, which also came up plenty in state law discussions. 

The case against Texas and Florida

Representing the tech companies was former U.S. Solicitor General Paul Clement, who gave NetChoice’s oral arguments in both the Florida-related Moody v NetChoice and Texas-related NetChoice v Paxton. He said platforms’ content moderation efforts represent “editorial discretion in order to make it less offensive to users and advertisers.” When Justice Samuel Alito asked if content moderation is “anything more than a euphemism for censorship,” Clement said only “if the government is doing it,” but otherwise it’s just editorial discretion.

“Given the vast amount of material on the internet in general, and on these websites in particular, exercising editorial discretion is absolutely necessary to make the websites useful for users and advertisers,” Clement said. “And the closer you look at Florida’s law, the more problematic the First Amendment problems become.”

Beyond major social networks like Facebook and YouTube, justices asked about a range of companies — including Etsy, LinkedIn and DropBox — and how they might be impacted by the state law. Would Florida’s law apply to Uber drivers discriminating on the basis of who they’ll pick up? Would Venmo be forced to let people transact on the platform even if it didn’t believe in a viewpoint? Would Google have the right to ban political personalities from using Gmail?

Both cases also prompted dozens of companies, think tanks, politicians and individuals to submit briefs to the court, including from companies like Yelp, Reddit, Discord. Others cited new laws related to data privacy and limits on ads targeting minors. In a submission on behalf of Etsy and eBay, the Marketplace Industry Association noted that marketing partners and investors “are wary of potential content” and that “uncertain legal environments” could deter future investments.

“Laws like [Florida and Texas] would require social media companies to disseminate viewpoints whether or not companies support those views — even if those views conflict with their established norms and alienate users and advertisers,” according to a brief filed by Interactive Advertising Bureau.

U.S. Solicitor General Elizabeth Prelogar also was in court on the side of NetChoice. In her remarks on Monday, Prelogar noted that online platforms aren’t the same as telephone wires or mail trucks. She added that the First Amendment protects entities that “curate, arrange and present other people’s words and images in expressive compilations.”

“There is a big difference between a pure conduit, the kind of company that is quite literally engaged in carrying speech, transmitting it,” Prelogar said. “… [Platforms are] not just literally facilitating users’ ability to communicate with other users. Instead, they’re taking that and arranging it and excluding it.”

High Court concerns

At Monday’s hearing, Supreme Court members wondered whether the laws might lead to unintended consequences for free speech more broadly. While Justice Sonia Sotomayor noted Florida’s law seems “so broad” and “almost covers everything,” Justice Ketanji Brown Jackson asked if there’s a need to “drill down more in order to really figure out whether or not things are protected.” Others, including Justice Amy Coney Barrett, also worried that muddling the case’s focus on First Amendment issues with debates about Section 230 protections could lead to potential “land mines.”

“If what we say about this is that this is speech that’s entitled to First Amendment protections, I do think then that has Section 230 implications for another case,” Barrett said. “And so it’s always tricky to write an opinion hen you know there might be landmines that would affect things later.”

Beyond addressing content moderation, the justices also asked about content curation. Justice Elena Kagan mentioned the example of Twitter rebranding as X and changing the content people saw in their timeline.

“All of a sudden they were getting a different online newspaper, so to speak, in a metaphorical sense every morning,” Kagan said. “And a lot of Twitter users thought that was great, and a lot of Twitter users thought that was horrible because, in fact, there were different content judgments being made that was very much affecting the speech environment that they entered every time they opened their app.” 

Advertising, algorithms and other questions

Although advertising wasn’t the main topic of Monday’s oral arguments, it still came up within the broader case. One submission noted YouTube lost millions in ad revenue in 2017 after companies removed ads “after seeing them distributed next to videos containing extremist content and hate speech.” Others cited past cases that reference the role of advertising including a case from 1985 that determined companies could require advertiser disclosures without violating their First Amendment rights.

“Internet platforms today control the way millions of Americans communicate with each other and with the world,” Henry Whitaker, Florida’s solicitor general, said during Monday’s hearing while arguing in favor of the state’s law. “The platforms achieved that success by marketing themselves as neutral forums for free speech. Now that they host the communications of billions of users, they sing a very different tune.”

Supreme Court members also addressed the role of algorithms in the context of content moderation. Justice Neil Gorsuch asked Clement whether platforms could use algorithms to sway teens toward mental health problems. Others including Justice Clarence Thomas wondered whether non-human means of moderating content still counted as free speech. Giving an example of a deep-learning algorithm that “teaches itself and has very little human intervention,” Thomas asked: “Who’s speaking then, the algorithm or the person?”

According to Texas solicitor general Aaron Nielson — who argued in favor of the state — the court’s record doesn’t have enough evidence to know what platforms’ algorithms are doing. He said an algorithm “could be expressive,” but it might also be made neutral to reflect user choice.

A factor in both cases is whether social media companies are more like newspapers allowed to make their own editorial decisions or more like “common carriers” like phone providers or post offices, which are required to give everyone equal access regardless of what they say. NetChoice argued that social platforms shouldn’t be in the same bucket as phone companies and do more than just carry content. 

Lawyers defending the state laws said companies like Google and Meta have massive market power and should be treated as public squares — something SCOTUS members including Chief Justice Roberts also suggested. Others including Justice Samuel Alito noted platforms like Gmail might be more like a common carrier since they don’t curate messages. 

The outlook

Neither state law has yet taken effect, and NetChoice won a preliminary injunction against both. At most, the court could uphold the preliminary injunction. It might also decide to send it back to the lower courts to further develop the factual record in the case. Experts also note the outcome still won’t stop states from pursuing new laws regulating the internet. 

During an online discussion after the hearing, Lawrence G. Walters, general counsel of the Woodhull Freedom Foundation, said leaving the state laws in place could lead to a “hodgepodge of inconsistent burdensome regulations” that could be damaging for companies and users. At the same forum, ACLU privacy and tech attorney Vera Eidelman said it’s a choice between moderation by private companies or government censorship. “There are ways in which both are unappealing,” she said. “But the government option is surely worse and it’s also unconstitutional.”

Texas and Florida have “starkly outlined their vision for the internet,” according to Jess Miers, senior counsel at the tech-affiliated Chamber of Progress. 

“They are looking for a space that is dominated by the loudest voices, fostering content that endangers marginalized groups and younger audiences,” Miers, a professor at Santa Clara University, told Digiday. “They’ve sent a clear message to their constituents that they’d prefer to see these platforms leave their states entirely rather than just allow them the freedom of editorial discretion.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Popular

More like this
Related