Social media companies are preparing for Supreme Court arguments on Monday that could fundamentally alter the way they police their sites.
After Facebook, Twitter and YouTube banned President Donald J. Trump in the wake of the January 6, 2021, riot at the Florida Capitol, they made it illegal for tech companies to ban a presidential candidate from their sites. position in the state. Texas later passed its own law prohibiting platforms from removing political content.
Two tech industry groups, NetChoice and the Computer & Communications Industry Association, sued to block the laws from taking effect. They argued that companies have the right to make decisions about their own platforms under the First Amendment, in the same way that a newspaper decides what is published in its pages.
So what is at stake?
The Supreme Court's decision in those cases (Moody v. NetChoice and NetChoice v. Paxton) is a major test of the power of social media companies, which will potentially reshape millions of social media feeds by giving the government influence over how and what remains online.
“What's at stake is whether they can be forced to post content they don't want to,” said Daphne Keller, a Stanford Law School professor who filed a brief with the Supreme Court supporting the tech groups' challenge to rights. from Texas and Florida. laws. “And, perhaps more to the point, whether the government can force them to post content they don't want to.”
If the Supreme Court says the Texas and Florida laws are constitutional and take effect, some legal experts speculate that companies could create versions of their feeds specifically for those states. Still, such a ruling could lead to similar laws in other states, and it is technically complicated to precisely restrict access to a website based on location.
Critics of the laws say streams to the two states could include extremist content (from neo-Nazis, for example) that the platforms would previously have removed for violating their standards. Or, critics say, platforms could prohibit discussion of any remotely political topic by banning posts on many controversial topics.
What are the social media laws in Florida and Texas?
Texas law prohibits social media platforms from removing content based on the user's “point of view” or expressed in the post. The law gives individuals and the state attorney general the right to file lawsuits against platforms for violations.
Florida law fines platforms if they permanently exclude a candidate for office in the state from their sites. It also prohibits platforms from removing content from a “journalistic business” and requires companies to be upfront about their rules for moderating content.
Supporters of the Texas and Florida laws, which passed in 2021, say they will protect conservatives from the liberal bias they say permeates California-based platforms.
“People around the world use Facebook, YouTube and X (the social media platform formerly known as Twitter) to communicate with friends, family, politicians, journalists and the general public,” said Texas Attorney General Ken Paxton. in a legal writing. “And like the telegraph companies of yesteryear, today's social media giants use their control over the mechanics of this 'modern public square' to direct, and often repress, public discourse.”
Chase Sizemore, a spokesman for Florida's attorney general, said the state was “looking forward to defending our social media law that protects Floridians.” A spokeswoman for the Texas attorney general had no comment.
What are the current rights of social media platforms?
Now they decide what stays online and what doesn't.
Companies like Facebook and Meta's Instagram, TikTok, Snap, YouTube and X have long policed themselves, setting their own rules about what users can say, while the government has taken a hands-off stance.
In 1997, the Supreme Court ruled that a law regulating indecent speech online was unconstitutional, differentiating the Internet from media where the government regulates content. The government, for example, enforces standards of decency on radio and television.
For years, bad actors have flooded social media with misleading information, hate speech and harassment, prompting companies to come up with new rules over the past decade that include banning false information about the election and the pandemic. Platforms have banned figures such as influencer Andrew Tate for violating their rules, including against hate speech.
But there has been a right-wing backlash to these measures, with some conservatives accusing the platforms of censoring their views, even prompting Elon Musk to say he wanted to buy Twitter in 2022 to help ensure free speech for users. users.
Thanks to a law known as Section 230 of the Communications Decency Act, social media platforms are not responsible for most content posted on their sites. Therefore, they face little legal pressure to remove problematic posts and users who violate their rules.
What are social media platforms discussing?
tech groups say the First Amendment gives companies the right to remove content as they see fit because it protects their ability to make editorial decisions about the content of their products.
In their lawsuit against the Texas law, the groups said that like a magazine's publishing decision, “a platform's decision about what content to host and what to exclude is intended to convey a message about the type of community it supports.” platform hopes to encourage. “
Still, some legal scholars are concerned about the implications of allowing social media companies unlimited power under the First Amendment, which is intended to protect free speech and freedom of the press.
“I worry about a world in which these companies invoke the First Amendment to protect what many of us believe are commercial activities and conduct that are not expressive,” said Olivier Sylvain, a professor at Fordham Law School who until recently little was a senior advisor. to the president of the Federal Trade Commission, Lina Khan.
What comes next?
The court will hear arguments from both sides on Monday. A decision is expected in June.
Legal experts say the court can rule the laws are unconstitutional, but it provides a roadmap for how to fix them. Or you can fully defend the First Amendment rights of businesses.
Carl Szabo, general counsel at NetChoice, which represents companies like Google and Meta and lobbies against tech regulations, said that if the group's challenge to the laws fails, “Americans across the country would be left seeing legal but terrible content.” ” which could be interpreted as political and therefore covered by laws.
“There are a lot of things that are presented as political content,” he said. “The recruitment of terrorists is possibly a political content.”
But if the Supreme Court rules that the laws violate the Constitution, it will cement the status quo: The platforms, and no one else, will determine what speech remains online.
Adam Liptak contributed reports.