There are certain established rules when it comes to sites' ability to police their own platforms, but the Supreme Court might just change that. For now, Twitter, Facebook
"It's a moment when everything might change," said former Google attorney and program director at Stanford University's Cyber Policy Center, Daphne Keller.
On January 20, the Supreme Court is scheduled to discuss whether or not to hear cases surrounding supposed censorship on social media, and a case is set to be heard next month regarding platforms' legal responsibility for what users post. Specifically, the case seeks to contest Section 230 of the 1996 Communications Decency Act, which shields platforms from liability for user-generated content.
To this point, the United States has largely left it up to companies to police speech online, but the Court's decisions in these cases could drastically change that practice. This could significantly impact the business models of virtually every social media company.
Compared to the U.S., other nations have been more hands-on with social media regulation. In the E.U., internet companies are required to have procedures in place to remove illicit content while maintaining transparency about how their content recommendations are generated.
In U.S. politics, regulation of harmful or illicit content is largely unpopular. Meanwhile, lawmakers have repeatedly pressed tech executives to explain why they remove content from their platforms. Republicans in particular have passionately decried the alleged censorship of conservative voices on social media, while Democrats, on the other hand, have called on companies to do more to combat misinformation.
Section 230 has long stood as a powerful tool for social media companies in the courts. Countless cases against Facebook, Twitter, and Youtube
"If they don't have any liability at the back end for any of the harms that are facilitated, they have basically a mandate to be as reckless as possible," University of Miami law professor Mary Anne Franks told The New York Times.
In the past, the Supreme Court has declined to hear cases from individuals blaming platforms for the harm done by users' posts, including allegations that Facebook promoted extremist content. Now, however, the Court will hear a case from the family of an American woman killed by extremists with the Islamic State in Paris. The family says Youtube should be held liable for promoting extremist content to its users.
"Any negative ruling in this case, narrow or otherwise, is going to fundamentally change how the internet works," said Google general counsel Halimah DeLaine Prado.
On the censorship side of the issue, Texas and Florida both recently passed laws restricting platforms' ability to remove content posted by users. The laws let users sue platforms for removing posts due to the "viewpoint" expressed by the content. In Florida, sites could be fine for banning accounts of political candidates.
The laws are being challenged by industry groups representing the social media companies, but the courts have disagreed over whether to uphold or reject them. The U.S. Court of Appeals for the 11th Circuit upheld most of a federal judge's ruling against the states, but the U.S. Court of Appeals for the Fifth Circuit sided against the platforms, opening up the possibility of a Supreme Court hearing.
"I think we're, right now, in a place where the court is being positioned to make a new judgment on the internet," said an associate professor of cybersecurity law at the U.S. Naval Academy, Jeff Kosseff.