The Supreme Court is set to hear back-to-back oral arguments this week in two cases that could significantly reshape online speech and content moderation.
The outcome of the oral arguments, scheduled for Tuesday and Wednesday, could determine whether tech platforms and social media companies can be sued for recommending content to their users or for supporting acts of international terrorism by hosting terrorist content. It marks the Court’s first-ever review of a hot-button federal law that largely protects websites from lawsuits over user-generated content.
The closely watched cases, known as Gonzalez v. Google and Twitter v. Taamneh, carry significant stakes for the wider internet. An expansion of apps and websites’ legal risk for hosting or promoting content could lead to major changes at sites, including Facebook, Wikipedia and YouTube, to name a few.
The litigation has produced some of the most intense rhetoric in years from the tech sector about the potential impact on the internet’s future. US lawmakers, civil society groups and more than two dozen states have also jumped into the debate with filings at the Court.
At the heart of the legal battle is Section 230 of the Communications Decency Act, a nearly 30-year-old federal law that courts have repeatedly said provide broad protections to tech platforms but that has since come under scrutiny alongside growing criticism of Big Tech’s content moderation decisions.
The law has critics on both sides of the aisle. Many Republican officials allege that Section 230 gives social media platforms a license to censor conservative viewpoints. Prominent Democrats, including President Joe Biden, have argued Section 230 prevents tech giants from being held accountable for spreading misinformation and hate speech.
In recent years, some in Congress have pushed for changes to Section 230 that might expose tech platforms to more liability, along with proposals to amend US antitrust rules and other bills aimed at reining in dominant tech platforms. But those efforts have largely stalled, leaving the Supreme Court as the likeliest source of change in the coming months to how the United States regulates digital services.
Rulings in the cases are expected by the end of June.
Gonzalez v. Google
The case involving Google zeroes in on whether it can be sued because of its subsidiary YouTube’s algorithmic promotion of terrorist videos on its platform.
According to the plaintiffs in the case — the family of Nohemi Gonzalez, who was killed in a 2015 ISIS attack in Paris — YouTube’s targeted recommendations violated a US antiterrorism law by helping to radicalize viewers and promote ISIS’s worldview.
The allegation seeks to carve out content recommendations so that they do not receive protections under Section 230, potentially exposing tech platforms to more liability for how they run their services.
Google and other tech companies have said that that interpretation of Section 230 would increase the legal risks associated with ranking, sorting and curating online content, a basic feature of the modern internet. Google has claimed that in such a scenario, websites would seek to play it safe by either removing far more content than is necessary, or by giving up on content moderation altogether and allowing even more harmful material on their platforms.
Friend-of-the-court filings by Craigslist, Microsoft, Yelp and others have suggested that the stakes are not limited to algorithms and could also end up affecting virtually anything on the web that might be construed as making a recommendation. That might mean even average internet users who volunteer as moderators on various sites could face legal risks, according to a filing by Reddit and several volunteer Reddit moderators. Oregon Democratic Sen. Ron Wyden and former California Republican Rep. Chris Cox, the original co-authors of Section 230, argued to the Court that Congress’ intent in passing the law was to give websites broad discretion to moderate content as they saw fit.
The Biden administration has also weighed in on the case. In a brief filed in December, it argued that Section 230 does protect Google and YouTube from lawsuits “for failing to remove third-party content, including the content it has recommended.” But, the government’s brief argued, those protections do not extend to Google’s algorithms because they represent the company’s own speech, not that of others.
Twitter v. Taamneh
The second case, Twitter v. Taamneh, will decide whether social media companies can be sued for aiding and abetting a specific act of international terrorism when the platforms have hosted user content that expresses general support for the group behind the violence without referring to the specific terrorist act in question.
The plaintiffs in the case — the family of Nawras Alassaf, who was killed in an ISIS attack in Istanbul in 2017 — have alleged that social media companies including Twitter had knowingly aided ISIS in violation of a US antiterrorism law by allowing some of the group’s content to persist on their platforms despite policies intended to limit that type of content.
Twitter has said that just because ISIS happened to use the company’s platform to promote itself does not constitute Twitter’s “knowing” assistance to the terrorist group, and that in any case the company cannot be held liable under the antiterror law because the content at issue in the case was not specific to the attack that killed Alassaf. The Biden administration, in its brief, has agreed with that view.
Twitter had also previously argued that it was immune from the suit thanks to Section 230.
Other tech platforms such as Meta and Google have argued in the case that if the Court finds the tech companies cannot be sued under US antiterrorism law, at least under these circumstances, it would avoid a debate over Section 230 altogether in both cases, because the claims at issue would be tossed out.
In recent years, however, several Supreme Court justices have shown an active interest in Section 230, and have appeared to invite opportunities to hear cases related to the law. Last year, Supreme Court Justices Samuel Alito, Clarence Thomas and Neil Gorsuch wrote that new state laws, such as Texas’s that would force social media platforms to host content they would rather remove, raise questions of “great importance” about “the power of dominant social media corporations to shape public discussion of the important issues of the day.”
A number of petitions are currently pending asking the Court to review the Texas law and a similar law passed by Florida. The Court last month delayed a decision on whether to hear those cases, asking instead for the Biden administration to submit its views.