US Supreme Court weighs online platforms’ liability in Google recommendations case News
US Supreme Court weighs online platforms’ liability in Google recommendations case

The US Supreme Court Tuesday heard oral arguments in Gonzales v. Google, a case that could upend how social media companies handle content distribution. At the heart of the case is a question of whether tech giants like Google can face liability when their algorithm recommends ISIS recruitment videos. That said, the case also has broader civil rights implications because, as one of the amicus briefs filed in the case pointed out, “As society has moved online, so too have discrimination, redlining, voter suppression, and harassment.”

The statute at issue before the court is Title 47, section 230(c) of the US Code, which was originally passed under section 230 of the Communications Decency Act. The statute broadly shields online platforms from liability for content posted to the platform by its users. As Justice Elena Kagan described, the issue before the court Tuesday concerned figuring out “how this statute applies–the statute which was a pre-algorithm statute applies in a post-algorithm world.”

In oral arguments, attorney for the plaintiffs Eric Schnapper claimed that Youtube’s behavior fell out of the structure of section 230(c). Because of this, Schnapper argued, Youtube should face liability for allegedly aiding and abetting ISIS in their recruitment efforts. Normally, section 230(c) prevents lawsuits against social media companies over content moderation and distribution decisions. Schnapper argued this case is different because Youtube’s algorithm is “affirmatively recommending or suggesting ISIS content,” and that this is different than “mere inaction.”

Arguing on behalf of the US, Malcom Stewart agreed and disagreed in part with Schnapper. Stewart did not adopt the plaintiffs’ view of section 230(c). Instead, Stewart urged the the court to “distinguish carefully between liability for the content itself, [and] liability for statements about the content.” That said, Stewart conceded that organizational decisions of a company may still be subject to suit, outside of section 230(c)’s coverage.

Lisa Blatt argued on behalf of Google and stood in opposition to Schnapper. Blatt argued that “[s]ection 230(c)(1)’s 26 words created today’s internet.” Under section 230(c)(1), Blatt argued, websites cannot be treated as the “publisher or speaker of any information provided by another.” Following this reasoning, Blatt said that when websites like Google communicate third-party information–like ISIS recruitment videos–and “the plaintiff’s harm flows from that information,” section 230(c)(1) protects Google from liability.

The justices, particularly Justice Clarence Thomas, seemed skeptical of Schnapper’s arguments. The justices repeatedly raised concerns over the flood of litigation that could occur if the court changed the interpretation of section 230(c). Thomas asked whether Youtube used a different algorithm to recommend ISIS recruitment videos as compared to cooking videos. This captured a question the court repeatedly raised: if Youtube’s algorithm can be said to be neutral depending on the content it recommends, how is the plaintiffs’ claim any different than previous claims denied by the court?

In an amicus brief to the court, Managing Attorney of the Digital Justice Initiative at the Lawyers’ Committee for Civil Rights Under Law David Brody emphasized that the court has the potential to shape online civil rights law. “If platforms are more concerned with liability, they are likely to turn up the dial and be even less permissive of controversial speech,” Brody said. “The goal is to thread the needle.” Brody argued that they way to do this is to adopt the “consensus test,” a two-part test that lower courts have largely adopted up until this point. The test asks courts to consider: (1) does the section 230(c) claim against a party seek to treat the platform as a publisher; and (2) if so, is the platform materially contributing to the illegality? Brody emphasized:

What’s important for the court to do here is to put a little meat on the bones about what it means for a claim to treat someone as a publisher, or what it means for someone to materially contribute illegality….Because when these tests are properly applied, they don’t sweep in everything under the sun and give platforms blank check immunity, but they also don’t open the floodgates to litigation.

Through its ruling, the court has the opportunity to clarify the issue and protect against potential civil rights violations, which, in recent years, has become complicated as the internet has grown beyond the confines originally described within section 230(c).