Supreme Court Wrangles with Lawsuit Shield for Social Media

In the first case, the Supreme Court of the United States (SCOTUS) heard about the federal law that helped create the modern internet; it seemed unlikely they would side with a family seeking to hold Google liable for the death of their daughter in a terrorist attack.

Meanwhile, the justices also indicated in arguments that lasted two and a half hours that they are suspicious of Google’s claims that Section 230 of the Communications Decency Act, part of a 1996 law, gives Facebook, Twitter, and other social media companies far-stretching immunity from lawsuits over recommendations targeted to specific individuals of documents, videos, and other content.

The case highlights the growing tension between the technology policy devised a generation ago and the reach of social media, which numbers several billion posts daily.

“We really don’t know about these things. You know, these are not like the nine greatest experts on the internet,” said Justice Elena Kagan of her colleagues and herself.

Kagan maintained that Congress, not the court, needs to make changes to the law, which was passed early in the age of the internet.

One of the six conservative justices, Brett Kavanaugh, agreed with liberal justice Kagan on the case which crossed ideological lines.

“Isn’t it better,” asked Kavanaugh, to leave things the way they currently are and “put the burden on Congress to change that?”

The case stems from an American college student’s death in the Paris nightclub terrorist attack

The case in front of the court stems from American college student Nohemi Gonzalez’s death in the Paris terrorist attack in 2015.

Family members were in the courtroom to hear the arguments about whether or not they could sue YouTube, which Google owns, for helping the Islamic State to attract new recruits and spread its message in violation of the Anti-Terrorism Act. The lower courts took Google’s side.

Justices used various examples to investigate what YouTube does when it uses algorithms to recommend videos to viewers, whether they be about cats or terrorists. Chief Justice John Roberts suggested that what YouTube does isn’t “pitching something in particular to the person who’s made the request” but only a “21st-century version” of what has been taking place for a long time.

Justice Clarence Thomas inquired whether YouTube uses an identical algorithm to recommend terrorist content or rice pilaf recipes. He was told yes.

The court will also hear about another terrorist attack at a nightclub in Istanbul in 2017 that saw 39 people killed and prompted a lawsuit against Google, Twitter, and Facebook.

Other challenges to social media laws enacted by the GOP in Texas and Florida are pending before the highest court. However, they wouldn’t be argued before fall or decided until the first half of 2024.