news

DOJ Undermines Google in Supreme Court Case Over Who's Responsible for Social Media Posts

People walk past a billboard advertisement for YouTube on September 27, 2019 in Berlin, Germany.
Sean Gallup | Getty Images
  • The Department of Justice warned the Supreme Court against an overly broad interpretation of a law shielding social media companies from liability for what users post on their site in a case involving Google.
  • The position could undermine Google in a case that could reshape the role of content moderation on digital platforms.
  • The case, Gonzalez v. Google, was brought by family members of a U.S. citizen who was killed in a 2015 terrorist attack for which ISIS claimed responsibility. The suit alleges Google's YouTube failed to adequately stop ISIS from distributing content on the site.

The Department of Justice warned the Supreme Court against an overly broad interpretation of a law shielding social media companies from liability for what users post on their platforms, a position that undermines Google's defense in a case that could reshape the role of content moderation on digital platforms.

In a brief filed Wednesday led by DOJ Acting Solicitor General Brian Fletcher, the agency said the Supreme Court should vacate an appeals court ruling that found Section 230 of the Communications Decency Act protected Google from being liable under U.S. antiterrorism law.

Section 230 allows for online platforms to engage in good-faith content moderation while shielding them from being held responsible for their users' posts. Tech platforms argue it's a critical protection, especially for smaller platforms that could otherwise face costly legal battles since the nature of social media platforms makes it difficult to quickly catch every harmful post.

But the law has been a hot-button issue in Congress as lawmakers on both sides of the aisle argue the liability shield should be drastically limited. But while many Republicans believe the content moderation allowances of the law should be trimmed down to reduce what they allege is censorship of conservative voices, many Democrats instead take issue with how the law can protect platforms that host misinformation and hate speech.

The Supreme Court case known as Gonzalez v. Google was brought by family members of American citizen Nohemi Gonzalez, who was killed in a 2015 terrorist attack for which ISIS claimed responsibility. The suit alleges Google's YouTube did not adequately stop ISIS from distributing content on the video-sharing site to aid its propaganda and recruitment efforts.

The plaintiffs pursued charges against Google under the Antiterrorism Act of 1990, which allows U.S. nationals injured by terrorism to seek damages. The law was updated in 2016 to add secondary civil liability to "any person who aids and abets, by knowingly providing substantial assistance" to "an act of international terrorism."

Gonzalez's family claims YouTube did not do enough to prevent ISIS from using its platform to spread its message. They allege that even though YouTube has policies against terrorist content, it failed to adequately monitor the platform or block ISIS from using it.

Both the district and appeals courts agreed that Section 230 protects Google from liability for hosting the content.

Though it did not take a position on whether Google should ultimately be found liable, the DOJ recommended the appeals court ruling be vacated and returned to the lower court for further review. The agency argued that while Section 230 would bar the plaintiffs' claims based on YouTube's alleged failure to block ISIS videos from its site, "the statute does not bar claims based on YouTube's alleged targeted recommendations of ISIS content."

The DOJ argued the appeals court was correct to find Section 230 shielded YouTube from liability for allowing ISIS-affiliated users to post videos since it did not act as a publisher by editing or creating the videos. But, it said, the claims about "YouTube's use of algorithms and related features to recommend ISIS content require a different analysis." The DOJ said the appeals court did not adequately consider whether the plaintiffs' claims could merit liability under that theory and as a result, the Supreme Court should return the case to the appeals court so it can do so.

"Through the years, YouTube has invested in technology, teams, and policies to identify and remove extremist content," Google spokesperson José Castañeda said in a statement. "We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices. Undercutting Section 230 would make it harder, not easier, to combat harmful content — making the internet less safe and less helpful for all of us."

Chamber of Progress, an industry group that counts Google as one of its corporate partners, warned the DOJ's brief invites a dangerous precedent.

"The Solicitor General's stance would hinder platforms' ability to recommend facts over lies, help over harm, and empathy over hate," Chamber of Progress CEO Adam Kovacevich said in a statement. "If the Supreme Court rules for Gonzalez, platforms wouldn't be able to recommend help for those considering self-harm, reproductive health information for women considering abortions, and accurate election information for people who want to vote. This would unleash a flood of lawsuits from trolls and haters unhappy about the platforms' efforts to create safe, healthy online communities."

WATCH: The messy business of content moderation on Facebook, Twitter, YouTube

Copyright CNBC
Contact Us