‘A disastrous outcome’: Landmark Supreme Court case could silence internet, experts warn

  • The Supreme Court is hearing a case about Section 230 that shields Internet platforms from liability in the past and could completely change the Internet, according to experts.
  • After the 2015 terrorist attack in Paris that killed 130 people, several relatives of the survivors sued Google over its algorithms that allegedly contributed to the radicalization of the terrorists responsible.
  • “If the court rules in Gonzalez’s favor, you’re going to face this really weird legal situation where technically if you do any kind of moderation that means you know there’s bad content on your platform and therefore could be held responsible for it,” said one specialist in DCNF.

A Supreme Court case challenging Section 230, which has unilaterally shielded tech companies from legal action in the past, could potentially lead to “disastrous” changes in how companies recommend content to their users, according to with some experts.

In 2015, 12 gunmen killed 130 people in Paris in a coordinated terror attack, and several of the victims’ relatives filed a lawsuit against Google alleging that the company’s algorithms created a rabbit hole by radicalizing terrorists, according to LawFare. If the Supreme Court rules in favor of Reynaldo Gonzalez, one of the victims’ family members, experts told the Daily Caller News Foundation that there would be far less speech in general and that the way we traditionally understand the internet would change completely.

Chris Marchese, general counsel for NetChoice, a technology company that supports free speech with clients such as Amazon, TikTok and Lyft, which is facing similar cases in Texas and Florida, told DCNF that “the implications of this case for the Internet they are really serious.” (RELATED: ‘Take aim’: Adam Schiff threatens big tech unless it censors more content)

“First and foremost, I think everyone agrees that the events of Gonzalez are horrific, that innocent American lives were lost, and that international terrorism is to blame,” Marchese said. “If the Supreme Court were to agree with the Department of Justice, as well as with Gonzalez, and hold that Section 230 does not apply to algorithms [then] … [y]you really can’t have a website of any kind without some kind of algorithm that organizes the content.”

Marchese explained that social media and even your Google search bar would change radically, as a large part of the platforms are based on algorithms. Without the ability to suggest content, technology platforms would not be able to perform basic functions such as displaying trending topics on Twitter or creating playlists for music and videos.

Section 230 was enacted by Congress in 1996 to create a way for individuals to use a platform and express First Amendment speech without companies being held liable for speech that could be hateful, violent or dangerous, according to with the Bipartisan Policy Center. Section 230(c)1, a short section of 26 words, has been called the “words that created the internet”.

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another provider of information content,” the section states.

Rachel Bovard, senior policy director at the Conservative Partnership Institute, told DCNF that “Gonzales’ case is unique because of the terrorism aspect.”

“This case examines whether algorithmic content enhancement is still protected under Section 230,” Bovard said. “This case is trying to say, when it comes to this case involving terrorist content, that the action that YouTube/Google took to act on that content by promoting it is not protected, so they are in fact liable.”

Gonzalez claims that before the Paris attack, Google created content using algorithms and “knowingly” allowed terrorist groups like ISIS to use it to recruit and “incite violence,” in violation of the Anti-Terrorism Act that gave the government special oversight regarding terrorism. crimes, according to the ACLU. As a result, Google falls out of Article 230 protection and is partially responsible for the Paris terror attack.

Marchese said there would be “a lot less speech on the Internet” if Section 230 was “vacated” by the Supreme Court. Adam Candeub, director of Intellectual Property, Information and Communications Law at the University of Michigan and former Under Secretary of Commerce for Telecommunications under the Trump administration, echoed similar sentiments but said it would be platforms that would have to be “much more careful in what they say”.

City workers clean the sidewalk and street in front of the Bataclan concert hall in Paris on December 22, 2015, after the sidewalk in front of the venue became accessible to pedestrians again. (FRANCOIS GUILLOT/AFP via Getty Images)

A potentially bigger concern raised is whether a ruling in favor of Gonzalez would prevent tech companies from having the freedom to get rid of content that is harmful and violent, such as child abuse. Eric Schnapper, a law professor at the University of Washington who is representing Gonzalez, argued that his case was a “separate issue.”

“We are in favor of social media sites being able to remove content that is offensive or dangerous,” Schnapper told DCNF. “Our concern is that they are suggesting things and promoting things that are dangerous.”

Candeub appeared to agree with Schnapper, explaining that 230 (c)2 addressed such concerns. Section (c)2 protects any “provider” or “user of an interactive computer service” from repercussions if certain content is restricted or censored because it is “obscene, lewd, obscene, filthy, excessively violent, harassing, or otherwise objectionable.” according to the Center for Bipartisan Policy.

“Platforms prefer section (c) 1 because the immunity is absolute, (c) 2, however, they can only get it if they show good faith and that the content they take down fits that category, so they have a lot more to prove it,” said Candeub. “What [Google] will argue that c1 covers everything and therefore if c1 is cut then they will not be able to protect children and that is not true. If (c)1 is cut, they will have to rely on (c)2 and they don’t like c2 as much.”

Marchese disagreed, arguing that limiting the scope of Section 203(c)1 would have a “disastrous effect.”

“If the court rules in Gonzalez’s favor, you’re going to be faced with this really weird legal situation where technically if you do any kind of moderation, that means you know there’s bad content on your platform and therefore he could be held responsible for it,” he pointed out. Marchese. “That’s exactly what section 230 was meant to do away with.”

The Supreme Court is scheduled to hear Gonzalez v. Google LLC on February 21. Candeub said he would be “surprised if the court rules in favor of Gonzales.”

“There just isn’t enough evidence in this case for the Supreme Court to say anything useful,” Candeub noted. “We don’t know how these algorithms work. You can’t just wave your hands and expect immunity, Google can’t do that. The question is if he is a speaker, does he transmit his own message through his algorithms?’

Google LLC counsel Lisa Blatt did not respond to The Daily Caller News Foundation’s request for comment.

All content created by the Daily Caller News Foundation, an independent and nonpartisan news service, is available free of charge to any legitimate news publisher that can deliver a large audience. All republished articles must include our logo, our reporter’s byline and their affiliation with DCNF. For any questions about our guidelines or working with us, please contact [email protected]

Leave a Reply

Your email address will not be published. Required fields are marked *