The Supreme Court will hear cases under Section 230: here’s what you need to know 1

JThe future of federal law that shields online platforms from liability for content uploaded to their site is up in the air as the Supreme Court is set to hear two potentially internet-changing cases this week.

The first case, Gonzalez v. Google, to be heard on Tuesday, argues YouTube’s algorithm helped ISIS post videos and recruit members, making online platforms directly and secondarily responsible for the Paris attacks. of 2015 that killed 130 people, 23 of them – Nohemi Gonzalez, a one-year American student. Gonzalez’s parents and other families of deceased victims are seeking damages related to the terrorism law.

Oral arguments for Twitter v Taamneh – a case that presents similar arguments against Google, Twitter and Facebook – centered on another IS terror attack that killed 29 people in Istanbul, Turkey, will be heard on Wednesday.

Cases will decide whether online platforms can be held liable for targeted advertisements or algorithmic content served on their platforms.

Tech companies say Section 230 protects them from these types of lawsuits because it grants them legal immunity from liability for third-party content posted on their platform. The case will decide whether the platforms can be held liable for delivering harmful content to users through their algorithm.

Here’s what you need to know about Section 230.

What is Section 230?

Section 230, passed in 1996, is part of the Communications Decency Act.

The law explicitly states that “No provider or user of an interactive computer service shall be considered the publisher or speaker of information provided by another information content provider”, meaning that platforms online are not responsible for any content a user may post.

The law allows tech companies to moderate or remove content deemed egregious. Section 230, however, does not protect sites that violate federal criminal law or intellectual property law. It also does not protect platforms that create illegal or harmful content.

Because popular sites like Facebook, Twitter, and YouTube rely on user-generated content, many people have credited Section 230 with creating the Internet we now know and love.

As the scale of online platforms has grown dramatically over time, with up to 368 million monthly active users on Twitter alone, experts say Section 230 helps protect struggling businesses track the amount of content posted on their platforms against lawsuits. what users say or do.

What are these cases about?

The Gonzalez family first filed a lawsuit in 2016, alleging that because Google, which owns YouTube, matches and suggests content to users based on their opinions, the platform recommended ISIS content to users. and allowed them to find other videos and accounts belonging to ISIS. .

The plaintiffs also argued that Google placed paid ads on ISIS videos, which meant they shared ad revenue with the terrorist organization. The lawsuit argues that this means that Google did not take enough steps to ensure that ISIS stayed off the platform. For this reason, the plaintiffs allege that these technology companies are directly responsible for having “committed acts of international terrorism” and secondarily responsible for having “conspired with, and aided and abetted, the acts of international terrorism of the Islamic State “.

A federal district court in California dismissed the suit, saying Google could not be held responsible for content produced by Islamic State. The United States Court of Appeals for the 9th Circuit sided with the district court, but in October the Supreme Court agreed to hear the case.

In an opposition brief filed in the Supreme Court, Google argued that a reconsideration of the case was not warranted because websites like YouTube could not be held liable as the “publisher or speaker” of the content. created by users. They add that Google does not have the ability to filter “all third-party content for illegal or tortious material” and that the company was concerned that “the threat of liability could lead to sweeping restrictions on online activities”.

Big tech companies like Twitter and Meta, which have voiced their support for Google in this case, say recommendations based on their algorithms allow them to “organize, categorize and display” user content in a way which improves the user’s experience on the platforms. and called the ability to do so “essential”.

What is the future of article 230?

If the court rules in favor of Gonzalez, the lawsuit will set a precedent for holding tech companies liable for targeted ads or endorsements.

The effects this could have on the internet are not fully known, although many warn that tech companies would face a slew of lawsuits. According to Associated Press. And other review sites could even be held liable for defamation if a particular restaurant receives low ratings.

Even dating sites, like Tinder and Match, have called Section 230 essential to the user experience on the app, as they hope to continue providing matchmaking recommendations “without having to worry about crippling litigation”, according to SCS.

What do legislators think of Section 230?

Conservatives have long criticized Section 230, alleging it allows social media platforms to censor right-wing content.

This scrutiny has been applied to platforms like Twitter, which have come under fire after it deleted a New York Post article about Hunter Biden’s laptop. Twitter executives later called the action a mistake during a House committee hearing, but many conservatives claimed it was evidence of bias. Lawmakers also criticized social platforms for banning conspiracy theorist Alex Jones’ Infowars page from their sites in 2018.

Former President Donald Trump has made calls to repeal the law, even prompting the Justice Department to release proposed amendments to Section 230 in 2020.

“I’m going to cut to the chase, Big Tech wants to attract conservatives,” Rep. Jim Jordan said during a House Judiciary Committee hearing in July 2020. “It’s not a hunch, it’s not not a suspicion, it’s a fact.”

Democrats also argued against Section 230, saying it prevents platforms from being held liable for hate speech and misinformation posted on their sites.

In July 2021, Senators Amy Klobuchar and Ben Ray Lujan introduced a bill that would remove tech companies’ immunity from lawsuits if their algorithms promote health misinformation.

The White House then called on Congress to revoke Section 230 during a September “listening session” on tech corporate accountability. And in January, President Joe Biden published an op-ed in the Wall Street Journal, calling for bipartisan legislation that would hold tech companies accountable.

“The American tech industry is the most innovative in the world… But like many Americans, I am concerned about how some industry players collect, share and exploit our most personal data, deepen extremism and polarization in our country, tilt the game of our economic terrain, violate the civil rights of women and minorities, and even put our children at risk,” Biden wrote.

More must-reads from TIME


contact us at [email protected].

gb7

Don’t miss interesting posts on Famousbio

Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

Ebanie Bridges is an international champion, wears underwear to weigh in and has an OnlyFans and says boxers who don’t usefulness what they’ve to their merit are ‘f****** stupid’

Ebanie Bridges is an Australian skilled boxer and lately was the WBA…

Arrest made in murder of LA Bishop David O’Connell, sources say

Los Angeles police have arrested a person in reference to the homicide…

14 Celebs Who Embraced Their Big Ears

If you’re really trying hard, you will find at least a few…

Reduce IT Employee Fatigue: Gartner’s Four-Step Plan

Successful organizations must involve top executives, lower organizational layers, IT, and business…