Illustration – Youtube
The YouTube logo can be seen on a laptop screen. Photo credit – Fabian Sommer—Picture Alliance/Getty Images
The future of federal law that protects online platforms from liability for content uploaded to their website is up in the air as the Supreme Court this week will hear two cases that could change the internet.
The first case, Gonzalez v. Google, due to be heard Tuesday, argues that YouTube’s algorithm helped ISIS post videos and recruit members – making online platforms directly and secondarily liable for the 2015 Paris attacks , which killed 130 people, including 23 – year-old American college student Nohemi Gonzalez. Gonzalez’s parents and the families of other deceased victims are seeking damages under the Counter-Terrorism Act.
Oral arguments for Twitter against Taamneh – a case bringing similar arguments against Google, Twitter and Facebook – focus on another ISIS terror attack that killed 29 people in Istanbul, Turkey, and will be heard on Wednesday.
In the cases, it is decided whether online platforms can be held liable for the targeted advertising or the algorithmic content distributed on their platforms.
Tech companies argue that Section 230 protects them from these types of lawsuits because it gives them legal immunity from liability for third-party content posted on their platform. The case will decide whether platforms can be held liable for distributing harmful content to users through their algorithm.
Here’s what you should know about Section 230.
What is Section 230?
Section 230, passed in 1996, is part of the Communications Decency Act.
The law specifically states: “No provider or user of an interactive computer service shall be treated as a publisher or speaker of information provided by another information content provider”, meaning that online platforms are not responsible for the content that a user can publish.
The law allows tech companies to moderate or remove content deemed outrageous. However, Section 230 does not protect websites that violate federal criminal statutes or intellectual property laws. It also does not protect platforms that create illegal or harmful content.
Because popular websites like Facebook, Twitter, and YouTube rely on user-generated content, many people have credited Section 230 with creating the Internet we know and love today.
As the size of online platforms has drastically increased over time, with up to 368 million monthly active users on Twitter alone, experts argue that Section 230 is helping businesses that are struggling with the volume of traffic on their platforms to keep up with published content, protect against lawsuits about what users say or do.
What are these cases about?
The Gonzalez family first filed a lawsuit in 2016, alleging that the platform recommended users’ content from ISIS and allowed them to find other videos and accounts owned by ISIS because Google, which owns YouTube, content based on their views compares and suggests to the users.
The plaintiffs also argued that Google placed paid ads in ISIS videos, meaning they shared ad revenue with the terrorist organization. The lawsuit argues that this means Google hasn’t taken enough action to ensure ISIS stays off the platform. For this reason, plaintiffs allege that these tech companies are directly liable for “perpetrating acts of international terrorism” and secondarily for “conspiring with and aiding and abetting in the acts of international terrorism of ISIS.”
A federal district court in California dismissed the lawsuit, arguing that Google cannot be held responsible for content produced by ISIS. The US Circuit Court of Appeals for the 9th Circuit sided with the district court, but in October the Supreme Court agreed to hear the case.
In a counter-argument brief filed with the Supreme Court, Google claimed that a review of the case was not warranted because sites like YouTube could not be held liable as “publishers or speakers” of user-generated content. They add that Google is unable to screen “all third-party content for illegal or tortious material” and that the company is concerned that “the threat of liability could result in wholesale restrictions on online activity.”
Big tech companies like Twitter and Meta, which have expressed support for Google in the case, say recommendations based on their algorithms allow them to “organize, rank, and display” user content in a way that enhances a user’s experience the platforms is improved and called the ability to do so “essential”.
What is the future of Section 230?
If the court rules in Gonzalez’s favor, the lawsuit will set a precedent to hold tech companies liable for targeted ads or recommendations.
The impact this could have on the internet is not fully known, although many warn that tech companies will face a multitude of lawsuits. Corporate giants like Yelp, Reddit, Microsoft, Craigslist, Twitter and Facebook say job and restaurant searches could be restricted if platforms could be sued over users’ posts, they said Associated Press. And other review sites could even be held liable for defamation if a particular restaurant gets bad reviews.
Even dating sites like Tinder and Match cited Section 230 as essential to user experience on the app, hoping to continue providing match recommendations “without fear of overwhelming litigation,” according to it CBS.
How does the legislature feel about Section 230?
Conservatives have long criticized Section 230, claiming it allows social media platforms to censor right-wing content.
This scrutiny was applied to platforms like Twitter, which came under fire after they removed a New York Post story about Hunter Biden’s laptop. Twitter executives later called the action a blunder at a House Committee hearing, but many conservatives have claimed it as evidence of bias. Lawmakers also criticized the banning of conspiracy theorist Alex Jones’ Infowars page from their websites in 2018.
Former President Donald Trump called for the law to be repealed, even urging the Justice Department to release proposed changes to Section 230 in 2020.
“I’ll just get to the point, big tech is out to get conservatives,” Rep. Jim Jordan said in a House Judiciary Committee hearing in July 2020. “That’s not conjecture, that’s not suspicion, that’s fact .”
Democrats have similarly opposed Section 230, saying it prevents platforms from being held liable for hate speech and misinformation disseminated on their websites.
In July 2021, Senators Amy Klobuchar and Ben Ray Lujan introduced legislation that would remove tech companies’ immunity from lawsuits if their algorithms promote health misinformation.
The White House later asked Congress to repeal Section 230 during a September “hearing session” on tech company accountability. And in January, President Joe Biden published an op-ed in the Wall Street Journal calling for bipartisan legislation that would hold tech companies accountable.
“American tech industry is the most innovative in the world… But like many Americans, I worry about how some in the industry collect, share, and use our most personal data, deepening extremism and polarization in our country, and affecting our economy, who are breaching the.” Violating the civil rights of women and minorities and even endangering our children,” Biden wrote.
Don’t miss interesting posts on Famousbio