What are the two lawsuits in the U.S. Supreme Court which could change the internet as we know it?

Source: The post is based on the article “What are the two lawsuits in the U.S. Supreme Court which could change the internet as we know it?” published in The Hindu on 7th March 2023.

What is the News?

The Supreme Court of the United States (SCOTUS) has begun hearing two pivotal lawsuits that will for the first time ask it to interpret Section 230 of the U.S. Communications Decency Act of 1996.

What is Section 230?

According to Section 230 of the U.S Communications Decency Act, if a person posts on Facebook that a certain individual is a fraud, the individual cannot sue the platform, but only the person who posted it. 

It is essentially a “safe harbour” or “liability shield” for social media platforms or any website on the internet that hosts user-generated content, such as Reddit, Wikipedia, or Yelp.

This section has been described by a recent book and by Lisa Blatt, Google’s lawyer in the Gonzalez case as the “26 words that created the internet”.

What are the two lawsuits filed in the US Supreme Court?

Both lawsuits have been brought by families of those killed in Islamic State(ISIS) terror attacks. 

Gonzalez versus Google Lawsuit: It has been filed by the family of Nohemi Gonzalez, an American killed while studying in Paris, in the ISIS terror attacks of 2015.

– The family is suing YouTube­ parent Google for affirmatively recommending ISIS videos to users through its recommendations algorithm. 

Twitter, Inc versus Taamneh Lawsuit: It pertains to a lawsuit filed by the family of a Jordanian citizen killed in an ISIS attack in Turkey. The lawsuit relies on the Anti-terrorism Act, which allows U.S. nationals to sue anyone who “aids and abets” international terrorism “by knowingly providing substantial assistance.

– The family argues that despite knowing that their platforms played an important role in ISIS’s terrorism efforts.

What have Tech companies said about Section 230?

Twitter argued that Section 230 allows facilitated platforms to moderate huge volumes of content and present the “most relevant” information to users. 

Digital rights activists pointed out that holding platforms liable for what their recommendation algorithms present could lead to the suppression of legitimate third-party information of political or social importance, such as those created by minority rights groups.

Print Friendly and PDF