Two Supreme Court Cases May Bring Major Changes in Social Media Content Moderation

Online rules and regulations have been an integral part of the Internet over the past few years, which is no surprise since it plays a huge part in social networking sites. Implementation or lack thereof can define what kind of information users take in, and so these cases may signify a big change.

Gonzales v Google

This Supreme Court case in particular brings Section 230 in question, which protects social media giants from taking responsibility for what their users post online. It will question whether or not they should answer for the ramifications of the contents circulating on their platform. 

The case was prompted by the death of the California college student Nohemi Gonzales, 23 years old at the time, after falling victim to a coordinated series of terrorist attacks in the city that killed 130 people., as mentioned in The New York Times.

Her father, and eventually the rest of her family, is accusing Google as well as other companies of spreading content that shaped its users into becoming terrorists. This makes them legally responsible for the outcome and the aftermath of the attacks.

Although the family blames several tech companies for the tragedy, the case that will be heard by the Supreme Court focuses on Google as its defendant. The plaintiffs claim that the company violated anti-terrorism laws by having ISIS videos appear on YouTube recommendations.

Even if it's merely the results of YouTube's algorithms, the Director of the Program on Platform Regulation at Stanford's Cyber Policy Center Daphne Keller pointed out that YouTube is liable for the algorithms that promoted the content in the first place.

Should the court side with the plaintiffs and deem that the companies should claim responsibility in the hearing this February 21st, it will likely change moderation practices and take down more content to avoid more lawsuits. This could result in hurting the freedom of speech of its users. 

Read Also: Twitter Continues Layoffs As Elon Musk Cuts Staff From Global Content Moderation Team

Twitter v Taamneh

Google is not the only company that will be placed under the Supreme Court's spotlight, since the following day will also bring in Twitter. Similar to the Gonzales' case, it also focuses on the US anti-terrorism law being violated by the company. 

Jordanian citizen Nawras Alassaf is a victim of a terrorist attack in Istanbul that was allegedly brought by promoted posts from ISIS. The plaintiffs accused the platforms of not controlling the content that spread on their sites, according to the Bipartisan Policy Center.

Those who filed the lawsuit point out that not doing anything to moderate posts provided an infrastructure for the terrorists, therefore supporting their operations. They mentioned that the platforms benefited by profiting from the said ads.

They also said that if the Court finds the companies liable, then greater content moderation policies and restrictions should be implemented. If not, it will "incentivize platforms to apply no content moderation to avoid awareness."

As mentioned in Engadget, it's possible that Congress might determine that Section 230 has caused too much harm and may amend the statute to increase liability for promoted content through algorithms. 

Related: House Judiciary Committee Chair Subpoenas Tech Giant CEOs Over Content Moderation Policies

© 2024 iTech Post All rights reserved. Do not reproduce without permission.

Company from iTechPost

More from iTechPost