YouTube is currently embroiled in a landmark Supreme Court case that has the potential to change how platforms handle user-generated content. The case revolves around whether or not YouTube can be held liable for hosting videos produced by a terrorist organization.
The family of an American killed in an ISIS attack is suing YouTube for allowing ISIS to use its platform to spread extremist propaganda. Google, the company that owns YouTube, claims that Section 230 of the Communications Decency Act shields it from liability for user-generated content.
During a hearing on Tuesday, Google’s lawyer warned that if YouTube loses, the internet could turn into a “horror show.” He contended that holding platforms liable for user-generated content could chill free speech and innovation online. In order to avoid being sued, he also warned that platforms would be forced to remove any content that could be considered controversial or offensive. This could result in a “dumbed-down” internet devoid of diverse viewpoints and creative expression.
In contrast, the family’s lawyer argued that holding YouTube accountable would not stifle free speech. He emphasized that other laws, such as copyright law, hold platforms liable for hosting infringing content. He also claimed that YouTube should be held to a higher standard because it has a responsibility to protect users and promote safety on its platform.
The decision of the Supreme Court in this case could have far-reaching consequences for the future of online content moderation. The court’s decision is still unknown, but the outcome of this case could have a significant impact on how platforms approach user-generated content in the future.
The sources for this piece include an article in Deadline.