With March almost upon us, so comes with it spring storms, and there is one brewing over the roles and liability of social media as it relates to death or injury of third parties. Two cases were heard by the Supreme Court recently that could change the way social media companies censor, moderate, or even promote content.
Understanding the Legal Protections in Place with Section 230
To set the stage for the impact of the cases being heard, you have to understand the legal protections afforded by Section 230 of the Communications Decency Act, specifically section 230(c)(1), to social media companies and internet service providers as to much of the content on their platforms. The law has two critical aspects-the first being “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” and the second being the liberal definition of what an interactive computer service actually is. Currently, it seems to encompass almost any online platform that publishes third party content. While this has been a political soccer ball over the last few years, it has not been a partisan political soccer ball in many instances. Some want Section 230 repealed all together, some want it amended so liability extends to interactive computer services for what they do not remove or what they do remove. It is a balance of free speech (hence the constitutional issue), censoring content, being socially responsible and to a large extent, self governing. There have been task forces, congressional committees, working groups, and lobbyists trying to come up with a solution that everyone can live with, almost with no success. Thus, the cases argued this week before the Supreme Court could mean changes in the law depending on the rulings.
Social Media Responsibility
Nohemi Gonzalez, a U.S. citizen was tragically killed in 2015 in a terrorist attack in Paris. The day after the attack, ISIS claimed responsibility by issuing a statement and video on YouTube, which is owned by Google. Her family filed a lawsuit against several social media companies, claiming that Google aided and abetted in furthering ISIS operations because they knew the company’s algorithms would generate more widespread viewing of content and recruiting propaganda by those sympathetic to their cause. The Ninth Circuit upheld the lower court dismissal of the case. The Gonzalez appeal will be the first time that the Supreme Court has examined Section 230, which is now almost 30 years old. A companion case to Gonzalez Taameh v Twitter is also before the court, however, the issue there is not so much the application of Section 230 to Social Media but instead the scope of the Anti-Terrorism Act (ATA) and the Justice Against Sponsors of Terrorism Act (JASTA), which allows victims of terrorism to pursue liability claims against any entity that assisted with an act of terrorism. In the Taameh case, where an individual was also killed in an ISIS terrorist attack, the Ninth Circuit actually overturned the ruling of the lower court on the issue of whether the plaintiff had an actionable claim.
Algorithms, Social Media, and Legal Obligations
Multiple special interest groups and other parties filed Amici Curiae briefs on the Gonzalez case including the Integrity Institute and Algotransparency (which I didn’t even know was a thing). While algorithms seem to be the tool of the devil in many lawsuits against social media in everything from suicides to inciting group violence to exacerbating addictive behaviors, there still is the underlying question of whether moderation from a moral and ethical standpoint which stops just short of a legal obligation is enough to carve out a different definition of publisher or speaker in the future. Look for a written decision in this case later this spring or early summer.