Mass shootings and hate speech: what can the government do?

0

Federal law clearly states that social media companies generally cannot be held liable for speech posted on their platforms. Despite this, New York Attorney General Letitia James (D), at the request of New York Governor Kathy Hochul (D), launched an investigation into the role of social media companies in the tragic Buffalo shooting, which 13 people were shot, 10 of them killed.

The suspected shooter used social media platforms, such as Twitch, 4chan and Discord, to plan and livestream the mass shooting, according to federal law enforcement sources. The investigation aims to examine the role of these companies and other online resources and platforms that the alleged shooter has used to express white supremacist views and his desire for violence.

It’s unclear what James or Hochul expect from the investigation. The clear implication is that social media platforms share responsibility for the tragic shooting. There are calls to do more to prevent social media from being used to spread hate speech.

Government cannot force platforms to curb despicable but lawful speech

There is nothing wrong with asking social media companies to do a better job of excluding ugly but legal speech, such as expressions of hate and white supremacy. The internet is full of vile and hateful material. It would be a better world if no one had these opinions or voiced them.

But the government cannot force platforms to moderate legal discourse. Social media companies are private entities and, unlike the government, they are not required to comply with the First Amendment. They decide what content to include on their sites and what to exclude.

Although there are some limited areas, such as incitement to illegal activity or genuine threats, where speech is not constitutionally protected and where the government can prohibit broadcast, the U.S. Supreme Court has always maintained that hate speech is protected by the First Amendment.

Statements of intent by the alleged shooter do not constitute incitement under current law. Incitement requires that he intends to induce an imminent unlawful action in another and is likely to so incite (Brandenburg vs. Ohio (1969)). Some of his published statements may satisfy New York’s law prohibiting terrorist threats (NY Pen. Code §490.20). But even if it did, it’s not clear that he meets the constitutional test of a real threat.

Statements of intent by the alleged shooter do not constitute incitement under current law. The Supreme Court has held that incitement requires that the speaker intends to incite imminent illegal action in others and is likely to incite it. Some of his published statements may satisfy New York’s law prohibiting terrorist threats, NY Penal Code §490.20. But even if it did, it’s not clear whether it meets the constitutional test of a genuine threat or falls under any other category of unprotected speech.

The government cannot punish hate messages or the platforms that are used to spread them. It would violate the First Amendment to hold social media companies accountable because the Buffalo shooter used them to express a message.

Section 230

Additionally, a federal statute, 47 USC § 230, explicitly states that social media companies cannot be held liable, with rare exceptions, for what is posted on their platforms. The law expressly prevails over state laws that impose liability for such messages.

In fact, in part because this federal law immunizes platforms for the content moderation decisions they make, social media companies already perform an enormous amount of content moderation. For example, from October to December 2021, Facebook, now known as Meta Platforms, claims to have taken action against terrorist content 7.7 million times, bullying and harassment 8.2 million times, and child sexual abuse material 19.8 million times.

And, when the alleged shooter attempted to live-stream an act of mass murder, Twitch was able to delete the video and suspend his account in less than two minutes. Unfortunately, as with the shooting in Christchurch, New Zealand, the recordings remain online.

The moderation of this discourse is necessary for the Internet to be usable by the greatest number, and the platforms know this. The enormous amount of content moderation performed by platforms such as Facebook demonstrates that social pressure and market factors can encourage conscientious moderation of content without unconstitutional government pressure.

But it’s also worth remembering that content moderation happens on such a huge scale that it’s impossible for platforms to get it right 100% of the time. It is completely acceptable to ask social media platforms to do better.

The government cannot force platforms to censor

Attorney General James, of course, is free to investigate social media’s role in the Buffalo shooting. But, if the investigation finds that there has been too much or too little content moderation, New York cannot force the platforms to change their moderation practices because those practices are protected by the First Amendment and immune by section 230.

The best we can hope for is that social media companies improve their content moderation practices by more accurately and quickly excluding or limiting access to objectionable material that users don’t want to see and platforms don’t want to host. .

The internet and social media are extremely powerful tools for freedom of expression. It is no exaggeration to say that this is the most significant development for expression since the invention of the printing press.

They have enormously improved people’s ability to reach mass audiences and access information. But such tools, and speech itself, can be used for better or for worse.

James can investigate, but the law is clear: social media companies cannot be punished for being the sites where racist speech has been expressed by a deeply disturbed and violent individual.

This article does not necessarily reflect the views of the Bureau of National Affairs, Inc., publisher of Bloomberg Law and Bloomberg Tax, or its owners.

Write for us: guidelines for authors

Author Information

Erwin Chemerinsky is Dean of UC Berkeley Law School and Jesse H. Choper Professor Emeritus of Law. Previously, he was Founding Dean and Distinguished Professor of Law, and Raymond Pryke Professor of First Amendment Law at the University of California, Irvine School of Law.

Alex Chemerinsky is a federal clerk in the district of Arizona.

Share.

About Author

Comments are closed.