The online spread of Saturday’s horrific series of murders only underscores the huge stakes underlying the Supreme Court’s impending decision. And it puts into perspective the political battle being played out at state, national and global levels over how – or whether – social media companies should moderate their platforms.
After HB 20 went into effect last week, it raised a host of questions about how social media will work in Texas in the future. Could tech platforms offer Texas-specific versions of their sites? Will some platforms completely stop providing services in Texas? What could social media content look like in Texas, without content moderation? The answers are still unclear.
What seemed hypothetical on Wednesday suddenly became painfully real on Saturday as social media companies rushed to respond to the shooting, which was originally broadcast live on video platform Twitch. Although Twitch said it took down the live stream within two minutes, that didn’t stop the video from being copied and shared on other platforms. Social media companies including Meta, Twitter, YouTube and Facebook parent Reddit have banned the video from their sites and are working to remove copies. But under Texas law, taking these steps could expose tech companies to costly litigation.
The shooting offers a horrifying example of the dilemma and challenges facing tech platforms in Texas and potentially across the country if the Supreme Court upholds the state’s content moderation law. A decision taking the Texas side would also likely bolster Florida lawmakers, who, spurred on by their statement creed tech companies politically discriminate against conservatives, passed similar legislation that is also stalled in court. And it would also give a roadmap for other states wishing to erect moderation bans. It equates to a confusing regulatory environment that has some state governments demanding lax moderation while others, like European policymakers, seem poised to impose tougher moderation standards.
New uncertainties for technology platforms under Texas law
If the Texas law is upheld, social media companies will face greater restrictions on how they moderate content. As HB 20 is written, platforms would become responsible in the state for taking action to “block, disallow, remove, deplatform, demonetize, de-boost, restrict, deny equal access or visibility , or otherwise discriminate the expression”.
The law is so new that no lawsuits have yet been filed for acts of alleged censorship.
According to Evelyn Douek, a platform moderation expert at Columbia University’s Knight First Amendment Institute, platforms might try to remove something like the Buffalo video under HB 20 and justify it on the grounds that they don’t censor. not the expression, but just remove the content.
But Jeff Kosseff, a law professor and platform moderation expert at the US Naval Academy, said the law was still ambiguous enough to create huge uncertainty for social media companies. Platforms, he said, would likely face immense legal pressure not to remove graphically violent content, including material like the Buffalo video, as plaintiffs could always claim that removing the videos is in itself a stifling of viewpoints under HB 20. Even the threat of such costumes could be a drag on moderation.
“Even if you just delete the video and say, ‘This video violates our policy,’ you’re still going to open the door to claims that it was deleted because it was posted by someone who has a particular perspective on things,” Kosseff told CNN.
Whether tech platforms are sued for removing the video or for suppressing a user’s perspective surrounding the video, the result would be the same – a law that effectively floods digital spaces with violent content, according to professor Steve Vladeck. of law at the University of Texas. and a CNN legal analyst.
“There’s no doubt that the Buffalo set video conveys both the stakes of the HB 20 dispute and what’s wrong with HB 20 itself,” Vladeck told CNN. “If a Texas-based account reposted or reposted the Twitch stream, deleting it would clearly, in my view, violate HB 20. When you deprive social media platforms of the ability to moderate content, you almost guarantee that they will be inundated with violent, inappropriate and otherwise objectionable messages.”
Beyond the graphic video itself, the Buffalo shooting also involves the spread of hate speech online like the kind found in the shooting suspect’s 180-page document, such as racist conspiracy theories. This type of content would clearly be bound to remain under HB 20, legal experts agreed, because it expresses a clear point of view.
“My biggest concern is really restricting the platforms’ ability to suppress these theories that fuel violence,” Kosseff said.
A boost to rethink content moderation
Beyond what HB 20 demands from tech platforms, Buffalo’s video also raises questions about some voluntary proposals to relax content moderation standards, like what billionaire Elon Musk has in mind for Twitter.
Musk is currently looking to buy Twitter in a $44 billion deal, saying he intends to bring more free speech to the platform by making it easier to enforce content rules by Twitter. How could a Musk-owned Twitter handle the Buffalo video? It’s not immediately clear. Twitter declined to comment, and Musk did not immediately respond to a request for comment.
“If there are tweets that are false and bad, those should either be deleted or made invisible, and a suspension, a temporary suspension, is appropriate,” Musk said. He added: “If they say something that’s illegal or just destructive to the world… maybe a timeout or a temporary suspension or that particular tweet should be made invisible or have very limited traction.”
Musk didn’t say whether Twitter should consider something like Buffalo’s video to be “wrong and wrong” or “destructive to the world.” If he concludes that is the case, then his position on the video could also risk going against HB 20. The result would be a clash between two entities – Musk and the Texas government – who ostensibly share the same goal of enabling more content than social media platforms, at least today, are widely agreed upon is wrong.
The immediate reaction from major social platforms to remove the Buffalo video reflects an established consensus on how to handle live-streamed videos of violence, informed by years of painful experience. But rather than affirm that consensus, recent developments could now fracture and blur it, with significant ramifications for all social media users.