Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter

from the the-blame-game dept

Can you imagine what kind of world we’d live in if you could blame random media companies for tangential relationships they had with anyone who ever did anything bad? What would happen if we could blame newspapers for inspiring crime? Or television shows for inspiring terrorism? The world would be a much duller place.

We’ve talked a lot about how the entire purpose of Section 230 of the Communications Decency Act is to put the liability on the right party. That is, it’s entirely about making sure the right party is being sued and to avoid wasting everyone’s time by suing some random party, especially in pursuit of “Steve Dallas” type lawsuits, where you just sue some random company, tangentially connected to some sort of legal violation, because they have the deepest pockets.

Image

Unfortunately, a judge in the NY Supreme Court (which, bizarrely, is NY’s lowest level of courts) has allowed just such a lawsuit to move forward. It was filed by the son of a victim of the racist dipshit who went into a Buffalo supermarket and shot and killed a bunch of people a couple years ago. That is, obviously, a very tragic situation. And I can certainly understand the search for someone to blame. But blaming “social media” because someone shot up a supermarket is ridiculous.

It’s exactly the kind of thing that Section 230 was designed to get tossed out of court quickly.

Of course, NY officials spent months passing the blame and pointing at social media companies. NY’s Governor and Attorney General wanted to deflect blame from the state’s massive failings in handling the situation. I mean, the shooter had made previous threats, and law enforcement in NY had been alerted to those threats and failed to stop him. Also, he used a high capacity magazine that was illegal in NY and law enforcement failed to stop him. Also, when people in the store called 911, the dispatcher didn’t believe them and hung up on them.

The government had lots of failings that aren’t being investigated, and lawsuits aren’t being filed over those. But, because the shooter also happened to be a racist piece of shit on social media, people want to believe we should be able to magically sue social media.

And, the court is allowing this based on a very incorrect understanding of Section 230. Specifically, the court has bought into the trendy, but ridiculous, “product liability” theory, that is now allowing frivolous and vexatious plaintiffs across the country to use this “one weird trick” to get around Section 230. Just claim “the product was defective” and, boom, the court will let the case go forward.

That’s what happened here:

The social media/internet defendants may still prove their platforms were mere message boards and/or do not contain sophisticated algorithms thereby providing them with the protections of the CDA and/or First Amendment. In addition, they may yet establish their platforms are not products or that the negligent design features plaintiff has alleged are not part of their platforms. However, at this stage of the litigation the Court must base its ruling on the allegations of the complaint and not “facts” asserted by the defendants in their briefs or during oral argument and those allegations allege viable causes of action under a products liability theory.

Now, some might say “no big deal, the court says they can raise this issue again later,” but nearly all of the benefit of Section 230 is in how it gets these frivolous cases tossed early. Otherwise, the expense of these cases adds up and creates a real mess (along with the pressure to settle).

Also, the judge here seems very confused. Section 230 does not protect “mere message boards.” It protects any interactive computer service from being held liable for third-party speech. And whether or not they “contain sophisticated algorithms” should have zero bearing on whether or not Section 230 applies.

The Section 230 test to see if it applies is quite simple: (1) Is this an interactive computer service? (2) Would holding them liable in this scenario be holding them liable for the speech of someone else? (3) Does this not fall into any of the limited exceptions to Section 230 (i.e., intellectual property law, trafficking, or federal criminal law)? That’s it. Whether or not you’re a message board or if you use algorithms has nothing to do with it.

Again, this kind of ruling only encourages more such vexatious litigating.

Outside of Section 230, the social media defendants sought to go to the heart of the matter and just made it clear that there’s no causal link between “dipshit being a dipshit on social media” and “dipshit going on a murderous rampage.”

And, again, the judge doesn’t seem to much care, saying that a jury can figure that out:

As a general proposition the issue of proximate cause between the defendants’ alleged negligence and a plaintiff’s injuries is a question of fact for a jury to determine. Oishei v. Gebura 2023 NY Slip Op 05868, 221 AD3d 1529 (4th Dept 2023). Part of the argument is that the criminal acts of the third party, break any causal connection, and therefore causation can be decided as a matter of law. There are limited situations in which the New York Court of Appeals has found intervening third party acts to break the causal link between parties. These instances are where “only one conclusion may be drawn from the established facts and where the question of legal cause may be decided as a matter of law.” Derdiarian v Felix Contr. Corp., 51 NY2d 308 at 315 (1980). These exceptions involve independent intervening acts that do not flow from the original alleged negligence.

Again, though, getting this kind of case to a jury would be crazy. It would be a massive waste of everyone’s resources. By any objective standard, anyone looking at this case would recognize that it is not just frivolous and vexatious, but that it creates really terrible incentives all around.

If these kinds of cases are allowed to continue, you will get more such frivolous lawsuits for anything bad that happens. Worse, you will get much less overall speech online, as websites have incentives to take down or block any speech that isn’t Sesame Street-level in terms of how it’s portrayed. Any expression of anger, any expression of complaining, any expression of being unhappy about anything could otherwise be seen as proof that the social media was “defective in its design” for not magically connecting that expression to future violence.

That would basically be the end of any sort of forum for mental health. It would be the end of review sites. It would be the end of all sorts of useful websites, because the liability that could accrue from just one user on those forums saying something negative would be too much. If just one of the people in those forums then took action in the real world, people could blame it and sue it for not magically stopping the real world violence.

This would be a disaster for the world of online speech.

As Eric Goldman notes, it seems unlikely that this ruling will survive on appeal, but it’s still greatly problematic:

I am skeptical this opinion will survive an appeal. The court disregards multiple legal principles to reach an obviously results-driven decision from a judge based in the emotionally distraught community.

The court doesn’t cite other cases involving similar facts, including Gibson v. Craigslist and Godwin v. Facebook. One of the ways judges can reach the results they want is by selectively ignoring the precedent, but that approach doesn’t comply with the rule of law.

This opinion reinforces how the “negligent design” workaround to Section 230 will functionally eliminate Section 230 if courts allow plaintiffs to sue over third-party content by just relabeling their claims.

Separately, I will note my profound disappointment in seeing a variety of folks cheering on this obviously problematic ruling. Chief among them: Free Press. I don’t always agree with Free Press on policy prescriptions, but generally, their heart is in the right place on core internet freedom issues. Yet, they put out a press release cheering on this ruling.

In the press release, they claim (ridiculously) that letting this case move forward is a form of “accountability” for those killed in the horrific shooting in Buffalo. But that’s ridiculous and anyone should recognize that. There are people to hold liable for what happened there: most obviously the shooter himself. But trying to hold random social media sites liable for not somehow stopping future real world violence is beyond silly. As described above, it’s also extremely harmful to causes around free speech that Free Press claims as part of its mission.

I’m surprised and disappointed to see them take such a silly stance that undermines its credibility. But I’m even more disappointed in the court for ruling this way.

Filed Under: , , , , ,
Companies: google, reddit, youtube

Rate this comment as insightful
Rate this comment as funny
You have rated this comment as insightful
You have rated this comment as funny
Flag this comment as abusive/trolling/spam
You have flagged this comment
The first word has already been claimed
The last word has already been claimed
Insightful Lightbulb icon Funny Laughing icon Abusive/trolling/spam Flag icon Insightful badge Lightbulb icon Funny badge Laughing icon Comments icon

Comments on “Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter”

Subscribe: RSS Leave a comment
41 Comments
Anonymous Coward says:

Confused NY Court Says That Section 230 Doesn’t Block Ridiculous Lawsuit Blaming Social Media For Buffalo Shooter

This is… good? We could launch a class action suit against the KOSA, EARN IT, STOP CSAM, and RESTRICT bills blaming them for every mass shooting event that has occurred ever since they were first announced.

Anonymous Coward says:

Wasn’t the Buffalo shooter an active /pol/ user who was inspired by the Christchurch shooter (specifically his manifesto that he distributed on various chan boards)? Why on earth is Facebook being sued? He didn’t even stream on Facebook like the Christchurch shooter did, he did it on Twitch. Am I missing something?

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

You’ll forever be unloved and unhappy, trapped in circles of unproductive and insightful losers who have literally nothing going for them but their skin color.

And that’s why you’re angry. It doesn’t get you as far as it used to. You’re finding you have to actually apply yourself and provide something of value, and you can’t do it.

You’re every bit as redundant as you imagine those of other races to be.

This comment has been flagged by the community. Click here to show it.

This comment has been deemed insightful by the community.
That One Guy (profile) says:

If you only stick to your principles when you like the result you don't have any

Separately, I will note my profound disappointment in seeing a variety of folks cheering on this obviously problematic ruling. Chief among them: Free Press. I don’t always agree with Free Press on policy prescriptions, but generally, their heart is in the right place on core internet freedom issues. Yet, they put out a press release cheering on this ruling.

The true test of a person/group’s principles is whether they’ll stick to them even when they might not like the result and/or who it’ll require them to side with.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Just Go Away

The problem here is that many people just want social media to go away. They don’t want facebook, twitter, reddit, or pretty much every forum on the internet to exist anymore. It’s like how the media companies acted back in the 90s when people first started sharing mp3 and other content. They wanted the internet to just go away. Because that’s the only solution here. Even with the most draconian moderation, you’re still going to have people who try to get around it. And they will. It’s behavior we see all the time. Tell people they cannot discuss something, and they’ll invent codewords to get around it, or they will talk about peripheral things.

And the sad thing is, even if social media was outlawed, people still wouldn’t be happy. Because they’d wonder why they can no longer share pictures with their family, or discuss how they hate certain people.

This comment has been deemed insightful by the community.
Anonymous Coward says:

Re:

And the sad thing is, even if social media was outlawed, people still wouldn’t be happy. Because they’d wonder why they can no longer share pictures with their family, or discuss how they hate certain people.

Precisely this.

The Dumbpublicans gleefully rubbing their hands at the idea of GenZs getting kicked off social media because those platforms get banned don’t have the brainpower to realize that if this happens, Alex Jones gets banned too.

Then again that is precisely the demographic fucked in the head enough to pay Jones for supplements.

This comment has been flagged by the community. Click here to show it.

MrWilson (profile) says:

Re:

Humans can’t agree on semantic meaning, so it would be impossible to train an AI to understand the nuances in a manner that was fair without significant improvements to the technology. You’d just get AI that was biased towards the data it was trained on, the way chatbots start to get racist when you feed them racist content.

There’s also a certain amount of subjective judgment that goes into justice and we can’t even agree if every violation of law should be considered a crime.

Do you think you should receive a ticket for every time you’ve ever committed a traffic violation? You’d be in jail or debt for the rest of your life.

This comment has been flagged by the community. Click here to show it.

This comment has been flagged by the community. Click here to show it.

Anonymous Coward says:

Re: Re: Re:

I remember those threats, Jhon. When you used to go by “horse with no name”. They were as toothless and spineless and pathetic as you in your nursing home ward back then and it’s the exact same thing now.

The best thing you have going for you is shaking your fist and making vague threatening promises just to feel like a top again.

andrea iravani says:

So what. Who cares?! That is now going to be my new attitude, since that is the attitude that I have encountered by everyone that I have reported the ten year long pre-meditated, cold-blooded, saistic organized crime spree that has been perpetrated against me.

The FBI, police, sherrifs, and government tell people reporting organized crime and terrorism to talk to crisis workers, because they are basically telling the victims reporting the crime, or witnesses reporting the crime, ” So what, we don’t care.” and it is just a lazy cop put to evade all responsibility for actually doing their jobs.

That is what happens when the illegal Orwellian terrorist network took over every sector of government, legal sector, academia, healthcare, financial sector, and every other sector of the economy.

So what who cares.

Add Your Comment

Your email address will not be published. Required fields are marked *

Have a Techdirt Account? Sign in now. Want one? Register here

Comment Options:

Make this the or (get credits or sign in to see balance) what's this?

What's this?

Techdirt community members with Techdirt Credits can spotlight a comment as either the "First Word" or "Last Word" on a particular comment thread. Credits can be purchased at the Techdirt Insider Shop »

Follow Techdirt

Techdirt Daily Newsletter

Ctrl-Alt-Speech

A weekly news podcast from
Mike Masnick & Ben Whitelaw

Subscribe now to Ctrl-Alt-Speech »
Techdirt Deals
Techdirt Insider Discord
The latest chatter on the Techdirt Insider Discord channel...
Loading...