Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Courts Social Networks United States

High Court Will Hear Social Media Terrorism Lawsuits (apnews.com) 78

The Supreme Court said Monday it will hear two cases seeking to hold social media companies financially responsible for terrorist attacks. From a report: Relatives of people killed in terrorist attacks in France and Turkey had sued Google, Twitter, and Facebook. They accused the companies of helping terrorists spread their message and radicalize new recruits. The court will hear the cases this term, which began Monday, with a decision expected before the court recesses for the summer, usually in late June. The court did not say when it would hear arguments, but the court has already filled its argument calendar for October and November.

One of the cases the justices will hear involves Nohemi Gonzalez, a 23-year-old U.S. citizen studying in Paris. The Cal State Long Beach student was one of 130 people killed in Islamic State group attacks in November 2015. The attackers struck cafes, outside the French national stadium and inside the Bataclan theater. Gonzalez died in an attack at La Belle Equipe bistro. Gonzalez's relatives sued Google, which owns YouTube, saying the platform had helped the Islamic State group by allowing it to post hundreds of videos that helped incite violence and recruit potential supporters. Gonzalez's relatives said that the company's computer algorithms recommended those videos to viewers most likely to be interested in them.

This discussion has been archived. No new comments can be posted.

High Court Will Hear Social Media Terrorism Lawsuits

Comments Filter:
  • If it doesn't happen it will only be because it might^Wwill affect Truth Social sooner or later. The Conservatives would absolutely love to pass some laws to let them punish social media for anything and everything, and this would be a good start in their eyes.

    • Cry about censorship, then try to censor everything instead of arguing against it like you insist we do for you even when you are literal Nazis.

      You cowards are pathetic.

      You know what else occurred to me today? For all the crying that you wankers do about social media, you are literally making slashdot worse than literally any real social media site with your mod bombings and other horseshit. I can't remember the last time I saw a swastika on Faceboot or Reddit, but I see one here two or three times a week.

      • Cry about censorship, then try to censor everything instead of arguing against it like you insist we do for you even when you are literal Nazis.

        Conservatives aren't the people whining and pooping their pants until people are "cancelled" because they said something the right didn't agree with. But regardless of that, this is a question of civil liability, not political ideology. Ideally the verdict would stand regardless of if the perpetrator were a petty religious zealot, a goose stepping fascist or a condom snorting anarchist. Does the inaction of the platform rise to the level of negligence? Does the platform have any obligation to act on or repo

    • Truth Social is already exempt from the Texas laws. Their law only applies to companies with 25 million users or more.

      • by Tablizer ( 95088 )

        > Truth Social is already exempt...[law] only applies to companies with 25 million users or more.

        Don's on the Loser List, that'll make him steam, ha ha.

  • Will agree social media is responsible. Justice Thomas will ctrl+f the constitution for facebook and find no mentions.

    • Will agree social media is responsible.

      LOL. No.

      Know who will want to blame the social media companies for not censoring them? The Wise Latina and Crew. Thomas, Alito, etc, will default to their normal position that saying mean things isn't violence, and your politics doesn't magically make it so.

      This thing is most likely doomed:

      But a judge dismissed the case and a federal appeals court upheld the ruling. Under U.S. law — specifically Section 230 of the Communications Decency Act — internet companies are generally exempt from liability for the material users post on their networks.

    • by splutty ( 43475 )

      Conservative court. Consider how much like 'terrorism' a lot of statements made by conservatives (politicians and civilians alike) are for a lot of the world, there might be some severe unintended consequences there if "social media is responsible".

  • A commonly accepted definition of "reasonable steps" should be formed because there is no way a web co. can manually inspect every message without going bankrupt or leaving high-regulation countries altogether. Plus, perpetrators often use code words. Only the very dumbest of crooks spell out exactly what they are up to.

    Perhaps something like "X hours of human inspection for every Y words posted". Perhaps be required to only inspect messages where a sanctioned list of key-words appears. Otherwise, it risks

  • by Sloppy ( 14984 ) on Monday October 03, 2022 @11:07AM (#62933517) Homepage Journal

    I don't understand why the courts are so myopic and think so small. Nearly every crime I've heard of in modern times, involved the use of either a CPU (and not just Intel/AMD) or a paved road.

    You'd think that when someone paves a road, they wouldn't be so negligent as to allow it to be used for terrorism, but it's like nobody cares at all, until after we're counting the bodies of loved ones and fellow citizens. Careless bastards!

    And the other day I read a story about a person who wanted to get an abortion, and though it was light on the details, she accidentally admitted that she communicated with other people. You damn well know she made some phone calls on a modern smartphone, full of criminally-negligent RAM, a terrorist-sympathizing CPU, a crime-tolerant GPU which always just looks the other way no matter what is rendered, and even a USB power connector where the manufacturer paid no heed at all to the possibility that it would be vicariously abused to coordinate conspiracies to commit abortion.

    We should all know about this by now. It makes me so mad, whenever I remember that some asshole sold Mohamed Atta a candy bar, just hours before he crashed that airplane. What could that vendor have been thinking, and why do we keep letting them get away with it? It's been two decades, and yet still neither the candy merchant nor its maker, has been called to account for their contribution to 9/11.

    • When someone builds a road, the pavement that the road is built from has no potential to stop terrorists. When someone builds a site with technology, that same technology absolutely has the potential to stop some of the problems that it creates.
      • by Calydor ( 739835 ) on Monday October 03, 2022 @11:47AM (#62933697)

        Sure it does. They just need to install spikes that can be deployed up through the surface every few meters, so with a press of a button law enforcement can stop a terrorist act in progress. It won't be abused at ALL to stop anyone who's not a terrorist but just annoyed a cop (or script kiddie) somehow.

        • The taxes from people going to work on those roads pay for the law enforcement in the first place! Now you want to spend trillions of dollars on something lar enforcement is trained to do? Form a roadblock?
          • by Calydor ( 739835 )

            Ohhh, you mean same as how they're trained to track down criminals after the crime has happened? No, you're right. We should just leave it that way and not spend crazy amounts on weird precautions like spikes in the road and automatic blacklisting on video sites.

            • So you would rather spend a trillion dollars on spikes that could pop up on anyone at any time than hire more police offices for a few million.
              • by Calydor ( 739835 )

                That is the exact opposite of what I just said. I was agreeing with you.

                I just compare those spikes to systems put in place to try to catch and stop all potentially terroristic or extremist content everywhere - they will inevitably miss something real while having more and more false positives.

    • Furthermore, most roads don't have private companies making billions of dollars from them.
    • roads are government owned so that makes things more tricky

    • Nearly every crime I've heard of in modern times, involved the use of either a CPU

      And if Intel managed precisely which code you're allowed to run on your CPU, and you had to subscribe to Intel and agree to their specific terms of service for how you act when using their CPU you may have a point.

      But they don't.

      And neither do you.

    • by AmiMoJo ( 196126 )

      These companies created algorithms that encourage users to engage and watch ads. Sometimes, far too often in fact, they recommend stuff that radicalizes users.

      They profit from that. Users on the path to becoming terrorists are watching ads. They are engaged.

    • I think the point is that social media sites tend to push extremist media due to the algorithms / help to create echo chambers.

      Many links available online if you look it up.

      Example :
      https://www.brookings.edu/blog... [brookings.edu]

      I don't think a random CPU or Road or your jeans does that.

  • by bubblyceiling ( 7940768 ) on Monday October 03, 2022 @11:08AM (#62933523)
    Hope the courts hold the companies accountable
    • So the ADL can sue slashdot when they don't delete ascii swastikas? Cool!

      • Re: (Score:3, Insightful)

        by thegarbz ( 1787294 )

        So the ADL can sue slashdot when they don't delete ascii swastikas? Cool!

        I sure hope so. Slashdot has proven in the past few years to moderate content. APK and his Hosts file spam has been eliminated. Anonymous posting without an account has been disabled. And if you click the "Terms" button below you'll see that there are terms on the content you post: "No user shall transmit Content or otherwise conduct or participate in any activities on the Sites that, in the judgment of Slashdot Media, is likely to be prohibited by law in any applicable jurisdiction, including laws governin

        • Despite these efforts, the Slashdot community is still full of toxic members who respond to political or just factual statements that displease them by hurling insults and at the poster (along with baseless and false accusations about what the poster literally just posted). Of course, this IS legal and I am unsure that the Big Brother style moderation that would be necessary to thwart this is really worth the cost.

        • Perhaps people are posting those swastikas as a political statement about the lack of historical knowledge around the swastika, and the anger they have around uneducated people like yourself flying into a rage that you think their religious symbol should be made illegal (or subject to civil action). Probably not of course, but this is why freedom of speech exists, so that singular and narrow viewpoints like your own don’t become law.
    • If you want to make sure nobody hosts user generated content in the US, that might be a good way to do it.

  • I think the problem is a matter of how far companies have sunk in terms of an ethical conscienceless in the name of making money. It seems that the more that technology advances, the more people seek to operate in a way that maximizes their profits no mater what that technology is used for. People use the example of building roads, and do they stop those roads from being used by terrorists. Well, actually they do, because those roads support people who go to work and pay taxes and then those taxes go int
  • Back in the early days of the Internet. No, I'm not talking about early web, I'm talking about USENET era. Many of the techno-visionaries saw the growth of the Internet and participation on USENET is a good thing because it allowed marginalized communities to meet with people they didn't know existed, develop their sense of place and power in the world.

    The techno-visionaries were thinking about minorities, people of color, lbgt+ and other types you would find in San Francisco/San Jose at the time. What
    • by Tablizer ( 95088 )

      > What they didn't realize is that it also give a place for racists, Nazis, Fascists to meet with others

      They should have. There's always bad apples around, and various jerks have screwed up networks since they became common business and university tools in the 80's.

  • They'll find for Big Tech in this case, and against Big Tech in a censorship case. Political balancing.

  • Terrorists don't have it, but big corporations do. Plus a corporation won't send someone with a bomb to the plaintiff.

"Hello again, Peabody here..." -- Mister Peabody

Working...