Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Courts Crime Facebook

Texas Court Rules Teens Can Sue Facebook For Its Alleged Role in Their Sex Trafficking (houstonchronicle.com) 97

The Houston Chronicle reports: The Texas Supreme Court ruled Friday in a Houston case that Facebook is not a "lawless no-man's-land" and can be held liable for the conduct of pimps who use its technology to recruit and prey on children.

The ruling came in a trio of Houston civil actions involving teenage trafficking victims who met their abusive pimps through Facebook's messaging functions. They sued the California-based social media juggernaut for negligence and product liability, saying that Facebook failed to warn about or attempt to prevent sex trafficking from taking place on its internet platforms. The suits also alleged that Facebook benefited from the sexual exploitation of trafficking victims. The justices said trafficking victims can move forward with lawsuits on the grounds that Facebook violated a provision of the Texas Civil Practice and Remedies Code passed in 2009.

Facebook lawyers argued the company was shielded from liability under Section 230 of the federal Communications Decency Act, which states that what users say or write online is not akin to a publisher conveying the same message. Essentially, they said, Facebook is immune to these types of lawsuits. The majority wrote, "We do not understand Section 230 to 'create a lawless no-man's-land on the Internet' in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking... Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it," the opinion said. "Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking."

The justices explained that Congress recently amended Section 230 to add the possibility of civil liability for websites that violate state and federal human-trafficking laws. They said under the amended law states may protect residents from internet companies that knowingly or intentionally participate in human trafficking through their action or inaction..... Annie McAdams, a lead attorney for the plaintiffs, said it was a groundbreaking decision. This is the first case to beat Facebook on its argument that it had immunity under Section 230, she said.

This discussion has been archived. No new comments can be posted.

Texas Court Rules Teens Can Sue Facebook For Its Alleged Role in Their Sex Trafficking

Comments Filter:
  • The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

    You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

    • Watch out, Facebook and other platforms will wield the ban hammer on you for using logic and reason to their detriment.
      • Unfortunately for Facebook, the ultimate arbiter of what is fair lies in the civil courts and not at Facebooks discretion. While I think a platform should not be ruled negligent for the actions of those who use it fairly, I agree in this case with the OP. Because Facebook started moderating beyond the absolute requirement of law, it is now liable unlike the telecom companies who didnâ(TM)t moderate voice.
        • by NFN_NLN ( 633283 )

          > Unfortunately for Facebook, the ultimate arbiter of what is fair lies in the civil courts and not at Facebooks discretion.

          Surely Facebook has the influence to bribe the judicial system to "not take the case"...
          That way the details of what happened don't get exposed and the Judge doesn't have to rule.

        • by sabri ( 584428 ) on Sunday June 27, 2021 @05:07PM (#61527712)

          Because Facebook started moderating beyond the absolute requirement of law,

          Yeah, as if you have any understanding of the law

          There is two things that matter here.

          First of all, Facebook moderating public posts is way different from moderating messaging (which is what the case is about).

          Second, having the right to sue does not mean P won the case. In order to establish that Facebook bears any liability whatsoever, they most provide sufficient evidence that Facebook was, at the very least, negligent. P is alleging that Facebook is liable for the tort of a third party. Even if Facebook not moderating its messaging platform could somehow be construed a tort, a significant amount of contributory negligence is present due to the fact the P knowingly exchanged messages with an unknown person.

          If you read the article, you see that the only thing that happened on Facebook is the establishing of contact. Are you also going to sue the bar where a pimp meets a future trafficking victim?

          Don't you think there is a tad bit more liability on behalf of the actual pimp? And what about the parents of a 15 year old who were clearly negligent in monitoring her internet usage and allowing here to meet this unknown adult.

          No, we all know why Facebook is being sued: because that's where the money is. It's a roll of the dice, which the Supreme Court will shoot down in a heartbeat.

      • by alvian ( 6203170 )
        Let them. Those users and their followers will go somewhere else. That's how free market works.
        • And then the NEW place will have the same problems. And after they go to yet another site, that site will have the same problems. Lather, rinse, repeat endlessly....
    • by Entrope ( 68843 )

      That's pretty much exactly what Section 230 says is not the rule. Facebook's potential liability is due to three-year-old changes in Section 230, usually called FOSTA-SESTA [wikipedia.org].

      • Re: (Score:1, Interesting)

        by guruevi ( 827432 )

        FOSTA-SESTA was based around and extended section 230.

        It made providers liable for usage of their platforms that facilitates sex trafficking, IF they moderate content. Facebook demonstrated it was willing to moderate the sitting President of the United States, perfectly legal, as a libertarian, I have no problems with a company I don't interact with doing random stuff to its customers, but section 230 protections fall away, as they should.

        Twitter next.

    • Shouldn't they have been held accountable long before this?
      Seems like this exact EXCUSE was tried in trials of Backpage (et alia) and CraigsList.
      Is it possible that FB's 'charitable contributions' to our politicians near and far have to date delayed this reckoning?

    • You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Are you saying that Facebook received easily verifiable reports that those pimps were using its platform to do child trafficking but did nothing about it? Because if Facebook didn't know what was going on, I'm not sure how you could fault them for these three cases.

      • by NaCh0 ( 6124 )

        Are you saying that Facebook received easily verifiable reports that those pimps were using its platform to do child trafficking but did nothing about it?

        Yes. When you read the details of the case, that is exactly what happened.

        • by Xenx ( 2211586 )
          I can only say that the details, as provided in the article, do not back you up. If there is proof that Facebook was aware of these particular pimps, and allowed their activity, it will come out in court. However, with the information available right now, Facebook doesn't appear to have been aware.
          • by NaCh0 ( 6124 ) on Sunday June 27, 2021 @11:07PM (#61528484) Homepage

            I'll quote the article since you missed it:

            "After the teen was rescued from his operation, traffickers kept using her profile to lure in other minors, according to the ruling. In this case the family says the girl’s mother reported what had happened to Facebook and the company never responded."

            The court can decide if the reported abuse was sufficient notification, but it does appear that a report of abuse was made.

            • If they did report it then its on facebook, they can't claim 230 when they have banned people for far less. They have banned people for view points that broke no law's but now they want to be immune from this.
            • by Xenx ( 2211586 )
              Fair enough. I did miss that somehow. I even recall reading that paragraph.
    • by Frank Burly ( 4247955 ) on Sunday June 27, 2021 @03:06PM (#61527364)

      The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

      You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Some variations of this atextual idiocy his keeps getting posted and it keeps being wrong. By your logic, a website that tried, but failed to block human traffickers would be liable, while a website that didn't do anything would retain immunity. The law says the exact opposite at 230(c)(2)(A):

      No provider or user of an interactive computer service shall be held liable on account of—(A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected

      The full text of Section 230 is here: https://www.law.cornell.edu/us... [cornell.edu] Please point out how a "provider or user of an interactive computer service" can be held liable for "restricting [the] availability of material that the provider . . . considers . . otherwise objectionable" by banning the speaker. And if you're going to claim that the term "good faith" somehow revives the fairness doctrine, please show your work and prepare to be taunted a second time

      • by raymorris ( 2726007 ) on Sunday June 27, 2021 @03:14PM (#61527398) Journal

        You're absolute right that 230 says they can't be held liable solely because they made a good-faith effort to control clearly objectionable things, like threats of violence and hardcore porn.*

        The REASON 230 was needed is precisely because without 230, if they control what's said and what isn't, they become liable. You're replying to someone who correctly stayed the common law. You're right that 230 provides an exception to the common law.

        The plaintiffs in this case will need to show that either Facebook did things that are outside of 230, or the damages caused, the injury, is outside of 230. If it's not within the protections of 230, then it falls into the common law rule that Facebook may be liable.

        * Exactly what they can and cannot control under 230 is debatable because a few words of the law are poorly written.

        • The person I was replying to did not correctly state the common law by incorrectly describing the effect of section 230 .

          But I have no idea how the case will play out for this plaintiff. Someone else posted a link to the wikipedia page for FOSTA-SESTA, which I have not read, nor have I read the complaint or the decision, and I have no idea whether the Texas Supreme Court got that right. But "reckless disregard" seems to be the standard, and I imagine that would be a bit tough to prove, unless FB was allowin

          • by raymorris ( 2726007 ) on Sunday June 27, 2021 @04:10PM (#61527540) Journal

            > FOSTA-SESTA, which I have not read, nor have I read the complaint or the decision, and I have no idea whether the Texas Supreme Court got that right. But "reckless disregard" seems to be the standard

            FOSTA-SESTA is pretty short:

            https://www.congress.gov/bill/... [congress.gov]

            There seems to be two standards, for two different types of conduct.

            1. promotion *or facilitation* of prostitution (where prostitution is not legal)

            OR

            2. Reckless disregard for whether sex trafficking is occurring.

            Not that's OR, not AND. Plaintiffs will need to prove one or the other.

            If Facebook provided filters or targeting that facilitates prostitution, that's not covered by 230. Generally because 230 says they aren't liable for what *users* post, it doesn't provide cover for what Facebook itself does; and specifically because FOSTA-SESTA says so. :)

            If Facebook did NOT facilitate prostitution, then we'd have to check reckless disregard for if trafficking was occuring. If they received notice (complaints) about trafficking and took no action even after having been notified, that could be reckless disregard.

            I also haven't looked at the facts of these cases; I'm only looking at the law for now. Which might be good, actually. It can be helpful to first understand exactly what the law says, without being distracted by a particular case. Then after understanding the law, check the facts of a particular case to see how it aligns with the law.

            • Ps the other thing to keep in mind is that 230 says only that the ISP is not to be treated as the publisher or speaker of the user-generated message. If they qualify. It doesn't say that can't have any liability of any type.

              That is, a plaintiff can't say "they are liable because they published this". A plaintiff could absolutely argue "they are liable because they allowed known sketchy dude to continue contacting underage girls with whom they had no known connection". They can say "Facebook is liable bec

        • by pavon ( 30274 )

          > The REASON 230 was needed is precisely because without 230, if they control what's said and what isn't, they become liable. You're replying to someone who correctly stayed the common law. You're right that 230 provides an exception to the common law.

          Even that is not necessarily true. Prior to the enactment of the Communication Decency Act, there were all of two court cases considering this issue. Cubby v. CompuServe said web services were considered distributors for purposes of third-party liability, w

      • You are trying to make sense to an Internet lawyer...? :)
      • Is Facebook's "banning of speech it didn't like" considered "in good faith"? What about in other cases?

        Certainly, we can argue about whether or not they banned speech they didn't like. Also, it must be shown whether or not their actions were "in good faith."

        But, with the release of the Project Veritas videos from FB, I think the "good faith" argument is now HARDER for them. We might say, "Well bad faith in one case does not negate good faith in other cases." BUT, if a pattern of bad faith efforts, d

      • What would constitute a lack of good faith then according to you? Banning 'conservatives' for the slightest ToU violation ('misgendeting'?) whilst allowing 'progressives' to call for and support BurnLootMurder (BLM) riots? Is that 'good faith' moderation?

        • There is no subject-matter connection between misgendering and support for BLM, so complaining that the moderation on these two topics is inconsistent is like complaining that posts with racial slurs are treated differently from posts supporting Bitcoin. Section 230 is not about validating your political identity or forcing Twitter to walk a mile in your shoes. Section 230 is about deciding whether Twitter has actually stepped into your shoes as a speaker, and thereby became liable for the illegal or tortio

    • by Anon E. Muss ( 808473 ) on Sunday June 27, 2021 @03:09PM (#61527378)

      You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Oh, sorry, wrong answer, but thank you for playing.

      Section 230 actually says the exact opposite. It explicitly allows sites to perform content moderation without that losing their liability protections.

    • Only, that's not at all true. In fact you essentially made that up out of whole cloth. Don't feel bad, though, a lot of people simultaneously daydream that this is a real thing.
    • Section 230 is crystal clear. It gives Facebook the right to ban anyone they want for nearly any reason (save a few protected classes) while maintaining immunity.

      And despite what Fox News is telling you this is a good thing. The internet is not like traditional broadcast media. It's an entirely new technology. And new tech needs new laws.

      Take away Section 230 and Facebook will ban everything that anyone who can afford to sue disagrees with. It'll be just like the DMCA. A single take down notice and
    • by meglon ( 1001833 )
      You are, once again, wrong. It isn't an either/or, it's both. They can moderate their forums, and still not be libel. I'm sorry you do not understand section 230, but i have to think it's because you're intentionally being a fucking liar... because you have been told, and show court cases, for several years now that show YOU ARE WRONG. So, either use that brain you supposedly have in your head for something other than keeping your ears apart to finally learn reality, or quit being a lying little piece o
    • The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

      You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Yes, you can. In fact, except for federal crimes (of which these claims actually are), in which it doesn't provide any protection at all. The judge obviously has a concurring opinion, but that just means that Facebook will request that the trial be moved to federal court, at which time they'll get their Section 230 dismissal.

    • If that were true, we would not have the current Internet. The whole point is to protect from liability even if there's some editing or moderation.

    • The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

      You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Those who modded this down, please listen: SuperKendall pisses me off sometimes too. But really? You had to mod him down into oblivion when he simply made a reasonable observation? Can't you evaluate a comment on its own merits and refrain from ad hominem modding? Sheesh!

    • The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

      That's not how things work.

      You can't hide behind 230

      Honestly, I don't see how this has ever had anything to do with 230? This is about enabling sex trafficking, not speech. Sex trafficking is simply not speech.

    • The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.

      You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.

      Every word you posted is a lie.

      Read the law. [cornell.edu] There is no such clause.

      The law specifically encourages platforms to censor anything they find offensive. It also shields them from liability for anything they do not censor.

      There is nothing in the law about losing protection based on allowing or disallowing people to post on the site.

      **Facebook may well lose, because of the 2018 FOSTA-SESTA act which specifically applies to sex trafficking online. It will be a difficult case, due to the requirement to show f

  • by crunchygranola ( 1954152 ) on Sunday June 27, 2021 @02:38PM (#61527260)

    This will spoil Zuckerberg's eventual plan to monetize a cut from the action.

    • by NFN_NLN ( 633283 )

      He has enough money already... I think he wants to be like that guy that ran a dating website, just so he could poach the best ones for himself.

  • This is a lesson, laws like FOSTA that are targeted at your, little guy, enemies(Craigslist), can and will eventually be targeted at you. Similarly, the patriot act is being used against white nationalists now and not just against Muslims.
  • by hdyoung ( 5182939 ) on Sunday June 27, 2021 @02:57PM (#61527328)
    This is literally just a single bump in the road of the litigation process. Facebook has VERY good lawyers. People will harrumph about how this is great then forget about it. Then the appeal will quietly come. 1-4 years from now, the decision will be tossed or whittled away to nothing.

    Laws won't fix it either. This is money we're talking about. MONEY. Repeat after me: MONEY. One of the parties in congress will not pass a single law or allow a single regulation to be put in place because the rank and file of this party want a government that can't actually govern. It's even harder to do something that interferes with a companies' ability to MAKE MONEY.

    It's not right. It's not wrong. It just is. There's very little that our entire society agrees on.

    Downmod in 3.2.1
    • by fermion ( 181285 )
      When it comes to underage solicitation, money eventually loses. Look at Epstein. And this is an issue where there is some momentum. Several years ago a girl was groomed by a fellow classmate and trafficked by the classmates father. The police were aware of this, knew the father was a child molester, but only wanted to gather evidence. The father went in and got her our himself. This is the level of motivation here. Houston not only have guns, but lots of law schools and lawyers. It is one of the places that
      • Epstein??? That’s your example of society not tolerating underage solicitation???? The guy built literally a small industry providing underage girls for himself and his friends. Decades later, yeah he gets arrested. Are you really going to call that a win? Id say that example supports my own assertion than yours. Money, money, money. Consequences are for old age where they really dont matter much. Same goes for a single example of a Texan father going ham to get his daughter back. Do you have any idea
        • by fermion ( 181285 )
          I call it a win because even with full political support, including the likes of no less than Ken Starr, we got him in jail and dead. Anyone who molests children and rapes minors and has some power is very difficult to stop. Look at Dennis Hastert. Matt Gaetz clearly believes he is untouchable. Which is why eroding their tools to begin with is the best approach.
    • by AmiMoJo ( 196126 )

      If that was true then e.g. tobacco companies would have avoided massive fines and laws requiring them to put health warnings on their products.

    • Honestly, it doesn't feel like companies actually do that well if a case gets beyond the first hurtle of dismissal, and this case in particular feels like one of the bad ones for them as the PR damage of being in court over sex trafficking is pretty bad, even if you do ultimately win. In fact, I would fully expect to see this settled very quickly at this point to avoid just that as getting dragged into the courts and the dangers of discovery, it can quickly turn very ugly.
  • by NicknamesAreStupid ( 1040118 ) on Sunday June 27, 2021 @03:14PM (#61527396)
    Craigslist was putting pimps out of business and liberating female sex workers when, all of a sudden, https://www.congress.gov/bill/... [congress.gov] came along and shut that down.
  • Section 230 is pretty explicit. Facebook is protected here. The judge should know this. He's either incompetent or corrupt.
    • All of the armchair lawyers on /. seem to think they understand 230 but not many do. There is nothing in the summary to tell us the actual allegations. "We do not understand Section 230 to 'create a lawless no-man's-land on the Internet' in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking... Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent unif
      • The plaintiffs claim they were lured using messenger and Instagram.

        The lawsuits were brought by three Houston women recruited as teens through Facebook apps and trafficked as a result of those online connections. The young women said in court filings that the social media giant cloaked traffickers with credibility and provided “a point of first contact between sex traffickers and these children” and “an unrestricted platform to stalk, exploit, recruit, groom, and extort children into the sex trade.”

        The women admit they were contacted and chose to interact with the men on the platforms and that they also chose to meet the men in person.

        The judges' full quote is:

        “Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it,” the opinion said. “Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking.”

        Somehow, the judges decided that the "human trafficking" was a misdeed of Facebook and not the perpetrators.

        My question is whom the women would sue if they had been lured and groomed at the public library, the mall, or at a local park?

        • by MrL0G1C ( 867445 )

          My question is whom the women would sue if they had been lured and groomed at the public library, the mall, or at a local park?

          How about local gov't or the police because there's a considerable chance that if these creeps hung around in those public places then people would have seen them and reported them.

          • No, there isn't a good chance of that happening.
          • Also, the police and government can't do anything until they break the law. Until they actually force the girls into prostitution, they haven't done anything wrong. You aren't very bright, are you?
  • What do the plaintiffs think Facebook should have done? Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.

    • Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.

      I don't see why that isn't possible.

      • by flink ( 18449 )

        Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.

        I don't see why that isn't possible.

        Because that would render any internet platform that allows two arbitrary human beings to communicate so expensive it would be impossible for them to exist.

        • Would it? How much would it cost?

          • In 2016, there where are about 60 billion messages sent on FB daily. We can assume that number have risen slightly since then. If it costs $1 to check every message, that's $60+ billion dollars.

            • If it costs $1 to check every message

              Yes, if you pay people too much, then it can't be done.

              • Yes, if you pay people too much, then it can't be done.

                So what is not too much? $6 billion/day? $600 million/day? $60 million/day? Remember, those who check the messages needs a living wage.

                There's another very obvious problem you fail to grasp, how many people do you need to hire so they can check +60 billion messages every day?

                Say you can check 200 messages an hour (spending 18 sec per message), they work in 8 hour shifts 24/7, that's 4800 messages for one work-slot/day which has to be filled by 3 employees. How many millions of employees do they need to reas

                • Early in AT&T's history, when human switchboard operators were the way telephone calls connected, their HR department concluded that telephony growth would require that everyone in America become a switchboard operator within 20 years. Then they invented the electro-mechanical telephony switch.
                  • A) Telephone operators don't monitor the entire conversation. They simply connect them. This makes your comparison false
                    B) We already have automated monitoring of posts and people complain about how bad it is and that it is either too strict or not strict enough.
                    • by Rhipf ( 525263 )

                      A) Telephone operators don't monitor the entire conversation. They simply connect them. This makes your comparison false

                      The fact that telephone operators don't monitor conversations only makes NicknamesAreStupid's comment more relevant. If it would take every American to become a switchboard operator to make physical switchboards work it would take even more than that many people to review every post on Facebook. So the comparison isn't false it is just understated.

                  • by flink ( 18449 )

                    Early in AT&T's history, when human switchboard operators were the way telephone calls connected, their HR department concluded that telephony growth would require that everyone in America become a switchboard operator within 20 years. Then they invented the electro-mechanical telephony switch.

                    The whole premise of this discussion is that some people like phantomfive contend that every message on Facebook should be vetted by a human. Yes automation exists and is being employed, but apparently that is not good enough.

                    If we required a human to listen in on every telephone conversation to make sure the phone system wasn't being used to abet the trafficking of teens, then yes, either everyone would be a telephone operator or a telephone call would be so expensive only the well off could afford them.

            • by HiThere ( 15173 )

              It's probably closer to 25 cents or less on the average. There will be some edge cases that are harder, but those can be bounced with a "please rephrase".

              But it's still impossibly expensive.

            • by sconeu ( 64226 )

              To hell with the cost.

              Assume there are 60billion messages. Assume that a person can check 1000 messages per day (I think that's high, but let's go with it). That means FB alone would need 60 million employees doing nothing but checking posts 24/7. That's roughly the population of Italy [worldometers.info].

              Somehow, I don't think that's feasible.

            • If it costs $1 to check every message, that's $60+ billion dollars.

              Automation reduces the cost. FB seemed to have no cost issues when it came down to negative posts of the Biden family business prior to the election, or posts pointing out where Fauci was beng political not scientific, or any other of a host of topics that fit the politics of the staff. Opposition attorneys will love delving into facebooks practices and costs in such areas to establish FB's capabilities.

              • by flink ( 18449 )

                Is it really so surprising that big companies treat people in power differently than the average person? When Trump was in power, he could shit post all he wanted. Even when he was arguably violating the ToS, exceptions were carved out for politicians because their messages supposedly have news value. When it was clear he was out of power and a bunch of his followers trashed the workplace of the people responsible for writing regulations that govern big media companies, surprise, he was out, and we should

                • by drnb ( 2434720 )

                  Is it really so surprising that big companies treat people in power differently than the average person? When Trump was in power, he could shit post all he wanted. Even when he was arguably violating the ToS, exceptions were carved out for politicians because their messages supposedly have news value. When it was clear he was out of power and a bunch of his followers trashed the workplace of the people responsible for writing regulations that govern big media companies, surprise, he was out, and we shouldn't be treating anyone special, except hey, look, this new guy's in power and now we will clean up his messes. It's not shocking or anything, it's just capital protecting itself.

                  The media did not clean up Trump messes. They amplified stories of his actual messes and created some negative stories that were not true. The protection is only extended to one side in this case. Letting him spout crazy shit at 3am actually hurt his re-election chances, so allowing him to continue did him no favors.

      • How many people would it take to read everything posted on Facebook, Instagram, or sent through messenger every day?
      • by Rhipf ( 525263 )

        Based on the vast number of posts on Facebook everyday you would need >1 million people reviewing posts and private messages.
        The actual number of live human reviewers would probably be in the 10s to 100s of millions.

    • What do the plaintiffs think Facebook should have done?

      Put in a sizable fraction of the effort to remove sex trafficking that they put into covering up negative Biden stories in the run-up to the election.

      • by catprog ( 849688 )

        Des story contain the word Biden. If so put it the queue for checking if it bad.

        Does story/message contain the word sex trafficking. If so put it in the queue for checking.

        Done.

         

  • by VeryFluffyBunny ( 5037285 ) on Sunday June 27, 2021 @04:16PM (#61527564)
    ...please, dear judges & juries, find Facebook guilty on all counts.
  • I hope Slashdot has mechanisms in place to deal with this! You wouldn't believe the trafficking in 42-year-old software developers that's out there...
  • As much as I hate Facebook I have to side with them in this case. This ruling will open the flood gates on nuisance and other frivolous legal actions against ALL web sites that post user comments that sometimes contain content that is offensive or other wise "emotionally harmful" to readers, like pointing out that someone's favorite Celeb is a rapist ("That's not True! He would NEVER drug and rape a 13 year old! The witnesses are all lying!")

    And now that the "gloves are off" in that a web site is legally r

  • ... shielded from liability under Section 230 of the federal Communications Decency Act ...

    They ban female nipples because 'rules' but not pimps. Facebook knows they have double-standards. Or maybe not: Pimps spend money, naked breasts don't.

    • You would have them ban all those rappers claiming to be pimps? Are you a racist?
      • You would have them ban all those rappers claiming to be pimps? Are you a racist?

        They can't do that and not also ban on the racist pimps who claim to be patriots....

  • wonder who is running the class action ? me you? split the diff lol

"The vast majority of successful major crimes against property are perpetrated by individuals abusing positions of trust." -- Lawrence Dalzell

Working...