Texas Court Rules Teens Can Sue Facebook For Its Alleged Role in Their Sex Trafficking (houstonchronicle.com) 97
The Houston Chronicle reports:
The Texas Supreme Court ruled Friday in a Houston case that Facebook is not a "lawless no-man's-land" and can be held liable for the conduct of pimps who use its technology to recruit and prey on children.
The ruling came in a trio of Houston civil actions involving teenage trafficking victims who met their abusive pimps through Facebook's messaging functions. They sued the California-based social media juggernaut for negligence and product liability, saying that Facebook failed to warn about or attempt to prevent sex trafficking from taking place on its internet platforms. The suits also alleged that Facebook benefited from the sexual exploitation of trafficking victims. The justices said trafficking victims can move forward with lawsuits on the grounds that Facebook violated a provision of the Texas Civil Practice and Remedies Code passed in 2009.
Facebook lawyers argued the company was shielded from liability under Section 230 of the federal Communications Decency Act, which states that what users say or write online is not akin to a publisher conveying the same message. Essentially, they said, Facebook is immune to these types of lawsuits. The majority wrote, "We do not understand Section 230 to 'create a lawless no-man's-land on the Internet' in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking... Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it," the opinion said. "Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking."
The justices explained that Congress recently amended Section 230 to add the possibility of civil liability for websites that violate state and federal human-trafficking laws. They said under the amended law states may protect residents from internet companies that knowingly or intentionally participate in human trafficking through their action or inaction..... Annie McAdams, a lead attorney for the plaintiffs, said it was a groundbreaking decision. This is the first case to beat Facebook on its argument that it had immunity under Section 230, she said.
The ruling came in a trio of Houston civil actions involving teenage trafficking victims who met their abusive pimps through Facebook's messaging functions. They sued the California-based social media juggernaut for negligence and product liability, saying that Facebook failed to warn about or attempt to prevent sex trafficking from taking place on its internet platforms. The suits also alleged that Facebook benefited from the sexual exploitation of trafficking victims. The justices said trafficking victims can move forward with lawsuits on the grounds that Facebook violated a provision of the Texas Civil Practice and Remedies Code passed in 2009.
Facebook lawyers argued the company was shielded from liability under Section 230 of the federal Communications Decency Act, which states that what users say or write online is not akin to a publisher conveying the same message. Essentially, they said, Facebook is immune to these types of lawsuits. The majority wrote, "We do not understand Section 230 to 'create a lawless no-man's-land on the Internet' in which states are powerless to impose liability on websites that knowingly or intentionally participate in the evil of online human trafficking... Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it," the opinion said. "Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking."
The justices explained that Congress recently amended Section 230 to add the possibility of civil liability for websites that violate state and federal human-trafficking laws. They said under the amended law states may protect residents from internet companies that knowingly or intentionally participate in human trafficking through their action or inaction..... Annie McAdams, a lead attorney for the plaintiffs, said it was a groundbreaking decision. This is the first case to beat Facebook on its argument that it had immunity under Section 230, she said.
Sorry Facebook, you lost this (Score:2, Insightful)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Re: (Score:2)
Re: Sorry Facebook, you lost this (Score:1)
Re: (Score:3)
> Unfortunately for Facebook, the ultimate arbiter of what is fair lies in the civil courts and not at Facebooks discretion.
Surely Facebook has the influence to bribe the judicial system to "not take the case"...
That way the details of what happened don't get exposed and the Judge doesn't have to rule.
Re: (Score:2)
I don't think there was any implication that the bribing of the judge would be legal, just not actionable. Like a huge campaign donation.
Re: Sorry Facebook, you lost this (Score:5, Informative)
Because Facebook started moderating beyond the absolute requirement of law,
Yeah, as if you have any understanding of the law
There is two things that matter here.
First of all, Facebook moderating public posts is way different from moderating messaging (which is what the case is about).
Second, having the right to sue does not mean P won the case. In order to establish that Facebook bears any liability whatsoever, they most provide sufficient evidence that Facebook was, at the very least, negligent. P is alleging that Facebook is liable for the tort of a third party. Even if Facebook not moderating its messaging platform could somehow be construed a tort, a significant amount of contributory negligence is present due to the fact the P knowingly exchanged messages with an unknown person.
If you read the article, you see that the only thing that happened on Facebook is the establishing of contact. Are you also going to sue the bar where a pimp meets a future trafficking victim?
Don't you think there is a tad bit more liability on behalf of the actual pimp? And what about the parents of a 15 year old who were clearly negligent in monitoring her internet usage and allowing here to meet this unknown adult.
No, we all know why Facebook is being sued: because that's where the money is. It's a roll of the dice, which the Supreme Court will shoot down in a heartbeat.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
That's pretty much exactly what Section 230 says is not the rule. Facebook's potential liability is due to three-year-old changes in Section 230, usually called FOSTA-SESTA [wikipedia.org].
Re: (Score:1, Interesting)
FOSTA-SESTA was based around and extended section 230.
It made providers liable for usage of their platforms that facilitates sex trafficking, IF they moderate content. Facebook demonstrated it was willing to moderate the sitting President of the United States, perfectly legal, as a libertarian, I have no problems with a company I don't interact with doing random stuff to its customers, but section 230 protections fall away, as they should.
Twitter next.
Re: Sorry Facebook, you lost this (Score:1)
Re: (Score:2)
Shouldn't they have been held accountable long before this?
Seems like this exact EXCUSE was tried in trials of Backpage (et alia) and CraigsList.
Is it possible that FB's 'charitable contributions' to our politicians near and far have to date delayed this reckoning?
Re: (Score:1)
Re: (Score:2)
Shouldn't they have been held accountable long before this?
Laws only apply to people who aren't in the club.
Re: (Score:2)
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Are you saying that Facebook received easily verifiable reports that those pimps were using its platform to do child trafficking but did nothing about it? Because if Facebook didn't know what was going on, I'm not sure how you could fault them for these three cases.
Re: (Score:2)
Are you saying that Facebook received easily verifiable reports that those pimps were using its platform to do child trafficking but did nothing about it?
Yes. When you read the details of the case, that is exactly what happened.
Re: (Score:1)
Re:Sorry Facebook, you lost this (Score:4, Informative)
I'll quote the article since you missed it:
"After the teen was rescued from his operation, traffickers kept using her profile to lure in other minors, according to the ruling. In this case the family says the girl’s mother reported what had happened to Facebook and the company never responded."
The court can decide if the reported abuse was sufficient notification, but it does appear that a report of abuse was made.
Re: (Score:1)
Re: (Score:3)
Re:Sorry Facebook, you lost this (Score:5, Informative)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Some variations of this atextual idiocy his keeps getting posted and it keeps being wrong. By your logic, a website that tried, but failed to block human traffickers would be liable, while a website that didn't do anything would retain immunity. The law says the exact opposite at 230(c)(2)(A):
No provider or user of an interactive computer service shall be held liable on account of—(A)any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected
The full text of Section 230 is here: https://www.law.cornell.edu/us... [cornell.edu] Please point out how a "provider or user of an interactive computer service" can be held liable for "restricting [the] availability of material that the provider . . . considers . . otherwise objectionable" by banning the speaker. And if you're going to claim that the term "good faith" somehow revives the fairness doctrine, please show your work and prepare to be taunted a second time
That's right - so was the opposite (Score:4, Interesting)
You're absolute right that 230 says they can't be held liable solely because they made a good-faith effort to control clearly objectionable things, like threats of violence and hardcore porn.*
The REASON 230 was needed is precisely because without 230, if they control what's said and what isn't, they become liable. You're replying to someone who correctly stayed the common law. You're right that 230 provides an exception to the common law.
The plaintiffs in this case will need to show that either Facebook did things that are outside of 230, or the damages caused, the injury, is outside of 230. If it's not within the protections of 230, then it falls into the common law rule that Facebook may be liable.
* Exactly what they can and cannot control under 230 is debatable because a few words of the law are poorly written.
Re: (Score:2)
The person I was replying to did not correctly state the common law by incorrectly describing the effect of section 230 .
But I have no idea how the case will play out for this plaintiff. Someone else posted a link to the wikipedia page for FOSTA-SESTA, which I have not read, nor have I read the complaint or the decision, and I have no idea whether the Texas Supreme Court got that right. But "reckless disregard" seems to be the standard, and I imagine that would be a bit tough to prove, unless FB was allowin
FOSTA-SESTA is a short law (Score:4, Informative)
> FOSTA-SESTA, which I have not read, nor have I read the complaint or the decision, and I have no idea whether the Texas Supreme Court got that right. But "reckless disregard" seems to be the standard
FOSTA-SESTA is pretty short:
https://www.congress.gov/bill/... [congress.gov]
There seems to be two standards, for two different types of conduct.
1. promotion *or facilitation* of prostitution (where prostitution is not legal)
OR
2. Reckless disregard for whether sex trafficking is occurring.
Not that's OR, not AND. Plaintiffs will need to prove one or the other.
If Facebook provided filters or targeting that facilitates prostitution, that's not covered by 230. Generally because 230 says they aren't liable for what *users* post, it doesn't provide cover for what Facebook itself does; and specifically because FOSTA-SESTA says so. :)
If Facebook did NOT facilitate prostitution, then we'd have to check reckless disregard for if trafficking was occuring. If they received notice (complaints) about trafficking and took no action even after having been notified, that could be reckless disregard.
I also haven't looked at the facts of these cases; I'm only looking at the law for now. Which might be good, actually. It can be helpful to first understand exactly what the law says, without being distracted by a particular case. Then after understanding the law, check the facts of a particular case to see how it aligns with the law.
Re: (Score:2)
Ps the other thing to keep in mind is that 230 says only that the ISP is not to be treated as the publisher or speaker of the user-generated message. If they qualify. It doesn't say that can't have any liability of any type.
That is, a plaintiff can't say "they are liable because they published this". A plaintiff could absolutely argue "they are liable because they allowed known sketchy dude to continue contacting underage girls with whom they had no known connection". They can say "Facebook is liable bec
Re: (Score:2)
> The REASON 230 was needed is precisely because without 230, if they control what's said and what isn't, they become liable. You're replying to someone who correctly stayed the common law. You're right that 230 provides an exception to the common law.
Even that is not necessarily true. Prior to the enactment of the Communication Decency Act, there were all of two court cases considering this issue. Cubby v. CompuServe said web services were considered distributors for purposes of third-party liability, w
Re: (Score:2)
Re: (Score:1)
Is Facebook's "banning of speech it didn't like" considered "in good faith"? What about in other cases?
Certainly, we can argue about whether or not they banned speech they didn't like. Also, it must be shown whether or not their actions were "in good faith."
But, with the release of the Project Veritas videos from FB, I think the "good faith" argument is now HARDER for them. We might say, "Well bad faith in one case does not negate good faith in other cases." BUT, if a pattern of bad faith efforts, d
Re: Sorry Facebook, you lost this (Score:2)
What would constitute a lack of good faith then according to you? Banning 'conservatives' for the slightest ToU violation ('misgendeting'?) whilst allowing 'progressives' to call for and support BurnLootMurder (BLM) riots? Is that 'good faith' moderation?
Re: (Score:2)
There is no subject-matter connection between misgendering and support for BLM, so complaining that the moderation on these two topics is inconsistent is like complaining that posts with racial slurs are treated differently from posts supporting Bitcoin. Section 230 is not about validating your political identity or forcing Twitter to walk a mile in your shoes. Section 230 is about deciding whether Twitter has actually stepped into your shoes as a speaker, and thereby became liable for the illegal or tortio
Re:Sorry Facebook, you lost this (Score:4, Insightful)
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Oh, sorry, wrong answer, but thank you for playing.
Section 230 actually says the exact opposite. It explicitly allows sites to perform content moderation without that losing their liability protections.
Re: (Score:3)
No, they didn't (Score:2)
And despite what Fox News is telling you this is a good thing. The internet is not like traditional broadcast media. It's an entirely new technology. And new tech needs new laws.
Take away Section 230 and Facebook will ban everything that anyone who can afford to sue disagrees with. It'll be just like the DMCA. A single take down notice and
Re: (Score:1)
Re: (Score:1)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Yes, you can. In fact, except for federal crimes (of which these claims actually are), in which it doesn't provide any protection at all. The judge obviously has a concurring opinion, but that just means that Facebook will request that the trial be moved to federal court, at which time they'll get their Section 230 dismissal.
Re: (Score:2)
If that were true, we would not have the current Internet. The whole point is to protect from liability even if there's some editing or moderation.
Re: (Score:2)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Those who modded this down, please listen: SuperKendall pisses me off sometimes too. But really? You had to mod him down into oblivion when he simply made a reasonable observation? Can't you evaluate a comment on its own merits and refrain from ad hominem modding? Sheesh!
Re: (Score:2)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
That's not how things work.
You can't hide behind 230
Honestly, I don't see how this has ever had anything to do with 230? This is about enabling sex trafficking, not speech. Sex trafficking is simply not speech.
Re: (Score:2)
The moment Facebook started banning speech it didn't like, that made sure it was responsible for all the rest it allowed.
You can't hide behind 230 when you are selective about who you do and do nutlet on the platform.
Every word you posted is a lie.
Read the law. [cornell.edu] There is no such clause.
The law specifically encourages platforms to censor anything they find offensive. It also shields them from liability for anything they do not censor.
There is nothing in the law about losing protection based on allowing or disallowing people to post on the site.
**Facebook may well lose, because of the 2018 FOSTA-SESTA act which specifically applies to sex trafficking online. It will be a difficult case, due to the requirement to show f
Z Is Sad (Score:3)
This will spoil Zuckerberg's eventual plan to monetize a cut from the action.
Re: (Score:1)
He has enough money already... I think he wants to be like that guy that ran a dating website, just so he could poach the best ones for himself.
FOSTA (Score:2)
they will dodge this (Score:3)
Laws won't fix it either. This is money we're talking about. MONEY. Repeat after me: MONEY. One of the parties in congress will not pass a single law or allow a single regulation to be put in place because the rank and file of this party want a government that can't actually govern. It's even harder to do something that interferes with a companies' ability to MAKE MONEY.
It's not right. It's not wrong. It just is. There's very little that our entire society agrees on.
Downmod in 3.2.1
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
If that was true then e.g. tobacco companies would have avoided massive fines and laws requiring them to put health warnings on their products.
Re: (Score:2)
Kind of reminds me of Craigslist (Score:5, Insightful)
It's just going to get thrown out (Score:1)
Re: (Score:2)
Re: (Score:2)
The lawsuits were brought by three Houston women recruited as teens through Facebook apps and trafficked as a result of those online connections. The young women said in court filings that the social media giant cloaked traffickers with credibility and provided “a point of first contact between sex traffickers and these children” and “an unrestricted platform to stalk, exploit, recruit, groom, and extort children into the sex trade.”
The women admit they were contacted and chose to interact with the men on the platforms and that they also chose to meet the men in person.
The judges' full quote is:
“Holding internet platforms accountable for the words or actions of their users is one thing, and the federal precedent uniformly dictates that Section 230 does not allow it,” the opinion said. “Holding internet platforms accountable for their own misdeeds is quite another thing. This is particularly the case for human trafficking.”
Somehow, the judges decided that the "human trafficking" was a misdeed of Facebook and not the perpetrators.
My question is whom the women would sue if they had been lured and groomed at the public library, the mall, or at a local park?
Re: (Score:2)
How about local gov't or the police because there's a considerable chance that if these creeps hung around in those public places then people would have seen them and reported them.
Re: (Score:2)
Re: (Score:2)
What exactly would they have Facebook do? (Score:2)
What do the plaintiffs think Facebook should have done? Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.
Re: (Score:1)
Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.
I don't see why that isn't possible.
Re: (Score:2)
Any answer that boils down to "have a human review everything" is simply not possible. It doesn't matter how badly you wish it were possible -- it isn't possible.
I don't see why that isn't possible.
Because that would render any internet platform that allows two arbitrary human beings to communicate so expensive it would be impossible for them to exist.
Re: (Score:1)
Would it? How much would it cost?
Re: (Score:2)
In 2016, there where are about 60 billion messages sent on FB daily. We can assume that number have risen slightly since then. If it costs $1 to check every message, that's $60+ billion dollars.
Re: (Score:2)
If it costs $1 to check every message
Yes, if you pay people too much, then it can't be done.
Re: (Score:2)
Yes, if you pay people too much, then it can't be done.
So what is not too much? $6 billion/day? $600 million/day? $60 million/day? Remember, those who check the messages needs a living wage.
There's another very obvious problem you fail to grasp, how many people do you need to hire so they can check +60 billion messages every day?
Say you can check 200 messages an hour (spending 18 sec per message), they work in 8 hour shifts 24/7, that's 4800 messages for one work-slot/day which has to be filled by 3 employees. How many millions of employees do they need to reas
Re: (Score:2)
Re: (Score:2)
B) We already have automated monitoring of posts and people complain about how bad it is and that it is either too strict or not strict enough.
Re: (Score:2)
A) Telephone operators don't monitor the entire conversation. They simply connect them. This makes your comparison false
The fact that telephone operators don't monitor conversations only makes NicknamesAreStupid's comment more relevant. If it would take every American to become a switchboard operator to make physical switchboards work it would take even more than that many people to review every post on Facebook. So the comparison isn't false it is just understated.
Re: (Score:2)
Early in AT&T's history, when human switchboard operators were the way telephone calls connected, their HR department concluded that telephony growth would require that everyone in America become a switchboard operator within 20 years. Then they invented the electro-mechanical telephony switch.
The whole premise of this discussion is that some people like phantomfive contend that every message on Facebook should be vetted by a human. Yes automation exists and is being employed, but apparently that is not good enough.
If we required a human to listen in on every telephone conversation to make sure the phone system wasn't being used to abet the trafficking of teens, then yes, either everyone would be a telephone operator or a telephone call would be so expensive only the well off could afford them.
Re: (Score:2)
It's probably closer to 25 cents or less on the average. There will be some edge cases that are harder, but those can be bounced with a "please rephrase".
But it's still impossibly expensive.
Re: (Score:2)
To hell with the cost.
Assume there are 60billion messages. Assume that a person can check 1000 messages per day (I think that's high, but let's go with it). That means FB alone would need 60 million employees doing nothing but checking posts 24/7. That's roughly the population of Italy [worldometers.info].
Somehow, I don't think that's feasible.
Automation reduces the cost (Score:2)
If it costs $1 to check every message, that's $60+ billion dollars.
Automation reduces the cost. FB seemed to have no cost issues when it came down to negative posts of the Biden family business prior to the election, or posts pointing out where Fauci was beng political not scientific, or any other of a host of topics that fit the politics of the staff. Opposition attorneys will love delving into facebooks practices and costs in such areas to establish FB's capabilities.
Re: (Score:2)
Is it really so surprising that big companies treat people in power differently than the average person? When Trump was in power, he could shit post all he wanted. Even when he was arguably violating the ToS, exceptions were carved out for politicians because their messages supposedly have news value. When it was clear he was out of power and a bunch of his followers trashed the workplace of the people responsible for writing regulations that govern big media companies, surprise, he was out, and we should
Re: (Score:2)
Is it really so surprising that big companies treat people in power differently than the average person? When Trump was in power, he could shit post all he wanted. Even when he was arguably violating the ToS, exceptions were carved out for politicians because their messages supposedly have news value. When it was clear he was out of power and a bunch of his followers trashed the workplace of the people responsible for writing regulations that govern big media companies, surprise, he was out, and we shouldn't be treating anyone special, except hey, look, this new guy's in power and now we will clean up his messes. It's not shocking or anything, it's just capital protecting itself.
The media did not clean up Trump messes. They amplified stories of his actual messes and created some negative stories that were not true. The protection is only extended to one side in this case. Letting him spout crazy shit at 3am actually hurt his re-election chances, so allowing him to continue did him no favors.
Re: (Score:2)
Re: (Score:2)
Based on the vast number of posts on Facebook everyday you would need >1 million people reviewing posts and private messages.
The actual number of live human reviewers would probably be in the 10s to 100s of millions.
React to trafficking as they do with Biden stories (Score:1)
What do the plaintiffs think Facebook should have done?
Put in a sizable fraction of the effort to remove sex trafficking that they put into covering up negative Biden stories in the run-up to the election.
Re: (Score:2)
Des story contain the word Biden. If so put it the queue for checking if it bad.
Does story/message contain the word sex trafficking. If so put it in the queue for checking.
Done.
Oh please... (Score:3)
This affects Slashdot, too (Score:2)
Re: (Score:2)
SlashDot thinks an elipsis is ascii art. What morons.
My 2 cents ... (Score:2)
As much as I hate Facebook I have to side with them in this case. This ruling will open the flood gates on nuisance and other frivolous legal actions against ALL web sites that post user comments that sometimes contain content that is offensive or other wise "emotionally harmful" to readers, like pointing out that someone's favorite Celeb is a rapist ("That's not True! He would NEVER drug and rape a 13 year old! The witnesses are all lying!")
And now that the "gloves are off" in that a web site is legally r
Double-standards (Score:2)
They ban female nipples because 'rules' but not pimps. Facebook knows they have double-standards. Or maybe not: Pimps spend money, naked breasts don't.
Re: (Score:2)
Re: (Score:1)
You would have them ban all those rappers claiming to be pimps? Are you a racist?
They can't do that and not also ban on the racist pimps who claim to be patriots....
class action? (Score:1)