Snapchat Isn't Liable For Connecting 12-Year-Old To Convicted Sex Offenders (arstechnica.com) 59
An anonymous reader quotes a report from Ars Technica: A judge has dismissed (PDF) a complaint from a parent and guardian of a girl, now 15, who was sexually assaulted when she was 12 years old after Snapchat recommended that she connect with convicted sex offenders. According to the court filing, the abuse that the girl, C.O., experienced on Snapchat happened soon after she signed up for the app in 2019. Through its "Quick Add" feature, Snapchat "directed her" to connect with "a registered sex offender using the profile name JASONMORGAN5660." After a little more than a week on the app, C.O. was bombarded with inappropriate images and subjected to sextortion and threats before the adult user pressured her to meet up, then raped her. Cops arrested the adult user the next day, resulting in his incarceration, but his Snapchat account remained active for three years despite reports of harassment, the complaint alleged.
Two years later, at 14, C.O. connected with another convicted sex offender on Snapchat, a former police officer who offered to give C.O. a ride to school and then sexually assaulted her. The second offender is also currently incarcerated, the judge's opinion noted. The lawsuit painted a picture of Snapchat's ongoing neglect of minors it knows are being targeted by sexual predators. Prior to C.O.'s attacks, both adult users sent and requested sexually explicit photos, seemingly without the app detecting any child sexual abuse materials exchanged on the platform. C.O. had previously reported other adult accounts sending her photos of male genitals, but Snapchat allegedly "did nothing to block these individuals from sending her inappropriate photographs."
Among other complaints, C.O.'s lawsuit alleged that Snapchat's algorithm for its "Quick Add" feature was the problem. It allegedly recklessly works to detect when adult accounts are seeking to connect with young girls and, by design, sends more young girls their way -- continually directing sexual predators toward vulnerable targets. Snapchat is allegedly aware of these abuses and, therefore, should be held liable for harm caused to C.O., the lawsuit argued. Although C.O.'s case raised difficult questions, Judge Barbara Bellis ultimately agreed with Snapchat that Section 230 of the Communications Decency Act barred all claims and shielded Snap because "the allegations of this case fall squarely within the ambit of the immunity afforded to" platforms publishing third-party content. According to Bellis, C.O.'s family had "clearly alleged" that Snap had failed to design its recommendations systems to block young girls from receiving messages from sexual predators. Specifically, Section 230 immunity shields Snap from liability in this case because Bellis considered the messages exchanged to be third-party content. Snapchat designing its recommendation systems to deliver content is a protected activity, Bellis ruled. Despite a seemingly conflicting ruling in Los Angeles that found that "Section 230 didn't protect Snapchat from liability for allegedly connecting teens with drug dealers," Bellis didn't appear to consider it persuasive. She did, however, critique Section 230's broad application, suggesting courts are limited without legislative changes, despite the morally challenging nature of some cases.
Two years later, at 14, C.O. connected with another convicted sex offender on Snapchat, a former police officer who offered to give C.O. a ride to school and then sexually assaulted her. The second offender is also currently incarcerated, the judge's opinion noted. The lawsuit painted a picture of Snapchat's ongoing neglect of minors it knows are being targeted by sexual predators. Prior to C.O.'s attacks, both adult users sent and requested sexually explicit photos, seemingly without the app detecting any child sexual abuse materials exchanged on the platform. C.O. had previously reported other adult accounts sending her photos of male genitals, but Snapchat allegedly "did nothing to block these individuals from sending her inappropriate photographs."
Among other complaints, C.O.'s lawsuit alleged that Snapchat's algorithm for its "Quick Add" feature was the problem. It allegedly recklessly works to detect when adult accounts are seeking to connect with young girls and, by design, sends more young girls their way -- continually directing sexual predators toward vulnerable targets. Snapchat is allegedly aware of these abuses and, therefore, should be held liable for harm caused to C.O., the lawsuit argued. Although C.O.'s case raised difficult questions, Judge Barbara Bellis ultimately agreed with Snapchat that Section 230 of the Communications Decency Act barred all claims and shielded Snap because "the allegations of this case fall squarely within the ambit of the immunity afforded to" platforms publishing third-party content. According to Bellis, C.O.'s family had "clearly alleged" that Snap had failed to design its recommendations systems to block young girls from receiving messages from sexual predators. Specifically, Section 230 immunity shields Snap from liability in this case because Bellis considered the messages exchanged to be third-party content. Snapchat designing its recommendation systems to deliver content is a protected activity, Bellis ruled. Despite a seemingly conflicting ruling in Los Angeles that found that "Section 230 didn't protect Snapchat from liability for allegedly connecting teens with drug dealers," Bellis didn't appear to consider it persuasive. She did, however, critique Section 230's broad application, suggesting courts are limited without legislative changes, despite the morally challenging nature of some cases.
Parents (Score:2)
Re: (Score:2)
Today the correct word would be "parent" in too many cases, likely working 2 jobs.
Re: (Score:2)
The sins of the mother are visited upon the children.
Nice philosophy you got there. What other victim groups do you blame?
Re:Parents (Score:4, Insightful)
Yeah, the parents don't have time to raise their kids these days, that's why they give the car keys to the 12-year-old and send them grocery shopping with the credit card. Nothing could ever go wrong with that!
That's exactly how that excuse sounds. Lock down the phone/tablet/smart device so it's restricted only to age-appropriate stuff, or don't give your kid an internet-capable device until they're mature enough in the first place. Yes, in an ideal world all the pedophiles should be behind bars, but it's not Snapchat's fault that they're not.
Re: (Score:2)
Do you blame the raped child for having shitty parents?
Re: Parents (Score:3)
Re: (Score:2)
I did.
Did you read the thread? He's blaming the kid for her parents being stupid.
Re: Parents (Score:4, Informative)
Re: (Score:2)
I was going to give you mod points but wanted to expand on your post, more likely if in the US:
If they are rich, probably too busy with work and social activities to care about their kids. These people could watch their children but have more important things to do.
If middle class, working probably 60+ hours per week without overtime pay and could be a single parent. Not much time to watch their children. If they take time off, they loose their job and fall to a lower "class".
If poor, working many part t
Re: (Score:2)
you can not monitor your child 34/7, but the child is literally exposed to electronic devices, unrelentingly 24/7.
It is not like they teach penmanship any longer.
The parent is incapable of supervision without cooperation from the media, and the community.
They have neither ... the company was culpable and the court was wrong
Re: (Score:2, Insightful)
I was a latch key kid by age 12. I think a lot of people here were.
Funny how half the time it's "we grew up w/o helicopter parents and we turned out fine!" but whenever anything goes pear shaped it's "where were the parents?!".
And folks wonder why our birth rate is below sustainability...
Re:Parents (Score:4, Insightful)
When I was 12, I knew who the neighborhood perverts were. And to steer clear of them. But messaging apps didn't exist yet. And when they were invented, all the handed down wisdom about how to spot a weirdo went right out the window. Not too many people had understood the idea that the picture of the 12 year old "making friends" online wasn't really 12.
Now that's becoming common knowledge. But it still makes the online world a lot more difficult for kids to handle when the strangers have a much more powerful suite of tools at their disposal. Our neighborhood perv was a big fat slob that no amount of Photoshop could have fixed.
Re: (Score:3)
Re: (Score:2)
because they didn't fit your image of how one is supposed to look
More like the neighborhood was small. Everyone knew who the creeps were. Fat, skinny, whatever. They couldn't hide. Online, people have tools and avatars behind which to hide.
Re: (Score:2)
That's not even close to true. In many cases the local priest was a sex offender, and everybody trusted their kids with the local priest.
Re: (Score:3)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: Parents (Score:1)
See here someone who neither has job, money or children lecturing us about jobs, money and children.
If you have children in the US, and you have a decent job, only 1 parent needs to work. My family does it, plenty of people I know do it. The myth that you need that 2 income family house and the fancy cars and live in the middle of the city for kids to have a decent upbringing is just that. I grew up in Europe with 1 income as well, you are a lot poorer in Europe doing that.
As far as children being on their
Re: (Score:2)
>Funny how half the time it's "we grew up w/o helicopter parents and we turned out fine!" but whenever anything goes pear shaped it's "where were the parents?!".
Hmm, yeah, because you HAVE to be either overbearing and controlling of every little thing, OR just toss the brat in the jungle and hope they come out as a semi-functional adult after the feral phase.
I mean, you couldn't ever, I don't know, keep an eye on your kid while letting them make their own minor mistakes and step in when they are going t
Re:Parents (Score:4, Informative)
C.O. eventually opened multiple Snapchat accounts, so that when her parents eventually realized she was using these products, she was able to hide her continued usage.
Re: (Score:2)
Exactly. This whole thing is a case of bad parenting, all the way around.
Re: (Score:3)
Parents have been disemboweled to raise their children. Digital devices are everywhere and these public plaforms will not permit parents to ban their children from them even when the use is discover and the parent complains to the platform. Parents have ZERO control of their childs digital interaction unless you want to emancipate the child at age 13 and kick them out of the house,
The court is simply wrong here. It is playing russian roulette with the lives of children.
Re: (Score:2)
Perhaps, but what can they do, really? For one thing, they are generally so huge that there is no way to be aware of individuals. You can say they are too big, or should be able to anyway, but then they get yelled at for being over-cautious. There is no way to verify the other end is a real person, age, etc.. This is why the platforms have legal shields. They end up in court constantly either not filtering enough, or filtering too much. Even AI isn't that good.
Let's say, we want to prevent all users under 1
Re: (Score:1)
Due to privacy though, NOTHING from this censorchipi should be repo
Re: (Score:2)
this is nonsense that they can't figure out how to keep kids off the media. Wikipedea does it with sock puppets, and Facebook can do it also. The fact is, they already know. They just refuse to boot the kids off the platforms.
She fell for this twice? (Score:4, Interesting)
There seems to be a lot of information missing here.
Re: (Score:2)
Most news stories are half wrong and half incomplete.
This one doesn't convince me otherwise.
Re: (Score:2)
Re:She fell for this twice? (Score:4, Interesting)
Re: (Score:2)
There seems to be a lot of information missing here.
Not really, all the information is there: 12 years old and 14 years old. If you're expecting well reasoned adult thinking, especially on topics as complex as how to spot predatory human behaviour then you're quite clueless as to how kids work.
Re: (Score:2)
What about, say, after getting raped because of it once, stay away from that social network completely? Maybe enforced by a parent if needed?
As usual, you completely miss the questions that would need to be asked and hence post garbage.
Re: (Score:2)
How does their algo work? (Score:2)
Was the criminal able to use some variation of SEO to trick the algo into getting himself on the recommended list of 12-year-old girls?
Re: (Score:1)
It makes sense. (Score:2)
Was the phone company liable in the olden days when pervs used the phone to lure teenage girls?
Re: (Score:2)
If the operator recommended the pervs repeatedly, and ignored complaints, I'd say yes.
Minimum age (Score:2)
Looks like 13 is the minimum age to be on the platform, so someone gave her an account without the necessary guidance:
https://www.snap.com/en-US/ter... [snap.com]
Steve Burton's gonna beat that first one. (Score:2)
Steve Burton should show up at the first offender's cell and "accidentally" go Jason Morgan on him.
Yes, I am inundated with out of date pop-culture references. Thank you very much.
Maybe we need a culture where parenting is given some form of priority again. This idea where to raise a family both parents must have at least one full time job, probably two, and kids need to be "babysat" by the electronic devices because the parents can't afford reliable daycare 24/7 as they work themselves to the bone isn't re
This has existed, this is why... (Score:1)
Platforms back then weren't liable and they're not liable now, which is fair if you ask me. I add everyone on quick add, and a lot of kids get caught up in that. I be sure to ask everyone's age before I speak with them. Being a kid doesn't necessarily mean you are disqualified from knowing me, but it does censor what I say