Google Scans Gmail and Drive For Cartoons of Child Sexual Abuse (forbes.com) 146
A Forbes associate editor covering privacy, cybercrime, and security/surveillance reports on a recently-revealed search warrant.
Instead of investigating a photo, it asked Google to provide information on a suspect who allegedly owned graphic illegal cartoons involving children: That kind of content is potentially illegal to own under U.S. law and can be detected by Google's anti-child sexual material (CSAM) systems, a fact not previously discussed in the public domain, the warrant reveals.... Google also uses machine learning tools to look at files and analyze them for any sign they're of abused children....
As per its legal requirements, Google handed information on what it found, as well as the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then passed on the findings to the DHS Homeland Security Investigations unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving back information on emails to and from the defendant. It appears the suspect may actually be a known artist. As no charges have been filed, Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper...
Google, meanwhile, has in recent years released transparency reports showing how many times it reports issues to NCMEC. The figures reveal a disturbing trend. In the first six months of 2021, it found more than 3.4 million pieces of potentially illegal content in 410,000 separate reports. That was up from 2.9 million in 365,000 reports in the last six months of 2020, and well over double that from January to June 2020, when 1.5 million pieces of CSAM material were discovered and reported to NCMEC in 180,000 reports...
As Google doesn't end-to-end encrypt its communications tools like Gmail or its file storage tech like Drive, it's still possible for the tech company to scan for illegal content. And as it has no plans to introduce those features, law enforcement can still rely on Google to warn NCMEC when abuse happens on its servers. Whether the majority of users will want Google to scan people's accounts so it can help find child abusers, or have improved privacy with end-to-end encryption instead, the Mountain View, California-based business will have to struggle with that balance in perpetuity. The same goes for any one of its rivals.
Instead of investigating a photo, it asked Google to provide information on a suspect who allegedly owned graphic illegal cartoons involving children: That kind of content is potentially illegal to own under U.S. law and can be detected by Google's anti-child sexual material (CSAM) systems, a fact not previously discussed in the public domain, the warrant reveals.... Google also uses machine learning tools to look at files and analyze them for any sign they're of abused children....
As per its legal requirements, Google handed information on what it found, as well as the IP addresses used to access the images, to the National Center for Missing and Exploited Children (NCMEC), which then passed on the findings to the DHS Homeland Security Investigations unit. Investigators used the IP addresses provided by Google to identify the suspect as the alleged owner of the cartoons, and searched his Google account, receiving back information on emails to and from the defendant. It appears the suspect may actually be a known artist. As no charges have been filed, Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper...
Google, meanwhile, has in recent years released transparency reports showing how many times it reports issues to NCMEC. The figures reveal a disturbing trend. In the first six months of 2021, it found more than 3.4 million pieces of potentially illegal content in 410,000 separate reports. That was up from 2.9 million in 365,000 reports in the last six months of 2020, and well over double that from January to June 2020, when 1.5 million pieces of CSAM material were discovered and reported to NCMEC in 180,000 reports...
As Google doesn't end-to-end encrypt its communications tools like Gmail or its file storage tech like Drive, it's still possible for the tech company to scan for illegal content. And as it has no plans to introduce those features, law enforcement can still rely on Google to warn NCMEC when abuse happens on its servers. Whether the majority of users will want Google to scan people's accounts so it can help find child abusers, or have improved privacy with end-to-end encryption instead, the Mountain View, California-based business will have to struggle with that balance in perpetuity. The same goes for any one of its rivals.
Child pr0n today... (Score:4, Insightful)
Re:Child pr0n today... (Score:5, Funny)
An example of cartoon child porn (Score:2)
Clearly illegal, anyone looking at this should go to jail.
Re:An example of cartoon child porn (Score:5, Funny)
Nah. Too well developed to be child porn. More like
|. .|
Re: (Score:2)
No to mention Japanese tentacle porn, some of the abused squids are only 12 years old.
Re: (Score:2)
In tentacle porn, the tentacled creatures are the abusers, you insensitive clod.
Re: (Score:2)
"In tentacle porn, the tentacled creatures are the abusers, you insensitive clod."
All the little girls claim that.
thirty years ago (Score:5, Insightful)
I had a cartoon of Bart Simpson getting a BJ from the purple headed twins, and other similar cartoons, maybe a dozen all told.
Illegal today, because it depicts someone meant to be a minor.
Am I going to have to worry it may be in one of my archives somewhere? jeez louise.
Re:thirty years ago (Score:4, Insightful)
Yes... that's basically where this is going. I mean beastality is banned in most countries too. Does this mean watching some Beauty and the Beast animated porn should be illegal. I mean that's not my cup of tea but this is to really open a can of worms.
This guy won art competitions, so I would even wager whatever he made was likely "tasteful" in the sense that there is a lot worse shit out there.
This doesn't even get into "faces of death" kind of stuff which personally is probably a far worse taboo... I mean murder is illegal everywhere, should having a video of a murder be illegal because then CCTVs put you at risk. Shit a CCTV probably captures actual child porn and if it's backed up, that's another "tick" against you.
So personally fuck this shit.
Re:thirty years ago (Score:4, Interesting)
There are US based artists making sexual images of cartoon children right now. They have Patreon pages where they accept money.
Something doesn't seem right here, either the law is not being accurately described or is often not enforced.
Re: (Score:2)
Patreon takes them down, just not in an automated fashion. If you were to publish the name of the account here, it wouldn't last long. My guess is that it's gone in less than a year anyway.
My thing is furries, so I'm pretty sure I can calculate in dog years to determine if they are legal or not. Anything above 3 and I should be fine.
Re:thirty years ago (Score:5, Interesting)
Something doesn't seem right here, either the law is not being accurately described or is often not enforced.
It seems that the justification given for the search is Potentially Illegal, not Illegal.
It is only explicitly illegal if CSAM depicts exploitation of a child - that could not exist in a cartoon involving entirely fictional characters that does not involve an actual person, But could be if one of the characters shown in their cartoon represents an actual person.
They also wrote that no charges have been filed, But Google - apparently, is asked to find the images all the same -- It sounds like scope creep..
What started as a system and method for finding images associated with actual abuse is being expanded to do more and more. Next thing you know it will become a more general search to report to law enforcement people with more uncertain or questionable materials (rather than definitively dubious items) for further study; this can easily expand to other categories of info.. too such as potentially-hateful imagery, "potentially-subversive, seditious documents, misinformation / propaganda, etc".
Re: (Score:2)
Given the political activism of the parent company, they're already scanning for political right/wrongthink and have a profile being kept on each person they have data on. The TOS allows it, they're in the business of selling intrusiveness and political profiles pay (you can predict what messaging will win elections, for example).
Re: thirty years ago (Score:2)
Tell GPT-5 to predict the response given your message history if someone asks for child porn. if gpt-5 predicts you can be convinced to provide it, you're guilty
Re: (Score:2)
if gpt-5 predicts you can be convinced to provide it, you're guilty
First of all they will change CP to include any legally "obscene" materials - which is a broader more legally vague completely separate category of items that various states have laws against to varying degrees of constitutionality, and there have been convictions of people on obscenity charges over possessing some cartoons but not for solely obscenity, if that makes sense.
More like.. Pre-Crime for the purpose of search justification. if
Re: (Score:2)
Do you have an actual link to the law? This topic comes up here relatively often and everyone seems to have a different take
There is no simple single law anyone can reference, because each state have its own specific laws regarding the matters of CSAM, and they are affected by the US constitution - Just because they wrote something in their statute does not mean all the state laws are permissible as written - some of the federal law tried to prohibit virtual or "simulated" porn but got found to be an Uncon
Re:thirty years ago (Score:4, Interesting)
You're understating things here, this is about cartoons so the question is: should a cartoon depicting murder be illegal, I think we can agree this is a high level of absurdity.
I don't know why anyone would want to watch "faces of death", it's grotesque but I don't want to impinge upon other people freedom.
What next, banning school-girl fantasy dress outfits? Now, why would women want to dress as school-girls anyway, is it because actually men fantasise about having sex with schoolgirls, of course it is. So should the st' trinians films all be banned, there is some obvious sexualization in those.
Shitty laws like this have a name - thought crimes.
Re: thirty years ago (Score:2)
Agreed.
Re: (Score:2)
I had a cartoon of Bart Simpson getting a BJ from the purple headed twins
Bart has been around since 1987, and he was ten years old then. So he is 44.
Re: (Score:2)
Strange thing is there is that kind of porn easily discoverable with Google searches, and the sites hosting them don't seem to have legal issues.
Is there any information on the criteria that an image must meet to get into the naughty list?
Re: (Score:2)
People in the UK aged 16 can have sex but if they draw a cartoon of it afterwards then they can be thrown in prison for 3 years. Makes perfect sense doesn't it ~sarc.
Re: (Score:2)
Australian legal precedence set in 2008 with regard to that exact cartoon (quite a few people have been charged with child pornography for having it since): https://www.abc.net.au/news/20... [abc.net.au]
Let that insanity sink in for a moment, this is a judge, explaining what is a 'person' under Australian law:
"In my view, the magistrate was correct in determining that, in respect of both the Commonwealth and the NSW offences, the word 'person' included fictional or imaginary characters ... the mere fact that the figure
I guess zone-tan better watch out⦠(Score:2)
Re: thirty years ago (Score:2, Insightful)
-The First One Up Against the Wall
Re: (Score:2)
Complicated (Score:2)
Re: (Score:2)
Uh, is that sarcasm? Can't tell if you really want to ban Snow White but it's a good comparison for the issue.
Re:Complicated (Score:4, Insightful)
Re: (Score:2)
Interesting points. Reminded me a bit of Akira which seems pretty clearly to have a scene of underage rape, though since we aren't told the kids ages, it can be left up to assumption they are 18+ (though maturity wise we might guess otherwise). Such a fairy tale does seem to have some of these "life lessons" potentially at reach to the audience.
So how does this work? (Score:4, Insightful)
I clicked on a few links looking for just how you write software to detect this type of material and came up pretty empty. They aren't talking about how it works.
So just how does such a system discriminate between "guilty" and "innocent?" I would think that is far more difficult than Level 5 car automation.
Given the numbers they do report -- in the millions -- I suspect their "AI" just deals out huge numbers of false positives for an organic unit to sort out. So they're just flooding a government agency with crap data which will keep them from getting real work on child abuse done. Or did I miss something on this?
don't take an deal and demand source code in court (Score:2)
don't take an deal and demand source code in court the state needs to prove in court that it is CP and if it's just an cartoon that may be very hard and if it's say 8bit / 16 bit art then you can point to video games that have stuff that look like what you have.
and don't forget the Best Buy case in where an fruit of the poisonous tree issue killed the case.
Re: (Score:2)
Re: don't take an deal and demand source code in c (Score:2)
Apple completely dropped their CSAM detection plans.
https://slashdot.org/story/393... [slashdot.org]
Re: don't take an deal and demand source code in (Score:3)
Re: (Score:2)
EULA does not take away your constitutional rights in court.
and they need to have an clean chain of custody also you have the right to face your accuser and if your accuser in an algorythm then I want the source code and the names of the people who over looked the report OR YOU MUST AQUIT!
Re: (Score:2)
Re: (Score:2)
but the K9 Officer is an LEO and google is not an LEO. Also people have getting off / really good deals as did not have access to the DUI Intoxilyzer code.
So maybe some can take an really good deal that drops any CP change down to something lesser.
Re: (Score:2)
I.E. identify a suspect via illegal means; create a fake 'legal' method; use the fake narrative to support the rest of the case - this is how intel agency domestic spying is used.
Re: (Score:2)
There are organizations that provide a list of known CSAM images - either government, or in partnership with government. I don't know about the US, but the UK has the IWF. Since these are perceptual hashes though, false positives are inevitable - and common.
I suspect the machine learning part is something of a brag. Hard to say though, as the whole field is really secretive. If they actually said how they find the material, it would be hinting at what countermeasures might hide it. Also some of the organisa
Re: (Score:2)
Create a database of 'offending' images and compare all files to it. This resulted in photos of a vietnamese female burnt by napalm being declared "sexually abused" and copies of the war-crime, censored.
Use AI to identify female areolas and male glans (purple knobs). Anything that matches the pattern is banned.
Or did I miss something on this?
Everyone can claim "I'm really concerned and doing something".
Re: (Score:2)
They have databases of image hashes.
Most of the reports are of websites. My guess would be that they first check if they are US based, otherwise no jurisdiction and possibly not illegal in the country in which they are based.
Google probably helped with the tech. They have an image hashing system that can recognise things like resized images, and copies that have been through the JPEG process a few times. Hopefully a human checks everything before any allegations are made.
Re: (Score:2)
As no charges have been filed, Forbes isn't publishing his name, but the man identified in the warrant had won several small Midwest art competitions, and one artwork from the 1990s had been mentioned in a major West Coast newspaper...
Fortunately we're not bound by such considerations and can name the child pornographer as one Raffaello Sanzio da Urbino. Here's a sample of his work [aaronartprints.org]. Warning: Possessing or viewing that may be illegal in some US states.
I guess it also means (Score:3)
guess it means anime fanservice and hentai must also be illegal in the US
It's time for pedo cartoons to stop! (Score:5, Funny)
You think a few drawings and animes hurt no-one. What about the toons who were kidnapped, trafficked and forced to shoot sick perverted cartoons? Scarred for life, that's what they are. Some of them ended up so mentally destabilized they even agreed to work for Disney...
Re: (Score:2)
Nothing a nice rubber eraser can't fix!
don't upload non-public data to the cloud (Score:5, Insightful)
Anyone who uploads non-public data to the cloud without first encrypting it is a fool.
I use Google Drive for two purposes: to publish stuff I want everyone to read, and as off-site backup for my most valuable data. Before uploading my backups I encrypt them. The key is not known to Google, so Google can't snoop on them. When I download the backup data I decrypt it myself.
Think of the cloud as "somebody else's computer". Any data you wouldn't give to a stranger is data you should't be uploading to the cloud.
Re:don't upload non-public data to the cloud (Score:5, Insightful)
Re: (Score:3)
Re: (Score:2)
There are institutions, both government and private, that are serious about computer security. How do they make sure their secrets are not accidently exfiltrated? Perhaps they have a set of "best practices" that could be useful to those of us with small budgets.
Re: (Score:2)
Re: (Score:2)
Keeping important data on physically isolated networks requires discipline. You have to make sure nobody bridges the gap for his convenience, either by order from the top ("Give me full access to everything or you're fired") or by planting a wi-fi hot spot somewhere. I can see how the military, or a company run by veterans, could have such discipline, but I expect it is rare among the private sector.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
G-suite services like Photos are already scanning local files on Android devices for advertising purposes. Scanning for "bad pictures" is the next step. (This was the same path they took in regards to scanning your emails.) After seeing the backlash Apple's program generated, they are not likely to publicize the fact that they are doing this.
Besides that, the trajectory of how consumer tech works is increasingly to upload your data to "the cloud" by default. Even if you carefully opt-out from that at every
Re: (Score:2)
Rolling your own smaltphone doesn't seem reasonable unless you have a very large IT budget. For those of us without such resources we can aviod using smartphones, or not put any secrets on them. As far as Windows is concerned, using a distribution of GNU/Linux doesn't seem unreasonable. I am not into computer games, and I don't know of anything else I had to give up when I started using Fedora. My last application switch was from Quicken to GNUcash.
I'm ok witht that. (Score:2)
Before you get upset, you should remember that Gmail/Google Drive are free services to just about everyone. If you're paying for it then sure, be upset but you bought in after knowing the ToS and the recent behavior of Alphabet/Google. Everyone that doesn't like this is free to go to other services... and there are no shortage of them.
I do value my privacy so I use ProtonMail. It's not perfect but it's good. For storage I built a small NAS.
Re: (Score:2)
For storage I built a small NAS.
Was it a Lil Nas X?
Okay, I'll show myself out...
Re: (Score:2)
Users have to be technically vigilant to have a chance of stopping it, if it's possible at all.
Re: (Score:2)
If you're using Android or iOS then you've already failed at privacy. Also, it's 100% possible to live life without a smart phone.
Re: (Score:2)
No, the obvious lesson is that your files are not private in "the cloud". How one responds to that will vary.
child porn versus first amendment (Score:5, Insightful)
How can you prosecute someone for posessing a picture? Isn't a picture "expression", and isn't expression protected by the First Amendment?
The workaround to the First Amendment used by the child porn law is that a picture of child porn is considered evidence of a child having been molested. Law enforcement has a database of known child porn pictures, and if you are caught with one of them you can be prosecuted.
However, what if I create a picture, purely from my imagination, that depicts a child being molested? That isn't evidence of anything, so if I am prosecuted for posessing it can I use the fact that I created the picture from my imagination in my defense? Would it help if I demonstrated to the court my ability to create a picture of a child being molested from my imagination?
A cartoon is even more problematic than a photo-realistic picture. It is probably easy for a skilled graphic artist to demonstrate to a court that he can draw a picture of a child being molested, and therefore such a drawing is not, by itself, evidence of anything.
Re:child porn versus first amendment (Score:5, Interesting)
If some sick fuck has cartoon drawing of depraved shit, there's no victim, so no common law crime.
Does it encourage him or give him an outlet that keeps him off the street?
People used to have this discussion about violent video games but it's mostly been settled that they reduce overall crime.
Re: (Score:2)
I don't think people are arguing about depraved shit, it's the vagaries that are the problem.
If I draw two stick figures having sex, at what point does some algorithm (would this be a TipperGorithm?) find it pedophilic and thus obscene and destroy my public reputation etc?
When one stick person is only 2/3 the height of the other? Half? Is it possible to draw people having sex with, say, a little person at all? Or a fantasy pic of a slutty hobbit chick?
Re: (Score:2, Informative)
Does it encourage him or give him an outlet that keeps him off the street?
The theory of an "outlet" is completely disproven. Acting out fantasies encourages the fantasies. There is no ambiguity about that part.
You are a fucking repulsive liar. For example, there are _very_ solid numbers that availability of porn significantly reduces rape.
Re: (Score:2)
I've now heard both sides proclaim the scientific consensus on both sides of the argument.
Will anyone care to back up their claims with a link to anything?
Re: (Score:2)
Here is something:
https://www.psychologytoday.co... [psychologytoday.com]
References to peer-reviewed scientific publications at the end of the article. There is really no way this effect is not there at this time.
There is no research on the impact of child porn availability (with or without victims), because doing such research has intentionally been made hard and getting it funded is basically impossible. At least the moral question for drawings, computer-generated pictures and texts not referring to any real people is that the
Re: (Score:2)
Its not any easy study to do. Very likely the consumers of child-like porn are more likely to commit offenses against children than are the average population because you have pre-selected people who are attracted to children. I don't seen a legal and ethical way to do this study - but maybe someone has figured it out
Must be a nice job (Score:2)
Experts for the age of cartoon characters must be paid a lot then.
But why do we care about drawings? (Score:3)
There was an argument that porn that looked like child porn, even if the actors will all >18 might be used to camouflage real child porn. I don't agree with that argument, but it is not irrational.
But what is the argument that a drawing of a child is hurting anyone? And if a drawing is illegal, why is that so different from a story? If someone draws a picture of a naked Romeo and Juliette, is that somehow far more damaging to society than the story itself?
I'm not in favor of scanning of online files for any reason - the risks of false positives, of releasing private information etc, are far too high, even if the scans are looking for real and serious harm. But this?
The argument not to store anything unencrypted in the cloud is good - but its easy to get it wrong. Various cloud services will try to entice you to put files in the cloud and a non-expert could get confused. At one point I discovered Microsoft was storing my desktop in the cloud - without my knowingly having given permission. (I must have missed an OK box somewhere). So good practice, but easy to get wrong.
Re:But why do we care about drawings? (Score:5, Insightful)
Simple: This is not about protecting children at all. Once you accept that this is only and exclusively about inflicting some arbitrary moral ideas on everybody with as much violence as needed, you can see why drawings are treated the same as actual images and why they seem to care much more about images than about actually stopping child abuse. That is simply not their goal. That is just the lie used to facilitate their actual goals.
Re:But why do we care about drawings? (Score:5, Insightful)
Their goal is to have a hammer they can use against anyone at any time. Even if the target is not prosecuted, once such information becomes public, they're likely to lose their job, membership in their community, friends and possibly family. I guarantee you, the CIA has enough zero-days to get a "bad" image onto your computer no matter how secure you think you are.
Re: (Score:2)
Re: (Score:2)
Their goal is to have a hammer they can use against anyone at any time.
Umm... Normal people don't have child pornography on their computer, cartoon or otherwise.
I guarantee you, the CIA has enough zero-days to get a "bad" image onto your computer no matter how secure you think you are.
Is that what you're going to tell the judge?
Re: (Score:3)
Amusing how you approach your parent poster with derision while immediately making his point.
Re: (Score:2)
It'd be oh so much easier to seize consumer electronics and add something incriminating prior to the 'official' forensics. No zero day is needed, just local folks in the chain of custody. And it's so much safer! If there is a slip-up and evidence tamperin
Re: (Score:2)
This is about drawings and other content that was generated without harming children. There is a real possibility that keeping this stuff illegal increases child abuse. There is also a ton of evidence now that this whole thing is being used as a pretext to remove freedoms and increase surveillance.
You, on the other hand, are trying to shut down this discussion. Why?
Re: (Score:2)
You, on the other hand, are trying to shut down this discussion. Why?
I'm calling out perverts. Apparently, Slashdot is infested with them. You must feel right at home.
No, you are not. You are being an useful idiot.
Re: (Score:2)
No, you are not.
Wow, what a compelling argument.
I'm still waiting for your alleged evidence that keeping child porn illegal increases abuse and the "ton of evidence" you claim to have for your weird conspiracy nonsense.
As it stands, I can only assume that this alleged evidence is just a comforting lie that you tell yourself to deal the shame of your sick perversion.
Why don't I hear this more? (Score:2)
why does google do this? (Score:2)
Stick figure child porn (Score:2)
So if I am a lousy artist, and make drawings of a similar style as the xkcd characters (or worst) and label them as children, and have my stick figures depicting "sort of" stick figure porn, is that child porn?
After all, you got all sorts of art, with drawings that don't look like anything to the very realistic paintings.
Does all it takes to make it child porn for the artist (or someone else) to say it depicts child abuse / child porn?
ridiculous. (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Sounds like you're going to prison then, pedo!
Re: (Score:3)
I will shoot xxxx and blow up xxx.
Even if you listed a specific person and place, it is still not illegal.
Only a "credible" threat is illegal. There must be both an intent and an ability to carry it out. So you aren't breaking the law because nobody believes you are serious.
The words aren't illegal. Only the action and intent that they represent are illegal.
Re: (Score:2)
Re: (Score:2)
They didn't just randomly go after the parents. The "don't do it" text implies that they knew what he had planned and knew he had a gun in his backpack. That puts their refusal to allow a search in a very different context. We don't have all the facts, but the facts we do have look damning.
Re: (Score:2)
The freedom of speech falls under the First Amendment, and has nothing to do with the Second in this context. This is a speech issue, which falls under the right to free speech, the right to bear arms doesn't even fit in the case you are trying to make.
Re:Did you know...a... (Score:4, Interesting)
I recall some news about a school calling the cops because the kids were singing the Fresh Prince theme, and someone misheard "shooting some b-ball outside of the school" as "shooting some people outside of the school".
Re:Did you know...a... (Score:5, Interesting)
I know someone from Eastern Europe whose kid got in a lot of trouble with the thought police because of a ball game, where he cheered his home team with "go, heroes" in his language, and that sounded like "you N@zi" to some language-challenged woke dumbfuck. You know, the brother of the dumbfuck who had this filter that makes me put @ in a perfectly cromulent word installed here on slashdot.
and japanese anime cases will hold in the USA for (Score:2)
and japanese anime cases will hold in the USA for content that is not really marketed for the USA and is sold legally in japan.
Re: (Score:2)
Well, that is a bold rant.
Re: (Score:3)
As far as I can determine, it's posted by a bot. Random complaints from a pool seem to be concatenated, as the posts differ very slightly. Between this and the other mass-posted hatefests, it's clear Slashdot is being seriously abused. Having said that, the moderation system usually cleans those out of most people's feeds quickly, proof that the system is mostly functioning as intended.
Re: (Score:2)
Re: (Score:2)
Sounds like you're going to prison soon!
Re: (Score:3)
Also, lets ban pediatrics, while we are at it. These are obviously all perverts that want to see children naked and even touch them!
Re: Those freaks should be imprisoned... (Score:3)
I am. What's the problem? Except for a few characters who are children, all ponies in the show are adults. Not even teens, adults. The protagonist is a post-doctorate government advisor. Her friends, forming the mains cast, are: a veterinarian with her own clinic and several years of practice; a professional cook and party organizer; a fashionista who owns a network of stores; a farm owner; and a fire department officer and semi-professional athlete. The youngest of the is 24 years old by the point the show
Re: (Score:3)
This is the troll++ language, it has the assignment and equality operators done the right way.