Cloud-Powered Facial Recognition Is Terrifying 286
oker sends this quote from The Atlantic:
"With Carnegie Mellon's cloud-centric new mobile app, the process of matching a casual snapshot with a person's online identity takes less than a minute. Tools like PittPatt and other cloud-based facial recognition services rely on finding publicly available pictures of you online, whether it's a profile image for social networks like Facebook and Google Plus or from something more official from a company website or a college athletic portrait. In their most recent round of facial recognition studies, researchers at Carnegie Mellon were able to not only match unidentified profile photos from a dating website (where the vast majority of users operate pseudonymously) with positively identified Facebook photos, but also match pedestrians on a North American college campus with their online identities. ... '[C]onceptually, the goal of Experiment 3 was to show that it is possible to start from an anonymous face in the street, and end up with very sensitive information about that person, in a process of data "accretion." In the context of our experiment, it is this blending of online and offline data — made possible by the convergence of face recognition, social networks, data mining, and cloud computing — that we refer to as augmented reality.'
Google decided against this. (Score:5, Interesting)
Where Are the Recall Rates? (Score:5, Insightful)
This is why Google shelved their version of this tech. The implications were too big.
Having studied this in college and witnessed many failed implementations of it [slashdot.org] I casually ask: Where are the recall rates [wikipedia.org] (see also sensitivity and specificity [wikipedia.org]) of these experiments?
Because when I read the articles, I found this instead of hard numbers:
Q. Are these results scalable?
The capabilities of automated face recognition *today* are still limited - but keep improving. Although our studies were completed in the "wild" (that is, with real social networks profiles data, and webcam shots taken in public, and so forth), they are nevertheless the output of a controlled (set of) experiment(s). The results of a controlled experiment do not necessarily translate to reality with the same level of accuracy. However, considering the technological trends in cloud computing, face recognition accuracy, and online self-disclosures, it is hard not to conclude that what today we presented as a proof-of-concept in our study, tomorrow may become as common as everyday's text-based search engine queries.
How you want to decide Google passed on continuing down this road is up to you. Frankly, I would surmise that the type I and type II errors [wikipedia.org] become woefully problematic when applied to an entire population. Facial recognition is not there yet, not until I see some hard numbers that convince me the error rate is low enough. Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them leading to wasted time, violated rights and wasted opportunity (depending on the misclassification).
Re: (Score:2)
Re: (Score:2, Interesting)
Depends what the inconvenience is. If it's a quick background check with no lasting effects (i.e. not being added to a do-no-fly list or terrorist watch list or your record or subjecting you to public humiliation or arrest), then perhaps... If it's a 5 year vacation in Guantanamo without access to legal counsel, then no way--that would be a horrible perversion of justice!
Consider this question: Do only famous people have look-a-likes? Why would that be, especially since famous people often look non-aver
Just shows to go you.... (Score:2)
the importance of NOT being seen [youtube.com]
Re: (Score:3)
Depends what the inconvenience is. If it's a quick background check with no lasting effects (i.e. not being added to a do-no-fly list or terrorist watch list or your record or subjecting you to public humiliation or arrest), then perhaps...if we start applying this tech to the population at large, we had better be certain that the consequences of a false match WHEN IT HAPPENS are acceptable, legally, ethically, and morally, or we shouldn't do it at all, IMHOP.
Not to get on my political soapbox, but have you been living under a rock for the last ten years -- or at least the last one year? You don't think being felt up by TSA at the airport is "subjecting you to public humiliation"? How about this [wordpress.com] woman [washingtonpost.com] who was removed from a Frontier Airlines flight, cuffed, detained, strip-searched, interrogated and finally released? Her crime was nothing more nefarious than sitting next to two men of Indian (the country, not Native American) descent, one of whom wa
Re: (Score:2)
The problem is scale. 1% of people in the USA is 3 million false positives. With 100,000+ people flying everyday, that is 1,000 false positives from 5,000 possible airports
That would be 30,000 jobs just to track down those false positives.
Re: (Score:2)
An odd definition of "wreck" (Score:3)
If you wreck even three hundred lives because your technology isn't accurate enough, that's three hundred too many.
That statement is correct, yet you have slanted it the wrong way.
You seem to think the worse error is in false positives. But all that happens is that the person would be selected for extra screening. How is that "wrecking" someone's life?
Compare that to not trying anything and letting someone take down a place with a few hundred people. Would you not admit that people who die on a place are
Re:An odd definition of "wreck" (Score:5, Insightful)
What we are talking about is risk management. Risk management is not just a matter of comparing scenarios; it is a matter of multiplying risk probabilities to risk weight (i.e., the severity of that risk), then summing all of the results of that operation. For example, a hijacker crashing an airplane into a building is a very severe risk -- it killed over three thousand people ten years ago -- but it has only happened *ONCE* (okay, four flights) in what...fifty? sixty?...years of airline service. That's a really, REALLY low probability times a really, really severe risk weight, which I'd argue results in a moderately low OVERALL risk. There is also the possibility of a hijacker murdering individual passengers until his (her) demands are met. That's happened significantly more often than a 9/11 hijacking (although still rare, in terms of number of hijacked flights vs. number of uneventful flights), but it directly affects (comparatively) fewer people. However, because it is more common, I'd argue that this scenario results in roughly the same OVERALL risk. Then there is the risk of an unruly passenger. That's much more common than the other two risks, but the risk weight is comparatively minor, which again results in an overall low risk.
As far as scenarios you are comparing...if all that happens is a false positive gets the luggage swabbed, then I really couldn't care less. If a false positive gets removed from an airplane, cuffed, locked into a cell, strip-searched and interrogated before finally being determined to be a false positive and released [washingtonpost.com] then I have a MAJOR problem with it. Consider it this way: if there were 520 people detained in Gitmo [npr.org] and the error rate for false positives (as assumed in the above thread) is 1%, then that means there were likely at least 5 innocent people detained at Gitmo. THAT is what I meant by "wrecked", and I maintain that's an accurate description. Ms. Hebshi's life may not have been wrecked, but I'd say that it has been severely and negatively impacted.
So, yeah. I do think that the worse error is false positives because the risk probability is significantly higher, and the risk impact is moderate to severe as well, which leads to a much, much greater overall risk than a one-in-twenty-million probability of 9/11, even when multiplied by the impact of the death of 3,000+ people.
98% Accurate! (Score:5, Interesting)
You mean to tell me that 98% accuracy when trying to spot terrorists in airports isn't good enough? That's only 200,000 false positives per year for a typical airport.
False positives OK at airport? (Score:5, Insightful)
You mean to tell me that 98% accuracy when trying to spot terrorists in airports isn't good enough? That's only 200,000 false positives per year for a typical airport.
Perhaps the false positives at airports are OK? Rather than randomly choosing people for more attentive searchers, and the occasional grandma to give the facade of fairness and not profiling, we could focus on the 2% who are higher probability. Of course 2% are unfairly inconvenienced but isn't that better than 100% unfairly inconvenienced? Clearly a negative/negative decision.
Of course this is all academic and falls apart if the false negatives are at a non-trivial level.
Re:False positives OK at airport? (Score:4, Insightful)
Those 2% start to get a bit pissed off after the first two or three times. I suspect we might prefer to stick to random in the interests of fairness.
The problem with randomness is that it is less effective since finite time is spent on low probability individuals. What is fair about increasing the likelihood that a bad guy gets through an innocents die? I think what you describe is better described as a facade of political correctness than fairness.
Perhaps the inconvenience could be ameliorated with the known/trusted flier biometric IDs that some are proposing.
Again, I see the unfair burden placed on the 2%, as I said its a negative/negative decision.
Re: (Score:3, Insightful)
But if you happen to look like Abul bin Awfulguy it means that you will be inconvenienced every time you go to the airport. Everytime. While that might be fine for you (or might not, did you know you look just like Sean McIRAnut?), it's not exactly great for Robert Hussien. Who's a fourth generation American, and has a security clearance, but convince the automated systems of that why don't you?
Re: (Score:3)
But if you happen to look like Abul bin Awfulguy it means that you will be inconvenienced every time you go to the airport. Everytime. While that might be fine for you (or might not, did you know you look just like Sean McIRAnut?), it's not exactly great for Robert Hussien. Who's a fourth generation American, and has a security clearance, but convince the automated systems of that why don't you?
If you in fact look like Abul bin Awfulguy or Sean McIRAnut shouldn't security stop you and have a chat to determine if you merely resemble or actually are the person in question? Should a human security agent who thinks he recognizes the aforementioned individuals not do anything unless the random number generator says its their turn for a conversation?
Again, I see the unfairness to the folks who resemble a bad guy, but I'm not sure the cost of "fairness" is reasonable. Especially if some biometric ID i
Re:98% Accurate! (Score:5, Informative)
Let's take JFK. From Wikipedia:
In 2010, the airport handled 46,514,154 passengers
2% of that is almost a million people. Every year. Now, let's assume handling each these false positives is the work of an hour on average. That's about a million hours spent.
Let's assume a workday of 8 hours, and 250 workdays a year. That's about 2000 hours a year for an average worker. So it'll take 500 people to track these false positives at JFK.
I think it's a little unacceptable, but YMMV of course.
Re: (Score:3)
First off. facial recognition is already widely used -- by casinos. The way it works is that if it matches you to a known cheat (or maybe MIT Math Major) then they just either watch you or bar you from the casino. They don't waste any time chasing down false positives. They just continue to improve their software, which is probably better than the commercial stuff now.
The simplest thing is for TSA to do is to just make an extra check on your documentation and bar you from flying if anything is amiss. Why
Re: (Score:3)
When has the TSA ever had to reveal or explain anything they do?
When they get a call from an angry congressperson about the treatment their relative/friend/self just received.
A private business is allowed to deny you entry to their property for no reason at all. So far, we have been operating under the assumption that the government is *not* allowed to deny you passage on a private airplane for no reason at all. Yes, they have tried, but the truth is they are not yet completely above the law.
Of course, personally, I don't feel the need to give them that opportunity.
Re: (Score:3)
Sorry but this pop-culture fixation on "terrorists" has been hijacked by the US Government to facilitate the systematic abrogation of all civil liberties and constitutionally guaranteed rights with the approval and assistance of the oppressed, (US).
We are no safer, the rogue government is infinitely more dangerous to the American people than "terrorists".
Re: (Score:2)
thats only a 1% error... is that supposed to make me feel more comfortable? Sounds like the technology works pretty well, pragmatically...
Anyway, sounds mildly-moderately threatening to general privacy. Who's paying for this?
FTFA, grants from:
National Science Foundation, grant # 0713361
US Army Research Office, contract # DAAD190210389
How much?
Re: (Score:2)
Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them... thats only a 1% error... is that supposed to make me feel more comfortable? Sounds like the technology works pretty well, pragmatically... Anyway, sounds mildly-moderately threatening to general privacy. Who's paying for this? FTFA, grants from: National Science Foundation, grant # 0713361 US Army Research Office, contract # DAAD190210389 How much?
We'll never get any better if we don't try. That's what these grants are for: improving the state of the art.
Re:Where Are the Recall Rates? (Score:4, Interesting)
How you want to decide Google passed on continuing down this road is up to you. Frankly, I would surmise that the type I and type II errors [wikipedia.org] become woefully problematic when applied to an entire population.
I dunno. I bet if you combine the location of a photo with what Google knows about where you live/hang out the results would be pretty good.
Re: (Score:2)
Also, it's not just the police/government mistaking someone's identity that is scary [nj.com]
Re:Google decided against this. (Score:5, Funny)
This is why Google shelved their version of this tech. The implications were too big.
I don't know... I fed my pr0n directory to Picasa's face recognition, and the results were pretty awesome.
Re:Google decided against this. (Score:5, Funny)
This is why Google shelved their version of this tech. The implications were too big.
I don't know... I fed my pr0n directory to Picasa's face recognition, and the results were pretty awesome.
You mean there are people with noses shaped like... that?
Re: (Score:2)
Bingo... just wait until this tech isn't restricted to just faces - why can't you use other body parts or background objects as well to refine the search? One day you might not even need a face to get an identity... I hope everyone is OK with their naked sexy pics coming back to haunt them in 20 years.
Re: (Score:2)
They shelved it, did they? Why would they do that?
Government would pay good money for it, as would many larger corporations (for internal use, of course).
We knew it was coming (Score:4, Insightful)
Re: (Score:3)
Because terrorists all have facebook accounts? I would assume most of them have very little online presence, pictorially anyway.
Re: (Score:3)
Duh, of course they don't have Facebook. They have Terrorbook, and most of their faces are partially covered with handkerchiefs or some other items.
Re: (Score:2)
Facebook centric because its academic research (Score:4, Insightful)
Because terrorists all have facebook accounts? I would assume most of them have very little online presence, pictorially anyway.
Oddly whenever a new terrorist is discovered and remains at large law enforcement and the mass media seem to be able to come up with a facial photo. Perhaps there are sources of photos other than facebook, in particular sources available to government agencies. DMV photo, passport photo, school photos, team photos, etc.
The experiment is facebook centric because it is an academic project that needs to stick to info made public by the individual to avoid privacy issues.
Re: (Score:3)
That is the cool but unnerving part of government tech. It is hard to tell how much is over estimated (like 2001's flights to the moon style overestimation), how far they are genuinely ahead and how much of the bleeding edge is released.
New York was revealed in the media recently to have the tech to track down everyone wearing a "red jacket" through their camera security systems.
Welcome to the world of tomorrow (Score:2)
I already don't like it.
Re: (Score:2)
I do.
Re: (Score:2)
Of course you don't, you watch a lot of sci-fi.
But Facebook... (Score:5, Insightful)
Look, I am not a paranoid man. I am perfectly willing to give out private and personal information - for a reasonable fee.
I give out private information to my bank all the time. In exchange, I get financial services.
Facebook offers - a) a blog, b) email, c) games, d) convenient log in
The first 3 are available for free elsewhere, the last is not worth much.
I'm not paranoid, I'm just not cheap. And Facebook is asking way way too much for the minimal services it provides.
Re: (Score:3, Interesting)
One of the reasons I have a facebook account is so I can untag photos others say are me.
Re: (Score:3)
That's a lot of work. Didn't you know you can change your privacy settings so that tagged photos of you aren't searchable by other people? http://www.facebook.com/help/?faq=267508226592992 [facebook.com]
Re: (Score:3)
One of the reasons I have a facebook account is so I can untag photos others say are me.
This is one of my arguments for maintaining a public presence on the internet: control over my image/likeness. When someone Googles my name, the first things they see are my professional webpage, personal webpage, and Facebook account. Anyone else with the same name is pushed to the second page of results. Anything not under my direct control is pushed to the bottom of the first page of results.
With a Facebook account and publicly available webpages, I am able to broadcast my side of the story and drown
Re: (Score:3)
I tag random photos of others as me. They can't untag a photo that you say is yourself ;)
I call it FaceBombing (combination of Photo Bomb and Facebook). I wish I could trademark the phrase.
Re: (Score:2)
But they can't tag you if you don't have an account. They can write your name, but that is not internally or externally searchable By Ordinary Users . I think your strategy is opening you up to more search connections, by being searchable and for periods tagged.
FTFY. You can be sure FB has database entries for people that don't have accounts, and that their racial recognition program uses these tags. When they build up enough info on a person, they might start sending them email solicitations* like "We have this photo tagged of you. Please create an account to confirm/deny that this is you." I bet it's two years or so away.
*A lot of people use the "upload my addressbook to facebook" option. If they do it from their smartphone, it might scrape the contact ph
public pics? (Score:3, Insightful)
Note that the pic in question (a) does not show a face clearly and (b) may or may not be me.
Re: (Score:2)
Makes no difference on Facebook. While you're doing that your auntie/mom/friends are busy uploading and tagging hundreds of pictures of you.
Re: (Score:2)
Then you find out FaceBook still has a log that it was tagged you, and they are granting back door access to certain governments/businesses to said logs.
Re: (Score:2)
Then you find out FaceBook still has a log that it was tagged you, and they are selling back door access to certain governments/businesses to said logs.
FTFY
Re: (Score:2)
If you are a Facebook user you can untag yourself. If you are not a Facebook users they can't tag you.
You think Facebook really throws that information away just because you clicked "untag"?
0 errors? (Score:2)
Re: (Score:2)
By the standards of some of the dodgier corners of forensics, this stuff will be downright impressive...
Face it (Score:5, Insightful)
The first real-world, publicly available use of this will be an app that lets you:
1. Take a picture of someone with your smart phone
2. Find naked pictures of this person online
BRB, heading to the local college campus...
Re: (Score:3)
And the next app will be a virtual reality overlay that delivers whatever metrics you like. Health, wealth, criminal history. This will be a boon for criminals!
Re: (Score:2)
You think all those images on Google Earth were taken vertically downwards...?
That sort of image transform is everyday stuff to people who work in geomapping.
Software the future of computing (Score:3)
Re: (Score:2)
Out of curiousity, I found and tried running one of those old PC magazine performance test programs from the 1990's (SpeedPro or something similar) on a modern PC (3 GHz, dual-core Intel). Performance was 120,000 faster than the original IBM XT, not taking into the use of GPU's. Tests were doing things like random disk access, FFT transform, memory operations.
Given that for some tasks a GPU is 100x faster than a CPU, and cloud computing puts together a grid of thousands or such PC's, that is an insane amou
it's annoying (Score:2)
when you get an anonymous email telling you you have a booger hanging on the end of a long nose hair
There was once a time that startrek predicted... (Score:2)
There was once a time that startrek predicted future technology. CSI is now doing it. And it is far less benevolent than the cell phone and portable medical diagnostic devices.
Re: (Score:2)
How about a reflection off a water droplet on a handrail at that stadium? I love this video: Red Dwarf CSI Spoof [youtube.com]
Sigh (Score:5, Insightful)
Time to start dressing like The Stig again.
Re: (Score:2)
Re: (Score:2)
If there are 300 million Stigs... how will they know which one is the real one?
Re: (Score:2)
Re: (Score:2)
The real Stig is clearly the one that's the best driver.
Re: (Score:2)
No worries, gait analysis will still get you.
Re: (Score:2)
Re: (Score:2)
Ben Collins, is that you? Some say he has a full tattoo of his face on his face..
I'm glad I have a clone ... (Score:3)
Of course, they just managed to link to *someone* ... did they then ask the person to confirm if they were correct?
I have a LinkedIn page, but without a picture. My twin brother on the other hand, uses Facebook, while I don't. (I'm rather sensitive about my info being out there, after having a stalker during undergrad) So, it's entirely possible that they would've gotten information from my face ... but unlikely that it'd have been my information
In this case, the error might still lead them to me, as my brother would recognize me if they showed him the picture ... but how many other incorrect matches might there have been? Just getting *a* match is not the same as getting the *correct* match.
Nothing to worry about! (Score:3)
Fucking luddites. Go tighten your tinfoil hats.
Low cost workaround... (Score:4)
Re: (Score:2)
Re: (Score:2)
who killed privacy? (Score:4, Insightful)
you did
it's funny that the tech industry holds some of the most privacy-concerned individuals, yet all their dedication to their craft has done is provide the most privacy destroying entity ever to exist
privacy is dead as a doorknob. just forget about the concept. really, you needn't bother about privacy anymore, it's a nonstarter in today's world. big brother? try little brother: every joe shmoe with a smart phone with a camera has more power than the NSA, KGB, MI6, MSS: those guys are amateur hour
i'm not saying it's wrong, i'm not saying it's right. i'm just saying it's the simple truth of the matter, right or wrong: privacy is dead. acceptance is your only option now. you simply can't fight this
and government didn't kill it, you paranoid schizophrenic goons
your technolust did
Re:who killed privacy? (Score:4, Insightful)
it's funny that the tech industry holds some of the most privacy-concerned individuals (..)
That is only if you believe the all-caps paragraphs on all the EULAs and TOS you click through. Often the following paragraphs will contradict the bombastic declarations of commitment to privacy - on the same page.
anyway, it's dead (Score:2)
Technological advance is the purview of many, not only technophiles, academics, or governments. Technology to monitor and correlate just advances. Other than that, I agree with you.
More data goes online every day, even aside from what we put there ourselves, data sourced a myriad ways, ways multiplying constantly. It's a(n ever more) digital life.
There's no pulling the plug. There's only learning to cope. It's just fact that our lives, the lives of everyone, grow ever more transparent.
So, how will we a
Re: (Score:2)
Large group of people stubbornly refuses to act in uniform manner. Film at 11.
Hysteria (Score:3)
They say the false accept rate is .001, or one in a thousand. That is, they can extract about 10 bits of information from a picture. From those 10 bits they claim to get the SSN? Or, they have the picture of a person, and need to identify them in a sample of a million people, they will get back 1000 possible matches.
The complaints about privacy seem greatly overblown. In essence they are saying that if you post a picture with your name, and then another picture without your name, someone with a million dollars of software might recognize the similarities. Of course they might without the computer too. This is just another in the long line of "security" scares which presume that items of public knowledge such as your appearance, name, DOB and SSN can be turned into a secret passwords after 40 years of being public knowledge. The security experts should be spending their time convincing banks not to pretend an SSN is a secret, rather than enabling them by agitating for legislation to make it so.
Finally, a wake-up call on privacy policy? (Score:3)
Re: (Score:3)
I am not sure what point you are trying to make here. The article talks about them matching a picture taken in public, with information such as images from facebook which are also set as public. Where is the violation of your privacy?
If you don't want random people in the street to be able to look at your facebook pictures then don't put them online, or make them as private.
Aggregating public information doesn't suddenly create privacy violations.
Get a website. (Score:2)
This isn't going away. The only real answer is to clog the information channels about you with what you actually want the world to know.
Does this pose a problem for, say, pseudonym online dating? Yep. Unless you're willing to drop the pseudonym and link out to your dating profile, alongside your work profile, your hobby blog. It's time to stop pretending that we can post to Facebook and compartmentalize it -- the service providers do not want to do this, and increasingly are unable to provide this even if w
Date screening (Score:2)
So now I can trace my date to see wether she ever did a porno ?
Wait, this is new? (Score:2)
They've been doing this on shows like CSI and NCIS for years. :) You mean... they were just making it up? Wow. My faith in Hollywood's technical advisers is shattered forever.
For example, this is dangerous for women (Score:5, Interesting)
Re:For example, this is dangerous for women (Score:5, Funny)
I am a good looking female.
On Slashdot? Are you lost?
Re: (Score:2)
Girls can be nerds too. News flash, at least some of them are likely to be attractive.
Re: (Score:3)
What? No links???
If you've got a pr0n website for $$...I'd have to think a link on Slashdot would bring a fortune in a day....if it could handle the slashdotting....
Re: (Score:3)
if it could handle the slashdotting....
Hot server pounded all night by gang of horny geek studs (5 stars)
Re: (Score:3)
I understand that unwanted attention from anyone can be unpleasant, upsetting and in some cases dangerous. But your experience shows that this kind of technology isn't required for that to be the case. A guy stalked you, tried to follow you home and made you feel threatened, and he didn't need to look you up online to do that.
He could well have searched for you online. Finding out your name could be as easy as overhearing a co-worker calling for you, or reading your name badge. But to a large extent the amo
Re: (Score:3, Informative)
Yes, now besides getting raped, she can be shot too!
Re: (Score:3)
Well, that's the thing.
You do not carry a gun...unless you are prepared to use it when needed without flinching.
I, for one...have no compunction about unloading a magazine into someone that is threatening to do me bodily harm...ESPECIALLY if it is in my own home.
If you're not willing to pull that trigger, then no...don't carry a gun, it will likely end up being used on you (assuming of course they don't have one already).
Re: (Score:2)
Can one really know if they're prepared until it happens? It's not like you can train shooting at people (you can train with other targets, but that doesn't carry the emotional response).
I'm an ignorant in such subjects, but I doubt that can be accomplished without some serious military training.
assuming of course they don't have one already
I assume - but I don't have any data to confirm - that even if they have a gun, they're much more likely to actually use it if they're against an armed opponent.
Re: (Score:3)
Outside of staying close to a man, the only real thing a 120lb woman can do to physically protect herself is to carry. I've met some amazingly good women fighters, but even they wouldn't have a chance against 50% of th
Re: (Score:3)
Frankly, I don't know if I want to be a person who has no problem shooting another, even if (s)he poses a danger to me. The repercussions of that would be disturbing.
Re: (Score:3)
Simple, doubt about whether the person is really a threat. You see someone chasing your son and hitting him on the back and shoulders - it turns out they were being chased by hornets and the guy was trying to knock the hornets off your son's back. (A year ago I was seen chasing someone's kid and hitting him for that very reason.)
You wake up in the middle of the night to investigate a noise - you see someone hunched over your wife who fell asleep on the couch - he turns and approaches you - is that your
Re: (Score:2)
Not if she learns how to use it. People that get shot with their own guns are those that never learned how to use them or protect themselves in the first place. If you want an option for defense and don't want to actually learn how to use a gun to protect yourself (which takes a fair amount of time and commitment) get pepper spray. If pepper spray is illegal where you live I suggest moving to a location that isn't hostile to your own safety.
Re: (Score:3)
She should get some of those .50cal cartoon guns (Desert eagle?) and take pics of herself in sexy nude poses with them :-P=
Hey she said she's a porn star, it's legitimate career advice.
Combine this with video storage (Score:2)
There are companies actually selling access to large stocks of video surveillance. Imagine combining facial recognition software with the video from thousands of security cameras. You could do all kinds of scary things.
A Little Callous (Score:2)
FTA:
I think judgment matters. If you have something that you don't want anyone to know, maybe you shouldn't be doing it in the first place.
It's just too much of a binary approach to the matter; either you accept being part of the network and are fine or you choose not to join and are hiding something. The fact is that there are grey areas on this matter; It's not often that there's something I don't want anyone to know, but there are thousands of things I don't want some people to know.
Likewise, I don't think it's entirely appropriate to have you automatically opted in. Suddenly, everyone is a part of a network whether they want to be or
every time i hear "the cloud" i reach for my gun (Score:2)
Then why on earth didn't they just say "the Internet"?!? Are we really going to see the term "cloud" replace "Internet"?
Re: (Score:3)
Not if you live in new york city. Wearing a mask is a crime.
Re: (Score:2)
Finally, a practical application.
Re: (Score:2)