Smart Cameras To Predict Crimes 245
hairybacchus writes: "The Independent News is reporting that scientists at Kingston University in London have developed video processing software that is able to predict behavior patterns of the people on-screen. They say it will be used to alleviate congestion in the London Underground or alert police to potential muggings. I wonder how long it will be before this is combined with face-recognition technology? It's spooky." I can't wait. "We searched you because the computer told us to." Trust the Computer.
Just imagine if the RIAA got hold of this (Score:1, Funny)
The Independent News? (Score:2, Insightful)
Sorry, but I still on 'Stunned by the Americacentrism' after the story where every man and his dog bemoaned a story that spoilt a television program before it has been shown in the whole of the *states*....
Re:The Independent News? (Score:2)
Did you bother to click on the link before complaining? The browser title is "Independent News".
Re:The Independent News? (Score:2, Funny)
;-)
Re:The Independent News? (Score:1, Funny)
could be good (Score:1)
Of course... that's if it works.
Re:could be good (Score:1)
Re:could be good (Score:3, Insightful)
Anyway, I'm getting a little off topic, but from what I've seen, the London camera system was installed to combat the IRA terrorists (sound familiar Americans?) but according to the program hasn't ever actually resulted in capturing an IRA terrorist. So, pray tell, what is the massive camera system in London used for? Spying on the citizens of course. Am I paranoid? A little, but without paranoid people we would not have a Bill of Rights in the US. We'd all be ignorant trusting twats who believe evil men don't exist and believe everything spoon-fed to us by the media and our government.
Re:could be good (Score:2)
You know, if you were robbed, you would be only too pleased to pull up the video evidence of it to help nail the person who did it.
Indeed, many people want cameras up in train stations and on trains (and probably would prefer to have a real person watching it to arrange for help if there was a problem). On a similar note of machine prediction, most of us are happy to have metal detectors at airports as a "predictor" of subsequent potential illegal acts.
So what gives? I think that what we are worrying about here is about a couple of themes:
1) The possibility that someone can collate and document your own activities and use innocent (and legal) behaviour against us. Of course, all bets are off with some groups. Just ask Bill about Monica - I don't think he actually broke the law, but everyone wanted to know anyway. (Ok, you expect a little scrutiny as a president!)
2) The possibility of persecution for acts which have never been commited, and where people are being judged on presumed intent. (Although most peopole still don't want guns on airplanes, funnily enough).
If this is the case, then what we need is more in the way of legislative (or ideally, constitutional) protection of:
1)rights to privacy, and
2)the presupposition of innocence.
These concepts do exist legally in some areas for protection of privacy (eg., Medical records) and from persecution (when accused of a major felony).
But they really don't exist at a day to day level for most people living normal lives. Protection against these actions with legal rights is probably the best solution here.
Because the technology isn't going to go away.
My 2c worth - Michael
Thoughtcrime (Score:3, Redundant)
Re:Thoughtcrime (Score:1)
Re:Thoughtcrime (Score:1)
Re:Thoughtcrime (Score:2)
That is a line that scares me because it isn't unconceivable for this to happen if this technology takes off. COmputers don't have the ability to distinguish between someone might commit a crime and someone who won't. Police, and the rest of law enforcement, have a hard enough time doing this, and they can think.
Re:Thoughtcrime (Score:2)
We arrested you because the computer said you were going to {insert you favorite crime here}.
I used "" by mistake. I guess I should hit preview more often.
Smart camera (Score:5, Funny)
...
**Damn** I hate it when I'm right!
Re:Smart camera (Score:2)
For when it gets /.'ed (Score:4, Informative)
CCTV: By learning behaviour patterns, computers could soon alert police when an unmanned camera sees 'suspicious' activity
By Andrew Johnson
21 April 2002
Computers and CCTV cameras could be used to predict and prevent crime before it happens.
Scientists at Kingston University in London have developed software able to anticipate if someone is about to mug an old lady or plant a bomb at an airport.
It works by examining images coming in from close circuit television cameras (CCTV) and comparing them to behaviour patterns that have already programmed into its memory.
The software, called Cromatica, can then mathematically work out what is likely to happen next. And if it is likely to be a crime it can send a warning signal to a security guard or police officer.
The system was developed by Dr Sergio Velastin, of Kingston University's Digital Imaging Research Centre, to improve public transport.
By predicting crowd flow, congestion patterns and potential suicides on the London Underground, the aim was to increase the efficiency and safety of transport systems.
The software has already been tested at London's Liverpool Street Station.
Dr Velastin explained that not feeling safe was a major reason why some people did not use public transport. "In some ways, women and the elderly are effectively excluded from the public transport system," he said.
CCTV cameras help improve security, he said, but they are monitored by humans who can lose concentration or miss things. It is especially difficult for the person watching CCTV to remain vigilant if nothing happens for a long period of time, he said.
"Our technology excels at carrying out the boring, repetitive tasks and highlighting potential situations that could otherwise go unnoticed," he added.
While recent studies have shown that cameras tend to move crime on elsewhere rather than prevent it completely, in certain environments, such as train stations, they are still useful.
And Dr Velastin believes his creation has a much wider social use than just improving transport.
His team of European researchers are improving the software so that eventually it will be capable of spotting unattended luggage in an airport. And it will be able to tell who left it there and where that person has gone.
However, the computer is not yet set to replace the human being altogether.
"The idea is that the computer detects a potential event and shows it to the operator, who then decides what to do - so we are still a long way off from machines replacing humans," Dr Velastin says.
Trustworthy? (Score:1)
So considering it is better to err on the side of caution, the best we can hope for is that these computers show the operator everything...
How exactly are they testing this and do the get many "CrimeNotFound" exceptions?
Re:Trustworthy? (Score:2)
Excellent... (Score:5, Funny)
"Gnovos, the computer has informed us that your progress in the 'QuakeSex Research Project' has been incredibly successful, and we are to give you another $100 million extension to the grant. Personally, I don't see how playing deathmatch games against your friends between sexual encounters with supermodels contributes to global peace, but it's not my place to dispute the wisdom of the computer. Machines are always right, after all. Oh, and another Nobel prize came today, should I put it in the box with the others?"
You slashdotters are a bunch of cynics.. (Score:3, Interesting)
Re:You slashdotters are a bunch of cynics.. (Score:5, Insightful)
It all boils down to whether you trust them to responsibly use the power they have in cases like this.
Well, do you?
STUDY THE PAST
Re:You slashdotters are a bunch of cynics.. (Score:4, Insightful)
Study the past, indeed! From your post, you would think that all this began with "the settling of the New World"! ROTFL! Try reading any history about any part of the world at any time!
When we consider whether we allow the police to have guns, we don't ask whether we can always trust them to use their guns wisely. Of course we can't. Instead, we ask what are the advantages of the police having guns versus their not having guns and what procedures we can have in place that will minimise the abuses.
We don't ban police from interrogating suspects even though sometimes they abuse their power in those interrogations. We do prevent them from torturing suspects, and we also will exclude certain evidence if police disregard the rights of suspects. Some jurisdictions also videotape all (custodial) interrogations of serious crimes, an excellent practice, which should be required.
But, notice, we do not ban interrogations. Nor do we say, we trust police to do the right thing always. The very foundations of our government are based on accountability to the people and checks and balances, not on trusting authorities to always do the right thing. Try reading The Federalist Papers some time instead of watching Oliver Stone movies.
Of all technologies, this one of having computers analyzing video surveillance cameras in public places, seems amazing innocuous. I can hardly imagine anything less threatening to me.
Re:You slashdotters are a bunch of cynics.. (Score:3, Insightful)
Re:You slashdotters are a bunch of cynics.. (Score:5, Interesting)
Consider surveillance cameras on city streets. Sure, the fact that I walk down a particular street at a particular time is public knowledge -- anyone could see me and remember. But what if every step I took in public was recorded on video and tracked? Whoever had that information would know a great deal about my behaviour, and that information could be used against me. Pervasive collection of information, even public information, can be a grave threat to privacy.
Now consider the technology discussed in this article. Phenomena such as racial profiling have taught us that an innocent person can suffer horribly at the hands of law enforcement personnel just because they fit a perceived statistical profile. Imagine a world where everyone is afraid to act in any way unusual for fear of being stopped for "questioning."
And you can forget the argument about "if it works, it's okay." First of all, these methods are inherently statistical, and statistical methods are never 100% accurate. If they were, they would be logical, deductive methods. Statistics is inductive.
Secondly, even if you did claim to have perfect foreknowledge of crimes to be committed, you create a predestination paradox. At what point does a would-be criminal make up his or her mind to commit a crime? Who's to say he or she wouldn't back down at the critical moment, or be unable to go through with it due to some chance event?
My real point here is that we can't always rely upon "more is better" methodology as our technology progresses. We have to consider how scale affects the nature of our technological activities. If we are blind to issues such as these, then eventually we'll get screwed. Maybe this prediction thing will turn out to be benign or even beneficial. But there are many, many issues of this sort, and some of them are going to bite us in the ass if we don't raise hell when we see a problem. Dig?
Re:You slashdotters are a bunch of cynics.. (Score:2)
Re:You slashdotters are a bunch of cynics.. (Score:2)
"No one goes from idealism to realism. There's a cynical stage inbetween." (--Sir Fred Hoyle, IIRC)
Re:You slashdotters are a bunch of cynics.. (Score:2)
Obviously, we're faggots, busily offending the sensibilities of victorian society behind closed doors.
Of course, if we had a transparent society - in which all our personal data were available to anyone who'd care to look - they'd realize that we're just a couple of heterosexual geeks having a small LAN party. (We both read Slashdot, but he's the only one within 13000 feet of the CO.)
Of course, since such a meeting would also be conducive to things that would offend the sensibilities of RIAA and MPAA executives, and the penalties for that are far worse, maybe it's better that the security apparatus doesn't know what goes on behind modded cases :)
Re:You slashdotters are a bunch of cynics.. (Score:2)
NO CARRIER
Re:You slashdotters are a bunch of cynics.. (Score:2)
I'll also add - depending on where she lived, between 1m and 3m resolution satellite photos of her home.
Always freaks 'em out when I say "So, is your room on the side of the house facing the row of trees, or do you wake up looking at that ugly apartment block across the street?"
Funny, I never seem to get a second date :)
Re:You slashdotters are a bunch of cynics.. (Score:1)
Re:You slashdotters are a bunch of cynics.. (Score:2, Informative)
I think the point that everyone is missing here is that this work is already done by humans, there's no new invasion of privacy. You'll find that the network of cameras in London help stop a lot of street crime. (it needs doing, London now has a higher level of street crime than New York). All the system will do is alert security personnel working on the street to keep an eye on one particular person, not to arrest them and lock them up instantly, just keep an eye on them. Considering this is currently done by some guy who has a hunch about the person he's looking at, it'll probably reduce the occasions of people being followed wrongfully.
Re:You slashdotters are a bunch of cynics.. (Score:2, Interesting)
the cynic is you (Score:2)
Trying to substitute cheap technology for a functioning society is the wrong path. You can put in cameras to detect potential criminals, but that doesn't get at the root of the problem. Crime and violence are the result of failed government policies. Cameras won't make you secure, and neither will minimum wage security gaurds or a stressed police force.
The cynic is you: rather than trying to prevent crime at the root, you give up and want throw more and more people into jail.
Re:the cynic is you (Score:2)
Crime and violence are the result of failed government policies.
Hmmm. Some of the time, maybe. But I think a "functioning society" is not absolutely correlated with government policies.
There are plenty of examples of societies with lousy governmental policies and, yet, some fine, upstanding good citizens.
Likewise, there are places with progressive, enlightened governmental policies where, nevertheless, criminals can be found.
I think the roots of crime and violence grow much deeper into culture as a whole. It would be convenient if government policies were so effective, but my observation is that they are only roughly correlated with society's behavior.
Re:the cynic is you (Score:2)
Nothing in the real world is "absolutely correlated" with anything.
There are plenty of examples of societies with lousy governmental policies and, yet, some fine, upstanding good citizens. Likewise, there are places with progressive, enlightened governmental policies where, nevertheless, criminals can be found.
Crime and terrorism isn't about existence or non-existence, it's about statistics and frequency. And the US statistics are lousy.
It would be convenient if government policies were so effective, but my observation is that they are only roughly correlated with society's behavior.
Government is one of the mechanisms by which culture is made. And, in a democracy, government is the mechanism by which culture acts. It's the one place where culture becomes visible and where it can be changed.
Give the system something to think about... (Score:4, Funny)
Very difficult to spot during editing apparantly ;-) Wonder what it would make of that?
This reminds me... (Score:4, Interesting)
VO: Some newspapers stop here.
Unfreeze and said Skinhead sweeps man out of the way of falling masonry i.e. it was a rescue and not a mugging.
VO: The Guardian - get the full picture.
I guess with this technology in place, computer-controlled lasers would have taken out the rescuer before he could act
Re:This reminds me...Fuller version (Score:1)
There were more than two viewpoints. You missed the middle sequence.
As you say, first viepoint was of dodgy looking skinhead running towards businessman,
it looks like a mugging about to happen.
Next set of shots show a car (not visible from previous angle) with some dodgy looking
geezers in it, slowing to a halt at a junction, next to skinhead walking along pavement
(pavement=sidewalk). Skinhead starts running away from car.
Final sequence, skinhead running towards businessman walking past building site.
Some heavy building material is just falling down from above, skinhead grabs businessman
pulls him out of the way.
An excellent advert.
>>I guess with this technology in place, computer-controlled lasers would have taken out
>>the rescuer before he could act
But of course. Think of all the starving lawyers who could have made a tidy packet out of
suing the building site. (Won't some think of the children^H^H^H^H^H^Hlawyers !)
Some figures re London surveillance cameras (Score:4, Interesting)
Privacy issues aside, somehow a 0:2,000,000 success:cost ratio strikes me as a wee bit useless, not to mention being an utter waste of tax money and gov't time.
And that doesn't begin to touch the problem of sorting out the mass of data from 300 screencaps per day per citizen.
More figures re London surveillance cameras (Score:2)
Assuming that's tolerably close, that means there is one camera for every 5 residents!!
And postulating that perhaps 20% of Londoners are out in public at any given moment, that's one camera per publicly-visible citizen at all times.
So.. with what statistically amounts to 100% surveillance of each and every citizen while they're out in public, the cameras still can't catch ONE terrorist.
[sarcasm] If the surveillance system is accurate in determining potentially naughty behaviour, it follows that the number of terrorists in London is zero. [/sarcasm]
Tom Cruise? (Score:3, Insightful)
It's a trashy promo for the new movie Minority Report [tnmc.org]. Computers predicting crimes before you commit them (in the 'not too distant future' they'd have you believe).
What I find funny is that Phillip K. Dick is listed as an 'author' of the movie on that web page. Promotional bs. He died in 1982 just before Blade Runner was released (his short story 'Do Androids Dream of Electric Sheep was the philosophical foundation for it).
Re:Tom Cruise? (Score:3, Informative)
News on philipKdick.com [philipkdick.com]
hmm (Score:3, Insightful)
Re:hmm (Score:2)
Precisely. I was just contemplating how to use this new surveillance technology for personal amusement.
I bet a pound I can convince it that I'm about to mug myself...
Re:hmm (Score:2)
Re:hmm (Score:2)
I never really understood why people would expect privacy in public places.
Are public bathrooms private places? What about dressing rooms? Underneath the tables at resturants?
Privacy is a matter of where we expect to be private. If the cameras are obvious, or otherwise publicized, and they are in "public places", then I agree with you (and that appears to be the case in this article). But if the cameras are hidden, even if it's a public place, I think that's problematic.
Above all though the key is to have checks to make sure the system is working properly. If used properly, cameras in public places could stop police brutality and could save some innocent people from being falsely imprisoned. But if used improperly, well, we've all read that book.
Re:hmm (Score:2)
No, no, and no.
It's nice and courteous of people to treat them as such, but you can't reasonably demand that. Well, dressing rooms are usually located in such places stores, gyms and whatnot which, along with restaurants aren't public places as such, they are private property where the owner admits a certain number of people, on certain terms - the owner makes the rules there. Which isn't to say that I think that store owners should be able to clandestinely film their customers, but that they can require of them that others are given privacy in their dressing rooms.
Eh, I'm just arguing for the sake of it :) I's not like I care about this one way or another.
Reminds me of a story I heard about... (Score:5, Interesting)
Anyway, they wrote some software- it more or less just looked for a human sized blob that moved. Worked too- it could detect human beings pretty well.
Trouble was, they found that it was unreliable- it tended to think birds landing in flocks and groups were people appearing and disappearing. So they improved on the algorithm, and put in some code that if the system could see the wings flapping- it would realise it was birds and ignore it.
Anyway, it worked pretty well, so they thought they'd give a hard test. Could someone deliberately evade it? They got a grad student and told him to work out a way to fool it. They set up the computer guarding a notional prize, and set him at it.
The grad student puzzled over it for a while, then siddled into the middle of view; and removed his jacket. He then waved his jacket over his head vigorously. The computer saw all the flapping, and activated the 'bird' assignment and he was able to steal the item...
That may have worked in trials... (Score:5, Funny)
BYTE's Circuit Cellar (Score:2, Informative)
These analysis programs are interesting research, but it'll be years before there's any thing even close to being have picking up enough threats and few enough false positives to be considered being in production. Besides, by the time physical movement is visible, the target / victim has already been selected, monitored, and assessed by the attacker. Proactive measures are much more effective than reactive. Look at the PC virus industry for detailed case studies in prevention versus cure.
Re:BYTE's Circuit Cellar (Score:2)
Re:BYTE's Circuit Cellar (Score:2)
Hmm. Well, I can't say much for legal reasons, but the technology has come along a lot further than you realize.
Keep in mind that you're referring to an article that's almost 20 years old dealing with consumer-available technologies. The current commercial and government-grade stuff is way way way beyond that.
The Birds and the Bombs (Score:1)
So? Flocks of geese look like soviet nuclear missiles to radar operators - i didnt hear anyone complaining about that!
Re:The Birds and the Bombs (Score:3, Funny)
Re:The Birds and the Bombs (Score:2, Funny)
The software (Score:3, Informative)
http://www.cordis.lu/telematics/tap_transport/rese arch/projects/cromatica.html [cordis.lu]
Their other projects [cordis.lu] are also interesting as well
I have seen this (Score:5, Interesting)
Locating "suspect packages" left in public places
Spotting vehicles parked in dodgy places
Watching for people accessing secure areas
Making sure no service vehicles get onto runways
Yes, all this is possible with more conventional technology but these often need a human being in close attendance. This system filters out noise like stray animals, cyclists, etc because it learns what suspect packages, vehicles and aeroplanes look like and also how they move and behave.
and yes... it could be used to spot human behaviours. It appears that someone plotting a crime moves differently to someone just going about their business. This system knows the rules about human shapes and modalities and fluidity of movement.
My view is that the final bit is a bit of spin for the consumption of venture capitalists and is unlikely to be of much use in prime time - so no need to panic yet. It does however raise interesting questions about "reasonable suspicion", evidence and culpability if someone is wrongly detained. Police would no doubt try to shift resonsibility onto the technology, as is their wont.
Re:I have seen this (Score:1, Insightful)
do authorities take a care about it if there is a difference?
would being black, hispanic or asian rise the chance of triggering an "alarm"?
what does this mean for the definition of public space? will there be a public space in future, if this technology is used, or would it shift the term of public spaces, and fragment them into spaces for certain people, that are forbidden to others just because a computer systems database says they are likely to commit a crime, even if they dont want to?
Re:I have seen this (Score:1, Funny)
Camera notes customer velocity, gait down hallway.
Computer computes desperation factor , and closes other door stalls to 'engauged', except last 'vip' stall that accepts $2 in coins.
For extra realism dummy legs and shoes can be electronically positioned on other thrones , loudspeaker noises added , to convice payeee, that that was the best 'penny' ever spent.
Dreamt up by the same architects who put the same number of stalls in mens and womens. Adding a coinbox to stalls is good, but just wait till the camera tells it to set a higher price.
Re:I have seen this (Score:3, Insightful)
It does however raise interesting questions about "reasonable suspicion", evidence and culpability if someone is wrongly detained. Police would no doubt try to shift resonsibility onto the technology, as is their wont.
I would hope that trying to shift responsibility for wrongful detention/arrest/prosecution would be met with a resounding, "So what?" If you use a tool to do your job, you're still responsible for what you do with the tool. If a house I build collapses and kills people, I shouldn't be able to blame the hammer - even if it's a special prototype hammer with artificial intelligence and accelerometers. I decided to use that particular hammer, so I am responsible for the results of that decision. (I'll get around to suing the hammer manufacturer later).
Also, we hear time and time again about how police don't have the power to act until a crime is committed (e.g. domestic violence) so how will this stop crime? It might assist in arrest or conviction rates by capturing evidence, but unless we have even more fundamental rights taken from us by our "representatives" and "protectors..."
It does seem to be a cool technology, but the potential for abuse is so high that I have trouble supporting it. When a technology exists that has a high potential for criminal abuse (e.g. MP3 copying) legislators fall all over themselves trying to quash it. But they conveniently look the other way when it's something that government might abuse (e.g. radar guns, surveillance equipment, drunk driving check points, Patriot laws...).
in a related story... (Score:4, Funny)
--
Mike Nugent
Come on... (Score:3, Funny)
Every psycologist worth his salt knows that you can't predict the behavior of individuals or even small groups. You need a large group before the mathematics of psycology can be applied with any acceptable degree of accuracy, on the order of the population of a medium to highly populated planet. Seldon would be rolling in his grave if he'd been born yet.
Re:Come on... (Score:2)
For the flow analysis stuff they will have a large amount of people to deal with an predict. On the mugging front however its going to be harder maybe its a simple
"That bloke is wearing 50k of gold round his neck and a 10k Rolex... he better watch out or he'll get mugged"
Wrong (Score:2, Informative)
I don't think we have to worry yet about a computer causing the erronious arrest of someone performing thoughtcrime or attempting a mugging: "'The idea is that the computer detects a potential event and shows it to the operator, who then decides what to do - so we are still a long way off from machines replacing humans,' Dr Velastin says." It's simply a tool to help the operators sort through the huge amount of visual data they are presented with.
BTW, I don't support the idea of a Big Brother monitoring the public. However, I'm equally unsupportive of the spread of FUD like this article write-up.
Potential to Abuse... (Score:2, Insightful)
However, the idea present in the system are not poor. When at university, I knew many students that worked nights as security guards. Most of them would either be studying notes or sleeping! Having a machine to help during the monotony isn't necessarily a bad thing.
If however this leads to harrassment from the authorities just cos you have bad social skills is another matter. Hence its use must be monitored and have regulations inplace to tackle misuse.
Maybe not all bad... (Score:5, Funny)
1. If I seem lost in thought, change the contents of some of the digital billboards to warn me about wandering into traffic.
2. If I seem sleepy, send an email to my employer warning them not to let me touch any code that day.
3. If I seem irritable, call my girlfriend and warn her to leave me alone for a few hours.
4. And of course, if I seem shifty and nervous, like someone about to do something hazardous and antisocial, someone with something to hide, who is going to do harm to everyone around them... warn the police because I am about to experience flatulence.
;-P
Re:Maybe not all bad... (Score:3, Funny)
Sadly, this is all that NBC has to offer for their fall lineup
1984 (Score:3, Insightful)
The popular myth is that "computers never make mistakes". Well, we ALL know this is bullshit. No computer is any better than the software that it is running, and the hardware is no better than the people who designed it.
Show me ONE bug free piece of software that exists, anywhwere, that is more complex than the "hello world!" level and you can argue with me.
Better yet, show me one OPERATING SYSTEM, the layer atop the hardware that any applications software (such as this Orwell-Ware) that is bug free.
Bug=mistake.
That said, the odds of any such application, to be flawless itself, running on a flawless OS, running on flawless hardware, are SO small as to be non-existant.
The best that can be hoped for is accuracy in the 90%+ range. Multiply that by 300 million people, and the number of people who are going to be harassed is in the TENS of million... The potential for abuse, by both law enforcement, and by hackers with agendas is staggering...
Already the face scanners have been proven to be so inaccurate that they are being dropped in places. This is a FAR more complex algorhythm... I'd think an accuracy rate of 20% would be generous.
For one thing, they are assuming that normal people will behave normally, but that criminals will behave differently, evasive, etc... Well, I for one will NOT act normally anyplace I know such a thing is operating, and I doubt anyone else will either. This, I doubt can be taken into account.
Machines are dumb , this won't work. (Score:1)
is spot patterns. If a criminal can learn these patterns he can avoid making them and even have
a friend somewhere else deliberalty MAKING those patterns to draw the attention of the CCTV operator
elsewhere. People assume criminals are stupid.
They're not.
This is a Good Thing (Score:5, Informative)
The trouble is, humans are inefficient and expensive, and their "gut instincts" may be fallible. The mall security guard may be the only guy watching a dozen closed-circuit monitors, and he may even be dozing off from the monotony of his job. The airport guard might be a minimum wage high-school dropout with barely any training. The cop's instincts are pretty good, but as objective as he tries to be, he unconsciously tends to target members of a particular race instead of going by solid scientific indicators.
This technology (if it works) will be a Good Thing because:
1. It improves upon an existing system that helps keep us safe.
2. It could be more effective and consistent.
3. It could apply rules objectively, and could be designed to flag activities that truly are suspicious (e.g. "casing" a department store) rather than those that merely look suspicious to biased humans (e.g. a young black man in a record store). This means that it could help protect our rights more than the current system.
Cheers,
IT
Re:This is a Good Thing (Score:4, Insightful)
Once new management came in, it took approximately three hours for them to come up with a rule that changed all that. They were tired of stuff being shoplifted (can you blame them?), so they said nobody can wear coats or backpacks into the place. We all had to leave them outside the front door. And it wasn't their responsibility to watch the coats and bags, either.
The very first day, someone walked out and picked up two backpacks, the next day a leather coat was stolen. After that, nobody wanted to go.
The problem? They assumed everyone with a coat or a backpack was a shoplifter. Inconveniencing everyone in order to stop one or two people seems wrong to me. I imagine this new camera system will use some sort of stereotyping as well, like watching for people who bounce around nervously, looking all around them for escape routes or police (many armed robberies in gas stations are like this). But, will the software be able to tell that from someone who really has to use the bathroom, and is bouncing up and down impatiently, searching around the room for the nearest restroom? I think not.
I admire the optimism, though.
no, it is not (Score:2)
The trouble is that people have too much confidence in the efficiency and infallibility of machines. A department store security guard that suspects you of being a shoplifter might be annoying, but he can't do anything until you actually shoplift.
Also, these kinds of machine vision applications are almost impossible to validate. Where do you get the training data from? How do you measure false alarm rate? Most likely, they will have to get trained by some person's judgement of what looks suspicious, which merely enshrines a fallible human judgment into perpetuity, inexactly at that.
The potential for false alarms is enormous. If you have some disability, carry a heavy package in an unusual way, or wear some strange outfit, this system is likely going to tag you as suspicious. Video cameras and computers have nowhere near the reasoning ability to figure out what is going on, or the resolution to even see the necessary details if they could.
Re: Smart Cameras (Score:1)
Also, London is filled with tens of thousands of video cameras, and now they all have face recognition software, so they can see a criminal and follow him through various areas of the city on camera until a cop can catch up to him.
And then there's this story [slashdot.org] about Connecticut doing the roughly the same thing.
better link (Score:3, Informative)
Briefly: Cromatica views crowds as changing colors against a background. When the colors stop, this is congestion. Likewise, suicide attempts are indicated by lingering for 10 minutes or more. It's pretty easy to identify a single person against an empty backdrop.
Of course, people are working on predicting muggings, and the article goes into that as well.
The article also has links to the research itself.
Better then letting some cops choose on their own (Score:2)
It is being done on casual basis by police around the world for personal preventive searches and car searches. For the time being it is "the trained operator told us so" instead of computer. And to be honest I would rather have a computer decice then some cops. It will be less racially and ethnically biased
Statistics from observing policemen in some US states and the number of blacks and whites they stop for checks and searches are well known, no point in reiterating them...
Re:Better then letting some cops choose on their o (Score:3, Interesting)
Statistics from observing policemen in some US states and the number of blacks and whites they stop for checks and searches are well known, no point in reiterating them...
Well, the accusations are well known. Then the US Justice Department got New Jersey to "agree" to actually commission a study of the issue, in a consent decree.
The company hired to do the study found that the incidence of speeding varied by race. In a way fairly consistent with the stop ratio.
The Justice department was outraged, has "grave doubts", etc. because that isn't what they wanted to find.
Similar Systems used for Traffic (Score:2, Interesting)
NTS uses real-time video and neural-network technology at traffic intersections and railroad crossings to predict traffic accidents and enforce traffic violations (bad news for you guys who blow red lights) - kind of similar to the situation in the article...instead of predicting the actions of people, it's predicting the actions of automobiles. There are already many deployments nationwide and lots more being installed.
BTW, the same predictive neural network technology is used to predict all types of financial fraud, including credit card fraud and money laundering.
Can anyone say Logan5 ? (Score:2)
Likely motivations (Score:3, Interesting)
1. Its a complete scam. They can't get facial recognition to work, so they've moved to new BS that doesn't yet have a bunch of defrauded users telling the marks that its crap.
2. A source for independent and arbitrary racism. Its no longer racist to search you for looking black. The computer has determined that you should move along.
Re:Some facts please? (Score:2)
If subject.race = black then
subject.deservesbeating = True
Camera.continuerecording = False
End if
With all the fancy claims of how it will monitor what the subject is looking around at, hunching his soldiers, and nervously ticking so as to reveal his nefarious intentions, you get bet that either the salesman, or the mark getting conned, will see it as a tool to arrest their least favourite ethnic group.
Lawsuits (Score:4, Interesting)
Stopping old ladies (Score:2, Funny)
Quick and the Dead (Score:2, Interesting)
But the point with face recognition would truely be a kicker. Once that system acually becomes reliable, anybody with a record notorius enough to have their face mapped would be tracked the moment they entered a store. Assuming you can't obscure your likeness in someway, of course.
Re:Quick and the Dead (Score:2)
ha. hahaha. hahahahahaaaaaaaaaahahahahaaaaaaa... (Score:2)
Right, this company out of nowhere can suddenly predict human behavior? Humans in large groups?
This is akin to the millions of dollars that CA just needlessly spent on Oracle licenses -- it's an example of some government flunky with a budget picking up some snake oil from an overzealous salesperson.
Anyone who claims they can "mathematically predict" human behavior is lying through his or her respective teeth.
No worry, it's just the image of the beast. (Score:2)
The machines are just a good excuse and distraction for that beasty point, via reflection.
I need a camera that can predict sex (Score:2)
what was their test suite? (Score:2)
hopefully they didn't test this software on the typical soap opera... which was probably written by "plot writer version 1.0" anyway.
I can see it now: "The camera predicts that the person on screen will turn out to be the long lost, transgender half-brother of the amnesiatic ex-stripper, and that he will marry the heiress to the papaya plantation..."
Re:If you're not a criminal, you have nothing to f (Score:2)
No, but I will now! Oh, what fun to be had...
Re:If you're not a criminal, you have nothing to f (Score:1)
Yes... (Score:1, Informative)
Minority Report [imdb.com]
What would you do if you were accused of a murder, you had not committed... yet?
Based on a Philip K. Dick short story, Minority Report is about a cop in the future working in a division of the police department that arrests killers before they commit the crimes courtesy of some future viewing technology
Re:Yes... (Score:2)
The would-be perpitrator is either arrested pre-emptively, or in one case just told in a phone call that they know they're about to commit a crime, which is enough to deter them.
Re:tom cruise movie like this (Score:1)
I don't think the movie was based on thoughts though, rather I think they could somehow predict the future or something. But it certainly does look like an interesting movie (even if it does star Tom Cruise).
Re:You do realise... (Score:2)
The Independent is by no means the UK's tabloid rubbish. We've got plenty of them - the Sun and Daily Mail spring instantly to mind
The Independent is a respected, pretty objective high-quality broadsheet. It's not politically aligned at all (hence the name) which I suppose might mean some doubt its accuracy because it's not just following the normal right-wing bias (see, UK media's overwhelmingly right wing, it's not just you in the US who have that problem!) but really, let's be honest. You may not like it but it's straight.
Re:You do realise... (Score:2)
Frankly, as a left-of-centre Scottish Nationalist, if I was trolling or inciting flames, you'd know about it
Accidents double in areas implementing cameras. (Score:3, Insightful)
Last year they put a load of static and mobile cameras all over the place. Basically, their "Safety camera" scheme has been a devastating failure.
Cameras have no effect on the casualty rate and are nothing more than revenue generation mechanisms.
Re:Accidents double in areas implementing cameras. (Score:2)
Re:I want to see this given to traffic cops (Score:2)
Re:Software company to predict human rights violat (Score:2)