Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Privacy Technology

Video Surveillance Tech Detects Abnormal Activity 189

Repton writes with news of a company, Behavioral Recognition Systems, that has received 16 patents on a new video surveillance application that can convert video images into machine-readable language, and then analyze them for anomalies that suggest suspicious behavior in the camera's field of view. The software can 'recognize' up to 300 objects and establish a baseline of activity. It should go on sale in September. "...the BRS Labs technology will likely create a fair number of false positives, [the CEO] concedes. 'We think a three-to-one ratio of alerts to actual events is what the market will accept,' he says. 'We could be wrong.'"
This discussion has been archived. No new comments can be posted.

Video Surveillance Tech Detects Abnormal Activity

Comments Filter:
  • Cool (Score:5, Interesting)

    by clang_jangle ( 975789 ) * on Tuesday July 29, 2008 @02:51AM (#24381333) Journal
    Great! Now, all they have to do is combine that with this [youtube.com], and we can all sleep soundly.
  • by Channard ( 693317 ) on Tuesday July 29, 2008 @03:10AM (#24381435) Journal
    ... when it first gets deployed - if it gets deployed - spread the word across the internet and get people to regularly silly walk past it and do other wierd but non threatening stuff. Hey presto, so many false positives it's rendered useless.
  • by frovingslosh ( 582462 ) on Tuesday July 29, 2008 @03:14AM (#24381455)
    And you don't think that will soon be made illegal? You sure sound like a terrorist to me, to Gitmo with you!
  • Re:grenade in mouth (Score:3, Interesting)

    by zobier ( 585066 ) <zobier@NosPam.zobier.net> on Tuesday July 29, 2008 @03:47AM (#24381635)

    I just don't think this is that big of a deal. I do think it will be extremely easy to game, e.g. flood it with FPs like the congestion charge cameras in the UK. It will be interesting to see what games people come up with.

  • by shawb ( 16347 ) on Tuesday July 29, 2008 @04:07AM (#24381717)
    The ratio of false positives really shouldn't be that much of an issue if the system is implemented properly.

    A bigger issue with a system like this would be false negatives. Economics being what it is, this means that the organizations deploying these cameras would likely end up hiring less people to watch the monitors per camera (whether that means an increase in cameras or a decrease in staff.) Therefore, the people watching the monitors would end up relying on the system to look for suspicious behavior. Then false negatives start to come into play. "suspicious behavior" that a human would notice and investigate may be missed by the system, and therefore go uninvestigated. This could cause escalating problems when people decide to learn what behaviors would trigger a "suspicious" flag and then go about doing their nefarious deeds where a human could have spotted them.

    Sure, it would be possible to institute an automated suspicious behavior system to augment existing systems, but in reality it would end up taking away from resources used for security. Even if the system would not reduce security levels, a system such as this would at least reduce the future investment in other proven security methods, such as an increase of competent staff to watch the monitors.
  • by misterhypno ( 978442 ) on Tuesday July 29, 2008 @04:13AM (#24381763)

    If that's what they're shooting for, then I have a name for the system:

    Cry Wolf!

    Because, that's all it's really going to do!

    Heaven help any street performer that gets caught by this video frankenstein's monster, because the cops will, in some jurisdictions, come in blasting away and a mime is a terrible thing to waste!

  • by Potatomasher ( 798018 ) on Tuesday July 29, 2008 @05:22AM (#24382103)
    I call shenanigans on this.

    There is no way they can recognize 300 objects in real world conditions. I work in machine learning (academics) and the current record for generic object recognition sits at around 54-57% for the Caltech 101 database (contains images of 101 different objects). So basically the algorithms of the best and brightest minds in academia (LeCun, Poggio, Lowe, etc) get it wrong half the time !!

    If any government officials are listening... Please don't waste our tax money on this !
  • by John Hasler ( 414242 ) on Tuesday July 29, 2008 @08:32AM (#24383111) Homepage

    > You could actually HIRE HUMANS, say 20 of'em, at $75K each to watch 5 screens each, and
    > have a 1:1 ratio of accuracy.

    Never pulled guard duty, did you? you won't get a "1:1 ratio of accuracy" even if you hire someone for each individual camera.

  • Scope creep (Score:3, Interesting)

    by Anonymous Brave Guy ( 457657 ) on Tuesday July 29, 2008 @09:23AM (#24383695)

    Yes, this is why any such technology must be viewed with some concern. In parallel with requiring people to change their behaviour for the benefit of the machine, we also have the danger of trusting the machine. Sooner or later, some jobsworth will decide that a beep on the machine constitutes "reasonable grounds" for suspicion, which is all that is required to stop/search or arrest someone in some places. Ironically, a 1-in-4 failure rate is probably a good thing here, since at least then such a decision is likely to be overruled by a higher authority sooner rather than later. But what is ever an acceptable failure rate, given the negative consequences for the innocent victim of the system?

    You need a pretty severe penalty/compensation/appeals system to overcome the downward spiral once you start to trust the machine, perhaps something along the lines of exponentially increasing compensation payments with each false accusation of the same person and personal responsibility on the part of the operators so they could go to court and be subject to sanctions if they abuse their position. But of course, this sort of thing doesn't really happen. What really happens is that the Powers That Be, whether government or corporate interests, pretend it's OK for a minority of cases to be wrong, and most of the little people don't have the resources to fight the abuse.

    Meanwhile, the person on the other end gets to be like the lady in the UK who was on a TV report recently because her car number plate was cloned: she is receiving automated fines for motoring offences at a silly rate from various government agencies, each of those fines has to be challenged individually in court, and there is no mechanism available to flag as suspect the record in the DVLA database those agencies used to find her. Her life has become an ongoing, government-sanctioned harassment campaign, and while there ought to be one hell of a due process lawsuit in the works since this is the government doing the dirty work, at the time they showed the report no lawyer had been able to advise her on how to stop the madness.

    Regardless of the theory, for every mass surveillance technology we allow to be introduced, stories like that one are easy to find. And most mass surveillance technologies don't really work very well anyway: city centre CCTV hasn't been proved to reduce crime, ANPR cameras result in the real criminals investing in false number plates, etc. We would do well to remember the old saying that man should not be judged by machine, and to oppose any sort of machine-based summary "justice", lest we slide further into the hole we are digging for ourselves.

Old programmers never die, they just hit account block limit.

Working...