Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Technology

Video Surveillance Tech Detects Abnormal Activity 189

Repton writes with news of a company, Behavioral Recognition Systems, that has received 16 patents on a new video surveillance application that can convert video images into machine-readable language, and then analyze them for anomalies that suggest suspicious behavior in the camera's field of view. The software can 'recognize' up to 300 objects and establish a baseline of activity. It should go on sale in September. "...the BRS Labs technology will likely create a fair number of false positives, [the CEO] concedes. 'We think a three-to-one ratio of alerts to actual events is what the market will accept,' he says. 'We could be wrong.'"
This discussion has been archived. No new comments can be posted.

Video Surveillance Tech Detects Abnormal Activity

Comments Filter:
  • oh yeah? (Score:5, Insightful)

    by ILuvRamen ( 1026668 ) on Tuesday July 29, 2008 @02:57AM (#24381375)
    If I walk past a security camera in a full sized squirrel outfit, humans couldn't even figure that one out let alone a computer. These systems are just dumb. Wait until computers are smart enough to talk with us to develop something like this otherwise it'll never be remotely accurate.
  • false positives (Score:5, Insightful)

    by zobier ( 585066 ) <zobier AT zobier DOT net> on Tuesday July 29, 2008 @03:04AM (#24381403)
    While it's a worry how 1984 society is becoming, I don't think false positives are a particularly bad issue with this technology. An operator who would normally be supposed to watch multiple streams of video for anomalous activity can use these more like bookmarks for subsequent human verification. The bigger issue as I see it is that 16 new patents were just granted on software/algorithms.
  • Re:Cool (Score:3, Insightful)

    by roguetrick ( 1147853 ) <kazer@brIIIigands.org minus threevowels> on Tuesday July 29, 2008 @03:18AM (#24381465) Homepage Journal

    Abnormal activity for a machine like that would just be some guy wearing blue.

    Need a sentry here!

  • by Anonymous Coward on Tuesday July 29, 2008 @03:22AM (#24381481)

    1. File a bunch of mumbo jumbo patents on video surveillance

    2. Convince some trade rag to interview your CEO

    3. Submit the story to /. as a clear and present danger to "Your Rights Online"

    4. ?

    5. Go public!

  • by Sycraft-fu ( 314770 ) on Tuesday July 29, 2008 @03:26AM (#24381507)

    I'm sure everyone on Slashdot is donning their tinfoil hats and screaming big brother (I've already seen a couple posts to that effect) but that really isn't the target market. You'll find that by far the most customers of CCTV equipment are private companies. Pretty much any large store will have an extensive CCTV system to watch for shoplifting.

    Ok well the problem is that you have to have humans watching it for suspicious activity. It is completely infeasible to hire one human per camera, and the more cameras a given human has to watch, the less they catch. Well, something like this could help. If it sees something suspicious, it brings it up on a display to one of the security personnel. The person then decides if it is a problem, or a false alarm.

    A moderate amount of false alarms is fine. This wouldn't be a case of "The system went off, arrest him!" It'd be a case of "The system went off, let's have a human watch and see what's going on." It would allow for better use of security personnel.

    Heck, I'd be interested in a system like this at work. We have CCTV on our computer labs. However we don't have anyone monitoring it. It's more for liability reasons, and so that if someone steals or damages a computer, we can hopefully help the police catch them. However prevention is better than clean up. So it'd be cool if when the system thought something was wrong, it'd notify staff and we could look. If everything was fine, we carry on as normal. If something is indeed happening, we call the police.

    You've got to stop with the idea that these sort of things are designed to figure out what you are thinking for some evil government plan. They aren't. They are designed to help make security systems more effective.

  • by moteyalpha ( 1228680 ) * on Tuesday July 29, 2008 @03:38AM (#24381563) Homepage Journal
    I wonder if this software would detect a positive for suspicious activity if it observed a lobbyist for the firm entering the office of a congressman with a hooker and a briefcase. It would seem that it would select for people who fly into the radar and as a result the actual complex threats would be ignored.
  • Re:Acceptance (Score:4, Insightful)

    by oodaloop ( 1229816 ) on Tuesday July 29, 2008 @04:18AM (#24381789)
    Jebus, what imaginations you people have. Who the hell is talking about arresting the false positives? There's a HUMAN IN THE LOOP. Someone is watching the alerts, which means innocent behavior gets discarded before the police show up. Fucking relax.
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Tuesday July 29, 2008 @04:40AM (#24381901)
    Comment removed based on user account deletion
  • by oodaloop ( 1229816 ) on Tuesday July 29, 2008 @04:42AM (#24381913)
    The ratio is 3-1 alerts to false alarms. That is, 4 events, three of which are real and one a false positive. Given the real events for which security personnel are going to be looking (mugging, rape, vandalism, etc), it would quite a while before 4 total events come to pass and only one of those was a false alarm. I would not call that crying wolf.
  • by Hognoxious ( 631665 ) on Tuesday July 29, 2008 @04:45AM (#24381927) Homepage Journal
    Unless you have enough guards that everybody is being watched at all times, you'd have false negatives anyway. The idea is not to remove/replace human surveillance but to target it more effectively by focusing on the more suspicious people.
  • Re:Don't fret (Score:2, Insightful)

    by rugatero ( 1292060 ) on Tuesday July 29, 2008 @04:58AM (#24381995)

    UK citizens are safe in the knowledge that they're being watched [wired.com].

    That poster is no one-off either. After it became fashionable to compare everything to 1984 the Government agencies realised they could play on people's paranoia. Hence the 'watchful eyes' poster, the targeting benefit fraud campaign and the warning to car tax evaders ("You can't escape the DVLA computer").

  • by RMH101 ( 636144 ) on Tuesday July 29, 2008 @05:33AM (#24382149)
    The problem is that this tech leads to this: http://www.pcworld.com/businesscenter/article/148370/cctv_camera_identifies_people_by_race.html [pcworld.com] - a proof of concept art piece that identifies people's race via CCTV images. The creator's done this precisely to point out to people the logical end point of this tech.
  • by jibjibjib ( 889679 ) on Tuesday July 29, 2008 @06:23AM (#24382397) Journal

    To the fool who tagged this 'unusualisnotabnormal':

    You're wrong. Unusual and abnormal mean essentially the same thing - something out of the ordinary, something not routine.

    If the point you were trying to make is that authorities shouldn't be suspicious of every unusual occurrence, then perhaps something like 'unusualisnotwrong' would have better served your purpose.

  • by JetScootr ( 319545 ) on Tuesday July 29, 2008 @06:23AM (#24382399) Journal
    From TFA: $1,500 to $4,500 per camera.
    100 screens, assume each one rotates thru, um, 5 cameras, that's 500 cameras, say $3000 per, 1.5 million bucks. You could actually HIRE HUMANS, say 20 of'em, at $75K each to watch 5 screens each, and have a 1:1 ratio of accuracy.
  • by Anonymous Coward on Tuesday July 29, 2008 @07:09AM (#24382589)

    The ratio of false positives really shouldn't be that much of an issue if the system is implemented properly.

    This is a very difficult AI problem, so it takes a lot more than just software engineering to get it right. The computer vision field has advanced enough that someone now can try something like this. 5 years ago, vision technology wasn't far enough along to do as much as they're doing. In 5 years, the research will make it possible to generate a much better solution.

    A bigger issue with a system like this would be false negatives. Economics being what it is, this means that the organizations deploying these cameras would likely end up hiring less people to watch the monitors per camera (whether that means an increase in cameras or a decrease in staff.)

    Quite the opposite, most likely. As another poster pointed out, most places have *nobody* watching the cameras because it's way too expensive.

    Places that do have cameras often have a single person watching, so in this case you're probably right that it might allow a company to add more cameras.

    (from below) So this will not only mean "search person X, because the computer tells us he is a thief". It will mean they keep searching till they find something. Can you think of places they might want to look?

    Actually, it's not likely to increase that kind of thing too much: the chances of lawsuits are too high. Management cares enough about that to put the extra effort into not overdoing it.

  • by FlyingBishop ( 1293238 ) on Tuesday July 29, 2008 @08:16AM (#24382955)
    In order for that to be suspicious, it would have to be an irregular event.
  • Re:Cool (Score:2, Insightful)

    by bdenton42 ( 1313735 ) on Tuesday July 29, 2008 @09:13AM (#24383499)

    Great! Now, all they have to do is combine that with this [youtube.com], and we can all sleep soundly.

    With one of those I'd be a little afraid of that "three-to-one ratio of alerts to actual events"...

The faster I go, the behinder I get. -- Lewis Carroll

Working...