Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Privacy Technology

Video Surveillance Tech Detects Abnormal Activity 189

Repton writes with news of a company, Behavioral Recognition Systems, that has received 16 patents on a new video surveillance application that can convert video images into machine-readable language, and then analyze them for anomalies that suggest suspicious behavior in the camera's field of view. The software can 'recognize' up to 300 objects and establish a baseline of activity. It should go on sale in September. "...the BRS Labs technology will likely create a fair number of false positives, [the CEO] concedes. 'We think a three-to-one ratio of alerts to actual events is what the market will accept,' he says. 'We could be wrong.'"
This discussion has been archived. No new comments can be posted.

Video Surveillance Tech Detects Abnormal Activity

Comments Filter:
  • Cool (Score:5, Interesting)

    by clang_jangle ( 975789 ) * on Tuesday July 29, 2008 @01:51AM (#24381333) Journal
    Great! Now, all they have to do is combine that with this [youtube.com], and we can all sleep soundly.
  • oh yeah? (Score:5, Insightful)

    by ILuvRamen ( 1026668 ) on Tuesday July 29, 2008 @01:57AM (#24381375)
    If I walk past a security camera in a full sized squirrel outfit, humans couldn't even figure that one out let alone a computer. These systems are just dumb. Wait until computers are smart enough to talk with us to develop something like this otherwise it'll never be remotely accurate.
    • Re:oh yeah? (Score:5, Funny)

      by Cassius Corodes ( 1084513 ) on Tuesday July 29, 2008 @02:14AM (#24381453)
      Computers are already smart enough to talk to us. They just don't have anything interesting to say.

      Why don't you try starting a conversation?
    • by Sycraft-fu ( 314770 ) on Tuesday July 29, 2008 @02:26AM (#24381507)

      I'm sure everyone on Slashdot is donning their tinfoil hats and screaming big brother (I've already seen a couple posts to that effect) but that really isn't the target market. You'll find that by far the most customers of CCTV equipment are private companies. Pretty much any large store will have an extensive CCTV system to watch for shoplifting.

      Ok well the problem is that you have to have humans watching it for suspicious activity. It is completely infeasible to hire one human per camera, and the more cameras a given human has to watch, the less they catch. Well, something like this could help. If it sees something suspicious, it brings it up on a display to one of the security personnel. The person then decides if it is a problem, or a false alarm.

      A moderate amount of false alarms is fine. This wouldn't be a case of "The system went off, arrest him!" It'd be a case of "The system went off, let's have a human watch and see what's going on." It would allow for better use of security personnel.

      Heck, I'd be interested in a system like this at work. We have CCTV on our computer labs. However we don't have anyone monitoring it. It's more for liability reasons, and so that if someone steals or damages a computer, we can hopefully help the police catch them. However prevention is better than clean up. So it'd be cool if when the system thought something was wrong, it'd notify staff and we could look. If everything was fine, we carry on as normal. If something is indeed happening, we call the police.

      You've got to stop with the idea that these sort of things are designed to figure out what you are thinking for some evil government plan. They aren't. They are designed to help make security systems more effective.

      • by shawb ( 16347 ) on Tuesday July 29, 2008 @03:07AM (#24381717)
        The ratio of false positives really shouldn't be that much of an issue if the system is implemented properly.

        A bigger issue with a system like this would be false negatives. Economics being what it is, this means that the organizations deploying these cameras would likely end up hiring less people to watch the monitors per camera (whether that means an increase in cameras or a decrease in staff.) Therefore, the people watching the monitors would end up relying on the system to look for suspicious behavior. Then false negatives start to come into play. "suspicious behavior" that a human would notice and investigate may be missed by the system, and therefore go uninvestigated. This could cause escalating problems when people decide to learn what behaviors would trigger a "suspicious" flag and then go about doing their nefarious deeds where a human could have spotted them.

        Sure, it would be possible to institute an automated suspicious behavior system to augment existing systems, but in reality it would end up taking away from resources used for security. Even if the system would not reduce security levels, a system such as this would at least reduce the future investment in other proven security methods, such as an increase of competent staff to watch the monitors.
        • by Hognoxious ( 631665 ) on Tuesday July 29, 2008 @03:45AM (#24381927) Homepage Journal
          Unless you have enough guards that everybody is being watched at all times, you'd have false negatives anyway. The idea is not to remove/replace human surveillance but to target it more effectively by focusing on the more suspicious people.
          • I wonder if the camera's fuzzy logic is set to increase the confidence level if there's a lot of brown detected in the moving object.
        • by TapeCutter ( 624760 ) * on Tuesday July 29, 2008 @05:30AM (#24382425) Journal
          "A bigger issue with a system like this would be false negatives. Economics being what it is, this means that the organizations deploying these cameras would likely end up hiring less people to watch the monitors per camera (whether that means an increase in cameras or a decrease in staff.)"

          Stocktakes tell the store owner how much is being lost via theft. Economics being what it is, you can use this information to measure your security bang for buck and notice there is a point where diminishing returns makes eliminating the remaining false negatives a net loss. At this point your best option is to maintain the same level of risk aversion for less money. People who run large sets of cameras are the target market so IMHO the false negatives will be expected/ignored by the buyer, they (rational but non-technical bussiness buyers) will simply want to know what it all means for their bottom line.
      • Comment removed (Score:5, Insightful)

        by account_deleted ( 4530225 ) on Tuesday July 29, 2008 @03:40AM (#24381901)
        Comment removed based on user account deletion
        • Re: (Score:3, Funny)

          by Hognoxious ( 631665 )

          So this will not only mean "search person X, because the computer tells us he is a thief". It will mean they keep searching till they find something.

          I think your foil hat is too tight.

          • by jacks0n ( 112153 )

            Computers are magic to most users.

            They just believe the results they spit out uncritically.

            I've seen this behavior in groups you wouldn't believe could possibly be that dumb.

      • "You've got to stop with the idea that these sort of things are designed to figure out what you are thinking for some evil government plan. They aren't. They are designed to help make security systems more effective."

        Consider this. For years we have been medicating and locking up people who think that governments, and companies, are watching them under the diagnosis of mental illness. Now you are accepting that such measures need to become a reality.
      • by RMH101 ( 636144 ) on Tuesday July 29, 2008 @04:33AM (#24382149)
        The problem is that this tech leads to this: http://www.pcworld.com/businesscenter/article/148370/cctv_camera_identifies_people_by_race.html [pcworld.com] - a proof of concept art piece that identifies people's race via CCTV images. The creator's done this precisely to point out to people the logical end point of this tech.
      • A moderate amount of false alarms is fine. This wouldn't be a case of "The system went off, arrest him!" It'd be a case of "The system went off, let's have a human watch and see what's going on." It would allow for better use of security personnel.

        Exactly. At the moment a lot of CCTV systems have video switches that are triggered by PIR sensors in the area covered by the camera - what BRS are proposing sounds like a smarter way of doing this.

        I don't want to have to watch my CCTV monitors all the time. I'

      • They are designed to help make security systems more effective.

        Did you ever think that maybe security systems are already "effective" enough?

      • by kabocox ( 199019 )

        Heck, I'd be interested in a system like this at work. We have CCTV on our computer labs. However we don't have anyone monitoring it. It's more for liability reasons, and so that if someone steals or damages a computer, we can hopefully help the police catch them. However prevention is better than clean up. So it'd be cool if when the system thought something was wrong, it'd notify staff and we could look. If everything was fine, we carry on as normal. If something is indeed happening, we call the police.

        Th

      • "You've got to stop with the idea that these sort of things are designed to figure out what you are thinking for some evil government plan. They aren't. They are designed to help make security systems more effective."

        What a system is designed for and what a system is ultimately used for are two very different things. The road to hell is often paved with good intentions.
    • If I walk past a security camera in a full sized squirrel outfit, humans couldn't even figure that one out let alone a computer.

      Imagine deploying this system at a furry convention.

      • by ya really ( 1257084 ) on Tuesday July 29, 2008 @03:18AM (#24381791)

        If I walk past a security camera in a full sized squirrel outfit, humans couldn't even figure that one out let alone a computer.

        Imagine deploying this system at a furry convention.

        I'd guess the cameras would most likely be programmed to identify the subject(s) as mentally disturbed, but most likely harmless.

      • by nizo ( 81281 ) *

        If this was plugged into nearby billboards (like in Minority Report), wearing the costume through the metro would be worth it just to see people freak out over the ads for furry porn.

  • by frovingslosh ( 582462 ) on Tuesday July 29, 2008 @02:00AM (#24381387)
    As long as the cops don't beat too many people too extremely for false positive behavior I can't see where this could be a problem. And Homeland Security is already working on getting some Executive Orders written up that will make it a crime to act in ways that cause false positives, so there should be no false positives in the near future (by definition they will be real positives). Problem solved.
  • I hope they don't bet the company on these patents since, as was discussed on ./ less than 24 hours ago, software patents involving digital signals may be invalid. (The field is generally called Digital Signal Processing.)

  • false positives (Score:5, Insightful)

    by zobier ( 585066 ) <zobier@zobieLAPLACEr.net minus math_god> on Tuesday July 29, 2008 @02:04AM (#24381403)
    While it's a worry how 1984 society is becoming, I don't think false positives are a particularly bad issue with this technology. An operator who would normally be supposed to watch multiple streams of video for anomalous activity can use these more like bookmarks for subsequent human verification. The bigger issue as I see it is that 16 new patents were just granted on software/algorithms.
    • grenade in mouth (Score:5, Informative)

      by globaljustin ( 574257 ) on Tuesday July 29, 2008 @02:24AM (#24381493) Journal

      I swear you must be a troll from Homeland Security...

      While it's a worry how 1984 society is becoming, I don't think false positives are a particularly bad issue with this technology

      That's like saying "Oh sure, it is worrisome that I have a live hand grenade with the pin pulled jammed in my mouth, but I don't think it would be extremely bad if it just blew off one of my pinky toes"

      This kind of technology makes me want riot...ahem...i mean...to exercise my 1st amendment right to protest in a law abiding way.

      I'm sickened. The CEO says: "We think a three-to-one ratio of alerts to actual events is what the market will accept."

      • by moteyalpha ( 1228680 ) * on Tuesday July 29, 2008 @02:38AM (#24381563) Homepage Journal
        I wonder if this software would detect a positive for suspicious activity if it observed a lobbyist for the firm entering the office of a congressman with a hooker and a briefcase. It would seem that it would select for people who fly into the radar and as a result the actual complex threats would be ignored.
      • Re: (Score:3, Interesting)

        by zobier ( 585066 )

        I just don't think this is that big of a deal. I do think it will be extremely easy to game, e.g. flood it with FPs like the congestion charge cameras in the UK. It will be interesting to see what games people come up with.

      • I'm not really sure what you're upset about. That a security guard at a camera has to look at an occasional false positive the computer recorded? What is your alternative? Someone staring at the camera 24 hours a day? No security?
        • false dichotomy..."either we have behavior monitoring cameras or we have nothing"

          c'mon man...depending on where you're talking about it can be a host of strategies. strategies that work. look 9/11 could have EASILY been prevented if screeners checked the damn No-fly list and detained the guys with FUCKING UTILITY KNIVES for some background checks and questioning.

          shoe bomber...hmmm guy with one way ticket and no luggage shabbily dressed and unkept...detain for questioning...don't need a damn bullshit 'behavi

  • I thought what I'd do was, I'd pretend I was one of those deaf-mutes.
  • by Channard ( 693317 ) on Tuesday July 29, 2008 @02:10AM (#24381435) Journal
    ... when it first gets deployed - if it gets deployed - spread the word across the internet and get people to regularly silly walk past it and do other wierd but non threatening stuff. Hey presto, so many false positives it's rendered useless.
  • by Anonymous Coward on Tuesday July 29, 2008 @02:22AM (#24381481)

    1. File a bunch of mumbo jumbo patents on video surveillance

    2. Convince some trade rag to interview your CEO

    3. Submit the story to /. as a clear and present danger to "Your Rights Online"

    4. ?

    5. Go public!

  • I just was just viewing this [imdb.com] right before logging onto slashdot and seeing this topic.
  • As noted in TFA and if the false positive ratio can be reduced to even 10-to-1, this technology might rapidly become the best friend of the fellow who has to constantly scan 100 surveillance screens for unusual activity.

    But this system's definition of "unusual activity" intrigues me. If one of these toys is set up for example in a bank to monitor a vault door and a bank guard passes by the door every hour on his rounds, the software would presumably record that as "normal" activity. What is the "unusua

    • But this system's definition of "unusual activity" intrigues me. If one of these toys is set up for example in a bank to monitor a vault door and a bank guard passes by the door every hour on his rounds, the software would presumably record that as "normal" activity. What is the "unusual" element that would prod the AI into sending an alert if a thief did exactly the same thing? What dynamic does the system employ to determine if a bank guard is a legitimate bank guard or Willie Sutton? The time it happened? Facial recognition? The fact that the "bank guard" pulled a cutting torch or dynamite out of his backpack and started going to town on the vault door?

      Maybe it flags people who hide their face from the camera.

    • From TFA: $1,500 to $4,500 per camera.
      100 screens, assume each one rotates thru, um, 5 cameras, that's 500 cameras, say $3000 per, 1.5 million bucks. You could actually HIRE HUMANS, say 20 of'em, at $75K each to watch 5 screens each, and have a 1:1 ratio of accuracy.
      • Re: (Score:3, Interesting)

        by John Hasler ( 414242 )

        > You could actually HIRE HUMANS, say 20 of'em, at $75K each to watch 5 screens each, and
        > have a 1:1 ratio of accuracy.

        Never pulled guard duty, did you? you won't get a "1:1 ratio of accuracy" even if you hire someone for each individual camera.

        • Never pulled guard duty, did you?
          Actually, yes I have. You'd be surprised how "abnormal behavior" fails to appear when a guard, even unarmed, walks by every few hours.
  • Does it detect ghosts? [youtube.com]

  • UK citizens are safe in the knowledge that they're being watched [wired.com]. That's it citizen, keep moving. Nothing to see here.

    Ignoring the Orwellian references, could this be used for some good?
    • Re: (Score:2, Insightful)

      by rugatero ( 1292060 )

      UK citizens are safe in the knowledge that they're being watched [wired.com].

      That poster is no one-off either. After it became fashionable to compare everything to 1984 the Government agencies realised they could play on people's paranoia. Hence the 'watchful eyes' poster, the targeting benefit fraud campaign and the warning to car tax evaders ("You can't escape the DVLA computer").

  • by Tablizer ( 95088 ) on Tuesday July 29, 2008 @02:40AM (#24381581) Journal

    Abnormal activity? You mean like a slashdotter outside, in the sun, with a date?
             

    • Re: (Score:3, Funny)

      by lena_10326 ( 1100441 )
      One must not let the slashdotter near bright light, especially sunlight, which can kill the slashdotter; one must not get water on the slashdotter; and, most importantly, one must never feed it after midnight.
    • I hope it validates the inputs to the abnormality calculation function or it'll get a divide-by-zero error and crash when it sees that.

  • Sure, the market might accept a three-to-one failure rate. But what about the 3 guys who get locked up in a DHS holding cell, get waterboarded and then released without any means of setting the record straight just because they were lost in a shopping mall with a cakebox in their hands?
    • Re:Acceptance (Score:4, Insightful)

      by oodaloop ( 1229816 ) on Tuesday July 29, 2008 @03:18AM (#24381789)
      Jebus, what imaginations you people have. Who the hell is talking about arresting the false positives? There's a HUMAN IN THE LOOP. Someone is watching the alerts, which means innocent behavior gets discarded before the police show up. Fucking relax.
      • by Aladrin ( 926209 )

        There wasn't even a mention of police. There was a -former- SS guy, who is not an 'IT consultant' that made 1 statement.

        Someone else said it earlier: The main customer for this is private businesses, not the government.

        And if we were going to protest private businesses monitoring us, that milestone was passed long ago. If you don't want to be monitored by a computer, don't shop at a retail store. Period.

        Personally, I'm just not worried about it. I don't steal, and if some store accused me of it, I'd ha

  • by Fluffeh ( 1273756 ) on Tuesday July 29, 2008 @02:44AM (#24381617)
    Machine: Uh oh, Jeremy stoped sticking his finger into his nose. I better call this one in!
    Operator: Ummm... Why has it shown me Jeremy just sitting there?
    Machine: Nope, there he goes again, digging away at his nose, everything back to normal. Better stop transmitting.
    ...
    Machine: Whoa, he stopping barking for boogers again! Better show the boss!!
    Operator: Why does this dumbass machine keep showing me Jeremy just sitting there for goodness sake...
    Machine: Boss! Boss! Come on, look! DIFFERENT! ABNORMAL!
    Operator: *Hmmm what's for lunch...*
  • Anyone else read that as "Video Surveillance TechNICIAN Detects Abnormal Activity"? I was confused for a bit.

  • by misterhypno ( 978442 ) on Tuesday July 29, 2008 @03:13AM (#24381763)

    If that's what they're shooting for, then I have a name for the system:

    Cry Wolf!

    Because, that's all it's really going to do!

    Heaven help any street performer that gets caught by this video frankenstein's monster, because the cops will, in some jurisdictions, come in blasting away and a mime is a terrible thing to waste!

    • Re: (Score:3, Insightful)

      by oodaloop ( 1229816 )
      The ratio is 3-1 alerts to false alarms. That is, 4 events, three of which are real and one a false positive. Given the real events for which security personnel are going to be looking (mugging, rape, vandalism, etc), it would quite a while before 4 total events come to pass and only one of those was a false alarm. I would not call that crying wolf.
      • The ratio is 3-1 alerts to false alarms. That is, 4 events, three of which are real and one a false positive. Given the real events for which security personnel are going to be looking (mugging, rape, vandalism, etc), it would quite a while before 4 total events come to pass and only one of those was a false alarm. I would not call that crying wolf.

        This has to depend on how often genuine events occur. For example, if they never occur, every alert will be a false positive. Obviously it wouldn't be deployed w

      • The ratio is 3-1 alerts to false alarms

        From the article:

        We think a three-to-one ratio of alerts to actual events is what the market will accept, he says.

        I read that as three alerts, only one of which is an "actual event". It could still be useful if applied properly, of course.

  • by Potatomasher ( 798018 ) on Tuesday July 29, 2008 @04:22AM (#24382103)
    I call shenanigans on this.

    There is no way they can recognize 300 objects in real world conditions. I work in machine learning (academics) and the current record for generic object recognition sits at around 54-57% for the Caltech 101 database (contains images of 101 different objects). So basically the algorithms of the best and brightest minds in academia (LeCun, Poggio, Lowe, etc) get it wrong half the time !!

    If any government officials are listening... Please don't waste our tax money on this !
    • Consider the 1970s era jet fighter Phantom II F-4. The leading edge of the wing required an extender to help it pull out of dives. The outboard wingends required an upward angle to stop flat spins. During development, the fuselage design was expanded to accommodate the fuel tanks the beast would need, and provision was made for up to three under-wing external fuel tanks. The nose had to be extended to make room for the combat radar defined in the original requirements.
      And it had to fly with 4 degree no
    • Perhaps this is less generic though. I'm sure differentiation of say a motorbike, a car, a human, and a tree is a lot easier than a ball, a plate, and another generally round object.
    • by jez9999 ( 618189 )

      If any government officials are listening...

      They're not.

  • Great so now we'll all have to watch our basic movements and how close together we have certain objects. The future of freedom is looking so wonderful. Sigh....
  • by jibjibjib ( 889679 ) on Tuesday July 29, 2008 @05:23AM (#24382397) Journal

    To the fool who tagged this 'unusualisnotabnormal':

    You're wrong. Unusual and abnormal mean essentially the same thing - something out of the ordinary, something not routine.

    If the point you were trying to make is that authorities shouldn't be suspicious of every unusual occurrence, then perhaps something like 'unusualisnotwrong' would have better served your purpose.

    • by Tim C ( 15259 )

      And that sort of thing is part of the reason why I disabled tags in my preferences - for every one good, useful tag there are half a dozen or more stupid, useless or just plain wrong ones.

    • No, he's right.

      It's unusual, but sometimes geysers spray water in the air.

      It's abnormal that Old Faithful no longer sprays at one-hour intervals.

      It's unusual, but normal for February to have a number of days other than 28.

  • I walk with a limp. Does that constitute abnormal behavior according to this software?
  • Comment removed based on user account deletion
  • ...once it becomes known what sets it off.

  • by Assmasher ( 456699 ) on Tuesday July 29, 2008 @07:49AM (#24383261) Journal

    ...wide area surveillance system (which included video analytics for loitering, wrong way traffic in a crowd, crowd panic analysis, smoke/fire detection by video, et cetera) I can point out that there are MANY companies, corporate research groups, and universities that have been doing things like this for several years (in some cases almost 10 years.)

              This company is in for a rude awakening when they realize that (a)their price per camera is extraordinarily high (this one metric is the biggest decider in large installation proposals [along with whether or not you have to mount the cameras on poles or just hallways/buildings]) and (b)a false alert rate of 3 to 1 is TOTALLY unacceptable. The entire purpose of video analytics in a security environment is to reduce the workload on the monitoring staff (and hopefully put more of them out into the field) while being able to scale up your coverage. I assure you that a 3 to 1 false alert rate will result in zero customers in a year. Measuring the false alert rate is also highly subjective. Companies tend to use a given scenario repeatedly to measure their results when, of course, this has little to no bearing on reality. Things like the weather (moving shadows affect certain algorithms even when accounted for algorithmically, headlights, flashlights, camera flashes change things, wind, rain, snow, bugs, everything you can imagine, lol...) negate all of these measurements.

              It is nice to see new blood in this space, but I hope they were smart enough to make their software offerings totally distinct from their hardware (many companies do not) so that they can integrate with other systems without to much work. That's the best way to make money in the video analytics market right now. The big boys (like SIEMENS) got into the game about 3 years ago and they'll squeeze you out every time unless you can offer something that helps them land a big deal.

  • Notice that the spokesperson says "We think a three-to-one ratio of alerts to actual events is what the market will accept."

    What he doesn't say is what their actual false positive rate in testing was... or any reason for believing the false positive rate will, in fact, be anywhere near that low.

    I've come to be very leery of that sort of hypothetical statement... ones that lead you to think something has been said that hasn't been said. He's saying that 3:1 is a plausible goal, but he's not saying they've ac

Time is the most valuable thing a man can spend. -- Theophrastus

Working...