Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
IBM Security Privacy Your Rights Online

Chicago Developing 'Suspicious Behavior' Monitoring System 294

narramissic writes "Over the past few years, Chicago's Office of Emergency Management and Communications (OEMC) has been blanketing the city with a network of thousands of video cameras in an effort to remotely keep track of emergencies in real time. Now, with the help of IBM, the network is getting some smarts. IBM software will analyze the video and ultimately 'recognize suspicious behavior,' says OEMC spokesman Kevin Smith. 'The challenge is going to be teaching computers to recognize the suspicious behavior,' said Smith. 'Once this is done this will be a very impressive city in terms of public safety.'"
This discussion has been archived. No new comments can be posted.

Chicago Developing 'Suspicious Behavior' Monitoring System

Comments Filter:
  • Re:London .. yes (Score:2, Informative)

    by jessiej ( 1019654 ) on Thursday September 27, 2007 @01:08PM (#20770749)
    Here's the discussion [slashdot.org] about that article. Plenty of opinions on both sides of the issue there.
  • by pclminion ( 145572 ) on Thursday September 27, 2007 @02:06PM (#20771523)

    Let's apply a little Bayesian reasoning, shall we?

    Given that system X identifies your behavior as suspicious, what is the probability that you are a terrorist? This probability is written P(T|S). This is what we want to find.

    Bayes' Rule: P(T|S) = P(S|T)*P(T)/P(S).

    P(S|T) is the probability that the system will identify you as suspicious, given that you are a terrorist. You can call this the system's "accuracy." Let's be generous and say the accuracy is 99.99% = 0.9999.

    P(T) is the probability that you are a terrorist. Let's say that this probability is one in a million: 0.000001.

    P(S) is the probability that the system thinks you are suspicious. There are two sources of suspicion: true positives, and false positives. The true positives are given by P(S|T)*P(T). The false positives are given by P(S|~T)*P(~T).

    Let's again, be generous, and say that the false positive rate P(S|~T) is only 0.1%, or 0.001.

    P(~T) is just 1-P(T) = 0.999999.

    So, let's substitute everything in:

    P(T|S) = P(S|T)*P(T) / (P(S|T)*P(T)+P(S|~T)*P(~T)) = 0.9999*0.000001 / (0.9999*0.000001+0.001*0.999999)

    What's that equal, everybody? 0.0009989 which is about 0.001, in other words 0.1%

    What does it mean? Even with a system that has a true positive rate of 99.99% and a false positive rate of only 0.1%, the probability of a "suspicious person" actually being a terrorist is only 0.1%.

    In other words, these systems are inherently useless in identifying terrorists. This is because terrorists are inherently RARE in the population. The massive accuracy of the test cannot make up for this fact.

  • Re:Good or bad? (Score:3, Informative)

    by Kozar_The_Malignant ( 738483 ) on Thursday September 27, 2007 @03:22PM (#20772651)

    Now, there's still room for abuse (train the system to recognize likely politically unpopular groups and send police to intimidate, for example), but that doesn't automatically mean that there can't be ways of making this system useful without making it intrusive.
    In Chicago? With the Chicago police? Welcome to Earth, Stranger.
  • by erlehmann ( 1045500 ) on Thursday September 27, 2007 @04:09PM (#20773443)

    Let's look at some numbers. We'll be optimistic. We'll assume the system has a 1 in 100 false positive rate (99% accurate), and a 1 in 1,000 false negative rate (99.9% accurate).

    Assume one trillion possible indicators to sift through: that's about ten events -- e-mails, phone calls, purchases, web surfings, whatever -- per person in the U.S. per day. Also assume that 10 of them are actually terrorists plotting.

    This unrealistically-accurate system will generate one billion false alarms for every real terrorist plot it uncovers. Every day of every year, the police will have to investigate 27 million potential plots in order to find the one real terrorist plot per month. Raise that false-positive accuracy to an absurd 99.9999% and you're still chasing 2,750 false alarms per day -- but that will inevitably raise your false negatives, and you're going to miss some of those ten real plots.
    source: http://www.schneier.com/blog/archives/2006/03/data_mining_for.html [schneier.com]
  • Re:Good or bad? (Score:3, Informative)

    by h4ck7h3p14n37 ( 926070 ) on Thursday September 27, 2007 @06:15PM (#20775293) Homepage

    Just so you know, many of the cameras are encased in a bulletproof box and have acoustic sensors to detect gunshots; the camera will automatically focus on the source of the shot.

    The things transmit their video wirelessly and allow for remote control via a wireless link to police cruisers in addition to their wired link to the monitoring center downtown. I'd focus on jamming the signal or disabling the wired link.

Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (3) Ha, ha, I can't believe they're actually going to adopt this sucker.

Working...