Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Privacy United States Technology

Tampa Police Give Up On Face Recognition Cameras 267

Saint Aardvark writes "The city of Tampa has given up on their face-recognition system attached to street surveillance cameras."
This discussion has been archived. No new comments can be posted.

Tampa Police Give Up On Face Recognition Cameras

Comments Filter:
  • Doesn't work (Score:5, Insightful)

    by JohnGrahamCumming ( 684871 ) * <slashdot.jgc@org> on Thursday August 21, 2003 @07:40AM (#6753442) Homepage Journal
    If you ignore the privacy worries for a minute the most interesting thing
    in this story is that the system didn't work. It didn't work in Tampa,
    it didn't work in Pinellas County and it isn't working in Virgina Beach.

    So you've got a dud system that's wasting police time. In Tampa they had
    a full time officer using the system who could have been out on the streets
    in the community that he is trying to protect understanding and interacting
    with that community. If you talk to police officers, reporters, or social
    workers I think you'd find that they value highly local knowledge in doing
    their jobs, not all seeing all knowing eyes in the sky.

    John.
    • Re:Doesn't work (Score:5, Insightful)

      by Azghoul ( 25786 ) on Thursday August 21, 2003 @07:43AM (#6753469) Homepage
      And that's about all that has to be said about the project, though we'll get plenty of people complaining about the privacy concerns.

      What's more interesting to me than the fact that it doesn't work is that the guys interviewed (policemen, IIRC) didn't know WHY it didn't work.

      And they didn't waste entirely too much money, the company gave it up for a free trial. I wouldn't want to be working for that company any more though. :)
      • Re:Doesn't work (Score:5, Insightful)

        by 0123456 ( 636235 ) on Thursday August 21, 2003 @07:48AM (#6753507)
        "I wouldn't want to be working for that company any more though. :)"

        Why? Just because it doesn't work, that doesn't mean they can't get the government to mandate its installation in all public places to catch "terrorists". That's the great thing about government contracts: it's not whether it works, it's who you know with their face in the pork trough...
        • Re:Doesn't work (Score:3, Insightful)

          by Azghoul ( 25786 )
          Well, that's particularly cynical. I like to believe that at least is some circumstances a failed product will NOT be "mandated" by any agency.

          The Tampa police already gave them a shot, and even though they apparently liked the company originally, that wasn't enough to save them. So I'd like to think that the company will be having trouble making future sales, at least until they figure out how to make it actually work...

          And so, given that times would likely be tight in the company, I wouldn't want to w
          • Re:Doesn't work (Score:5, Insightful)

            by 0123456 ( 636235 ) on Thursday August 21, 2003 @08:05AM (#6753650)
            "Well, that's particularly cynical."

            No, it's realistic. Just look at something like the Osprey, which the US military didn't want, which doesn't really work, and which has killed quite a few people in crashes, but Congress kept forcing funding onto the military for because it kept the pork going to their mates.

            If they know the right people they will get the contracts whether or not it works: there's a huge amount of pork available for "anti-terror" projects at the moment, so they merely need to grease the right palms to get their share. Not working is irrelevant when politicos are involved.
            • Re:Doesn't work (Score:3, Informative)

              by autocracy ( 192714 )
              The Osprey's errors were simply well reported. Nearly every new plane design has to go through the same mess - new planes are dangerous. When you get a pilot's license for anything but basic flight, you are type-certified and do training either in a simulator or with a pre-certified pilot who knows the plane. New plane == no experienced people; especially something like the Osprey that was such a major change on typical designs. None of this means that the plane didn't work and didn't have many succesful fl
            • by kikta ( 200092 ) on Thursday August 21, 2003 @09:16AM (#6754226)
              The Osprey's shortcomings were over-publicized for the most part. Many new aircraft have significant problems, especially one so radically different as the Osprey. The fact that some officers decided to try and cover up the problems didn't help public perception, but that's just what it was - perception, not fact.

              The Marine Corps needs a new medium-lift helo. The CH-46 Sea Knight is entirely too old. Have you ever ridden in one? I have, and believe me when I tell you that we don't call them "Flying Coffins" because we thought the name sounded pretty.

              The Osprey isn't perfect, but it's an example of a system that can be great if given the chance.
            • Re:Doesn't work (Score:4, Informative)

              by Anonymous Coward on Thursday August 21, 2003 @09:29AM (#6754329)
              Although your point is Valid, the Osprey (V-22) is a bad example. The V-22 is actually a very well put together aircraft and has been performing very well. All the test piolets seem to love flying it and all of them have told me that the plane transitions from hover to flight seamlessly. The vehicle flies about twice as fast as a conventional rotoary aircraft and can lift a fair amount more cargo. The USMC is slated to purchase quit a few to replace the aging (30-40 year old) CH46 fleet, and it is sorely needed.

              As far as it being forced on the military, thats a line of crap. The Marines (to write about what I know) are very much excited about getting their grubbies on the plane, however as usual with transitions there are a few old horses who feel that the CH46 is fine. 'it's not broke so don't fix it'.

              The crashes you refer to numbered 3. And they all occured early in the V-22's test cycle. Although not official, some opinions are that the crash was caused by piolets trying to hotshot a little too much in the planes.
            • Although it's true that the DoD attempted to kill the plane, the Marines worked their friends in Congress to override their own "bosses."

              Are you aware that the helicopters being replaced, primarily CH-46 Sea Knights, were so horrid initially that they earned the nickname, "Boeing Body Bag?"
        • That's the great thing about government contracts: it's not whether it works, it's who you know with their face in the pork trough...

          Roger that. Lie detectors don't work, have been scientifically shown not to work since sometime around 1616CE, and yet the USG continues to use it as a condition of employment in many areas. Moronic.

          I'm just happy that in this case a law enforcement agency actually stopped doing something because it didn't work. That doesn't always happen.

          • That's not true. Yes, its farily easy to beat a lie detector test, if you've been trained how to do it.

            But most people haven't been trained to beat the test. So its a very effective way to tell if a person is reliable or not.
            • Umm, no.

              The Polygraph and Lie Detection [nas.edu] Juicy quote: "Polygraph testing now rests on weak scientific underpinnings despite nearly a century of study, the committee said. And much of the available evidence for judging its validity lacks scientific rigor."

              There's never been a study that conclusively shows that lie detectors work. Never.

        • by Anonymous Coward
          Don't forget the other point you need to make. The govt dept spending the money NEEDS to spend the money on something. Whether it works or not is beside the point.

          Why you ask do they NEED to spend the money. Simple, if they don't spend their entire budget (on stupid crap or not) they will get cut the next budget cycle. However, if they run out of money early then their rewarded the next budget cycle with a larger budget.

          Wouldn't it be nice if our jobs work that way. "Yes, I spent my entire months bud
      • Re:Doesn't work (Score:5, Insightful)

        by esper ( 11644 ) on Thursday August 21, 2003 @12:00PM (#6756067) Homepage
        that's about all that has to be said about the project, though we'll get plenty of people complaining about the privacy concerns.

        As well we should, for two reasons:
        1. The cameras are still in place (and, I believe, in operation). I prefer not to be on camera without my consent, even if there isn't a computer trying to match my face against a criminal database.

        2. If the only reason they retired it was because it doesn't work, then they're likely to try it again once the technology has gotten a bit better. It's not dead for good.
    • Re:Doesn't work (Score:2, Insightful)

      I personally think that having that officer patrolling that area would be a more valuable use of his time anyway. Just the presence of an officer can help to keep the crime down a bit. It also doesn't worry people with the privacy concern. And how about the response time? If the officer is right there, then the response time is cut down to almost zero.

      However, if it was one officer watching more than one camera, then you run into a problem, because now it'll take more than one officer to do the job. B
    • Re:Doesn't work (Score:5, Informative)

      by Znork ( 31774 ) on Thursday August 21, 2003 @08:15AM (#6753730)
      Not even the fact that it doesnt work is really interesting; the fact that face recognition technology used in this way is, and always will be, worthless was known already.

      Face recognition is useful when comparing small groups against a large database, or a large group against a small database since you can trim the fuzzy factors to get more false positives or more false negatives. For example, if you want to find the identity of a certain suspect in a large database you can have it spit out 10 suggestions of who it could be and eliminate the false positives manually. Or if you use it for access control you can trim it to reject as much as possible, as someone going through an access control can adjust their face for optimal lighting and try again.

      But to use it to scan random people under bad conditions and compare against a large database where you dont want either false positives or false negatives is idiocy and the system will be completely useless as you'll either get dozens of random false positives each day and haul in innocent people who probably look nothing like the match or you wont get the actual matches at all.

      The companies like former Visionics trying to push these systems for crowd use are selling snake oil. It doesnt work today and as the factors making it unusable cant really be significantly improved upon it wont work in the future either.
      • by kenyob ( 209893 ) on Thursday August 21, 2003 @09:16AM (#6754230) Homepage
        Not necessarily true. Visionics failed miserably as we all know and fell flat on their faces. However new technologies are being developed that greatly enhance facial recognition technologies such as 2d to 3d facial modelling Cyberextruder [cyberextruder.com], wireless mobile, light compensating camera systems JonesCAM.tv [jonescam.tv], and high speed database systems. Facial recognition and other biometric technologies are merely in their infancy. Biometrics is at the point where the world wide web was in 1994. Its truly about to explode and privacy issues will come up. But I feel in this day and age with all the acts of terrorisim people will give up a little bit of privacy to feel safer. You will also see the greatest use of the technologies in casinos first and foremost. They have money and a ton less privacy issues.
        • Not necessarily true. Visionics failed miserably as we all know and fell flat on their faces. However new technologies are being developed that greatly enhance facial recognition technologies

          I'd argue that it is necessarily true. The point is that even a tiny - say 0.01% - false positive rate is a show-stopper in the applications that we're discussing here. If there are 1,000 people in your database and you're monitoring the throngs inside a busy airport, you're probably talking several false-positives per

      • ...the system will be completely useless as you'll either get dozens of random false positives each day and haul in innocent people...

        That's exactly why it's so important to people like Ashcroft that the government not be burdened with the requirement of actually proving their guilt. If you can simply declare that to be suspected - by man or machine - is to be guilty, then you can look like a hero in the war on terrorism in the next election. Getting lawyers and judges involved will just mess things up by

    • Re:Doesn't work (Score:3, Insightful)

      by Safety Cap ( 253500 )
      So you've got a dud system that's wasting police time.
      Because it is technology, it is "magic" and therefore above reproach.

      If this were anything that the dullard politicians could understand, like a sewage treatment plant, highway construction project, or new jail, they would have the vendor bused up on fraud and racketeering charges faster than you could say, "Pig in a Poke."

    • If you ignore the privacy worries for a minute...

      You have no right to privacy on a public street or in a public place.

      In Tampa they had a full time officer using the system who could have been out on the streets
      in the community that he is trying to protect understanding and interacting with that community.


      While the software is a failure, having a single full time officer watching the cameras is a good way to 'patrol' a larger area. Examine the case of having six cameras that scan six widely separa
      • You have no right to privacy on a public street or in a public place.

        While you are technically correct, I (and several others in this place) make a distinction between casual observation and active surveilance. I accept that I have no right to privacy in a public place to the degree that it is inevitable that people will see me. However, that does not mean that it is acceptable for someone to follow me around and closely watch my actions. Remote surveilance is even worse, as it denies me both the oppor
    • Re:Doesn't work (Score:3, Informative)

      by swillden ( 191260 ) *

      it didn't work in Pinellas County

      Actually, this isn't true; I attended a presentation given by the Pinellas County Sherrif's dept. and they're thrilled with their system and don't know how they'd get by without it.

      The reason it does work for them is because of the way they're using it. Face recognition systems are lousy at identifying people against a large database, but they're very good at locating a small subset of the database that may be a match. And people are quite good at looking at a small n

  • by Lawrence_Bird ( 67278 ) on Thursday August 21, 2003 @07:42AM (#6753452) Homepage
    that they have lost face?
    • Re:Does this mean (Score:5, Interesting)

      by TheViffer ( 128272 ) on Thursday August 21, 2003 @08:13AM (#6753714)
      Did they every have one?

      If they were the same quality cameras they use in say convenience stores or banks how could they work in the first place.

      You know what I mean, black and white, fuzzy, jerky motion of peoples its shooting. I am sure we all have seen them on the 10 o'clock news with the news person say "Have you seen this man/women".

      They are so bad that a few months ago a truck rolled into a local convenience store for a smash and grab. The cameras were not even able to make out the license plate at this one particular location.

      One would think with all of today technologies, massive digital storage space and low prices for this hardware a decent system could be put in.
  • Yes but... (Score:5, Funny)

    by CGP314 ( 672613 ) <CGP@NOSpAM.ColinGregoryPalmer.net> on Thursday August 21, 2003 @07:42AM (#6753455) Homepage
    But after two years, it yielded no positive identifications

    I'm sorry, I didn't catch that... how many false positives did you say the system had?
    • Re:Yes but... (Score:2, Insightful)

      This is a valid point.

      How many times was there a false positive identification made and some innocent person picked up or questioned by the police? How many manhours were wasted on these false positives?

  • Not surprising (Score:4, Insightful)

    by Wierd Willy ( 161814 ) on Thursday August 21, 2003 @07:43AM (#6753465) Journal
    These systems will never work untill they can figure out a way to store such information as faces and other physical attributes holographically. 2D photography won't ever do it accurately enough to make the system functional.

    • Re:Not surprising (Score:5, Insightful)

      by tiled_rainbows ( 686195 ) on Thursday August 21, 2003 @07:54AM (#6753556) Homepage Journal
      That doesn't make sense. I can recognise a particular face from a 2D photograph. Therefore it must be possible, just difficult.
      • I can recognise a particular face from a 2D photograph
        You want that job? Positions are free, but you must be able to climb walls and pillars...
      • Re:Not surprising (Score:4, Informative)

        by Wierd Willy ( 161814 ) on Thursday August 21, 2003 @08:11AM (#6753695) Journal
        Thats because your mind stores such information holographically. You have two eyes set 1-3/4 inches or so apart. That gives you 3D image that is stored in your visual cortex AS a 3D image. The eyes of the observee are a major aspect of facial recognition. 2D cameras dont record the subtleties of eye color and iris detail.

        You dont recognize people that you have never seen before. If you were to see a photograph in 2D of some random individual, then try to find that person in a flowing crowd under varying light conditions and facial expressions, you probably wouldn't be able to recognize that individual. It takes several months to teach a person to do this. Even expert law enforcement personnell cannot do this without a certain ingrained talent for recognizing faces.

        • You have two eyes set 1-3/4 inches or so apart

          Hey, I'm not George W Bush, you know!

        • Re:Not surprising (Score:3, Informative)

          by ojQj ( 657924 )
          People with who are blind in one eye can also recognize faces from 2D photographs. I have a fairly subtle problem which reduces the effectiveness of this particular depth cue for me, and I too have no problem recognizing faces from 2D photographs. The two-eye trick is not the only cue people use to determine 3rd dimension information.

          And face recognition doesn't have much to do with the 3rd dimension even for "normal" people. Try looking at the back of a mask that imitates the facial contours on both s

      • I'm sure you can recognize your ten closest friends from a 2D photograph. So can a facial recognition system.

        Now try picking out one out of your 40000 closest friends in a crowd of several million people.

        How many times will you go 'oh, isnt that... no, wait, it wasnt', or pass your 23472'nd friend by without recogizing him?

        It's not just difficult. It is impossible. The human mind is *good* at recogizing faces, but the problem is that human facial features just arent that unique, and varying conditions pl
      • The brain has specialized mechanisms that do nothing but recognize faces. It is in the frontal lobe, I think. (I get my brain parts mixed up) It is one of the reasons you may recognize someone, but cannot remember their name or where you know them from.
  • by The Brain Murderer ( 641875 ) on Thursday August 21, 2003 @07:44AM (#6753473)
    Here, whenever a criminal appears on the TV, their face is always partially obscured whereas everyone else looks normal.

    Why not site these cameras at ports and airports as any dodgy people would appear with their faces obscured so you could just arrest them...

  • by markom ( 220743 ) on Thursday August 21, 2003 @07:44AM (#6753476) Homepage
    The Register has a story here [theregister.co.uk].

    Interestingly enough, they mention successful system in Scotland being up to 70% successful in "crowd".
  • by beacher ( 82033 ) on Thursday August 21, 2003 @07:45AM (#6753482) Homepage
    Cameras on every corner. Web based cameras. Pan and zooming cameras... With some recognition software.. We could build something that dispenses beads when it recognizes ... umm...
  • Shocking.... (Score:5, Insightful)

    by moehoward ( 668736 ) on Thursday August 21, 2003 @07:45AM (#6753486)
    Or not. It didn't work. No duh. Did anyone really think it would? I always got the idea of the guy selling these was like the monorail salesman in the Simpsons.

    I'm completely amazed that the general public has become conditioned to tolerate this crap from law enforcement. Yes, it's nice that it's gone now, but we all know it will be back. And furthermore, the cameras themselves are still there!!! I mean, come on!! We should be outraged enough that the cameras are there, let alone the facial recognition.

    Is civil disobedience dead or has civil disobedience become outlawed? What sort of legal/semi-legal countermeasures can be taken against surveillence cameras set up in public places? I'd love to have some sort of laser pointer that I can point at cameras in public areas to break them.
    • Re:Shocking.... (Score:3, Informative)

      "I'd love to have some sort of laser pointer that I can point at cameras in public areas to break them"

      Why not use a regular laser pointer? I believe the link below was featured on SlashDot once (or was it www.cryptome.org?).

      http://www.naimark.net/projects/zap/howto.html

      "Or not. It didn't work. No duh. Did anyone really think it would?"

      Sure. It DOES work, if set up properly. What, you think it's not possible in theory? Why?
    • It's really useful for helping prevent street crime, and catching real criminals. Rather than have dozens of police officers wandering about the streets more-or-less aimlessly, a smaller number can be directed to trouble spots very quickly.

      Furthermore, CCTV helps catch vandals where there wouldn't ordinarily be police in an area. Car got the windows tanned? Get the police to check the CCTV tapes.

      I don't buy into thie "They're invading our privacy!" thing. If you're walking down a busy street, you are
      • by GigsVT ( 208848 ) on Thursday August 21, 2003 @08:15AM (#6753736) Journal
        Rather than have dozens of police officers wandering about the streets more-or-less aimlessly, a smaller number can be directed to trouble spots very quickly.

        The logical extension is cameras in homes. Get robbed? No problem, the police have all your video on file, and can just pull up the footage to see who broke into your home.

        Or maybe there are pesky political demonstrators marching down the street, interrupting traffic. With the cameras in place, it will be easy to convict them for something to shut them up for a while.

        It's not as much what their doing now, it's that the same arguments for what they are doing now can be used to justify real loss of freedom.
        • Why on earth would people put cameras in private homes? It's one thing to put cameras on public streets, quite another to put them in houses. It's something that works quite well in the UK. Possibly because we're not twitching and quivering at the mention of "terrorism", having had to put up with US-funded terrorism for around 40 years.
        • by IIRCAFAIKIANAL ( 572786 ) on Thursday August 21, 2003 @09:24AM (#6754285) Journal
          The logical extension is cameras in homes. Get robbed? No problem, the police have all your video on file, and can just pull up the footage to see who broke into your home.

          No, that's the illogical suggestion. A slippery slope fallacy, if you will. X does not follow Y.

          Or maybe there are pesky political demonstrators marching down the street, interrupting traffic. With the cameras in place, it will be easy to convict them for something to shut them up for a while.

          Protesters use their own camera's. It helps prevent police brutality. If a protestor takes part in an illegal protest they are well aware that they can get arrested - often times, that is the whole point. Police already use video provided by the media in cases such as these as well. Should we ban the free press because it's infringing on your (misguided) expectations of privacy in public?

          It's not as much what their doing now, it's that the same arguments for what they are doing now can be used to justify real loss of freedom.

          CCTVs in public are no different than having a cop on every corner as long as people are aware that the camera's are there. In fact, in a few ways, they are better than a cop. A camera can't be racist or sexist. A camera can't plant drugs on someone or entice them to commit a crime.

          It's also much better to secure a conviction based on an image rather than someone's description as well. It has been proven numerous times that people are downright useless when it comes to recalling a person's face.

          Basically, what I am saying is CCTV can put more criminals in jail and keep more innocent people out of jail if it is properly used.
      • It's really useful for helping prevent street crime, and catching real criminals

        Not really. It just moves the crime away from the cameras.

        • Not really. It just moves the crime away from the cameras.

          So the logical conclusion is cameras don't work? Hmm I add a camera and the crime relocates out of the view of the camera. What if you keep adding camera coverage? The crimes have fewer areas to occur unimpeded. Larger areas enjoy fewer crimes and more honest citizens traffic in those protected areas since they are safer. Since fewer dispatchers can monitor larger areas you can concentrate the law enforcement officers on the beat in the unmonito
          • You remind me of the people that say mandatory sentencing doesn't work.

            Mandatory sentencing doesn't work.

            Funny thing is now we have mandatory sentencing guidelines and put criminals in prison and leave them there, there are people who whine about the rising prison population.

            We are bitching about the fact that we have more people in prison than any other country on Earth, that most of them are non-violent drug offenders who would be better served by treatment, and that we are releasing violent cri

      • What's wrong with CCTV is that it does not catch criminals, it merely displaces crime to non-CCTV equipped areas. In the worst case, that can mean the blind spots of all the cameras along a street.

        Back in The days, a detective would begin with a crime and look for evidence which might help them find the person who committed the crime. Todays new breed of coppers, armed with CCTV, DNA testing and sundry intrusions into privacy, begin with a person and look for evidence which might help them find a crime c
    • by Anonymous Coward
      Is civil disobedience dead or has civil disobedience become outlawed?

      Civil disobedience has always been outlawed. It's definition is to frigging disobey a law you don't agree with.

      It's effectiveness is a whole another matter. If enough people are willing to go to jail for their beliefs, politicians usually take notice of them. Just make sure the law your breaking is a misdemeanor and not a felony...felons don't vote.

    • Well, some of my friends back in Oregon used to take their frustrations out on highway signs with 22s.
  • by Chip Salzenberg ( 1124 ) on Thursday August 21, 2003 @07:46AM (#6753492) Homepage
    ... with more effective pattern matching software watching it: human cops. I think that's a better deterrent to crime than the flaky software they've given up on.
  • It broke man. (Score:5, Insightful)

    by secondsun ( 195377 ) <secondsun@gmail.com> on Thursday August 21, 2003 @07:47AM (#6753495) Journal
    Police are at a loss to explain why the software wasn't effective, since it seemed to work fine in controlled testing, Guidara said.

    If I were selling you a million dollar system it would work when I showed it off too.
  • Text! (Score:3, Informative)

    by anonymous coword ( 615639 ) on Thursday August 21, 2003 @07:47AM (#6753496) Homepage Journal
    This news site dosen't seem to be up to the slashdot effect. Heres the text.

    Tampa police eliminate facial-recognition system

    By MITCH STACY
    Associated Press

    AP Photo
    A surveillance camera is seen in the Ybor City area of Tampa, Fla., in this June 2001, file photo.

    TAMPA, Fla. -- Civil-rights advocates celebrated a decision by Tampa police to scrap a highly touted facial-recognition software system that was designed to scan the city's entertainment district for wanted criminals.

    But after two years, it yielded no positive identifications and no arrests.

    "It was of no benefit to us, and it served no real purpose," Capt. Bob Guidara said Wednesday, emphasizing the decision to drop the software was based on its ineffectiveness rather than privacy issues.

    Tampa became the first city in the United States to install the software in June 2001 to scan faces in Ybor City nightlife district and check them against a database of more than 24,000 felons, sexual predators and runaway children.

    But critics said it violated privacy rights, forcing Ybor City visitors to be in what amounted to an electronic police lineup without their consent.

    Darlene Williams, chairwoman of the Tampa area chapter of the American Civil Liberties Union, said she's glad it's gone.

    "People have the right to be anonymous, and not to be put in a police lineup for committing the offense of walking down a public street," Williams said.

    "As a culture we have always given police the tools that are deemed appropriate to do their jobs. (But) this was handled without public input or foreknowledge, and that was wrong."

    New Jersey-based Visionics Corp. had offered the city a free trial use of a the program, called FaceIt. It was installed on closed-circuit cameras that police used to monitor Ybor City crowds on Thursday, Friday and Saturday nights.

    A police officer in a room three blocks away monitored video images and could pick out faces in the crowd to scan and run through a criminal database to search for matches.

    Initially, it could be used only with one of the system's 36 cameras at a time, but an upgrade last year allowed use on up to six of the cameras.

    Critics compared it to George Orwell's novel "Animal farm" and Texas Rep. Dick Armey, the U.S. House majority leader at the time, called for congressional hearings on the technology. Protesters donned bandanas, masks and Groucho glasses on one busy Saturday night to show their contempt.

    Police are at a loss to explain why the software wasn't effective, since it seemed to work fine in controlled testing, Guidara said.

    Meir Kahtan, a spokesman for the company, now known Identix Inc. after a merger between Visionics and the security technology company Identix, declined to answer questions on the matter Wednesday.

    The company's only comment came in a one-sentence statement that seems to suggest privacy issues were behind the Tampa's decision.

    "Identix has always stated that this technology requires safeguards, and that as a society we need to be comfortable with its use."

    Guidara said the closed-circuit cameras installed in 1997 will remain in Ybor City without the face-scanning capabilities. They are effective as a deterrent and have helped police foil crimes, he said.

    Face-scanning technology is still being used in other cities. The airport, jail and jail visitation areas in Pinellas County are using it, but it has never resulted in an arrest, officials said.

    Virginia Beach, Va., installed the software on closed-circuit cameras along the city's boardwalk last summer. While it has never produced a hit or an arrest, police spokesman Sgt. Max Hayden said it performed well in controlled tests and may be a deterrent to criminals. Signs along the boardwalk inform visitors of its use.

    "It would not be prudent to take technology offline when it's been up and running for a year, based on another city deciding not to use it," Hayden said.
  • Well, DUH... (Score:4, Insightful)

    by Hayzeus ( 596826 ) on Thursday August 21, 2003 @07:48AM (#6753504) Homepage
    Police are at a loss to explain why the software wasn't effective, since it seemed to work fine in controlled testing, Guidara said.

    This has been a MAJORLY over-hyped technology. Facial recognition isn't so hard, but the attentional mechanisms required to pick faces out of a crowd reliably under varying lighting conditions are still iffy at best. Most still seem to rely on skin color detection to pick out candidate areas of a scene, and, frankly, that method is still pretty dicey when used out in the real world.

  • Finally! (Score:5, Funny)

    by Anonymous Coward on Thursday August 21, 2003 @07:48AM (#6753510)
    This is awesome ... finally I can visit Tampa again!

  • So what happen to Miami Vice?

    Did they catch everyone already?
  • Felons (Score:3, Insightful)

    by travdaddy ( 527149 ) <travo@[ ]uxmail.org ['lin' in gap]> on Thursday August 21, 2003 @08:02AM (#6753623)
    If there are signs everywhere informing people of its use, would a felon really go anywhere *near* the system? Doesn't seem like it. To felons, these signs mean "come walk over here and we'll arrest you." Perhaps this is why it's not working.
  • Advertisments (Score:5, Insightful)

    by Pompatus ( 642396 ) on Thursday August 21, 2003 @08:04AM (#6753641) Journal
    Virginia Beach, Va., installed the software on closed-circuit cameras along the city's boardwalk last summer. While it has never produced a hit or an arrest, police spokesman Sgt. Max Hayden said it performed well in controlled tests and may be a deterrent to criminals. Signs along the boardwalk inform visitors of its use.

    This reminds me of a DUI checkpoint I saw a couple of months ago. They had not one, but TWO signs 6 and 4 blocks, respectively, that said, "DUI checkpoint ahead". There were plenty of opportunities to turn down another street and avoid it altogether.

    Does it really take that much intelligence for a criminal to avoid an area where he/she might get caught? While one might be so drunk as to not be able to read the signs, I think law inforcement in these circumstances is being as stupid as these criminals. [globe-rider.com] Maybe it's that think like your enemy strategy.
    • its not stupidity .. its called legal issues with not advertising DUI checkpoints.

      I know in the town where my parents live in, they need to advertise the checkpoints in the local paper a week before.

      THis is to avoid DUI's getting off by using entrapment as a defense or soemthing.. I dunno exactly I have no Law experience.. but thats the gist of it.
    • Re:Advertisments (Score:5, Informative)

      by untaken_name ( 660789 ) on Thursday August 21, 2003 @08:21AM (#6753781) Homepage
      This reminds me of a DUI checkpoint I saw a couple of months ago. They had not one, but TWO signs 6 and 4 blocks, respectively, that said, "DUI checkpoint ahead". There were plenty of opportunities to turn down another street and avoid it altogether.

      I actually saw a pretty intelligent use of signs by cops once. lollapalooza was held in an outdoor venue near where I used to live. People leaving the show had to get on a limited-access highway and go about 2 miles before there was an exit, and everyone leaving the show had to drive to that exit. About a mile before the exit, they placed several large signs that said 'Drug checkpoint ahead. All cars will be searched.'
      Of course, that would be illegal to do, and there was no 'drug checkpoint' at all. Instead, the police waited around for people to illegally u-turn across the median and then busted those people. We just kept driving, and sure enough, no checkpoint. After we made a legal u-turn at the next exit, we saw someone swerve across the median, and then saw two cops streak after them, sirens blazing. I don't think we stopped laughing the whole way home. Sure, it's an underhanded method, but anyone who knows their rights wouldn't fall for it.
      • They were going to do something similar in Indianapolis, but the papers got wind of it and protests of entrapment scuttled the idea. Personally, I think it's pretty clever...
    • Re:Advertisments (Score:3, Insightful)

      by YrWrstNtmr ( 564987 )
      They had not one, but TWO signs 6 and 4 blocks, respectively, that said, "DUI checkpoint ahead". There were plenty of opportunities to turn down another street and avoid it altogether.

      And down those side streets, they may have had a cop or two waiting for the drunken avoiders to come weaving their way.
    • Re:Advertisments (Score:3, Insightful)

      by Cyno ( 85911 )
      What bothers me about this stuff is we're supposed to be innocent until proven guilty. Stoping everyone to test for a DUI is accusing everyone of being guilty until they test innocent.
    • that reminds me of a small town just south of here. The local police put up a sign on the highway about a mile south of the town saying DWI checkpoint ahead. People who are obviously drunk get off at the exit thinking they are avoiding the cops. What they don't realize is that the checkpoint is at the top of the exit.
    • by dpbsmith ( 263124 ) on Thursday August 21, 2003 @10:52AM (#6755156) Homepage
      According to this story [zwire.com]

      During [Aurora, CO's] annual bluegrass festival in 2000, officers posted signs saying "Narcotics checkpoint, one mile ahead" and "Narcotics canine ahead". They then hid on a hill, clad in camouflage, and watched for any people who turned around or appeared to toss drugs out of their windows after seeing the signs.
      Stephen Corbin Roth, 60, was pulled over for littering after he threw out what appeared to be a marijuana pipe. Police found a marijuana pipe and mushrooms during a search of his car and he appealed his conviction on possession of drug paraphernalia to the appeals court.
      Under the procedure that day, an officer down the road would be told by radio to pull over any vehicle seen littering while an officer on the hill would run down and find the items thrown away.
      The appeals court ruled drug checkpoints are illegal because motorists are stopped at random and without reasonable suspicion of committing a crime. However, in Roth's case, the court concluded that finding the marijuana pipe gave the officers probable cause to stop Roth's vehicle.
  • Not a victory (Score:3, Insightful)

    by tundog ( 445786 ) on Thursday August 21, 2003 @08:10AM (#6753687) Homepage

    My first reaction was, did they drop it because of community pressure or because it was ineffective?

    But after two years, it yielded no positive identifications and no arrests

    Unfortunatly, this is not a victory. When the technology is ready, it will be back.
  • I work for a local CCTV/Surveilance/Locksmith here and we do work for some pretty major companies in the area (Power companies, paper mills, food production facilities, etc).

    We put in some incredibly high end equipment and I can't imagine how difficult it would be to match a face to a database. Biometrics has gone pretty far, but to match enough points of a face and have it at just the right angle would be nearly impossible. For even a hand scanner to work correctly you need to have your hand in there ju
  • by dpbsmith ( 263124 ) on Thursday August 21, 2003 @08:19AM (#6753761) Homepage
    According to a radio report on NPR, Tampa did not spend money directly on the system. The surveillance cameras were already in place (and will remain in place) and Identix provided the software on some kind of free-trial or beta basis. Of course, I'm sure a great deal of police time = money was wasted on training, etc.

    The reporter discussed the issue of false positives with the interviewee, in a somewhat vague way. The reporter said, sensibly enough, something like "Isn't the problem that if you require too many measurements to match you don't get identifications, and that if you only require a few you get false positives?" The interviewee concurred. I got the impression that the police department might have insisted that the system be tuned to a level where they were not wasting time on false positives, and at that level there were simply no matches.

    The reporter also asked (also sensibly) whether the apparent lack of success could have been because the system's installation was widely publicized and the bad guys knew better than to show up in Ybor City. Interestingly enough, the interviewee said something like "If I believed that, it would be a great thing and I'd want to keep the system in place forever." I was, however, left with the distinct impression that the interviewee did NOT believe that.
  • They should have put CowboyNeal in front of a camera. At least they would have one positive ID... ("hey, that's the f***er who stole all those pizza's and portables last week")
  • by epicstruggle ( 311178 ) on Thursday August 21, 2003 @08:26AM (#6753817)
    was that they did not want many false positives. So they decided on a very high match before a person was flagged. They did not want a "looser" match as that would give them false positives.

    The technology is there to get the bad guy, but we might have to put up with getting mistaken for the bad guy from time to time. We need to decide if its worth it.

    later,
    epic
  • by AlecC ( 512609 ) <aleccawley@gmail.com> on Thursday August 21, 2003 @08:27AM (#6753828)
    This system seems to have tripped across a common problem with all id recognition systems - face, retina, voice, fingerprint, whatever. That is that they are used in two completely different modes.

    One mode is the verification mode: this person claims to be Mr XYZ: is he? For this purpose, you only have one identity to match. If the answer comes out "maybe" instead of "yes" or "no", you can take another photo/scan/whatever. You can use extremely number intensive checking techniques because you are only trying to match ONE face/eye/... to ONE record. And the people being checked have at least some incentive to help their system (remove glasses, get a rescan when they have hair cut or grow beard). Systems can be made to do this very reliably in this mode - call it mode 1.

    You can scale this up a little bit, while maintaining reliability. A car, for example, might recognise the voices of four registered drivers and adjust itself to suit, or a secure area form a few tens of people. Call this mode 1A.

    The second mode is when you are trying to detect any one of a large list of possible people in a huge crowd, when they may have changed their characteristics significantly, either intentionalyy or unintentionally. Call this mode 2.

    The trouble is that a lot of people assume that, if you can scale from 1 to 1A, the scaling from 1A to 2 will be linear. Which it won't. As well as the linear scaling of vastly more records to match (a linear scaleing), there is the the no-rescan, chjanged face, uncooperative facto, which acts quadratically with the fist. This means the problem explodes uncontrollably very soon.

    Some of the people making this assumption should know better.
    • I see four possible uses of face-recognition systems:

      1. One to One: Like the verification example you used. Yeah, this must be the easiest, but you still don't see it a lot, even where, eg ATMs, it would be useful.

      2. One to Many: For example, the police pick up a suspect, take his picture, and run it through their database to see if he's been picked up before, like they do with your name at the moment, but removing the opportunity to lie.

      3. Many to One: for example, putting a bunch of cameras in a
  • In Virginia (emphasis mine):
    While it has never produced a hit or an arrest, police spokesman Sgt. Max Hayden said it performed well in controlled tests and may be a deterrent to criminals. Signs along the boardwalk inform visitors of its use.

    They didn't have any positive identifications but do they know for certain that the crimials they were looking for passed in front of the camera? Were criminals aware of the cameras and simply avoiding them?

  • by immel ( 699491 )
    The history of electronics has been filled with these sorts of duds that later became big deals. Example- the PDA. In the 90s, an early PDA called the Newton had similar software that was supposed to recognise handwriting, but it didn't work too well. Later that decade, everyone who was anyone had a PDA that worked. I think this face recognition technology could become big in the future; just give it some time. Hopefully it will be usewd for good not evil? Just a thought. :)
  • for once. Wasn't Tampa the first city to implement low-power X-rays (aka automatic strip-search) in airport security checkpoints?
  • It dosen't work today. Horray.
    It won't work tomarow. Great.
    It will not work 10 years from today. Orgasum.
    But....

    20 years ago we couldn't stuff a computer into a camra. Couldn't program a computer to track motion using a camra. Couldn't program a computer to record survalence images into a standard file format.

    You could 20 years ago program a computer to record images into non-standard images files but they weren't very useful for much.

    Now imagin an image proccesor capable of analising 50 to 500 faces at
    • Mod parent up.

      How much of a victory is this? I think we should be very careful about celebrating when something bad fails due to technological constraints. We of all people should understand that is only temporary. People always talk about unenforced laws being no problem. Well, they can be enforced selectively in a way that basically amounts to tyranny, like the Texas anti-sodomy law. Similarly, a law that cannot be enforced is still a problem if the law could someday be enforced. Then we'll have t

  • by Dark Coder ( 66759 ) on Thursday August 21, 2003 @10:02AM (#6754628)

    Of course, it doesn't work...

    The CCD needs to be of high-resolution [cplab.com] (greater than 15 megapixels) alongside with zooming lens and a 24/7 staff of camera operators in order to garner sufficient pixel details necessary for adequate facial resolution.

    Don't forget, the best evasion technique against this cutup is a simple New Orlean masquerade mask [littleshopoffantasy.com].

    Oh yes, want night-time survelliance and target-elimination? Don't forget a infrared laser [fas.org] with remote-control software-adjustable variable beam-width lenses [cnn.com].

    Come on... Slapping a 2 megapixel and a fixed lens together isn't going to cut it.

  • by Mr Beano ( 650508 ) on Thursday August 21, 2003 @11:54AM (#6755999) Homepage
    After 10 beers, I too have given up my "facial recognition" system.
    So many false positives, so many regrets, so much itching and burning....
  • by onyxruby ( 118189 ) <onyxruby@ c o m c a s t . net> on Thursday August 21, 2003 @11:59AM (#6756056)
    The problem with the Tampa system, or an airport system or any other public environment is that they are all transient in nature. An airport has millions of people going through every year, with a pretty good chunk in a hurry to catch a flight. Many of these people will never pass through that given airport again. You also have a much larger database of positives to pick from. The Tampa system had 30,000 mugshots to base from. There were simply too many out of control variables for the system to be effective. In essence you are looking at a system that doesn't deal well with transient environments. Now let's compare this to casinos where the technology was developed and you'll soon see the flaws with the Tampa system.

    The casinos in Vegas have an official "black book" list of only 38 [lasvegassun.com] people they are required to keep out, Atlantic City has 173 [thegoodgam...uide.co.uk]. In addition to this most casinos partake in a mutual database of people that they know or suspect are cheats. From these sources you have a listing of some 3000 - 5000 cheats (source being techs from Vegas I worked with for a while, can't find link to verify) that they want to look out for. They also have something more important. They have an environment where people enter and tend to stay for a few hours. They also have a lot of high quality video cameras from many angles, and have a fixed viewing area. Translation, they don't have nearly as many people to look for, can view a relatively stationary target from multiple angles, and have a lot more time to compare a picked out face to a database, and no security needs that an airport would have that dictate immeadiate detention.

    The reasons this works in casinos almost all stack against this working in a public environment like a city center or an airport. The question is, how long until technology improves before such systems would be considered to have an acceptable false positive rate? Standards are also needed for compensation for people who are falsely picked up and miss flights, hotels and the like. A missed airplane flight can be thousands of dollars, what is the appropriate compensation to the poor detained soul that is not in fact a terrorist or criminal?
  • by exp(pi*sqrt(163)) ( 613870 ) on Thursday August 21, 2003 @12:54PM (#6756631) Journal
    ...technologies that seem so far beyond what I thought was possible that it upsets my world view. I sometimes work in image processing. Finding features in photos of scenes can be damn hard. Just finding a face reliably can be hard, let alone identifying its owner! Sure, you'll find lots of papers claiming to be able to do these things but my success rate with implementing (or just downloading) code from papers is pretty low. Publish or perish, even if you have a exaggerate a wee bit. So when I read about face identification software actually being used in airports it surprised me a lot. How can these guys be a quantum leap above everyone else? But now I see I had no need to be surprised.

    I remember 20 years ago the father of a friend of mine [ic.ac.uk] claimed he had hardware/software that could identify faces. It was all over the TV and newspapers. Nothing's changed.

  • by skia ( 100784 ) * <skia@nOspAM.skia.net> on Thursday August 21, 2003 @04:32PM (#6759180) Homepage
    Police are at a loss to explain why the software wasn't effective, since it seemed to work fine in controlled testing, Guidara said.

    "Though," Guidara elaborated, "We did think it odd that all of the test subjects had barcodes tattooed across their foreheads."

We are Microsoft. Unix is irrelevant. Openness is futile. Prepare to be assimilated.

Working...