Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Privacy Databases Programming Software Technology

3D Biometric Facial Recognition Comes To UK 157

Roland Piquepaille writes "In the UK, where the recent Queen's speech about national identity cards generated lots of -- mostly negative -- coverage, another potentially invasive technology is being tested with very few criticism. For example, several police departments are now testing a 3D biometric facial recognition software from Aurora, a company based near Northampton. The use of facial recognition 'is rapidly becoming the third forensic science alongside fingerprints and DNA,' according to a police officer who talked to BBC News for 'How your face could open doors.'" (More below.)
"The company claims its software is so sophisticated it can make the distinction between identical twins. And if the civil liberties groups continue to be neutral, this technology could also be deployed in airports or by private companies. Even banks are thinking to put cameras in their ATM machines to identify you. The good thing is that you will not have to remember your PIN. On the other hand, as with every new technology, is it safe for your privacy and is it possible to hack the system? Read more before making your decision."
This discussion has been archived. No new comments can be posted.

3D Biometric Facial Recognition Comes To UK

Comments Filter:
  • by CheesyPeteza ( 814646 ) on Sunday November 28, 2004 @06:26AM (#10936322)
    My friend was building a 3D scanner for his final year project. He went to Aurora for a tour and to see how they did it.

    According to him they said that they'd taken Geri Halliwell's face and put it on to the body of a model for one of her videos as she was pregnant at the time. :o (this was around the beginning of 2000, I'm not a big Geri fan so I can't tell you what video it was)

    I wasn't sure if I believed what Aurora had said at the time, and I'm still not. But if its true, this technology must be pretty advanced as that was 4 years ago.
  • by UpnAtom ( 551727 ) on Sunday November 28, 2004 @06:47AM (#10936369)
    If you saw the Discovery Channel documentaries where they resculpted faces of Hitler, Churchill & Roosevelt onto actors' bodies, it's obvious that we can no longer trust video as an authentic source.

    For example, does the recent blurry video of Bin Laden [welfarestate.com] prove that he's still alive?
  • by h4rm0ny ( 722443 ) * on Sunday November 28, 2004 @06:55AM (#10936382) Journal

    Seriously, are the cameras going to be set to raise an alert when someone walks down the street that they can't distinguish? Will police occasionally stop you and ask you to remove your stetson so that CCTV can calm down?

    How reliable can this be? And if they can scan and recognize a face this effectively in the data, can we reproduce it in latex a la Mission Impossible... well enough to fool the system anyway?

    And do we want the government to have this much data on people?

    I can certainly answer the last question.

  • Easy to defeat (Score:5, Interesting)

    by MancDiceman ( 776332 ) on Sunday November 28, 2004 @07:28AM (#10936445)
    I've recently lost about 5Kgs of weight, and my face, particularly the shape, is quite different. I look at pictures of myself just 3 months old and even I look quite different. Even friends who see me every day comment on it.

    This technology could be flawed by people just gaining and losing weight. Look at pictures of people who have lost a lot of weight and you'll see their cheeks, chin, even lips all look completely different. If this system is so "accurate" it can distinguish between identical twins, what happens when people eat too many twinkies or lose a few kgs?
  • CCTV Footage? (Score:1, Interesting)

    by zmollusc ( 763634 ) on Sunday November 28, 2004 @08:24AM (#10936561)
    CCTV must have come on in leaps and bounds recently. The pictures from CCTV footage that are shown on tv (Does anyone recognise this man seen robbing a post office last week) are usually of appalling quality.
  • Re:foolproof (Score:3, Interesting)

    by Zemran ( 3101 ) on Sunday November 28, 2004 @09:27AM (#10936736) Homepage Journal
    They can claim all they like but I think that they are being disingenuous. I have read many independant trials of this and all found problems. I do not proffess to be an expert, but I did work with forensic experts for 5 years until recently and we never found that facial recognition worked well enough, even to use as a guide. It was far too easy to fool and even when not trying to fool it the failure rate was high.

    Even though it is not supposed to matter what hair do someone has it did seem to confuse the system when someone changed their hair. A swollen face or nose, from a fight or accident can affect it (relevant if you are talking about criminals) and even worse was it gave many false positives when looking for someone.

    I think it is like the polygraph, it will be used excessively and create a lot of problems that will have to be sorted out later but because it is great sounding technology suits will like it.
  • by h4rm0ny ( 722443 ) * on Sunday November 28, 2004 @11:26AM (#10937133) Journal

    Too late. It is already against the law to, say, wear a balaclava at a protest.

    Yes. Doens't seem to stop the coppers wearing concealing face gear or getting vicious when someone points a video camera back at THEM.

  • by icepick72 ( 834363 ) on Sunday November 28, 2004 @02:13PM (#10937861)
    The company claims its software is so sophisticated it can make the distinction between identical twins

    Is it just me, or does something seem to be missing from this sentence? For example -- injection of proposed content >> -- "however the software failed to recognize a Halliburton executive from a West-African Pygmy ".

    Just walk around the airport with a goofy look on your face. You'll never be suspected ... by the facial recognition equipment. Or better yet come through the cafeteria with chunks of food smeared on your face. That's inconspicuous.

  • by Anonymous Coward on Sunday November 28, 2004 @03:45PM (#10938298)
    Comments on 3-D face recognition in United Kingdom

    A few comments:

    1. The claimed performance of the Aurora system seems unlikely. There is a long history of exaggerated claims by companies marketing face recognition products. For example, see news coverage of face recognition immediately following the September 11 terrorist attacks.

    Face recognition is quite difficult:

    Faces vary over time due to natural aging, gain or loss of weight, weathering of the face due to environmental factors, changes in facial hair in men, and other factors.

    Even human face recognition is not particularly accurate, as many cases of mistaken identity and incorrect eyewitness identification demonstrate.

    The uniqueness of faces has not been demonstrated. In addition to identical twins and close relatives with striking resemblances, people with strikingly similar faces sometimes occur in the general population, usually among the same ethnic group.

    All forms of pattern recognition by computers -- computer vision, speech recognition, face recognition, etc. -- have proven extremely difficult. It seems likely that duplicating human level pattern recognition requires the discovery of a new scientific principle or principles. Obviously, one cannot rule out that Aurora or another company has made such a discovery.

    2. Whether using fingerprints, DNA, face patterns, or other biometrics, even a small rate of duplicates -- for example one in a million -- present serious problems for forensic applications. In major urban areas, several million people live within a few hours travel time of any crime scene, meaning that any biometric could be matched to several suspects (using a one in a million duplicate rate) with plausible access to the crime scene. In fact, with modern airline transportation, the entire world's population of over 6 billion people is within at most a few days travel time of any crime scene.

    This is not an academic concern. For example, in the recent Brandon Mayfield case, the United States FBI matched a fingerprint taken from a bag at the scene of the Madrid train bombing to Brandon Mayfield, a lawyer, in Portland Oregon. The Spanish authorities, however, produced their own match to the fingerprint, an Algerian suspect. The FBI argued for a time that Brandon Mayfield might have traveled by plane to Spain using a false passport -- since there was no record of any travel. The considerable distance between Portland Oregon and Madrid did not provide an adequate alibi due to modern transportation. Eventually, the FBI backed down and claimed that its expert fingerprint examiners had made a mistake.

    The Brandon Mayfield case is not an isolated case. There are several known cases of mistaken fingerprint identification. In all cases, the authorities blame the misidentification on error or fraud by fingerprint examiners, rather than a duplicate or near duplicate fingerprint, a finding which would invalidate the method entirely.

    3. Historically, the databases of biometrics -- fingerprints and mug shots -- used in police investigations appear to be highly biased, consisting primarily of persons with criminal records or circumstances that make them plausible suspects for crimes (for example, terrorist watch lists). Thus, if a misidentification occurs -- especially due to fingerprints, the suspect will face a very difficult time proving his or her innocence. It requires an airtight alibi, contradictory DNA evidence, or something similarly compelling to call into question the biometric identification, especially if the wrongly accused is a known "bad person". If the databases included all citizens, then suspicious misidentifications of persons without criminal records or other plausible reasons for suspicion would be more likely to call into question the biometric method such as fingerprint.

    4. Biometrics used for access control, in place of keys for example, face serious problems with error rates as low as 1%. For example, consider a hote
  • Blunkett (Score:5, Interesting)

    by carldot67 ( 678632 ) on Sunday November 28, 2004 @04:08PM (#10938418)
    Non-UK readers should be aware that David Blunkett, the UK Foreign Minister and parent of this god-forsaken legislation often uses the old "you can trust us with your data - it's not like we're the Nazis or anything" line when people complain about ID cards, biometrics and all the other good stuff he has in mind.

    Non-UK readers shuld also be aware that Blunkett this week is facing charges of inappropriate behaviour when he was caught personally intervening in the visa application for his mistress' new nanny.

    I find myself needing to give my face/fingerprints to a man who would appear to be a corrupt adulterer. How excellent is that.
  • by mikechant ( 729173 ) on Monday November 29, 2004 @06:51AM (#10941476)
    Seemed to work impressively until three people showed up at the door, one spied into the iris reader, door opened and the other two just tailgated through.

    We're constantly reminded at work how security is our responsibility but they're too mean to install one simple piece of technology that would actually stop people getting in without a swipe card - a one-person-at-a-time turnstile system. Instead we're all supposed to stop people tailgating us and check they have a valid id. Of course, no-one does this since it's quite likely to be some senior manager behind you who thinks ids etc are for 'the little people' and will give you serious trouble if you challenge them. So the tailgating goes on and the thefts continue and they send us another email reminding us how security is *our* responsibility.

Thus spake the master programmer: "After three days without programming, life becomes meaningless." -- Geoffrey James, "The Tao of Programming"

Working...