Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Databases Privacy Government Security Software Hardware Technology

Facial Recognition Database Used By FBI Is Out of Control, House Committee Hears (theguardian.com) 90

The House oversight committee claims the FBI's facial recognition database is out of control, noting that "no federal law controls this technology" and "no court decision limits it." At last week's House oversight committee hearing, politicians and privacy campaigners presented several "damning facts" about the databases. "About 80% of photos in the FBI's network are non-criminal entries, including pictures from driver's licenses and passports," reports The Guardian. "The algorithms used to identify matches are inaccurate about 15% of the time, and are most likely to misidentify black people than white people." From the report: "Facial recognition technology is a powerful tool law enforcement can use to protect people, their property, our borders, and our nation," said the committee chair, Jason Chaffetz, adding that in the private sector it can be used to protect financial transactions and prevent fraud or identity theft. "But it can also be used by bad actors to harass or stalk individuals. It can be used in a way that chills free speech and free association by targeting people attending certain political meetings, protests, churches, or other types of places in the public." Furthermore, the rise of real-time face recognition technology that allows surveillance and body cameras to scan the faces of people walking down the street was, according to Chaffetz, "most concerning." "For those reasons and others, we must conduct proper oversight of this emerging technology," he said.
This discussion has been archived. No new comments can be posted.

Facial Recognition Database Used By FBI Is Out of Control, House Committee Hears

Comments Filter:
  • I think police should need a warrant to use facial recognition in many cases. I also feel that perhaps searches of electronic devices and online accounts need to strictly limit exactly what is searched for and disallow any evidence of any crimes not listed in the warrant from being used.

    The 4th amendment is supposed to make it hard to prosecute certain kinds of crime. In my opinion, the police really have no business going after crime that isn't reported to them anyway, except for a few exceptions like murder.

    • by Anonymous Coward

      I think police should need a warrant to use facial recognition in many cases. I also feel that perhaps searches of electronic devices and online accounts need to strictly limit exactly what is searched for and disallow any evidence of any crimes not listed in the warrant from being used.

      The 4th amendment is supposed to make it hard to prosecute certain kinds of crime. In my opinion, the police really have no business going after crime that isn't reported to them anyway, except for a few exceptions like murder.

      The text of the United States Constitutions 4th Amendment:
      The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

      I'm not sure I would go so far as say the police cannot use evidence they witness of say, stolen property, that is found d

    • I think police should need a warrant to use facial recognition in many cases. I also feel that perhaps searches of electronic devices and online accounts need to strictly limit exactly what is searched for and disallow any evidence of any crimes not listed in the warrant from being used.

      I don't think that is realistic or even remotely reasonable. You know, people recognise other people simply by eye-sight; police officers do the same. We can use binoculars to see further away and thus enhance our vision, or we can use an infra red camera to see thing at night - and record it, and so on. There is a continuum from using no technological aid at all to using automatic recognition technology, so where should the limit be set? I think we have to weigh up the benefits against the costs. There is

  • The FBI then needs to be Disbanded.

    I am disgusted at anyone that puts security above freedom, One of the wisest men of our country once said, "Those who surrender freedom for security will not have, nor do they deserve, either one."

    • by Tablizer ( 95088 )

      Unfortunately, fear-mongering works politically, at least in the USA. It's why we have inflated and impractical incarceration durations. Painting demons for the masses is psychologically easy to do compared the intricacies and nuances of explaining and protecting rights.

  • by Gravis Zero ( 934156 ) on Monday March 27, 2017 @07:28PM (#54123301)

    ... are most likely to misidentify black people than white people.

    Before you jump to conclusions about "racist software," I can tell you that the reason for this is very well known and understood: lighting and contrast ratios. Specifically, you get a much higher contrast ratio of faces with light skinned people from image sensors than do you for dark skinned people unless you have their face very well lit up. Simply put, camera sensors are a poor substitute for the human eye and this is one of the side-effects of that.

    • by Tablizer ( 95088 )

      Before you jump to conclusions about "racist software"...

      What exactly are the accusations you are defending against?

      I will add that bias is not necessarily conscious racism; it could be those who build the equipment and/or train staff on using the equipment are mostly used to dealing with people of a certain ethnic group. These are mistakes of familiarity, not "racism" per se. These kinds of mistakes have appeared in commercial enterprises before, such as manual photo processing labs.

      • Before you jump to conclusions about "racist software"...

        What exactly are the accusations you are defending against?

        I think you missed the keyword in there: before. If you didn't think it was coming, this is the internet, some people think the Earth is flat and the US government "did 9/11", so it's only a matter of time before someone claims the software is racist.

        I will add that bias is not necessarily conscious racism; it could be those who build the equipment and/or train staff on using the equipment are mostly used to dealing with people of a certain ethnic group.

        You misunderstand when I wrote, "have their face very well lit up," i mean that you need to have lights specifically for lighting their face. A bunch of pictures were from the DMV which isn't exactly a photography studio. A better solution might be to use po

        • by Tablizer ( 95088 )

          I'm still not following. Do you mean "racist software" in the literal AI sense that the software is thinking like a racist human being?

          i mean that you need to have lights specifically for lighting their face.

          You mean like a spot-light to the face? Then you get squinty photos. And awkward. One probably has to manually adjust the gamma levels instead, otherwise the background or clothing has too much effect on auto-adjustment. Either train the staff or have the photos sent to an adjustment lab to be tuned by

          • I'm still not following. Do you mean "racist software" in the literal AI sense that the software is thinking like a racist human being?

            There is no point in pondering what nut-jobs think, so stop wasting your time on it.

            You mean like a spot-light to the face? Then you get squinty photos.

            Look at how they do photo-shoots for models and you'll understand because that kind of lighting is what's needed.

            One probably has to manually adjust the gamma levels instead

            A) that won't help
            B) the computer vision software already filters images in multiple ways to get the optimal contrast.

            Either train the staff or...

            The staff where? All staff everywhere? We're going to make cops into photographers now? Also, what about video camera footage? There are two sides of the equation and your introductory suggestio

            • by Tablizer ( 95088 )

              Look at how they do photo-shoots for models

              Models are used to harsh lights. They learn not to squint. It's part of their job.

              The problem isn't tweaking the images, it's that there isn't enough information in the images.themselves.

              If so, they need better cameras, such as cameras that take photos under multiple different exposures and/or has a high brightness sensitivity range. Software and/or remote human experts can then select the best exposure and/or tune levels. In the end you probably still need a huma

              • If so, they need better cameras, such as cameras that take photos under multiple different exposures and/or has a high brightness sensitivity range. Software and/or remote human experts can then select the best exposure and/or tune levels. In the end you probably still need a human to adjust them; AI is still sketchy.

                I hope you saved up your pennies to buy everyone new cameras, both still and video because nobody is going to want to pay for that.

                • by Tablizer ( 95088 )

                  Commercial iPhones optionally take multiple exposures for a single "snap" so that one can skip blinking subjects, for example. Just do similar for exposure time.

                  And I realize suddenly replacing all that equipment would be quite expensive, but at least make the next batch of cameras better than the last batch so the ratio of good cameras in production gradually goes up over time.

                  • And I realize suddenly replacing all that equipment would be quite expensive, but at least make the next batch of cameras better than the last batch so the ratio of good cameras in production gradually goes up over time.

                    Contact your local representative if you want to change the world a little for the better but don't expect them to be sympathetic. Beside, I only use my global psychic projection powers for evil. ;P

    • Racist hardware then...
  • ...the FBI's facial recognition database is out of control, noting that "no federal law controls this technology" and "no court decision limits it."

    And that's just the way they like it, with no pesky "rules" or "laws" to hamper their activities.

  • by Zibodiz ( 2160038 ) on Monday March 27, 2017 @07:33PM (#54123351) Homepage
    The algorithms generally tend to identify an individual based on relative (i.e. comparison-based) facial structure. In order to have a facial structure to measure, it must identify the eyebrows, eyes, nose, jawline, hairline, etc. The way it identifies these features is based in contrast from one element to another; when hair and skin color are similar (such as with a platinum blond like me, or with a dark-haired, dark-skinned individual), there's a lot less contrast there to measure.
    Furthermore, with darker skin, shadows aren't as noticeable, so the shadows that would otherwise allow measurement of eyebrow prominence or jawline will also be much harder to identify.
  • The FBI only has a criminal database to look at internally. So that is the color of law talking point if asked about any such photo issues.
    The FBI has access to a lot of other databases to search for images. The databases do not belong to the FBI so are not considered part of the FBI.
    A lot of color of law words and terms are used.
    The US face collection is public, mil, private, social media. The talking point on such access is always that the FBI only has criminal photographs in their repository.
    Not
    • The reboot of Hawaii Five-0 depends on Steve McGarrett and Co. looking up a TON of info on suspects on their fancy touchscreen computer. Knowing how each jurisdiction is essentially its own Tower of Babel I doubt that something like that actually is doable these days, but the possibilities are in place as demonstrated by this article.

      • by AHuxley ( 892839 )
        The jurisdiction issue has been worked around.
        A lot of the new data is from CCTV upgrades that keeps images for many months.
        Private security uses CCTV or walks out in front of their building and takes a photo. The photo is submitted to the city or state.
        The city or state then offers further database access to get federal funding.
        Its not a federal database, its not really a city or state funded database if questions are ever asked. A hidden public private partnership that allows a lot of access a
  • Databases you may of volunteered info to: 43 Things, Academia.edu, About.me, Advogato, aNobii, AsianAvenue, aSmallWorld, Athlinks, Audimated.com, Badoo, Bebo, Biip.no, BlackPlanet, Bolt.com, Busuu, Buzznet, CafeMom, Care2, CaringBridge, Classmates.com, Cloob, ClusterFlunk, CouchSurfing, CozyCot, Crunchyroll, Cucumbertown, Cyworld, DailyBooth, DailyStrength, delicious, DeviantArt, Diaspora*, Disaboom, Dol2day, DontStayIn, Draugiem.lv, douban, Doximity, Dreamwidth, DXY.cn, Elftown, Ello, Elixio, English, Eper
  • On its own face recog is pretty crap due to high false positive and often poor quality input footage. But if you trying to identify someone from footage of a specific location and have the location metadata to isolate a pool of potential matches first, and then use face recog to narrow down that list, you have a good chance of id'ing a perp if they are on grid.

    • by niks42 ( 768188 )
      ANPR (Automatic Number Plate Recognition) is deployed on UK roads. Between 25 and 40 MILLION license plates are read every day, locating a vehicle on 270,000 miles of roads. The police retain plates for 2 years, and of course they capture every plate that passes by.

      That means that 70% of journeys are recorded. Stick that data together with facial recognition .. you know where everyone has been for the last two years.

      If you add mobile phone data, GPS data from Google to that, sheesh I could tell you wh

Is knowledge knowable? If not, how do we know that?

Working...