. ..but, on the other hand, government spyware is way off the Orwellian deep end already.
Lawsuits seem about the only way to arrive at a sane analysis for balancing liberty with minimizing crime, and then feed that back into legislation.
Maybe some want to live in a panopticon, whereas others prefer a more traditional feel. One hopes that we can let people sort themselves by taste.
Wrongful arrest is wrongful arrest no matter whether it was due to some fault in a computer system, a jackass abusing their power, or even just an innocent mistake in some paperwork.
In some cases it can be caught early and corrected and maybe all that's needed is an apology. In other cases, the only way to really fix the problem is for someone to have to pay.
Wrongful arrest is wrongful arrest no matter whether it was due to some fault in a computer system, a jackass abusing their power, or even just an innocent mistake in some paperwork.
Yep, at least he wasn't thrown in federal prison for 20 years.
(...to share a designed-for-3-inmates cell with 5 large gangbangers).
I was just thinking that this is so much like some software development. Ie, I don't know what the root cause of the bug is, but I found something suspicious, I changed the code, and checked it in. I didn't test it myself, because that's time consuming and it gets in the way of my fast response that I do so that the boss thinks I'm not a slacker. I can't say I don't what the bug's cause was because the boss will think I'm incompetent, and the customer won't like that as an answer. Upper management is happ
Nothing will deter abuse of the Constitution until the blue thugs are, themselves, subject to imprisonment and decertification for life. Like any career criminal, they never change unless met with greater force.
"Tough on crime" as a slogan gets you re-elected much easier than "respect for the people". Even worse as a campaign slogan would be "respect for people who may not even look like you or live in your neighborhood."
The paraphrase is accurate (according to the fine complaint):
Defendant Bussa arranged a photo lineup with a security contractor who was not even present in the store on the day of the crime, and who had only watched the same grainy surveillance video that was already in Detective Bussa’s possession.[] When that person picked Mr. Williams from the lineup . . , The warrant request was faulty and misleading because Defendant Bussa hid the fact that the person who picked Mr. Williams out of the lineup had never actually seen the shoplifter in person.
Since the security guard wasn't even in the building the day of the incident, nor was the man that was arrested, it follows that the guard could not have seen him.
1) The guard never saw him because it was later established the man wasn't there.
2) The guard wasn't a witness to a crime so why does his testimony matter.
"Qualified immunity" (q.v.) means they can't be held to account unless there's a court precedent with near-identical facts. I'd be surprised if face recognition technology has been around long enough for that to have happened already.
While this is true, the police department passes its costs up the chain to the city, which in turn dutifully pays out taxpayers money to fund the lawsuits knowing that it can't do anything to actually stop the bad behavior because it has no control over the police (see QI again), and the police associations prevent group bad behavior from being acted upon too.
It's the continuing narrative that places that have been blue for decades (or half a century plus) must cope with evil, racist police with no ability to do anything about it.
Ironically, the heavily blue areas blame police unions for this while also heavily promoting unions.
Even more ironically, people keep putting the same do-nothing "leaders" back in office locally and keep doing their level best to export the failed policies as far and wide as possible.
Finally, when called out on this insanity, they simpl
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened. There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures th
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
The point is that they don't view themselves as her subordinates. Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures the cycle will continue. At some point, the politicians running the cities are going to have to choose between the police union and the people they ostensibly serve. Break them like Reagan broke the air traffic controllers. Of course, that would be politically disadvantageous for those politicians, wouldn't it?
The police have a fun set of pressure tactics that can make a politician's life miserable when they don't get what they want. They are also really good at spinning the arbitration clauses as a good thing. But I agree with you, in s
The point is that they don't view themselves as her subordinates.
Their view of the subject doesn't really matter, does it? "My boss is an idiot, fuck that guy" doesn't tend to do very well when the boss is willing to do something about that attitude.
Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
If only they had access to some resources like subpoenas or state/federal authorities who could simply seize the information being withheld. The Chicago police department may think themselves to be above the law, but they only are as long as they are allowed to be. I'm sorry you can't see that the politicians who claim to b
It's because police unions don't act as a union, they act as a bad faith bully. This is evident in tactics like "well then we won't respond to 911 calls" and "we'll stop preventing crime" as though they do anything but show up afterward.
Don't know about the locality where this happened, but Baltimore, Maryland, USA, hasn't controlled the Baltimore City PD since 1860. It's actually a state agency.
While true, a bill became law in Maryland last Friday to return control of the Baltimore City PD to the city. I don't know the details and I only heard about it because it was one of many bills that became law either without the Governor's signature or through veto overrides and that made national news.
And at some point, the cities budget for the police department *does* run out an the police pensions get a severe reduction when the city declares bankruptcy and the federal pension backup system takes over the pensions.
City police pensions are usually held at the state level with all the other police in the state, or in some cases, all the other public employees in the state. It would be better to eliminate qualified immunity and allow lawsuits to hit the value of the individual's pension.
We need to do something about it... but we also need a police force. The police force as a career could easily reach a point where no qualified candidates wanted to work there.
I don't have a good answer.
Qualified immunity is bad. But so is losing your pension to a false accusation after 11 years work.
Qualified immunity is protection from criminal/civil action. It doesn't mean you can't be fired or fined, employment contracts can't be revised, laws can't be changed etc.
No, but the Police Associations and contracts written in the "Law and Order" obsessed 1990s deals with the whole issue of firing police who perform what, to everyone else, is obvious malpractice on the job.
We need wholesale reform of LE in this country. Unfortunately those who promoted defunding the police, together with those who adopted the slogan to cover their own sense of what reforms are needed, did everyone a huge disservice in the last year and made it harder to talk about that. But QI, the 1990s
No, but the Police Associations and contracts written in the "Law and Order" obsessed 1990s deals with the whole issue of firing police who perform what, to everyone else, is obvious malpractice on the job.
Let me preface this by saying: I agree in principle with your general point. Our police are militarized. Soldiers make (and always have made) awful police officers, because the primary mission of a solider is "kill people and break things" which is decidedly at odds with "serve and protect."
That said, it's disingenuous to blame "contracts written in the 90s" for today's ills, when it's the mindset that began in the 90s (actually, it's probably the 70s) that is the problem. No labor contract lasts for thre
Important note - the police department and the city are the same entity. The city absolutely has control over police - someone from the city side has to deal with the bad faith police union representatives.
Qualified immunity doesn't protect them from being punished by citizens in various ways. All legal. Banning them from stores, following them around, recording them while off duty, freezing them out / ostracizing them from society, etc. If citizens really cared this is the only way to go to punish cops that hide behind qualified immunity when for instance shooting unarmed people during a welfare check. The cops can quite their job but that punishment will still haunt them for life. Not even moving could preve
It's from Wikipedia, but here we go: "... extending to all [officials] but the plainly incompetent or those who knowingly violate the law".
If a police officer tells a security guard to pick out someone out of a lineup, _knowing_ that the security guard hasn't witnessed any crime, and that the security guard doesn't have any information that the police officer doesn't have, then this is either plainly incompetent, or an intentional attempt to get a warrant when getting a warrant wasn't justified, which is
Algorithmic bias is a proper term regarding this type of problem. Algorithmic bias, as it applies to race, would be racial algorithmic bias. Or, racial bias for short. Most people would understand the terms usage within this context.
I find it a weird abuse of the word "bias". Bias is a systematic error, meaning that, for example, Mr. Johnson would be consistently misidentified as Mr. Peters. Merely making a random error is not bias. Where exactly would you see "racial bias" anyway? Would Asian people be consistently misidentified as black people, for example? This doesn't seem to be the case here. From what I understand, he was misidentified for another black person.
Itâ(TM)s because mediafolk with mediocre education and basic math and science training are now AI experts.
Now they can use words without understanding what they mean, as long as they get the desired reaction.
What happened to the guy is messed up, but this whole AI âoerace biasâ thing is stupid and shows serious ignorance about how statistical learning works.
The point is that the person who trained the system or who selected the training data had biases. Maybe they selected a training set that had mostly white faces but didn't notice because it looked "normal" to them, causing the system to make errors more often when dealing with black faces. The system then recapitulates the biases of the people who worked on it. It's really not that hard to understand.
The point is that the person who trained the system or who selected the training data had biases
The bias of a user (which may very well be biased) has little to do with errors of a system.
causing the system to make errors more often when dealing with black faces
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
A systematic error over a particular range in a random sample of values is pretty much the textbook definition of bias.
The bias of a user (which may very well be biased) has little to do with errors of a system.
How the errors get into the system is irrelevant to whether the system is biased.
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias.
Except, in this case, the randomness differs based upon race. The system isn't doing anything wrong by not being as accurate with black people. But, the system does not handle different races equally. That is all that matters in this context.
There IS a systematic error here. There is randomness is how AI does the matching. There is not randomness in the fact that the AI cannot perform as well on some races. THAT is the systematic error here.
But that's not a systematic error. A systematic error would be for example labeling white people regularly as black people but not labeling black people regularly as white people. Randomness is not systematic error if randomness doesn't move the expected value.
You're arguing against the reality of the situation, and I don't know what words to use for you to understand where your flaw in logic is. I've already explained why to the best of my capabilities. I can only hope someone else can do it better than I.
No, so far I haven't argued about the REALITY of the situation at all. I've been talking merely of MISNOMERS used by some to describe it. And, quite logically, you can't know what words to use to point out a non-existing flaw.
Bias is irrelevant here. Even for non-black people the results will be poor and the recognition system will most likely pick someone who had nothing to do with this incident.
Poor image quality is a legitimate complaint. But there is no "racial bias" in computer software.
Yes, there is if you do not properly train the algorithm. If you train it with predominantly white faces it will be objectively poorer at matching black people's faces making it literally racially biased. That bias clearly is not the fault of the algorithm but the person who trained it. Nevertheless, the result is software that is less likely to accurately match certain races making it racially biased.
Yeah further: i was reading about a machine learning session which was trying to teach the machine was a Party is. So images of classrooms, parties, meetings etc. but the researchers forgot to include any black people in the Party images. So the machine concluded that if there's any black people in the picture it can't be a Party. That was an unintentional oversight on the part of the developers that translated into a racial bias in the algorithm.
Yeah, this is pretty dumb. It has nothing to do with facial recognition, it's entirely reasonable to use facial recognition (or any other unreliable witness statement) as a guide to choose a possible suspect. It's what happens after that which was the issue here, apparently.
It's the policing version of those "(generic, obvious thing)...on the internet!" patents we used to joke about. Bad police work... after facial recognition!
>> it's entirely reasonable to use as a guide Sure, if you're from a species that hasn't proven it has a fucking massive hardon for "the computer said".
The consequences range from mild inconvenience to fatality, because the scenario is universal, not unlike situations with "an accident", a word with a similarly wide range of impact.
The difference is one of the incompetences above is avoidable. Not by futilely trying to improve or educate the idiot human operator and his immutable human nature, of cours
Part of the problem with digital vs human facial recognition is the set of faces being compared against. In this case, he was matched to his drivers license photo. Will a human really go through the database of all drivers license photos in the entire state and claim to pick a match? Of course not. There are aspects of this that are unique to the scale, methodologies, and obtuse nature of technological systems.
There's a statistic claiming that among any million people, there is on average _one_ who looks identical to you. So identical that your mum, or your brother, or your husband/wife could be shown a photo of that person, or could meet them in person, and couldn't tell it wasn't you.
There was a case where a young woman was arrested for shoplifting and hitting a security guard, was identified by two security guards, got convicted and was extremely lucky that the case, with her photo was reported in the newspaper: Because the real culprit saw it, came forward, and she was set free. They showed photos of both side by side, and they were identical. Closer than identical twins usually are.
With that statistic, if you took a high quality photo of any bank robbery in the USA and compared it to 200 million license photos, you could make 200 arrests immediately. With at least 199 people being innocent, so at that point the similarity alone cannot be enough.
Cops keep making this same mistake over and over and over again.
Some new tech comes along. Fingerprints, DNA, IP addresses, ANPR, cell phone tracking, facial recognition. They buy into the hype that it's infallible and does their job for them, and start arresting people. Sometimes those people even get convicted by clueless juries, influenced by media hype and CSI.
They buy into the hype that it's infallible and does their job for them
...or they are aggressively sold the hype on the new technology by unscrupulous companies with large, effective marketing teams who want to make money and the police lack the independent scientific background to know better. This is made harder by the fact that usually, the new tech is useful and really can help them do their jobs better if handled properly which makes the hype they are sold far more believable.
It's the same ploy drug companies pull on doctors to sell their new wonder drug that they hav
The police have always used facial recognition by humans, both by the cops and the general public. This facial recognition is also unreliable.
So in all cases, either by computer program or by humans, you need corroborating evidence.
By corroborating evidence, you mean the business that has a $4000 8K HDTV for the front lobby visitors, while continuing to budget for My-First Security cameras made by Hasbro? 'Cause you know..priorities.
Basically the computer looked at the image and spit out a name. The police then issued a warrant for that person, and arrested him.
In interrogation, the guy looks at the surveillance photo, holds it up to his face and asks the interrogator if they look alike. The only similarity was "they were both black".
In fact, the police compared that image against the stored image of the guy and in a few seconds, you could tell the computer match was false.
In other words, someone could've spent 30 seconds comparing the image in the surveillance video with the guy's image, and basically tell they were different.
The computer spit out a name, No one takes a few seconds to see if the two people were even remotely similar to each other. It was a shoplifting case. I'm sure you could afford a few seconds.
That is not the case.
The computer identified him.
Then they set up a block of pictures that included him.
From the block of pictures, he was identified by one of the store's security people.
At that point, they got a warrant and he was arrested.
Since you have already misunderstood what actually happened, if you do compare the two pictures there is a lot of similarity between the two.
The problem is also one of scale. Human eyeballs possessed by police are in limited supply and expensive. You can't create a panopticon using police eyeballs without obviously creating a police state. People would resist that.
Cameras hooked up to facial recognition systems are much more abundant and cheaper. You can have your total surveillance of the population without all the problematic visuals of secret police tailing people everywhere and at much less expense.
by Anonymous Coward writes:
on Tuesday April 13, 2021 @10:14PM (#61270970)
Any IT job that you interview for, they ask you about your *arrest* record. Not convictions. Mainly because HR feels that one can buy an acquittal, but if a cop thinks you are guilty enough to pull out the handcuffs, you are guilty.
An -arrest- for shoplifting can fuck up someone's future with regards to jobs completely. Even if expunged, all the third party databases will have this entry in it forevermore, as well as the mug shots, which employers will find and ensure the candidate does not get hired on, under the "where is there smoke, there is fire" pretense.
I served on a jury trial and we voted to acquit. During deliberations, one guy refused absolutely to do this despite the lack of evidence and a poor case by the prosecutor. He comes into the trial at the start assuming the defendant must be guilty, because he was arrested, and would not change his mind. He eventually talked to the judge, I don't know what he said but he was removed from the jury and an alternate juror was added.
The department then produced a photo lineup that included Williams' picture, showed it to a security guard who hadn't actually witnessed the shoplifting incident, and obtained a warrant when that guard picked him from the lineup.
"Will you please pick the man you didn't see commit the crime from these pictures."
Laziness, incompetence and malfeasance should be grounds for government job termination.
Mr Williams and his attorneys will probably make a few buck from THE TAXPAYER, but the real folks responsible for this travesty are lazy police detectives. If policy doesn't say that at least 3 humans have to sign off that the crime image matches a suspect's drivers license photo or prior mug shot then police hire ups should be fired too.
That still means putting everything on a possible photo match. Using facial recognition as one investigative tool is fine, but it can't be sufficient for an arrest. There needs to be more than just two photos looking like they might be of the same person, no matter how confident the match. As a starting point it's fine, it's a lead worth following, but that's about it.
The natural response should be mass rioting in the urban shit hole that is Detroit. Throw some Molotov cocktails around, loot some minority-owned businesses, throw plastic water bottles loaded with cement at cops, set cars on fire, and the like. You know, the usual BLM bullshit.
That said, this guy obviously has a legitimate gripe and this never should have happened. Fuck the police!
Waiting for the idiots to come in saying it's working as designed, and that it's not discrimination therefore there are no problems anywhere in the chain of responsibility.
failed to mention the system's shortcomings -- including [...] known racial bias of recognition systems
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
FR algos are not perfect, and people's application and interpretation of it is not perfect. In order to improve either issue, they should probably not be conflated.
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
In this case they had a grainy photo, and the algorithm said "with this low quality photo, it is possible that Mr. Smith was the person on the photo". It should have also said "probably one in twenty random people could also be the person on that photo".
The real problem wasn't the photo, but the fact that the photo didn't justify a warrant, and a police officer basically created fake evidence out of thin air to justify a warrant.
If you look like a robber and a security guard thinks it was you, and you get arrested as a result, and released after 30 hours, that's just bad luck. If there is a photo that doesn't justify an arrest, and a police officer lets a security pick you out of a line up who has not seen the robbery and can't possibly know it was you, and if the police officer then lies to the judge about exactly what he did to get a warrant (because I'm sure the judge would have never signed the warrant if he or she had known the correct facts), then it is not bad luck, it is something the police force should pay you compensation for.
Exactly! The problem was shoddy (I'm being kind) police-work. Just a possible photo match may be enough to question someone, but an arrest? Hell no. I'm not sure why the judge shouldn't be sued as well, unless the detective presented a stack of well-crafted lies, the judge did not do his job.
"...and the known racial bias of recognition systems.
30 years from now when AI is smart enough to make humans look like a goldfish by comparison, are we still going to be making "racial bias" excuses for recognition systems, or is there perhaps some truth to that statement that humans can look alike?
Case 1: Future employer asks police about arrests. That question should come up clean. This arrest should never be mentioned, because only the wrong conclusions can be drawn.
Case 2: Someone calls your employer, telling them you are a criminal because they saw you being arrested (and as far as the caller knows they are telling the truth). Then YOU should be able to call the police, and they should be able to provide the true facts - that you were arrested because a police officer faked evidence against you.
I think that's how it works in Germany; YOU can call the police and ask them, and they can give YOU a complete list of everything they know about you, including illegal arrests, proven or likely false allegations, and of course convictions where you served time 20 years ago. All things that your future employer would not be told.
Germany is a country that doesn't suck. Here in the US, as soon as an arrest is made, and the mugshots/fingerprints/DNA tests hit the databases, private companies slurp that stuff off and send it offshore. Even if charges are dropped or it was obviously a mistake, that arrest is still permanently on the record, and cannot be removed. Then, during business hiring, those sites are checked, and if someone pops up with mug shots, they are escorted out of the interview and they are guarenteed that they will n
Do you know how many people have been wrongfully arrested in the past based on their description. decades before computers were ever used in law enforcement?
Obviously, but the roots of this problem have nothing to do with facial recognition technology.
Solving this problem at its core reduces to eliminating enough of the human race that everyone personally knows everyone else alive, and therefore this kind of mistaken identity cannot occur.
Everyone knows that human memory isn't reliable! The first memory is already a vague matching to patterns that were most dominant in the person's memory at that moment, based on what had happened in the short time before. And every time it is remembered again, it is re-modulated by the oned at *that* moment. So after a while, little is left.
Intentionally limiting the input at that time, with a lineup, mostly narrows that to those peope in that line then. Even unintentionally.
I'm from the "Don't feed the sharks" school (Score:5, Insightful)
Lawsuits seem about the only way to arrive at a sane analysis for balancing liberty with minimizing crime, and then feed that back into legislation.
Maybe some want to live in a panopticon, whereas others prefer a more traditional feel. One hopes that we can let people sort themselves by taste.
Re: (Score:0)
Re:I'm from the "Don't feed the sharks" school (Score:4, Insightful)
In some cases it can be caught early and corrected and maybe all that's needed is an apology. In other cases, the only way to really fix the problem is for someone to have to pay.
Re: (Score:2)
Wrongful arrest is wrongful arrest no matter whether it was due to some fault in a computer system, a jackass abusing their power, or even just an innocent mistake in some paperwork.
Yep, at least he wasn't thrown in federal prison for 20 years.
(...to share a designed-for-3-inmates cell with 5 large gangbangers).
Re: (Score:3)
I was just thinking that this is so much like some software development. Ie, I don't know what the root cause of the bug is, but I found something suspicious, I changed the code, and checked it in. I didn't test it myself, because that's time consuming and it gets in the way of my fast response that I do so that the boss thinks I'm not a slacker. I can't say I don't what the bug's cause was because the boss will think I'm incompetent, and the customer won't like that as an answer. Upper management is happ
Re: (Score:1)
Like any career criminal, they never change unless met with greater force.
Re: (Score:2)
Re: (Score:3)
"Tough on crime" as a slogan gets you re-elected much easier than "respect for the people". Even worse as a campaign slogan would be "respect for people who may not even look like you or live in your neighborhood."
biological facial recognition (Score:0)
Re:biological facial recognition (Score:5, Insightful)
"obtained a warrant when that guard who never saw him picked him from the lineup"
If you're going to quote, don't leave out the important part.
Re:biological facial recognition (Score:4, Insightful)
showed it to a security guard who hadn't actually witnessed the shoplifting incident
Note that it does not say that the guard "never saw him". It merely says the guard didn't witness the shoplifting incident.
Re:biological facial recognition (Score:5, Informative)
Defendant Bussa arranged a photo lineup with a security contractor who was not even present in the store on the day of the crime, and who had only watched the same grainy surveillance video that was already in Detective Bussa’s possession.[] When that person picked Mr. Williams from the lineup . . , The warrant request was faulty and misleading because Defendant Bussa hid the fact that the person who picked Mr. Williams out of the lineup had never actually seen the shoplifter in person.
https://www.aclumich.org/en/pr... [aclumich.org]
Re: (Score:2)
Re: (Score:2)
2) The guard wasn't a witness to a crime so why does his testimony matter.
You can think if you try but you gotta try.
Right or wrong, likely to fail (Score:2)
"Qualified immunity" (q.v.) means they can't be held to account unless there's a court precedent with near-identical facts. I'd be surprised if face recognition technology has been around long enough for that to have happened already.
Re: (Score:3)
Qualified immunity protects the individuals. It does not protect the Police Department itself.
Re: (Score:3)
While this is true, the police department passes its costs up the chain to the city, which in turn dutifully pays out taxpayers money to fund the lawsuits knowing that it can't do anything to actually stop the bad behavior because it has no control over the police (see QI again), and the police associations prevent group bad behavior from being acted upon too.
So nothing will happen.
Re: Right or wrong, likely to fail (Score:3)
> the city ... has no control over the police
Oh really?
Re: (Score:2)
It's the continuing narrative that places that have been blue for decades (or half a century plus) must cope with evil, racist police with no ability to do anything about it.
Ironically, the heavily blue areas blame police unions for this while also heavily promoting unions.
Even more ironically, people keep putting the same do-nothing "leaders" back in office locally and keep doing their level best to export the failed policies as far and wide as possible.
Finally, when called out on this insanity, they simpl
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened. There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures th
Re: (Score:2)
It is not a wrong narrative though. In Chicago as an example, they mayor asked the police for a report on how many times they served no-knock warrants on the wrong address. The police didn't bother answering and when they were finally forced to produce a report, it was missing several addresses where wrong raids were known to have happened.
Far from refuting what I said, you're actually reinforcing my point about putting do-nothing leaders with failed policies back in office. Throwing up your hands saying, "my subordinates will not do what I tell them to, I guess I tried" means either you need new subordinates, or you don't need to be doing the job.
The point is that they don't view themselves as her subordinates. Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
There are also issues with arbitration clauses that can reverse penalties that are out of line with ones issued in the past which is problematic if in the past, the police got almost no punishment for ex excessive force.
So stop accepting said arbitration clauses in the contract?
I realize negotiations aren't easy, and police unions really DO have a lot of power, but the "we can't do anything" mindset just ensures the cycle will continue. At some point, the politicians running the cities are going to have to choose between the police union and the people they ostensibly serve. Break them like Reagan broke the air traffic controllers. Of course, that would be politically disadvantageous for those politicians, wouldn't it?
The police have a fun set of pressure tactics that can make a politician's life miserable when they don't get what they want. They are also really good at spinning the arbitration clauses as a good thing. But I agree with you, in s
Re: (Score:2)
The point is that they don't view themselves as her subordinates.
Their view of the subject doesn't really matter, does it? "My boss is an idiot, fuck that guy" doesn't tend to do very well when the boss is willing to do something about that attitude.
Also, no one says she stopped trying, but they dragged the process out in an attempt to avoid accountability. For some reason they don't want to give up their ability to bash down doors at any hour they feel like.
If only they had access to some resources like subpoenas or state/federal authorities who could simply seize the information being withheld. The Chicago police department may think themselves to be above the law, but they only are as long as they are allowed to be. I'm sorry you can't see that the politicians who claim to b
Re: (Score:2)
Re: (Score:2)
Don't know about the locality where this happened, but Baltimore, Maryland, USA, hasn't controlled the Baltimore City PD since 1860. It's actually a state agency.
Re: (Score:2)
Re: (Score:1)
And at some point, the cities budget for the police department *does* run out an the police pensions get a severe reduction when the city declares bankruptcy and the federal pension backup system takes over the pensions.
Re: (Score:2)
Re: (Score:1)
I have very mixed emotions about that.
We need to do something about it... but we also need a police force. The police force as a career could easily reach a point where no qualified candidates wanted to work there.
I don't have a good answer.
Qualified immunity is bad. But so is losing your pension to a false accusation after 11 years work.
Re: (Score:2)
Qualified immunity is protection from criminal/civil action. It doesn't mean you can't be fired or fined, employment contracts can't be revised, laws can't be changed etc.
Re: (Score:2)
No, but the Police Associations and contracts written in the "Law and Order" obsessed 1990s deals with the whole issue of firing police who perform what, to everyone else, is obvious malpractice on the job.
We need wholesale reform of LE in this country. Unfortunately those who promoted defunding the police, together with those who adopted the slogan to cover their own sense of what reforms are needed, did everyone a huge disservice in the last year and made it harder to talk about that. But QI, the 1990s
This comment deserves +6. (Score:0)
Thank you for taking the time to make it.
Re: (Score:2)
No, but the Police Associations and contracts written in the "Law and Order" obsessed 1990s deals with the whole issue of firing police who perform what, to everyone else, is obvious malpractice on the job.
Let me preface this by saying: I agree in principle with your general point. Our police are militarized. Soldiers make (and always have made) awful police officers, because the primary mission of a solider is "kill people and break things" which is decidedly at odds with "serve and protect."
That said, it's disingenuous to blame "contracts written in the 90s" for today's ills, when it's the mindset that began in the 90s (actually, it's probably the 70s) that is the problem. No labor contract lasts for thre
Re: (Score:2)
Re: (Score:1)
Qualified immunity doesn't protect them from being punished by citizens in various ways. All legal. Banning them from stores, following them around, recording them while off duty, freezing them out / ostracizing them from society, etc. If citizens really cared this is the only way to go to punish cops that hide behind qualified immunity when for instance shooting unarmed people during a welfare check.
The cops can quite their job but that punishment will still haunt them for life. Not even moving could preve
Re: (Score:2)
If a police officer tells a security guard to pick out someone out of a lineup, _knowing_ that the security guard hasn't witnessed any crime, and that the security guard doesn't have any information that the police officer doesn't have, then this is either plainly incompetent, or an intentional attempt to get a warrant when getting a warrant wasn't justified, which is
could have the case expunged = have to pay for it? (Score:2)
could have the case expunged = have to pay for it?
At very least the cops need to cover all costs and all legal fees
Stop this B.S. already (Score:-1)
including a "woefully substandard" image and the known racial bias of recognition systems.
Poor image quality is a legitimate complaint. But there is no "racial bias" in computer software. Its not our fault that all monkeys look the same.
Re: (Score:3)
Re: (Score:2)
Re: Stop this B.S. already (Score:2)
Itâ(TM)s because mediafolk with mediocre education and basic math and science training are now AI experts.
Now they can use words without understanding what they mean, as long as they get the desired reaction.
What happened to the guy is messed up, but this whole AI âoerace biasâ thing is stupid and shows serious ignorance about how statistical learning works.
Re: (Score:2)
The point is that the person who trained the system or who selected the training data had biases. Maybe they selected a training set that had mostly white faces but didn't notice because it looked "normal" to them, causing the system to make errors more often when dealing with black faces. The system then recapitulates the biases of the people who worked on it. It's really not that hard to understand.
Re: (Score:2)
The point is that the person who trained the system or who selected the training data had biases
The bias of a user (which may very well be biased) has little to do with errors of a system.
causing the system to make errors more often when dealing with black faces
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
Re: (Score:2)
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias. It just forces the manufacturer to specify the worst of these random error in the datasheet as the instrument's error.
A systematic error over a particular range in a random sample of values is pretty much the textbook definition of bias.
Re: (Score:2)
Re: (Score:2)
The bias of a user (which may very well be biased) has little to do with errors of a system.
How the errors get into the system is irrelevant to whether the system is biased.
That's very much like some instruments having a different random error in different points of their measurement range, which is *still* not a bias.
Except, in this case, the randomness differs based upon race. The system isn't doing anything wrong by not being as accurate with black people. But, the system does not handle different races equally. That is all that matters in this context.
Re: (Score:2)
How the errors get into the system is irrelevant to whether the system is biased.
Perhaps true, but irrelevant here.
Except, in this case, the randomness differs based upon race.
But that's not bias. Bias is a systematic error, not a varying random error.
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Re: (Score:2)
Bias through Poor Training (Score:2)
Poor image quality is a legitimate complaint. But there is no "racial bias" in computer software.
Yes, there is if you do not properly train the algorithm. If you train it with predominantly white faces it will be objectively poorer at matching black people's faces making it literally racially biased. That bias clearly is not the fault of the algorithm but the person who trained it. Nevertheless, the result is software that is less likely to accurately match certain races making it racially biased.
Re: (Score:2)
Yeah further: i was reading about a machine learning session which was trying to teach the machine was a Party is. So images of classrooms, parties, meetings etc. but the researchers forgot to include any black people in the Party images. So the machine concluded that if there's any black people in the picture it can't be a Party. That was an unintentional oversight on the part of the developers that translated into a racial bias in the algorithm.
Who Cares? (Score:-1, Flamebait)
The black was guilty of something anyway. Why not just let the arrest stand?
Police Always Used Facial Recognition by Humans (Score:5, Insightful)
The police have always used facial recognition by humans, both by the cops and the general public. This facial recognition is also unreliable.
So in all cases, either by computer program or by humans, you need corroborating evidence.
Re: (Score:3)
Yeah, this is pretty dumb. It has nothing to do with facial recognition, it's entirely reasonable to use facial recognition (or any other unreliable witness statement) as a guide to choose a possible suspect. It's what happens after that which was the issue here, apparently.
It's the policing version of those "(generic, obvious thing)...on the internet!" patents we used to joke about. Bad police work... after facial recognition!
Re: (Score:1)
>> it's entirely reasonable to use as a guide
Sure, if you're from a species that hasn't proven it has a fucking massive hardon for "the computer said".
The consequences range from mild inconvenience to fatality, because the scenario is universal, not unlike situations with "an accident", a word with a similarly wide range of impact.
The difference is one of the incompetences above is avoidable. Not by futilely trying to improve or educate the idiot human operator and his immutable human nature, of cours
Re: Police Always Used Facial Recognition by Human (Score:3)
Part of the problem with digital vs human facial recognition is the set of faces being compared against. In this case, he was matched to his drivers license photo. Will a human really go through the database of all drivers license photos in the entire state and claim to pick a match? Of course not. There are aspects of this that are unique to the scale, methodologies, and obtuse nature of technological systems.
Re: Police Always Used Facial Recognition by Human (Score:5, Informative)
There was a case where a young woman was arrested for shoplifting and hitting a security guard, was identified by two security guards, got convicted and was extremely lucky that the case, with her photo was reported in the newspaper: Because the real culprit saw it, came forward, and she was set free. They showed photos of both side by side, and they were identical. Closer than identical twins usually are.
With that statistic, if you took a high quality photo of any bank robbery in the USA and compared it to 200 million license photos, you could make 200 arrests immediately. With at least 199 people being innocent, so at that point the similarity alone cannot be enough.
Re:Police Always Used Facial Recognition by Humans (Score:5, Insightful)
Cops keep making this same mistake over and over and over again.
Some new tech comes along. Fingerprints, DNA, IP addresses, ANPR, cell phone tracking, facial recognition. They buy into the hype that it's infallible and does their job for them, and start arresting people. Sometimes those people even get convicted by clueless juries, influenced by media hype and CSI.
Aggressive Marketing (Score:2)
They buy into the hype that it's infallible and does their job for them
It's the same ploy drug companies pull on doctors to sell their new wonder drug that they hav
Re: (Score:2, Insightful)
The police have always used facial recognition by humans, both by the cops and the general public. This facial recognition is also unreliable.
So in all cases, either by computer program or by humans, you need corroborating evidence.
By corroborating evidence, you mean the business that has a $4000 8K HDTV for the front lobby visitors, while continuing to budget for My-First Security cameras made by Hasbro? 'Cause you know..priorities.
Re:Police Always Used Facial Recognition by Humans (Score:4, Insightful)
But in this case it was particularly egregious.
Basically the computer looked at the image and spit out a name. The police then issued a warrant for that person, and arrested him.
In interrogation, the guy looks at the surveillance photo, holds it up to his face and asks the interrogator if they look alike. The only similarity was "they were both black".
In fact, the police compared that image against the stored image of the guy and in a few seconds, you could tell the computer match was false.
In other words, someone could've spent 30 seconds comparing the image in the surveillance video with the guy's image, and basically tell they were different.
The computer spit out a name, No one takes a few seconds to see if the two people were even remotely similar to each other. It was a shoplifting case. I'm sure you could afford a few seconds.
Re: (Score:1)
The computer identified him.
Then they set up a block of pictures that included him.
From the block of pictures, he was identified by one of the store's security people.
At that point, they got a warrant and he was arrested.
Since you have already misunderstood what actually happened, if you do compare the two pictures there is a lot of similarity between the two.
Re: (Score:2)
The problem is also one of scale. Human eyeballs possessed by police are in limited supply and expensive. You can't create a panopticon using police eyeballs without obviously creating a police state. People would resist that.
Cameras hooked up to facial recognition systems are much more abundant and cheaper. You can have your total surveillance of the population without all the problematic visuals of secret police tailing people everywhere and at much less expense.
Even without malintent, having ubiquito
Hope he wins... (Score:5, Insightful)
Any IT job that you interview for, they ask you about your *arrest* record. Not convictions. Mainly because HR feels that one can buy an acquittal, but if a cop thinks you are guilty enough to pull out the handcuffs, you are guilty.
An -arrest- for shoplifting can fuck up someone's future with regards to jobs completely. Even if expunged, all the third party databases will have this entry in it forevermore, as well as the mug shots, which employers will find and ensure the candidate does not get hired on, under the "where is there smoke, there is fire" pretense.
Re: (Score:0)
2021, allegation is everything
lol jk been here for years
Re: (Score:2)
I served on a jury trial and we voted to acquit. During deliberations, one guy refused absolutely to do this despite the lack of evidence and a poor case by the prosecutor. He comes into the trial at the start assuming the defendant must be guilty, because he was arrested, and would not change his mind. He eventually talked to the judge, I don't know what he said but he was removed from the jury and an alternate juror was added.
I think he should count himself lucky (Score:-1)
he didn't get shot or beaten to death.
Re: (Score:-1)
And so he should let the racist piece of shit pigs get away with it?
No.
And go fuck yourself.
Re: (Score:0)
sarcasm is complicated on the internet.
Re: (Score:2)
whoosh
Wat? (Score:3)
The department then produced a photo lineup that included Williams' picture, showed it to a security guard who hadn't actually witnessed the shoplifting incident, and obtained a warrant when that guard picked him from the lineup.
"Will you please pick the man you didn't see commit the crime from these pictures."
Qualified Immunity (Score:0)
Laziness, incompetence and malfeasance should be grounds for government job termination.
Mr Williams and his attorneys will probably make a few buck from THE TAXPAYER, but the real folks responsible for this travesty are lazy police detectives. If policy doesn't say that at least 3 humans have to sign off that the crime image matches a suspect's drivers license photo or prior mug shot then police hire ups should be fired too.
But in the end the taxpayer will take it the end.
Re: (Score:2)
Looking at the current live news... (Score:-1)
That said, this guy obviously has a legitimate gripe and this never should have happened. Fuck the police!
Re: (Score:0)
Sounds like you memorized your handbook. I expect you and your buddies brought your "BLM 4evar!" shirts and everything.
Working as designed (Score:2)
Conflated issues (Score:2)
failed to mention the system's shortcomings -- including [...] known racial bias of recognition systems
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
FR algos are not perfect, and people's application and interpretation of it is not perfect. In order to improve either issue, they should probably not be conflated.
Re:Conflated issues (Score:4, Insightful)
If it fails to tell one person from another of the same race (damn I loathe the USA obsession with that word), is it really racial bias? Because I am guessing that this is not an instance of the video showing a white guy and the algo saying "it's likely the black guy, it usually is".
In this case they had a grainy photo, and the algorithm said "with this low quality photo, it is possible that Mr. Smith was the person on the photo". It should have also said "probably one in twenty random people could also be the person on that photo".
The real problem wasn't the photo, but the fact that the photo didn't justify a warrant, and a police officer basically created fake evidence out of thin air to justify a warrant.
If you look like a robber and a security guard thinks it was you, and you get arrested as a result, and released after 30 hours, that's just bad luck. If there is a photo that doesn't justify an arrest, and a police officer lets a security pick you out of a line up who has not seen the robbery and can't possibly know it was you, and if the police officer then lies to the judge about exactly what he did to get a warrant (because I'm sure the judge would have never signed the warrant if he or she had known the correct facts), then it is not bad luck, it is something the police force should pay you compensation for.
Re: (Score:3)
AI; racist, or accurate? (Score:0)
"...and the known racial bias of recognition systems.
30 years from now when AI is smart enough to make humans look like a goldfish by comparison, are we still going to be making "racial bias" excuses for recognition systems, or is there perhaps some truth to that statement that humans can look alike?
Re: (Score:2)
What's in a police database? (Score:4, Insightful)
Case 2: Someone calls your employer, telling them you are a criminal because they saw you being arrested (and as far as the caller knows they are telling the truth). Then YOU should be able to call the police, and they should be able to provide the true facts - that you were arrested because a police officer faked evidence against you.
I think that's how it works in Germany; YOU can call the police and ask them, and they can give YOU a complete list of everything they know about you, including illegal arrests, proven or likely false allegations, and of course convictions where you served time 20 years ago. All things that your future employer would not be told.
Re: (Score:0)
Germany is a country that doesn't suck. Here in the US, as soon as an arrest is made, and the mugshots/fingerprints/DNA tests hit the databases, private companies slurp that stuff off and send it offshore. Even if charges are dropped or it was obviously a mistake, that arrest is still permanently on the record, and cannot be removed. Then, during business hiring, those sites are checked, and if someone pops up with mug shots, they are escorted out of the interview and they are guarenteed that they will n
Oh, please (Score:2)
Re: Oh, please (Score:2)
Now there's one more.
That is the problem.
Re: (Score:2)
Obviously, but the roots of this problem have nothing to do with facial recognition technology.
Solving this problem at its core reduces to eliminating enough of the human race that everyone personally knows everyone else alive, and therefore this kind of mistaken identity cannot occur.
Lineup? Is this a joke?? (Score:2)
Everyone knows that human memory isn't reliable!
The first memory is already a vague matching to patterns that were most dominant in the person's memory at that moment, based on what had happened in the short time before. And every time it is remembered again, it is re-modulated by the oned at *that* moment. So after a while, little is left.
Intentionally limiting the input at that time, with a lineup, mostly narrows that to those peope in that line then. Even unintentionally.
That's very close to deliberate c