Wisconsin's Prison-Sentencing Algorithm Challenged in Court (engadget.com) 228
"Do you want a computer to help decide a convict's fate?" asks Engadget, telling the story of a Wisconsin convict who "claims that the justice system relied too heavily on its COMPAS algorithm to determine the likelihood of repeat offenses and sentenced him to six years in prison." Sentencing algorithms have apparently been in use for 10 years.
His attorneys claim that the code is "full of holes," including secret criteria and generic decisions that aren't as individually tailored as they have to be. For instance, they'll skew predictions based on your gender or age -- how does that reflect the actual offender...?
[T]he court challenge could force Wisconsin and other states to think about the weight they give to algorithms. While they do hold the promise of both preventing repeat offenses and avoiding excessive sentences for low-threat criminals, the American Civil Liberties Union is worried that they can amplify biases or make mistakes based on imperfect law enforcement data.
The biggest issue seems to be a lack of transparency, which makes it impossible to determine whether convicts actually are receiving fair sentences.
[T]he court challenge could force Wisconsin and other states to think about the weight they give to algorithms. While they do hold the promise of both preventing repeat offenses and avoiding excessive sentences for low-threat criminals, the American Civil Liberties Union is worried that they can amplify biases or make mistakes based on imperfect law enforcement data.
The biggest issue seems to be a lack of transparency, which makes it impossible to determine whether convicts actually are receiving fair sentences.
Yes please (Score:5, Insightful)
If you think a computer is biased and unfair and makes decisions using secret criteria, wait till you meet a human!
Re: (Score:2, Funny)
Can we close the comments section now please he said everything we need
full of holes (Score:2)
His attorneys claim that the code is "full of holes," including secret criteria and generic decisions that aren't as individually tailored as they have to be.
This statement was kinda funny . Unless one of his attorney is named Alan Cox and has a merlins beard ( which is highly unlikely).
Re: (Score:2)
Re:Yes please (Score:5, Insightful)
There is no AI involved, just some code written by humans. A computer cannot express compassion. IMHO Compassion is the highest virtue, and the first to be discarded. Hope can be rekindled, faith restored, love reunited but compassion once lost is rarely seen again. Even the guilty deserve compassion, it is that which elevates man from beast. We do that which is necessary but it should NOT stop us from being sorry it has to come to that.
Re:Yes please (Score:5, Funny)
Having the computer show compassion is easy.
if (criminal.gender == female & criminal.attractive == true)
showCompassion = true;
Re:Yes please (Score:5, Funny)
Having the computer show compassion is easy.
if (criminal.gender == female & criminal.attractive == true)
showCompassion = true;
that looks like the algorithm they use in custody battles.
Re: (Score:3)
{
haveErection ;
sleep 1
}
Re: (Score:2)
So according to you every single court the world over behaves exactly the same..!
If you want to complain about something at least do it with a modicum of ability. That was just pathetic to read.
Re: (Score:2)
So according to you every single court the world over behaves exactly the same..!
If you want to complain about something at least do it with a modicum of ability. That was just pathetic to read.
dave420, maybe you should slow down with the "420", it seems to be impacting your thought process.
Re: (Score:2)
It's not really a gender issue though, it applies to race and nationality and accent and stature and ask sorts of things.
Re: (Score:2)
Having the computer show compassion is easy.
if (criminal.gender == female & criminal.attractive == true)
showCompassion = true;
that looks like the algorithm they use in custody battles.
No, that one is simpler. if(female==TRUE) then female.custody=1;
It cost me $25k and four years to get full custody when it was blatantly apparent she was being abused.
Re: (Score:2)
AND
tits (big) = true
compassion = true
Re: (Score:2)
Are you claiming a syntax error on pseudocode? Or saying that pseudocode should always match the conventions of the prevailing real language?
Re: (Score:2)
Re:Yes please (Score:5, Funny)
My compiler says that criminal.attractive is undefined. Can you post the code for that property so that the system knows what attractive is.
Also, I think criminal.gender is deprecated. These days, I think you're suppose to use criminal.birthGender or criminal.identifiedGender.
Re:Yes please (Score:4, Informative)
You missed the "... || criminal.wealthAndPower == high" clause.
Also, you did a bitwise and instead of a logical one.
Re: (Score:2)
auto showCompassion = (criminal.gender == female & criminal.attractive);
Re: (Score:3)
if prisonOperator.hasPaidUs == true {
prisoner.sentence = sentenceMax }
Re:Yes please (Score:4, Insightful)
People are complaining because the computer is biased, and unfair, and uses secret criteria and it disagrees with human intuition.
But algorithmic bias is worse than human bias, because people know other humans are biased, but give a pass to computers.
Re: (Score:2)
But algorithmic bias is worse than human bias, because people know other humans are biased, but give a pass to computers.
Well, the latter is provably false, otherwise the complaints mentioned wouldn't exist in the first place.
Re:Yes please (Score:5, Insightful)
The advantage of a computer algorithm is that it can be analyzed, refined, improved, and relied upon to function more or less as well as it did in the past (assuming the new inputs aren't radically different from the previous ones.)
Now, judges - they come to their position through politics and lawyering, two selection criteria that would seem to get you the last people you would want to determine what is fair or reasonable for "the people at large." Even 200+ years ago this was recognized and is the basis for "trial by jury of peers" laws, and everyone knows that amounts to a random lottery style decision, but it's still better than leaving it up to a judge.
Re:Yes please (Score:4, Insightful)
The advantage of a computer algorithm is that it can be analyzed, refined, improved, and relied upon to function more or less as well as it did in the past (assuming the new inputs aren't radically different from the previous ones.)
Not when the entire process is kept secret. Which is the complaint here.
Re: (Score:2)
You seem to be confusing the verdict with the sentence.
Re:Yes please (Score:4, Insightful)
If the algorithm is secret, then it's worse than trusting a judge who can be removed from the bench any number of ways.
Judges are supposed to follow the law, which is public knowledge. If we're algorithmically systematizing sentencing, that algorithm needs to be public too - before the evidence comes out in the algorithm's output.
Re:Yes please (Score:4, Insightful)
Most of the algo's used are "privately developed", and thus not released because that would involve revealing "business critical information" (which obviously trumps the interests of the person being sentenced on the basis of it). Because the primary aim is to provide private entities with contracts/work/govt money.
Re: (Score:3)
Atleast the "secret criteria" could be easily fixed by making it output it's reasoning.
Re: (Score:2)
Re: (Score:3)
The "secret criteria" is probably along the lines of "criteria X statistical significance is 0.02456 and a weight of -0.12466 improves the predictions by 0.03154". The criteria is probably not secret but rather completely opaque for the human mind.
Re: (Score:2)
"Bernie/Trump: Change that makes the establishment shit their pants."
It would probably make bernie and trump shit their pants too, considering both of these men occupy polar opposites of the ideological spectrum.
Re: (Score:2)
Yep.
All they have in common is not being hillary clinton, on the sense of not being THE choice the establishment made.
Re: (Score:2)
And if you've met a human, you know they're the ones that write these computer programs.
Re: (Score:2)
That's understood and can lead to a reconsideration. Shoving that into a computer doesn't improve it, but does tend to shut down any discussion of unfairness among lay people who don't understand that Garbage in == Garbage out.
It's the new version of the '70s corporate classic "Our *COMPUTER* says you still owe us $1.98!"
Re: (Score:2)
Well, obviously if the algorithm treats race as a sentencing factor then that too needs to be removed.
Nobody's suggested so far that it does take race into account. Good. Criminals need appropriate sentencing, irrespective of gender, race or age.
And this is different how? (Score:2)
And this is different how to inscrutable judges who decide sentencing? Case in point Judge Aaron Persky and the Stanford rape case. Was there any openness or fairness in that sentence? I don't see Brock Turner complaining much.
Re: (Score:3, Insightful)
Actually, Brock Turner's dad absolutely did complain that 6 months was too severe of a sentence for his special snowflake rapist son.
Re: (Score:3)
He wasn't convicted of rape.
Actually, the prescribed sentence for rape in California is 3, 6, or 8 years, depending on the circumstances; all of these are less than the maximum sentence for the crimes for which Brock was convicted. zhe was convicted on three felony counts: assault with intent to commit rape of an intoxicated or unconscious person, sexual penetration of an intoxicated person and sexual penetration of an unconscious person. Maximum prison sentence of 10 years.
Hard to imagine that a computer would have come up with a sen
Justice is blind and buggy (Score:3)
Re:Justice is blind and buggy (Score:5, Interesting)
If the software is open source, transparent, and clear in how it functions, I'd rather have sentencing software than a judge in most cases.
The problem comes when you get politicians meddling in the software, writing in "zero tolerance" code and "mandatory minimum sentences" - they've done this already in the legislatures, tying judges hands in sentencing decisions. With the data available: prior convictions, credit scores, family ties, etc., software can determine probable outcomes of lenient sentences.
Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.
Re: (Score:2)
Wish I had mod points. Greatr point, that, and I wish I'd thought of it....
Supporting data? (Score:2)
Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.
Wish I had mod points. Greatr point, that, and I wish I'd thought of it....
Why? Do you have scientific evidence that there is any meaningful strong correlation between credit scores and recidivism? I'm guessing you are just assuming there is a link without actually having data to back it up. How do you plan to account for people who have terrible credit scores for reasons beyond their control (age, poor health, bad luck, etc)?
Favoring the rich and priviliged (Score:2)
Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.
And pray tell what good is that for the double digit percentage of the population who have bad credit scores scores for reasons beyond their control? People who are young, very old, poor, disabled, unhealthy or have just had a bad run of luck can have a terrible credit score and it has no correlation with anything that should be a consideration in sentencing. Not everybody is a 45 year old white male with a college degree and a steady job and 2.5 kids. Credit scores are complicated and there are many re
Re: (Score:2)
As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simp
Punishes the disadvantaged (Score:2)
As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simple probation"
Unless you can show me some causal relationship between credit score and recidivism it should not be a consideration unless the case is about something related to personal finance. Even using it only as a positive has a skewing effect. Minorities, poor, elderly, young all are more likely to have poor credit ratings. So you are not helping the most disadvantaged groups but you are helping white, middle/upper class privileged people based on "evidence" that is unrelated to the case. It's no different than
Re: (Score:3)
As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simple probation"
Unless you can show me some causal relationship between credit score and recidivism it should not be a consideration unless the case is about something related to personal finance. Even using it only as a positive has a skewing effect. Minorities, poor, elderly, young all are more likely to have poor credit ratings. So you are not helping the most disadvantaged groups but you are helping white, middle/upper class privileged people based on "evidence" that is unrelated to the case. It's no different than saying "he looks like a nice young man so we shouldn't punish him too hard". It makes zero sense.
Y'all ever been to Texas? "he looks like a nice young man so we shouldn't punish him too hard" gets taken to the extreme of letting rich kids off lightly for manslaughter, and executing innocents. You can say "it makes zero sense," to which the Texas judge (and most of the community) will say "that's how we do things here, son."
Algorithmic sentencing is one way to move toward a more consistent system, less subject to the vagaries of individual jurisdictions. Judgements still have to be made based on some
Algorithmic sentencing is a terrible idea (Score:4, Insightful)
Y'all ever been to Texas?
Not only have I been there I have family that lives there. Outside of Austin they should ask for a passport for most of the state to go there. I make no apologies when I say many Texans have some seriously messed up ideas about what constitutes "justice".
Algorithmic sentencing is one way to move toward a more consistent system, less subject to the vagaries of individual jurisdictions.
Consistency isn't necessarily as valuable as you might be implying. The entire point of having a judge is, you know, to judge things and come to a reasoned opinion about how the law should apply to a particular case. I could see an algorithm being useful as an aid to advise a judge on possible options but there are WAY too many corner cases for it to be a good idea to put it front and center. Are judges imperfect? Yep! That's why we have appeals systems. But you literally cannot come up with an algorithm that will properly address all the corner cases. You are merely turning the programmer and lawmakers into the de-facto judge which is a terrible idea. See three strikes laws [wikipedia.org] if you need an example of how stupid algorithmic sentencing can be.
Judgements still have to be made based on something, and credit scores (and, more specifically, the underlying data from which they are computed) are one of the strongest windows into personality and prediction of future behavior we have in today's society.
I reject your framing of the issue. You are presuming that credit scores have any meaningful relationship to criminality without presenting any actual evidence that such an assertion is backed by facts. You are extrapolating purchasing and financial management behavior to have some relationship to criminality without any basis. Even a correlation isn't adequate because there are all sorts of ridiculous correlations [hbr.org] between completely unrelated things. You have to PROVE a causal relationship between an individual person's credit score and their likelihood to commit future crimes for your argument to have any basis at all. Good luck with that.
Re: (Score:2)
Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.
Reminds me of a person I heard about that moved to the US with their partner, had secured a job and income but mortgage? Bzzzt, you got no credit score. Took up all the credit card loans they could find, paid them down a month later, perfect credit score. Doh. I agree that a bad credit score should count against you, no credit score should count as "too smart to pay >10% interest" not "never had any responsbility for paying on time"...
Re: (Score:2)
I can totally relate to this: fresh out of school with a Master's degree, couldn't get a home mortgage to save my life - totally had demonstrated ability to pay all the costs over the previous 12-18 months, but "insufficient history to secure a 30 year mortgage." I eventually did find a willing lender with good terms, and after a few decades of on-time payments, now they're trying to shove credit in every conceivable body orifice - phone contracts in the ears, store cards up the nose, etc.
Re: (Score:2)
Clearly, credit score shouldn't be the sole, or even a heavy, factor in sentencing decisions - but it, or something like it, is one for human judges.
If I were writing the algorithm, I'd also look into things like: how hard has the person "actively worked" on their credit score - tell-tales like taking out a large number of trivial loans, active disputes in the system, etc. There's a baseline for the "average citizen," and there's cases where people's activities are 3SD outside the mean. One might say that
Re: (Score:3)
No, what we should really be doing in the US is holding you fuckers at the CRAs accountable for your systemic pattern of libel. CRAs should be obligated to vet any information submitted to them for accuracy before passing it along to others or using it to calculate a credit score.
Re: (Score:3)
That's called a credit dispute - I've had them, I'd be suspicious of anyone over the age of 35 who hasn't had at least one by that time.
If you've had one run-in with one crappy landlord, that should exonerate any impact that single case had on your credit rating (I managed to get through my crappy landlord experience without a court case, but many good tenants before me did not, and the algorithm could actually look into that and forgive them their head-butting since the pattern is clear that the other part
Re: (Score:2)
"I'd be suspicious of anyone over the age of 35 who hasn't had at least one by that time."
And this is why, dear USA, your country is fucked.
Re: (Score:2)
"I'd be suspicious of anyone over the age of 35 who hasn't had at least one by that time."
And this is why, dear USA, your country is fucked.
My first dispute was with AT&T - back before their terms and conditions included "we can change the rules whenever we want by posting them on our website, your payment of your bill indicates your agreement with our current terms." They were the most f-ing arrogant company, ever. My personal dispute was because they tripled their billing rates, and only notified me by sending a bill with the newly tripled rates. It was worth the hassle I got for non-payment of the $35 just to be able to tell the story
Re: (Score:2)
Re: Justice is blind and buggy (Score:2)
Re: (Score:2)
Re: (Score:3)
My first black stain came from student loans that got erroneously put into repayment after my freshman year, showed multiple 120+ days past due on the report, nobody informed me until after graduate school almost 6 years later. These things get explained and the final decision takes those explanations into account.
So, if the algorithm says: "650, maximum sentence." that's a bad algorithm, use a judge instead. If the algorithm can take into account circumstances, disputes, resolutions, etc. then it might b
Re: (Score:3)
This is so misguided on so many levels, I don't even know where to start.
Re: (Score:2)
A reminder that the penal system serves several purposes and has several goals: Punishing people who break the law. Reforming law-breakers into upstanding citizens. Protecting upstanding citizens from dangerous elements. Threatening upstanding citizens with dire consequences if they break the law. For way too long, prisons were all about punishment, which doesn't serve much purpose, especially if the general populous doesn't know of it. Reforming criminals is much better. But you can't forget it's other g
Re: (Score:2)
That is a false dichotomy. If you honestly think that is the only alternative, you probably shouldn't be commenting on this subject until you fix that...
Re: (Score:2)
Here's an idea: why don't you give him a few examples of the alternatives to (keep repeat offender locked up) or (setting repeat offender free to offend again). If you aren't willing do that, maybe you shouldn't be commenting on this subject either.
it's the mindset (Score:2)
use algorithms, if we believe we can generalize and use statistics and averages, thinking that people are more or less robots, obeying instincts and influence of environment and society, without free will.
on the other hand, if we believe that each individual is different and has free will, there cannot be anything but superficial use for generalizations and averages, and certainly no sentencing algorithms.
of course judges etc, who will sentence without algorithms can make mistakes, but that is the price of
Re: (Score:2)
Algorithms are much harder to bribe, with money, political favors, etc.
Gender? (Score:2, Interesting)
Wait, it's iCurrentYear, why would we be giving any weight to gender in sentencing? Men and women have equal rights, yes? So why should say, someone who raped a teenager over 50 times get off scott free just because of their gender? [mirror.co.uk]
Re: (Score:2)
You're right. Most rapists don't get any jail time at all.
Re: (Score:2)
Quite correct, since most rapes are male on male prison rapes
That's not quite true. While it does skew the distribution, there's no evidence it changes the balance. (Admittedly, thought, it's not being given the same weight in media as the things happening outside prisons, but whether that's because of misandry or because of the goold ole' American view of "who cares about prisoners" is anyone's guess.
Algorithms for sentencing ... (Score:3)
... and I thought Facebook was fucked up.
1980 (Score:2)
Hmmm.. I reckon the algos started around 1980.
https://commons.wikimedia.org/... [wikimedia.org]
Re: (Score:2)
I'm thinking more like 20 minutes into the future... [youtube.com]
The real issue is lack of transparency (Score:5, Insightful)
While the idea of using an algorithm to sentence a human being is bone-chilling, you might be able to justify this as a "formula" for sentencing -- which, of course, merits its own debate.
What is unconscionable about this is the fact that it's a SECRET algorithm. As in closed source. Essentially a secret law.
This has no place in democracy.
(Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).
Re: (Score:3)
(Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).
Heh. That actually reminded me of Minskys article on "Why programming is a good medium for expressing poorly understood and sloppily formulated ideas". Why, it even speaks of "A program as a court of law"! But I would like to see some experimental computerized justice - not in the sense of this thing that is being criticized, rather as a research vehicle - to simply see if it uncovers some of the "poorly understood" and "sloppily formulated" parts of our justice systems. Perhaps it could enable us to right
Re: (Score:2)
The most glaring example of a contradiction I can think of is the original ACA ruling. The first part of the ruling was to determine if the thing you had to pay for not having health insurance was a fine or a tax to determine if the plaintiffs had standing. The court found that it was in fact not a tax so therefore the plaintiffs had standing. The second part of the ruling was the part about expanded medic
Re: (Score:2)
Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).
As opposed to having the same bias encoded, consciously or unconsciously, in the minds of judges?
Doing this algorithmically at least raises the possibility of analysis and criticism (assuming it's transparent -- no argument there, transparency is critical), which means that if done with appropriate oversight and scrutiny, it may have significant advantages over human judgement. I would still want the human judge to be able to overrule the algorithm, though. And it would probably be good to add a review pr
Statistics (Score:5, Insightful)
It doesn't and it's not intended to.
It's statistics and probabilities, just like how insurance premiums work.
Re:Statistics (Score:5, Insightful)
like how insurance premiums work.
That *is* skewing predictions based on gender and age.
If a computer program deduces from the fact that you are male that you will live a shorter time, and this makes women receiving lower payments due to them being more likely to live longer.
As this discriminated against women, this practice was swiftly forbidden by the EU. It probably wouldn't have been forbidden if it discriminated against men, but that's another issue. It was actual gender based discrimination.
If you let a computer program take factors like skin color or religion into account, that computer program will arrive at similar conclusions that humans arrived:
* blacks do more crimes per thousand people than white people
* highly educated people do less crimes than less educated people
* More muslims commit terrorist attacks than non-muslims
This doesn't mean that black people are ape-like, or that less educated people are automatically criminals. It usually just is a correlation, which doesn't imply causation. If you took a black person, and remove them from all the racist and discriminatory influences, AND the lower social status (that has been caused by lots of discrimination before), then won't end up with a higher crime likeliness than if you took a "normal" white person. Its probably even more caused by the social status than discrimination.
As for muslims, its simply caused because there is no powerful christian group that radicalizes christians to commit terror attacks. If you take the radicals out, the average muslim is as peace-loving as the average christian or jew.
Denying the existence of these correlations to protect from discrimination is wrong. This is just fuel for people who claim that the existence is denied because people want to "cover up" something, or because people are too dumb, usual criticism by people like trump. It is also wrong though to do actual discrimination, like israel does it, or trump proposes.
Re: (Score:2)
What's that got to do with the likelihood of wrecking a car?
Re: (Score:2)
What's that got to do with the likelihood of wrecking a car?
The EU ruling was about life insurances: http://www.bbc.com/news/busine... [bbc.com]
The example I brought up has nothing to do with car insurances, right, although here there is lots of age based discrimination you can do for car insurances.
The young people are likely to do accidents due to missing experience and lots of self esteem. The elderly drivers might cause accidents because they are senile or have limited senses (eyesight, etc).
Re: (Score:2)
And you are not allowed to take gender into account either when deciding on car insurance premiums in the E.U. either.
Re: (Score:2)
The EU ruling was about life insurances: http://www.bbc.com/news/busine [bbc.com]... [bbc.com]
And yet, it does mention "notably car insurance". Likewise, contrary to your (apparent) claim that this ruling is biased in favor of women, the article mentions that the premiums for car insurance and life insurance for women are likely to rise significantly as a result of this decision.
Re: (Score:2)
If a computer program deduces from the fact that you are male that you will live a shorter time, and this makes women receiving lower payments due to them being more likely to live longer.
What's that got to do with the likelihood of wrecking a car?
I guess that all depends on if you survive the crash. It's one of many reasons that men live statistically shorter lives.
Re: (Score:2)
As this discriminated against women, this practice was swiftly forbidden by the EU. It probably wouldn't have been forbidden if it discriminated against men, but that's another issue. It was actual gender based discrimination.
Actually, insurance companies are not allowed to charge men more for motor insurance because it is discriminatory. The EU applies the rules equally.
* More muslims commit terrorist attacks than non-muslims
The most recent stats I could find from the FBI [fbi.gov] say that Jews committed more terrorist attacks in the US in the period 2002-2005. I read something similar about Europe that I can't be bothered to find now, because this one has been debunked quote thoroughly.
Denying the existence of these correlations to protect from discrimination is wrong.
People aren't trying deny the existence of the ones you listed that are correct, in fact they are acknowle
Re: (Score:2)
Actually, insurance companies are not allowed to charge men more for motor insurance because it is discriminatory.
An argument could be made that this is discriminatory against women, requiring them to pay higher premiums than are justified by actuarial risk, subsidizing male drivers -- who already make more money than they do.
Re: (Score:2)
Interesting argument. I think it would likely fail though, as the courts tend to look as issues of equality as there needing to be a very good reason for there to be a difference, to achieve some "legitimate" aim. By "legitimate" they mean in terms of furthering greater equality or some other human rights related issues, or at least not disadvantaging one group.
There is some commentary on the ruling here: http://www.slaughterandmay.com... [slaughterandmay.com]
Basically they couldn't argue that offering one sex lower premiums had
Re: (Score:2)
As for muslims, its simply caused because there is no powerful christian group that radicalizes christians to commit terror attacks. If you take the radicals out, the average muslim is as peace-loving as the average christian or jew.
If you took away the radicals, there'd be no more muslims strapping themselves to a bomb to become martyrs but there's an awful lot of muslims that do support the stoning of gays, adulterers, believers who leave the faith, people who insult the prophet and so on as a matter of law or that support the people who organize and commit terrorism. The last time I did the numbers there was more than 100 million muslims who supported gross violations of human rights and a helluva lot more who I'd call Westboro Bapt
Re: (Score:3)
If convicts could get out of prison as soon as an insurance company will insure them against repeat offenses, then all sentences would be the same: they would be the grown-up version of "sit in the corner until you've learned your lesson."
Can you think of a better incentive to rehabilitate than to know that you'll get out as soon as you're cured? Is there any good reason to keep people locked up longer than that? Is it wise to release people before they've rehabilitated?
Link to actual story (Score:2)
The link supplied goes to a page with barely more text than a slashdot summary. Skip the middleman and go to the actual source [nytimes.com].
what I found in their source code: (Score:2)
Re: (Score:2)
So a poor homeless latino gets "(sentence + 5 + 10) * 2", and a poor homeless black man get "(sentence + 5 + 10) * 3"
Sounds about right, unfortunately...
Re: (Score:2)
Insurance (Score:2)
Sounds like the same algorithms that are used to calculate your insurance premium...
Original article I read on the subject (Score:4, Informative)
http://www.jsonline.com/news/c... [jsonline.com]
They rely on cookies to limit the number of articles that you read. Unlike similar sites, they don't block you if you block cookies.
Demonstaratingly unfair (Score:2)
The problem is the algorithms were optimized for a specific set of people, and guess what, they weren't the most common offenders.
That is, the algorithms are supposed to predict what you will do after you get out - go legit or commit more crimes.
But they are very accurate for older, white offenders, but very inaccurate for younger, black offenders. Effectively the algorithms were written in an attempt to be race blind by ignoring race. But certain activities, like number of previous interactions with the
Hmmm... (Score:2)
"claims that the justice system relied too heavily on its COMPAS algorithm to determine the likelihood of repeat offenses and sentenced him to six years in prison."
So a defendent's past isn't supposed to be used in trial, but some unknown factor about what he MIGHT do in the future is allowed in sentencing? I'm all for executing criminals, but this sort of nonsense is just plain dangerous.
Re:The 'real' software (Score:5, Insightful)
Why don't they use the fundamentals used in developing the algorithm to identify and eliminate the root causes of crime (through rehabilitation/counseling) and thus reduce crime in highly prone areas?
Honestly enough they should, there are pushes to do so, it's just that funding is an issue, as always. Funny thing is, though, it's supposed to actually save money!
Okay, up here in Alaska low oil prices have resulted in a government crisis - the government's income has dropped drastically. One of the reforms being put in place, finally, is 'community corrections', which has been shown to save money by actually preventing repeat criminal occurrences. As they mentioned, locking people up for long periods actually increases the chance they'll reoffend, especially if you don't provide support after they get out.
So rather than locking somebody up for 12 years, you lock them up for, say, 4. You take the money for the next 4 years of prison and put it into rehabilitating the criminal, which is enough to cover extended services in prison, as well as at least 4 years outside, because, surprise, it's cheaper than keeping them in prison. The last 4 years of prison sentence avoided is pure savings, though they mentioned that they're putting half the money into what you mentioned - addressing the core situations causing criminality in the first place.
As for the op -
His attorneys claim that the code is "full of holes," including secret criteria and generic decisions that aren't as individually tailored as they have to be. For instance, they'll skew predictions based on your gender or age -- how does that reflect the actual offender...?
Secret criteria is a problem, but gender and age have clear differences on how likely you are to re-offend. If you're a first time offender at 40 odds are something very strange happened, and if you fix that problem the person is unlikely to offend again, while being a first time offender at 15 is a bad sign that it might become a habit.
Here is the problem (Score:3)
An algorithm should NEVER give a sentence length based on recidivism rate, it should solely based on the crime itself, and past recidivism. Giving longer sentence based on other criteria like probability f recidivism i
Re: (Score:2)
eliminate the root causes of crime
They are. They've correlated it to gun ownership in general. If you have a problem with that, then you can see why that might not work.
Re: (Score:3)
Actual repeat offender? Sure take that into account.
Projected repeat offender on grounds of 'young man' ? Sorry but that's ageist and sexist, and very much grounds for challenge as basing sentencing on that runs very counter to provision of equal justice.
Re: (Score:2)
I don't support pre-emptive punishment, so I can't support sentencing based on probabilities.
This is especially true where the justice system is inherently sexist, itself causing at least some of the gender disparity in offending and re-offending rates.
Use the probabilities to guide and direct resources aimed at prevention, not to punish in advance. A 70% chance of recidivism means 30% of people would be unfairly punished. I can't support that.
Re: (Score:2)
Humans are only statistics in aggregate. An individual deserves equal rights.
Re: (Score:2)
I think it's because the justice/prison system are heavily involved in misusing or misunderstanding technology.
Slashdot readers are interested in technology generally, and also on balance tend towards preferring actual justice (rather than 'social' justice) so want to know about abuses of the system.