Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
The Courts Crime Software United States

Wisconsin's Prison-Sentencing Algorithm Challenged in Court (engadget.com) 228

"Do you want a computer to help decide a convict's fate?" asks Engadget, telling the story of a Wisconsin convict who "claims that the justice system relied too heavily on its COMPAS algorithm to determine the likelihood of repeat offenses and sentenced him to six years in prison." Sentencing algorithms have apparently been in use for 10 years. His attorneys claim that the code is "full of holes," including secret criteria and generic decisions that aren't as individually tailored as they have to be. For instance, they'll skew predictions based on your gender or age -- how does that reflect the actual offender...?

[T]he court challenge could force Wisconsin and other states to think about the weight they give to algorithms. While they do hold the promise of both preventing repeat offenses and avoiding excessive sentences for low-threat criminals, the American Civil Liberties Union is worried that they can amplify biases or make mistakes based on imperfect law enforcement data.

The biggest issue seems to be a lack of transparency, which makes it impossible to determine whether convicts actually are receiving fair sentences.
This discussion has been archived. No new comments can be posted.

Wisconsin's Prison-Sentencing Algorithm Challenged in Court

Comments Filter:
  • Yes please (Score:5, Insightful)

    by penguinoid ( 724646 ) on Sunday June 26, 2016 @11:39PM (#52396383) Homepage Journal

    If you think a computer is biased and unfair and makes decisions using secret criteria, wait till you meet a human!

    • Re: (Score:2, Funny)

      by Anonymous Coward

      Can we close the comments section now please he said everything we need

      • His attorneys claim that the code is "full of holes," including secret criteria and generic decisions that aren't as individually tailored as they have to be.

        This statement was kinda funny . Unless one of his attorney is named Alan Cox and has a merlins beard ( which is highly unlikely).

    • Your comment offends me and I shall be asking the jury for .... One million dollars! As I'm from EU and we need dat shit right nao!
    • Re:Yes please (Score:5, Insightful)

      by Archfeld ( 6757 ) <treboreel@live.com> on Sunday June 26, 2016 @11:49PM (#52396425) Journal

      There is no AI involved, just some code written by humans. A computer cannot express compassion. IMHO Compassion is the highest virtue, and the first to be discarded. Hope can be rekindled, faith restored, love reunited but compassion once lost is rarely seen again. Even the guilty deserve compassion, it is that which elevates man from beast. We do that which is necessary but it should NOT stop us from being sorry it has to come to that.

      • by penguinoid ( 724646 ) on Monday June 27, 2016 @12:20AM (#52396553) Homepage Journal

        Having the computer show compassion is easy.
        if (criminal.gender == female & criminal.attractive == true)
              showCompassion = true;

        • by lucm ( 889690 ) on Monday June 27, 2016 @12:47AM (#52396619)

          Having the computer show compassion is easy.
          if (criminal.gender == female & criminal.attractive == true)

                showCompassion = true;

          that looks like the algorithm they use in custody battles.

          • showCompassion( )
            {

            haveErection ;
            sleep 1 ;

            }
          • by dave420 ( 699308 )

            So according to you every single court the world over behaves exactly the same..!

            If you want to complain about something at least do it with a modicum of ability. That was just pathetic to read.

            • by lucm ( 889690 )

              So according to you every single court the world over behaves exactly the same..!

              If you want to complain about something at least do it with a modicum of ability. That was just pathetic to read.

              dave420, maybe you should slow down with the "420", it seems to be impacting your thought process.

          • by AmiMoJo ( 196126 )

            It's not really a gender issue though, it applies to race and nationality and accent and stature and ask sorts of things.

          • Having the computer show compassion is easy.
            if (criminal.gender == female & criminal.attractive == true)


                  showCompassion = true;

            that looks like the algorithm they use in custody battles.

            No, that one is simpler. if(female==TRUE) then female.custody=1;

            It cost me $25k and four years to get full custody when it was blatantly apparent she was being abused.

        • if attractive = false
          AND
          tits (big) = true
          compassion = true
        • by cdrudge ( 68377 ) on Monday June 27, 2016 @08:06AM (#52397853) Homepage

          My compiler says that criminal.attractive is undefined. Can you post the code for that property so that the system knows what attractive is.

          Also, I think criminal.gender is deprecated. These days, I think you're suppose to use criminal.birthGender or criminal.identifiedGender.

        • Re:Yes please (Score:4, Informative)

          by mrchaotica ( 681592 ) * on Monday June 27, 2016 @08:11AM (#52397867)

          You missed the "... || criminal.wealthAndPower == high" clause.

          Also, you did a bitwise and instead of a logical one.

        • by e r ( 2847683 )
          Shorter and more legible:
          auto showCompassion = (criminal.gender == female & criminal.attractive);
      • if prisonOperator.hasPaidUs == true {
                    prisoner.sentence = sentenceMax }

    • Re:Yes please (Score:4, Insightful)

      by Actually, I do RTFA ( 1058596 ) on Sunday June 26, 2016 @11:53PM (#52396451)

      People are complaining because the computer is biased, and unfair, and uses secret criteria and it disagrees with human intuition.

      But algorithmic bias is worse than human bias, because people know other humans are biased, but give a pass to computers.

      • But algorithmic bias is worse than human bias, because people know other humans are biased, but give a pass to computers.

        Well, the latter is provably false, otherwise the complaints mentioned wouldn't exist in the first place.

    • Re:Yes please (Score:5, Insightful)

      by JoeMerchant ( 803320 ) on Monday June 27, 2016 @12:11AM (#52396517)

      The advantage of a computer algorithm is that it can be analyzed, refined, improved, and relied upon to function more or less as well as it did in the past (assuming the new inputs aren't radically different from the previous ones.)

      Now, judges - they come to their position through politics and lawyering, two selection criteria that would seem to get you the last people you would want to determine what is fair or reasonable for "the people at large." Even 200+ years ago this was recognized and is the basis for "trial by jury of peers" laws, and everyone knows that amounts to a random lottery style decision, but it's still better than leaving it up to a judge.

      • Re:Yes please (Score:4, Insightful)

        by taustin ( 171655 ) on Monday June 27, 2016 @01:12AM (#52396679) Homepage Journal

        The advantage of a computer algorithm is that it can be analyzed, refined, improved, and relied upon to function more or less as well as it did in the past (assuming the new inputs aren't radically different from the previous ones.)

        Not when the entire process is kept secret. Which is the complaint here.

        • You seem to be confusing the verdict with the sentence.

        • Re:Yes please (Score:4, Insightful)

          by JoeMerchant ( 803320 ) on Monday June 27, 2016 @08:09AM (#52397861)

          If the algorithm is secret, then it's worse than trusting a judge who can be removed from the bench any number of ways.

          Judges are supposed to follow the law, which is public knowledge. If we're algorithmically systematizing sentencing, that algorithm needs to be public too - before the evidence comes out in the algorithm's output.

      • Re:Yes please (Score:4, Insightful)

        by Anonymous Coward on Monday June 27, 2016 @02:57AM (#52396927)

        Most of the algo's used are "privately developed", and thus not released because that would involve revealing "business critical information" (which obviously trumps the interests of the person being sentenced on the basis of it). Because the primary aim is to provide private entities with contracts/work/govt money.

    • by mwvdlee ( 775178 )

      Atleast the "secret criteria" could be easily fixed by making it output it's reasoning.

      • Which is just a good habit for all decision-making systems built in the last forty years or so.
      • by GuB-42 ( 2483988 )

        The "secret criteria" is probably along the lines of "criteria X statistical significance is 0.02456 and a weight of -0.12466 improves the predictions by 0.03154". The criteria is probably not secret but rather completely opaque for the human mind.

    • "Bernie/Trump: Change that makes the establishment shit their pants."

      It would probably make bernie and trump shit their pants too, considering both of these men occupy polar opposites of the ideological spectrum.

      • by Z80a ( 971949 )

        Yep.
        All they have in common is not being hillary clinton, on the sense of not being THE choice the establishment made.

    • And if you've met a human, you know they're the ones that write these computer programs.

    • by sjames ( 1099 )

      That's understood and can lead to a reconsideration. Shoving that into a computer doesn't improve it, but does tend to shut down any discussion of unfairness among lay people who don't understand that Garbage in == Garbage out.

      It's the new version of the '70s corporate classic "Our *COMPUTER* says you still owe us $1.98!"

  • And this is different how to inscrutable judges who decide sentencing? Case in point Judge Aaron Persky and the Stanford rape case. Was there any openness or fairness in that sentence? I don't see Brock Turner complaining much.

    • Re: (Score:3, Insightful)

      by PopeRatzo ( 965947 )

      don't see Brock Turner complaining much.

      Actually, Brock Turner's dad absolutely did complain that 6 months was too severe of a sentence for his special snowflake rapist son.

  • by justcauseisjustthat ( 1150803 ) on Sunday June 26, 2016 @11:51PM (#52396437)
    In the US there are already way too many people in jail, letting buggy software sentence people is a joke.
    • by JoeMerchant ( 803320 ) on Monday June 27, 2016 @12:17AM (#52396537)

      If the software is open source, transparent, and clear in how it functions, I'd rather have sentencing software than a judge in most cases.

      The problem comes when you get politicians meddling in the software, writing in "zero tolerance" code and "mandatory minimum sentences" - they've done this already in the legislatures, tying judges hands in sentencing decisions. With the data available: prior convictions, credit scores, family ties, etc., software can determine probable outcomes of lenient sentences.

      Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.

      • Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.

        Wish I had mod points. Greatr point, that, and I wish I'd thought of it....

        • Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.

          Wish I had mod points. Greatr point, that, and I wish I'd thought of it....

          Why? Do you have scientific evidence that there is any meaningful strong correlation between credit scores and recidivism? I'm guessing you are just assuming there is a link without actually having data to back it up. How do you plan to account for people who have terrible credit scores for reasons beyond their control (age, poor health, bad luck, etc)?

      • Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.

        And pray tell what good is that for the double digit percentage of the population who have bad credit scores scores for reasons beyond their control? People who are young, very old, poor, disabled, unhealthy or have just had a bad run of luck can have a terrible credit score and it has no correlation with anything that should be a consideration in sentencing. Not everybody is a 45 year old white male with a college degree and a steady job and 2.5 kids. Credit scores are complicated and there are many re

        • As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simp

          • As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simple probation"

            Unless you can show me some causal relationship between credit score and recidivism it should not be a consideration unless the case is about something related to personal finance. Even using it only as a positive has a skewing effect. Minorities, poor, elderly, young all are more likely to have poor credit ratings. So you are not helping the most disadvantaged groups but you are helping white, middle/upper class privileged people based on "evidence" that is unrelated to the case. It's no different than

            • As others have said, the humanitarian side of the system errs on the side of forgiveness. If someone's credit is tanked, then don't consider their credit rating as a positive indicator of their compatibility with societies rules and expectations, look for other positive signs. This isn't saying "bad credit score: throw them under the jail and let them rot" this is saying "good credit score: indication that this person makes good on their promises, on a first offense give them benefit of the doubt and simple probation"

              Unless you can show me some causal relationship between credit score and recidivism it should not be a consideration unless the case is about something related to personal finance. Even using it only as a positive has a skewing effect. Minorities, poor, elderly, young all are more likely to have poor credit ratings. So you are not helping the most disadvantaged groups but you are helping white, middle/upper class privileged people based on "evidence" that is unrelated to the case. It's no different than saying "he looks like a nice young man so we shouldn't punish him too hard". It makes zero sense.

              Y'all ever been to Texas? "he looks like a nice young man so we shouldn't punish him too hard" gets taken to the extreme of letting rich kids off lightly for manslaughter, and executing innocents. You can say "it makes zero sense," to which the Texas judge (and most of the community) will say "that's how we do things here, son."

              Algorithmic sentencing is one way to move toward a more consistent system, less subject to the vagaries of individual jurisdictions. Judgements still have to be made based on some

              • by sjbe ( 173966 ) on Monday June 27, 2016 @02:18PM (#52400359)

                Y'all ever been to Texas?

                Not only have I been there I have family that lives there. Outside of Austin they should ask for a passport for most of the state to go there. I make no apologies when I say many Texans have some seriously messed up ideas about what constitutes "justice".

                Algorithmic sentencing is one way to move toward a more consistent system, less subject to the vagaries of individual jurisdictions.

                Consistency isn't necessarily as valuable as you might be implying. The entire point of having a judge is, you know, to judge things and come to a reasoned opinion about how the law should apply to a particular case. I could see an algorithm being useful as an aid to advise a judge on possible options but there are WAY too many corner cases for it to be a good idea to put it front and center. Are judges imperfect? Yep! That's why we have appeals systems. But you literally cannot come up with an algorithm that will properly address all the corner cases. You are merely turning the programmer and lawmakers into the de-facto judge which is a terrible idea. See three strikes laws [wikipedia.org] if you need an example of how stupid algorithmic sentencing can be.

                Judgements still have to be made based on something, and credit scores (and, more specifically, the underlying data from which they are computed) are one of the strongest windows into personality and prediction of future behavior we have in today's society.

                I reject your framing of the issue. You are presuming that credit scores have any meaningful relationship to criminality without presenting any actual evidence that such an assertion is backed by facts. You are extrapolating purchasing and financial management behavior to have some relationship to criminality without any basis. Even a correlation isn't adequate because there are all sorts of ridiculous correlations [hbr.org] between completely unrelated things. You have to PROVE a causal relationship between an individual person's credit score and their likelihood to commit future crimes for your argument to have any basis at all. Good luck with that.

      • by Kjella ( 173770 )

        Yes, I'm saying credit score is a point for consideration in sentencing - not decision of guilt or innocence, but once guilt has been established, credit score tells about a person's history of making good on commitments, and should be a strong predictor of their likelihood of meeting terms of a suspended sentence, probation or parole.

        Reminds me of a person I heard about that moved to the US with their partner, had secured a job and income but mortgage? Bzzzt, you got no credit score. Took up all the credit card loans they could find, paid them down a month later, perfect credit score. Doh. I agree that a bad credit score should count against you, no credit score should count as "too smart to pay >10% interest" not "never had any responsbility for paying on time"...

        • I can totally relate to this: fresh out of school with a Master's degree, couldn't get a home mortgage to save my life - totally had demonstrated ability to pay all the costs over the previous 12-18 months, but "insufficient history to secure a 30 year mortgage." I eventually did find a willing lender with good terms, and after a few decades of on-time payments, now they're trying to shove credit in every conceivable body orifice - phone contracts in the ears, store cards up the nose, etc.

    • by jopsen ( 885607 )
      Combating repeat offenders with longer prison sentences, ie. essentially locking them up... is not a solution.

      This is so misguided on so many levels, I don't even know where to start.
      • A reminder that the penal system serves several purposes and has several goals: Punishing people who break the law. Reforming law-breakers into upstanding citizens. Protecting upstanding citizens from dangerous elements. Threatening upstanding citizens with dire consequences if they break the law. For way too long, prisons were all about punishment, which doesn't serve much purpose, especially if the general populous doesn't know of it. Reforming criminals is much better. But you can't forget it's other g

  • use algorithms, if we believe we can generalize and use statistics and averages, thinking that people are more or less robots, obeying instincts and influence of environment and society, without free will.
    on the other hand, if we believe that each individual is different and has free will, there cannot be anything but superficial use for generalizations and averages, and certainly no sentencing algorithms.

    of course judges etc, who will sentence without algorithms can make mistakes, but that is the price of

  • Gender? (Score:2, Interesting)

    by Anonymous Coward

    Wait, it's iCurrentYear, why would we be giving any weight to gender in sentencing? Men and women have equal rights, yes? So why should say, someone who raped a teenager over 50 times get off scott free just because of their gender? [mirror.co.uk]

  • by CaptainDork ( 3678879 ) on Sunday June 26, 2016 @11:53PM (#52396457)

    ... and I thought Facebook was fucked up.

  • by labnet ( 457441 )

    Hmmm.. I reckon the algos started around 1980.
    https://commons.wikimedia.org/... [wikimedia.org]

  • by cowtamer ( 311087 ) on Monday June 27, 2016 @12:46AM (#52396615) Journal

    While the idea of using an algorithm to sentence a human being is bone-chilling, you might be able to justify this as a "formula" for sentencing -- which, of course, merits its own debate.

    What is unconscionable about this is the fact that it's a SECRET algorithm. As in closed source. Essentially a secret law.

    This has no place in democracy.

    (Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

    • (Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

      Heh. That actually reminded me of Minskys article on "Why programming is a good medium for expressing poorly understood and sloppily formulated ideas". Why, it even speaks of "A program as a court of law"! But I would like to see some experimental computerized justice - not in the sense of this thing that is being criticized, rather as a research vehicle - to simply see if it uncovers some of the "poorly understood" and "sloppily formulated" parts of our justice systems. Perhaps it could enable us to right

      • I would be all for this just as a means of sanity checking things and finding obvious contradictions.

        The most glaring example of a contradiction I can think of is the original ACA ruling. The first part of the ruling was to determine if the thing you had to pay for not having health insurance was a fine or a tax to determine if the plaintiffs had standing. The court found that it was in fact not a tax so therefore the plaintiffs had standing. The second part of the ruling was the part about expanded medic
    • Also, any algorithm which ingests statistical and demographic data is bound to come up with unpalatable and/or spurious demographic correlations [engadget.com] (since there is a causal link between poverty and crime and a historic link between race and poverty) which I wold rather have society refrain from codifying -- in law or in actual computer code).

      As opposed to having the same bias encoded, consciously or unconsciously, in the minds of judges?

      Doing this algorithmically at least raises the possibility of analysis and criticism (assuming it's transparent -- no argument there, transparency is critical), which means that if done with appropriate oversight and scrutiny, it may have significant advantages over human judgement. I would still want the human judge to be able to overrule the algorithm, though. And it would probably be good to add a review pr

  • Statistics (Score:5, Insightful)

    by Hognoxious ( 631665 ) on Monday June 27, 2016 @12:58AM (#52396649) Homepage Journal

    For instance, they'll skew predictions based on your gender or age -- how does that reflect the actual offender...?

    It doesn't and it's not intended to.

    It's statistics and probabilities, just like how insurance premiums work.

    • Re:Statistics (Score:5, Insightful)

      by NotInHere ( 3654617 ) on Monday June 27, 2016 @02:06AM (#52396813)

      like how insurance premiums work.

      That *is* skewing predictions based on gender and age.

      If a computer program deduces from the fact that you are male that you will live a shorter time, and this makes women receiving lower payments due to them being more likely to live longer.

      As this discriminated against women, this practice was swiftly forbidden by the EU. It probably wouldn't have been forbidden if it discriminated against men, but that's another issue. It was actual gender based discrimination.

      If you let a computer program take factors like skin color or religion into account, that computer program will arrive at similar conclusions that humans arrived:

      * blacks do more crimes per thousand people than white people
      * highly educated people do less crimes than less educated people
      * More muslims commit terrorist attacks than non-muslims

      This doesn't mean that black people are ape-like, or that less educated people are automatically criminals. It usually just is a correlation, which doesn't imply causation. If you took a black person, and remove them from all the racist and discriminatory influences, AND the lower social status (that has been caused by lots of discrimination before), then won't end up with a higher crime likeliness than if you took a "normal" white person. Its probably even more caused by the social status than discrimination.

      As for muslims, its simply caused because there is no powerful christian group that radicalizes christians to commit terror attacks. If you take the radicals out, the average muslim is as peace-loving as the average christian or jew.

      Denying the existence of these correlations to protect from discrimination is wrong. This is just fuel for people who claim that the existence is denied because people want to "cover up" something, or because people are too dumb, usual criticism by people like trump. It is also wrong though to do actual discrimination, like israel does it, or trump proposes.

      • If a computer program deduces from the fact that you are male that you will live a shorter time, and this makes women receiving lower payments due to them being more likely to live longer.

        What's that got to do with the likelihood of wrecking a car?

        If you took a black person, and remove them from all the racist and discriminatory influences, AND the lower social status (that has been caused by lots of discrimination before), then won't end up with a higher crime likeliness than if you took a "normal" whi

        • What's that got to do with the likelihood of wrecking a car?

          The EU ruling was about life insurances: http://www.bbc.com/news/busine... [bbc.com]

          The example I brought up has nothing to do with car insurances, right, although here there is lots of age based discrimination you can do for car insurances.

          The young people are likely to do accidents due to missing experience and lots of self esteem. The elderly drivers might cause accidents because they are senile or have limited senses (eyesight, etc).

          • by jabuzz ( 182671 )

            And you are not allowed to take gender into account either when deciding on car insurance premiums in the E.U. either.

          • The EU ruling was about life insurances: http://www.bbc.com/news/busine [bbc.com]... [bbc.com]

            And yet, it does mention "notably car insurance". Likewise, contrary to your (apparent) claim that this ruling is biased in favor of women, the article mentions that the premiums for car insurance and life insurance for women are likely to rise significantly as a result of this decision.

        • If a computer program deduces from the fact that you are male that you will live a shorter time, and this makes women receiving lower payments due to them being more likely to live longer.

          What's that got to do with the likelihood of wrecking a car?

          I guess that all depends on if you survive the crash. It's one of many reasons that men live statistically shorter lives.

      • by AmiMoJo ( 196126 )

        As this discriminated against women, this practice was swiftly forbidden by the EU. It probably wouldn't have been forbidden if it discriminated against men, but that's another issue. It was actual gender based discrimination.

        Actually, insurance companies are not allowed to charge men more for motor insurance because it is discriminatory. The EU applies the rules equally.

        * More muslims commit terrorist attacks than non-muslims

        The most recent stats I could find from the FBI [fbi.gov] say that Jews committed more terrorist attacks in the US in the period 2002-2005. I read something similar about Europe that I can't be bothered to find now, because this one has been debunked quote thoroughly.

        Denying the existence of these correlations to protect from discrimination is wrong.

        People aren't trying deny the existence of the ones you listed that are correct, in fact they are acknowle

        • Actually, insurance companies are not allowed to charge men more for motor insurance because it is discriminatory.

          An argument could be made that this is discriminatory against women, requiring them to pay higher premiums than are justified by actuarial risk, subsidizing male drivers -- who already make more money than they do.

          • by AmiMoJo ( 196126 )

            Interesting argument. I think it would likely fail though, as the courts tend to look as issues of equality as there needing to be a very good reason for there to be a difference, to achieve some "legitimate" aim. By "legitimate" they mean in terms of furthering greater equality or some other human rights related issues, or at least not disadvantaging one group.

            There is some commentary on the ruling here: http://www.slaughterandmay.com... [slaughterandmay.com]

            Basically they couldn't argue that offering one sex lower premiums had

      • by Kjella ( 173770 )

        As for muslims, its simply caused because there is no powerful christian group that radicalizes christians to commit terror attacks. If you take the radicals out, the average muslim is as peace-loving as the average christian or jew.

        If you took away the radicals, there'd be no more muslims strapping themselves to a bomb to become martyrs but there's an awful lot of muslims that do support the stoning of gays, adulterers, believers who leave the faith, people who insult the prophet and so on as a matter of law or that support the people who organize and commit terrorism. The last time I did the numbers there was more than 100 million muslims who supported gross violations of human rights and a helluva lot more who I'd call Westboro Bapt

    • by Ichijo ( 607641 )

      If convicts could get out of prison as soon as an insurance company will insure them against repeat offenses, then all sentences would be the same: they would be the grown-up version of "sit in the corner until you've learned your lesson."

      Can you think of a better incentive to rehabilitate than to know that you'll get out as soon as you're cured? Is there any good reason to keep people locked up longer than that? Is it wise to release people before they've rehabilitated?

  • The link supplied goes to a page with barely more text than a slashdot summary. Skip the middleman and go to the actual source [nytimes.com].

  • Whats in the algorithim: if $offender = poor { $sentence = $sentence + 5 } if $offender = homeless { $sentence = $sentence + 10 } if $offender = latino { $sentence = $sentence * 2 } elseif $offender = black { $sentence = $sentence * 3 }
    • by Tomahawk ( 1343 )

      So a poor homeless latino gets "(sentence + 5 + 10) * 2", and a poor homeless black man get "(sentence + 5 + 10) * 3"
      Sounds about right, unfortunately...

  • Comment removed based on user account deletion
  • Sounds like the same algorithms that are used to calculate your insurance premium...

  • by JonahsDad ( 1332091 ) on Monday June 27, 2016 @08:17AM (#52397889)
    Living in Wisconsin, I remembered reading about this last month. Here's the article from the Milwaukee Journal Sentinel:

    http://www.jsonline.com/news/c... [jsonline.com]

    They rely on cookies to limit the number of articles that you read. Unlike similar sites, they don't block you if you block cookies.

  • The problem is the algorithms were optimized for a specific set of people, and guess what, they weren't the most common offenders.

    That is, the algorithms are supposed to predict what you will do after you get out - go legit or commit more crimes.

    But they are very accurate for older, white offenders, but very inaccurate for younger, black offenders. Effectively the algorithms were written in an attempt to be race blind by ignoring race. But certain activities, like number of previous interactions with the

  • "claims that the justice system relied too heavily on its COMPAS algorithm to determine the likelihood of repeat offenses and sentenced him to six years in prison."

    So a defendent's past isn't supposed to be used in trial, but some unknown factor about what he MIGHT do in the future is allowed in sentencing? I'm all for executing criminals, but this sort of nonsense is just plain dangerous.

Beware of Programmers who carry screwdrivers. -- Leonard Brandwein

Working...