When Sentencing Criminals, Should Judges Use Closed-Source Algorithms? (technologyreview.com) 196
Some judges in America have recently started using a closed-source algorithm that predicts how likely convicts are to commit another crime. Mosquito Bites shared an article by law professor Frank Pasquale raising concerns about the algorithms:
They may seem scientific, an injection of computational rationality into a criminal justice system riddled with discrimination and inefficiency. However, they are troubling for several reasons: many are secretly computed; they deny due process and intelligible explanations to defendants; and they promote a crabbed and inhumane vision of the role of punishment in society...
When an algorithmic scoring process is kept secret, it is impossible to challenge key aspects of it. How is the algorithm weighting different data points, and why? Each of these inquiries is crucial to two core legal principles: due process, and the ability to meaningfully appeal an adverse decision... A secret risk assessment algorithm that offers a damning score is analogous to evidence offered by an anonymous expert, whom one cannot cross-examine... Humans are in charge of governments, and can demand explanations for decisions in natural language, not computer code. Failing to do so in the criminal context risks ceding inherently governmental and legal functions to an unaccountable computational elite.
This issue will grow more and more important, the law professor argues, since there's now proprietary analytics software that also predicts "the chances that any given person will be mentally ill, a bad employee, a failing student, a criminal, or a terrorist."
When an algorithmic scoring process is kept secret, it is impossible to challenge key aspects of it. How is the algorithm weighting different data points, and why? Each of these inquiries is crucial to two core legal principles: due process, and the ability to meaningfully appeal an adverse decision... A secret risk assessment algorithm that offers a damning score is analogous to evidence offered by an anonymous expert, whom one cannot cross-examine... Humans are in charge of governments, and can demand explanations for decisions in natural language, not computer code. Failing to do so in the criminal context risks ceding inherently governmental and legal functions to an unaccountable computational elite.
This issue will grow more and more important, the law professor argues, since there's now proprietary analytics software that also predicts "the chances that any given person will be mentally ill, a bad employee, a failing student, a criminal, or a terrorist."
No (Score:2, Insightful)
No
Re: (Score:2)
Agreed. Algorithms are made by man, and as such, are fallible; that they involve math / programming (Magic) merely obscures their origin.
Re: (Score:3)
Your sarcasm is misaimed. The problem is not that they are algorithms--it of course has always been algorithms, though not always machines; "laws, not men". The problem is that they are closed source: sentences are being meted out by a set of rules that those being sentenced are not allowed to know. That's not acceptable.
Re: No (Score:2)
The problem is that they are closed source: sentences are being meted out by a set of rules that those being sentenced are not allowed to know. That's not acceptable.
Then we're agreed, sentences should be determined by sentencing guidelines set at the whim of politicians attempting to look 'tough on crime' (like, during the crack cocaine epidemic in the 80s/90s), and then sympathetic when our prisons are over-crowded with victims of 'three strike' regulations?
Frankly, I'd like it to be in the hands of judges, with very limited exceptions, but that will never fly in today's litigious society.
Re: (Score:2)
Re: (Score:3)
Saying that its acceptable to use algorithms doesn't mean you think the algorithms in use are the right ones.
Which ensures that the unfairness is better covered and harder to uproot (not that it's easy now).
Re: (Score:2)
> Frankly, I'd like it to be in the hands of judges
That makes sense almost anywhere but in America - you have elected judges. Judges are politicians themselves. Judges can be complete fucking morons with no comprehension of the law, just their own collection of prejudices. Judges can be elected just by being popular - or perceived as "tough on crime" - in some dumb-fuck inbred county.
But yeah, regardless of that, sentencing algorithms, like the laws themselves, should be open source whether they're ru
Re: (Score:2)
The algorithms themselves are actually the least important aspect. As I have said before [slashdot.org], even if the algorithms are 100% open and transparent, that means nothing if the data feed into them is poor. If the bank uses an algorithm to determine if it want to lend money to you, how is the data about you collected? Who decided to classify you as a say medium risk person? What cirterias did he/she/they use for that? How thorogh were he/she/they in gathering decition material? What did he/she/they miss/ignore/mis
Re: (Score:2)
"I love love LOVE the official Slashdot forced assumption of treating everyone like criminals being OK"
I have been happily insulting Slashdot for 15+ years but even I have to say it's totally the other way here. Slashdot overwhelmingly takes "innocent until proven guilty" way too far; remember Hans Reiser?
Re: (Score:2)
Slashdot is much more pro-gun and libertarian than the average person. And that includes being more anti-crime (which is undefined in many circumstances). Being libertarian, I'd expect a small-government stance on crime. More fines (linked to income), and public lashings would be much much cheaper than time in prison, and no less effective, but slashdot, like American
Re: (Score:2)
In what universe?
The Constitution I KNOW says that every person shall enjoy EQUAL rights, privileges and immunities, the exact opposite of the"Winner take all" Libertarian pro-corporate power posture
Short answer: No (Score:3)
Long answer Nooooooooooooooo!!!!!!
This is as crazy as having closed source software determine guilt (think breathalyser, radar gun etc) -- which means it is probably inevitable.
In the legal system, on every encounter with an object running code, (firmware or otherwise) an accused party should move for discovery and expert analysis of said code, and how it was developed.
Were they using best-practices? Running valgrind? (lets see the exception list) Address-sanitizer, theread-sanitizer, clang static analyser? etc..
Do they have unit tests? Do they track code coverage of those tests? (it had better be 100% across the board)
Even using all the latest tools and with plenty of unit tests with good coverage, there are still bugs.
Re: (Score:2, Interesting)
think breathalyser, radar gun etc
Those are not really comparable, they measure something by objective metrics, and the reliability of the results produced by those tools can be verified independently, even if the tools themselves are closed source. How likely someone is to commit a crime again is not something that can be reliably calculated by using some well defined formula, thus, a secret algorithm that tries to do that is far worse than the quoted examples.
Re: (Score:1)
At the very least, they could reveal the algorithm used to determine the risk. If that's a trade secret, it calls into serious question how accurate it actually is.
Re: (Score:2)
I'm assuming you are trolling, but anyway ... The question here isn't guilt or innocence, as stated in the summary. Guilt has already been determined before these algorithms are then applied to attempt to predict the likelihood that the convicted will be a repeat offender. Sentence lengths are then determined (in part) based on this score.
And it is, of course, highly unlikely that race is even an input into the system. It is, however, still possible that other criteria which might be used (income, educati
Re: (Score:2)
Re: (Score:2)
Incredibly unlikely. All it would take is one leak showing that race is included as a factor and the lives of just about everyone involved in it would be ruined. Why would they risk that? The warm fuzzies of satisfied racism doesn't even seem remotely plausible as a motivation for taking this risk.
Re: (Score:2)
Of COURSE race will not be explicitly included in the algorithm inputs. It will include all sorts of things correlated with race instead.
Re: (Score:2)
Re: (Score:2)
Not the same thing. The problem isn't so much that that software is closed source as that the algorithm is. Breathalyzers and radar guns may have closed source software, but what they measure is a well-publicized, independently verifiable fact. Not the same thing.
Re: (Score:2)
This is as crazy as having closed source software determine guilt
Or indeed using software at all, when it comes to doling out justice. The law is imperfect - one might almost say by design - and is only ever meant to be a guideline or framework for the judge, who is there to add some human insight and understanding. I think it is a very disturbing idea, using computers to "calculate" justice; before we know it, we will have courts run by call centers.
Re: (Score:3, Insightful)
There are many places where there simply is no open alternative, and the closed alternative was funded by a company that would never open its source, and shouldn't be forced to (after all, they paid for the research and development and should be able to profit from it)
No one is proposing forcing them to open source their code, they're simply proposing not using their code, and letting the market do the rest.
Companies don't have the right to make a profit just because they spent money on research. They only get to make a profit if their product is actually something useful. In this case, the parent poster argues that it's not useful since you can't verify any of the determinations it makes.
Re: Short answer: No (Score:4, Insightful)
Not using their code means an immediate loss of good tools/software, which will make it harder for government to fairly evaluate criminal cases. Court cases will become even more of what one person said versus what the other said, and therefore have worse outcomes than if technology was used to provide evidence.
This assumes that the tools/software are good (proof?), than it allows for easier and fair evaluation of cases (proof?) and that not using these tools would lead to worse outcomes for the defendant (proof?). To prove such tings would require to know and understand the details of how the algorithm work, which is impossible with the algorithm being closed.
Yes, of course there is potential for abuse in closed source software, but most likely there isn't blatant abuse, because that would look really bad if it got out that the government was, for example, targeting minorities or gays directly
It wouldn't be the first time the government got caught doing something reprehensible which assumed would never get public. Furthermore the "abuse" doesn't need to be malicious, it can also arise from unexpected behaviour of the algorithm in specific cases.
Having it closed source isn't ideal, but if it can be reliably shown to make accurate predictions, it's better than nothing.
No matter how accurate it is, the defendant has the right to know the reasoning leading to his sentencing and "the algorithm we trust said so" is definitely not enough.
Re: (Score:2)
They can reveal the algorithm without revealing the source code. This is just secrecy so the government can get away with bias.
When freedom is contingent on secret rules, freedom is lost. The same goes for "copyrighted" laws where you can't get a copy without paying for it.
Re: (Score:2)
Even better, just put the raw anonymized recidivism data on Kaggle [kaggle.com] and let everyone compete to come up with the best model.
Re: (Score:2)
Race is not an allowed input...duh.
The model will be discriminatory based on zip code instead. Smart defense attorneys will move the defendant, on paper, to a whitebread suburb.
Re: (Score:2)
what you just wrote just accepts bias and proposes hacking it as the solution.
a legal system that requires hacking to approximate an unbiased outcome (or to alter the bias in the defendant's favour) is inherently biased.
BTW, race doesn't need to be an explicit input - there are many proxies for race, zip code being one of them. number of encounters with police is another. number of relatives with convictions (and relatives with long sentences) too.
Re: (Score:2)
Crime statistics are inherently biased...or something.
There is simply no way to find inputs that won't proxy for race. Until the black community gets it's act together.
Re: (Score:3)
Sorry, but the Constitution contains no feasibility clause. The people, especially those who stand accused of a crime have a right to examine the evidence against them. That includes raising questions of it's trustworthiness in the particular circumstances of their case. The choice is simple. Want law enforcement to use your breathalyzer? Open the source. Otherwise it's not legally possible. Your choice. Faced with that, I'm guessing they will find a way to open it.
Re: (Score:2)
This is fucking rubbish. There's such a thing as black box testing. You don't need to see the source for a breathalyser, you can calibrate it with a dozen college students and a keg.
Re: (Score:3)
No, you can't. There's a great many medical conditions and simple genetic quirks that can throw a breathalyser off. Without opening the black box, you'd need a huge sample size to get even modest coverage of those cases where it wouldn't read true.
Re: (Score:2)
The way the system works, the software isn't "open source" but was seen by non-employees. The devices were tested in court and found to be reliable. Then, under the "common law" system, when they are trusted to a standard, all courts accept them as trusted black boxes, and attacks against the accuracy aren't allowed. Thank the speeder
Re: (Score:2)
In most cases, a breathalyzer fail isn't an automatic conviction (though a refusal is). So take the flawed test, and get two doctors to testify that the result is invalid.
Testify based on what? The sample was destroyed and the court is somehow OK with that. In many jurisdictions, no blood draw will happen even if the defendant asked for one. We are left with just that number and hearsay claiming it's accuracy in that particular instance.
As for the digital photo, in many cases I WILL be wanting the raw image data, not because I believe a random bit flip during compression somehow made a duck appear, but because the technical bar for shopping in a duck is quite low.
Re: (Score:2)
Ideally, yes. There are jurisdictions that now accept the breathalyzer alone.
Re: Short answer: No (Score:2)
Right! Because that sort of testing really worked in the auto emissions industry.
Re: (Score:2)
This is not "evidence against them." Judges often have wide discretion to determine sentences; if a judge can just arbitrarily pick a sentence between 2 and 10 years, I'd rather go with the algorithm.
Re: (Score:2)
The problem is that the algorithm might have an illegal bias against some class of defendant.
Re: (Score:3)
Re: (Score:2)
I was speaking to the more general case. However, there are some prejudicial factors that may not be considered in sentencing. A black box offers no assurances those factors were correctly left out of consideration. For all we know, the thing is using numerology.
Re: (Score:2)
Even the idea of forcing all future software to be open source is unrealistic because companies won't do the R and D without profit potential.
Open source does not mean non-profit. There are plenty of profitable open source companies.
Re: (Score:2)
I think some people are thinking open source as in GPL or MIT or BSD. In this case, the source only needs to be open to inspection by a trusted party (trusted to the defendant/convicted)- there needs to be no licensing involved or anything and it can be done by independent analysis under no competition and crafted non disclosure agreements that only allow the disclosure of problems or lack thereof.
I do not otherwise disagree with your statement. It just doesn't need to go that far. If you or I are subject t
Re: (Score:2)
The software itself doesn't need to be open source, the algorithim and data need to be public knowledge. Otherwise we - the public - can't confirm whether we're just making the courts more efficient or whether we're institutionalizing undesired biases or creating other injustices.
Re: (Score:2)
the software also needs to be open source - there's no other way to verify that it implements the open algorithm and uses the open data.
Re: (Score:2)
Undesired by whom? It looks like a great way to hide biases someone desires.
Government should be all open source (Score:4, Interesting)
Government, as being tax funded, should use entirely open source software and open formats. Anything otherwise is favoring certain corporations (Microsoft formats for example) or having potential to be abused (FBI backdoors in government software).
We the people elected them. We the people should be able to inspect them.
Re: (Score:2)
"I sentence you to ten years? Why? Well, I can't tell you why, because then your lawyer might get you off." This is the kind of legal procedure you're arguing for?
Absolutely not... (Score:1)
This is tricky, but I think "No" (Score:2)
This is tricky because most people don't get the issues related to open/closed source. Remember that the majority of people now have smart phones and tablets with their walled garden of app ecosystems. The idea of libre software simply does not occur to them. They likely didn't grow up learning computers in the era when source code commonly shipped with the computer and you could inspect/tweak it if something went wrong.
That said, I firmly believe that any software which is used in determining innocence/
Hail Friend Computer! (Score:1)
"I'm sorry, Citizen - the reason for your incarceration is not available at your security clearance level. Please report to you closest walled garden for mandatory fun."
-The Computer.
What do they do now? (Score:5, Insightful)
Imagine a judge just picks a number out of the air based on his own experience, opinions and yes, prejudices.
That's using a closed source algorithm, except it runs on a wet carbon platform rather than silicon.
Re: (Score:2, Insightful)
Yes, and it's trivially easy to get people to believe the judge might have made a 'bad call' in your case because we all know that people make mistakes.
The problem with the black box algorithm is that in spite of years of things blowing up in their faces, a number that came out of a computer is still imbued with an air of 'rightness' that it most assuredly doesn't deserve. But that air of 'rightness' will make it much harder for a defendant to challenge inappropriately harsh sentencing because people think
Righteness (Score:1)
People's irrational belief in the accuracy of technology.
"Garbage in; garbage out" has been forgotten by the current generations.
AND - you can just bet some very sharp person will figure out how to game the algorithm.
CONSULTANT: Let's see, several drug offenses, rape, homicide - order the defendant to go to Church and "find Jesus".
COMPUTER: "200 hours of public service and time served."
Re: (Score:2)
It also makes it harder to challenge inappropriately lenient sentencing (in jurisdictions where that's permitted).
Re: (Score:2)
And like I said, it's the same as it is now - except s/Computer/Judge/
Re: (Score:2)
"Yes, and it's trivially easy to get people to believe the judge might have made a 'bad call' in your case because we all know that people make mistakes."
Doesn't really matter what people believe or not, if the judge has the discretion to select a number that number is almost always going to stand.
Re: (Score:2)
Why not? People already appeal their sentences for being "too harsh" - whether it was made by a judge, or whether it wa
Re: (Score:2)
I don't see what the fuss is about. It's not that different to how insurance companies determine premiums.
Re: (Score:3)
Moreover, this is being used for sentencing, not for determining one's innocence. The defendant's guilt has already been decided at this point. And on top of that, bounds are already established for what penalties can be levied for each variety of crime, so it can't exceed those bounds either. As you said, the judge is already a black box. All this does is apply some consistency between the black boxes we're using.
Re: (Score:2)
Re: (Score:2)
There's an algorithm that will break that into paragraphs for you.
Perhaps they don't know themselves? (Score:2)
Perhaps the secret is that the algorithm is a black box method, and no-one actually knows exactly why it scores the way it does?
Neural networks are all the rage now, and they are one of the absolutely worst when it comes to transparency and intelligibility.
Had the company worked with rule-based classifiers or other transparent machine learning I'm sure they'd be happy to show just how smart they are, snippet by snippet. After all, we only want to avoid showing others when we're doing something simple or stu
Re: (Score:2)
I doubt they'd be willing to show off even if they thought they were clever. They'll be afraid that someone will point out something stupid that they've done which calls the whole mess into question.
The best thing to do with a naked emperor is to not let him out in public.
Re: (Score:2)
That makes it especially problematic since they then can't prove that the system didn't infer race and then use that in hidden variables to determine the sentence.
Re: (Score:2)
You still have to be quite careful if it's inferring race rather than having it input. Just because you imagine the former gang affiliate from Compton to be white doesn't mean the software does.
Re: (Score:2)
In other words, it influences the judge's determination of sentence without any ability to see it's 'reasoning'. And race is just an example of many factors that it would be illegal for it to consider. Others would include gender and poverty.
It's not a problem, it's a feature. (Score:2)
Just think how much money the state will save on legal costs if convicts are unable to appeal their sentencing. Sure, there's the chance of giving people a few more years in prison than is really appropriate, but has there ever been a time when the public called for more rights for convicted criminals?
It is not about the software (Score:2)
AFAIK software have no legal right.
It should be the responsibility of the lawyer using the software. Closed or open source doesn't matter, what is important are the results and the legal entity backing them.
If a lawyer uses voodoo magic to find culpability, why not, as long as the facts are right. If not, he better get a good explanation. "magic" won't cut it. Same for the software. The good thing with open source is that it is easier to explain results when challenged.
I'd like opensource everywhere but I d
Lack of reason(ing) (Score:2)
It is obvious that the answer should be no.
Sentencing decisions should include the reasoning why the decision would be motivated.
"Because The Software Says So" is no more valid than "Because I Say So".
I think that if software is being used, not only should the software be Open Source, the sentencing document should include a mathematical explanation of the algorithm that the program uses together with the input parameters about the judged and various statistics that the decisions leans on.
Re: (Score:2)
Re:Lack of reason(ing) (Score:5, Funny)
As a lawyer, I will admit most lawyers are not very mathematically savvy, except when it comes to calculating legal fees, then they turn into Srinivasa Ramanujam.
The answer to headline questions: NO (Score:2)
Others have already shown why the answer to this question is No.
I would like to point out why No is always the answer to questions like this.
If the answer is Yes, the editor rephrases the question as a statement. I.E. no one writes a headline "Did Trump get elected?" Instead we write "Trump wins."
We only use a question when the real headline would be boring, so we re-write it to sound more exciting and make it a question so we can't get sued. So if we know that Trump has not resigned, but we want peop
A better alternative (Score:3)
Use compassion and moral guidance. Remember, a prison is supposed to be about rehab, not outright punishment. The Nordic countries do it right.
Re: (Score:3)
Prisons have multiple functions: they protect society by physically separating criminals, they serve as punishment, they serve as deterrent, and they may also rehabilitate. If you say that their purpose ought to be only rehabilitation, well, you're probably largely on your own.
That's your opinion, not a fact. Many people are offended by Breivik's conditions of imprisonment [bbc.com] for example.
Furthermore, it's
Re: (Score:3, Insightful)
> the US has a much more diverse population, and hence we have a much larger number of people who are difficult to rehabilitate.
The elephant in the room is: the US has a revenge justice. Not alone in this, the element of revenge is, alas, quite widespread in our societies, sometimes stronger, sometimes weaker.
In a modern society, revenge has no place. Protect the society, OK. Learn from mistakes (the "perpetrator", but also the society surrounding him/her), definitely. But revenge? We are at one level wi
Re: (Score:2)
I'm more interested in the end result. If we rehabilitate a prisoner, we reduce the number of criminals and increase the number of productive citizens. If we don't, we don't.
I also don't see how any form of prison time wouldn't be considered as somewhat retributive.
Re: (Score:2)
Sure we do. Look under "Minimum Security Prisons":
https://www.bop.gov/about/faci... [bop.gov]
I believe SCOTUS found that there was no constitutional right to rehabilitation; that doesn't mean that the federal government never rehabilitates. But if there's a point you're trying to make, please cite the actual decision you're referring to.
Re: (Score:2)
I don't see how that environment is all that different from typical monastic life: manual labor, communal living, meditatio
Re: (Score:2)
You probably can keep people from committing violence by keeping them entertained, but that's merely treating the symptoms. Most people might not choose a monastic or military life, but if they g
short answer (Score:3)
Government needs to be able to explain its decisions about citizens to the public. Private organizations don't.
So, proprietary analytical software that cannot justify its decisions is not acceptable for courts of any kind, for policing, for awarding government contracts, for public schools, etc.
It is acceptable for businesses, medical providers, employee evaluations, private schools, etc.
Absolutely! (Score:4, Funny)
Gender issues (Score:2)
https://www.law.umich.edu/news... [umich.edu] For those too busy to read the citation, the research by Professor Sonja Starr indicates men receive prison sentences that average 63% longer than women convicted of the same crime. Women are also twice as likely to avoid incarceration altogether. The paper itself: https://papers.ssrn.com/sol3/p... [ssrn.com]
So what we have here a bit of a minefield. If we are to continue this practice, the algorithm must t
Re: (Score:2)
Men commit far more crimes then women. If the premise of basing sentencing on probability of future crimes is accepted, gender (and race etc) disparity makes perfect sesnse. Even if the algorithm is not given gender and race inputs, its likely to reach same conclusions by considering other factors such as growing up in poverty.
Re: (Score:2)
Men commit far more crimes then women
Men are prosecuted for crimes more than women, receive harsher sentences and are also more likely to have untreated mental issues that lead to behaviour considered criminal.
So no, putting gender (or race) in doesn't make sense at all. In fact leaving gender out would probably help remove some of the sentencing discrepancy.
Re: (Score:2)
Men commit far more crimes then women. If the premise of basing sentencing on probability of future crimes is accepted, gender (and race etc) disparity makes perfect sesnse. Even if the algorithm is not given gender and race inputs, its likely to reach same conclusions by considering other factors such as growing up in poverty.
Wow - good to see that the bogots have chimed in. That - iamcat - would be you.
Any other groups you want individuals to be punished based on any of your pre-decided bigotry?
Re: (Score:2)
You seem to be assuming that the longer sentences are the fairer/better ones. If the "that guy looks big and scary" bias is removed and everyone gets the shorter sentence, it's good for all.
Re: (Score:2)
You seem to be assuming that the longer sentences are the fairer/better ones. If the "that guy looks big and scary" bias is removed and everyone gets the shorter sentence, it's good for all.
Explain how I suggested anything other than there is a verifiable difference between between male and female sentencing. I said nothing about the length of punishment I desired. For the record, punishments in the US tend to be longer than I think is adequate.
As for assumptions, you appear to be assuming what I am assuming. In this particular case, you are wrong.
Re: (Score:2)
3 people shoot a gun at other people. One person misses hitting anyone, and gets a few years in prison, one person wounds someone and gets several years in prison, one person kills someone and get 20-life or the death penalty. Why? All three shot at people. Why should the results make a difference?
The study I cited is not based just on Murder, but arrests and outcomes across a broad spectrum.
As for the results not making a difference. That's a real slippery slope. Is thinking you want a person dead a crime? Buying a firearm for self defense against the person? There's a reason that the severity of the crime is the main consideration for the punishment
Re: (Score:2)
Not even going to bring up race?
We all know that black people go to jail at higher rates than whites. Considering or not considering race as a factor is also terrible.
Race is also a problem. According to other research by Professor Starr https://papers.ssrn.com/sol3/p... [ssrn.com] It's real, and it's happening. Interestingly, the bias in favor of women is that the gender gap is about six times as large as the racial disparity. Read the articles for the info.
Except...human nature (Score:2)
While the open-sourcing (for critical purposes) of such an algorithm is all well and good in theory, the fact is it would then be gamed for advantage.
If, for example, there is a statistically-significant correlation between a defendant showing up at a trial with a facial tattoo re-committing later, any intelligent defense attorney is going to either cover it with make up or get it removed.
Re: (Score:2)
Re: (Score:2)
Lawyer: Look, the alogorithm will give you a lower sentence if you get a haircut, find a job and stop getting drunk every night.
Society: Stop gaming the system!
Algorithms cannot be "closed source" (Score:2)
We are talking about Law, so pedantry and precision is the way to go[*] ......
By definition an algorithm cannot be "open source" or "closed source". It might be proprietary ... but that is a different thing.
Personally, I think that using some sort of Big Data / AI / Machine Learning thing to abdicate a Judge's responsibilities would seem to be the wrong way to go, particularily if you are using them to predict somone's future behaviour -- why have a Judge at all if you are going to do that?
Furthermore, pre
Counterargument (Score:5, Informative)
Frank Pasquale left out a couple of details in his opinion piece. First, these algorithms are only used in determining sentences, not to determine guilt. At that point in the trial guilt has been determined beyond a reasonable doubt. At the sentencing phase rules of evidence do not apply anymore and almost anything goes. That's why the prosecutor puts crying victims on the stand. There have been two supreme court cases (Malenchik vs Indiana and Loomis v. Wisconsin) that challenged the use of algorithms in sentencing and both upheld (in the later with some minor restrictions) that these algorithms can be used in sentencing. The conditions in general are that the algorithm has to be scientifically sound. That was the case in both cases that were challenged as there is existing peer-reviewed literature that examined the algorithms. Frank neglected to mention that. The secrecy of the algorithms is a consequence of patent and copyright law btw. The algorithms in these cases are a scoring function. Math is not (and should not) patentable or copyrightable. In this case the consequence is that the manufacturers only recourse is to keep it a trade secret. That could be solved better, but in my opinion people shouldn't get their hopes up that there's some exploitable loophole in the algorithm or something.
We can debate whether assessments (actuarial prediction instruments) should be used in sentencing or criminal justice. I’m very much in favor as it does reduce bias and leads to reproducible results. It’s much easier to control for biases in decision making with statistical methods than it is to control or fix bias in humans. Does anybody believe that human judgement is less biased? You can read up on the work of Paul Meehl who spend his lifetime showing that even simple assessment tests outperform the judgment of trained clinicians. Part of the sentencing is taking into consideration how likely the perpetrator is to commit a new offense. Humans suck at making predictions and estimating probabilities. This is no different in criminal justice.
Let me end this with pointing out some of the positive change that systems like this have brought: early release from incarceration. Low risk prisoners are more frequently released early (not just from overcrowded California prisons when ordered to do so by a federal judge), and then put on probation/parole. And work out well it did: http://time.com/4065359/califo... [time.com] The expected crime wave from federally mandated early release didn't materialize. In my opinion thanks to these prediction models.
There are many things wrong with policing and criminal justice in the US, but the move to what’s generally referred to as “evidence based practices” (incl. actuarial prediction instruments) has been pretty positive. The great part is that both Dems and Reps are behind the idea of risk assessments so we might actually see some change for the better.
Past performance etc. (Score:2)
The entire Concept if Flawed (Score:2)
Locking a person away because you think they will commit future crimes, probably.
A justice system is about objectivity. It is not about punishing someone because you dislike them. Justice is blind, not a popularity contest.
Re: (Score:2)
A justice system is about objectivity. It is not about punishing someone because you dislike them. Justice is blind, not a popularity contest.
What's more objective than using an algorithm for sentencing that is using data that has been shown to be predictive for recidivism? Indeed it's not about disliking somebody but about public safety and deterrence.
Consistent or arbitrary? (Score:2)
When a judge "decides" on a sentence, there can be many factors that influence the result. Not all of them are strictly relevant or objective and not all of them are easily explainable. Different judges would give different punishments for the same crime. Hell, the same judge can give different sentences for seemingly identical circumstances. When a program computes a sentence then the same data entered each time will lead to the same result.
Whether it is an open-source program that gives a result or a pr
The EU says "no" (Score:3)
Re: (Score:2)
With a lot of caveats, of course, but basically an individual has the right to 'an explanation of the decision reached after [algorithmic] assessment'. Described in http://fusion.kinja.com/eu-cit... [kinja.com]
Interesting. Thanks for sharing. The decisions of linear models (like the ones used for sentencing, credit scores etc.) can indeed be explained. The credit bureaus sell explanations of the score and tips on improving the score as an extra service, for example. That doesn't mean that people will like the explanation... If the EU will indeed require explanations for algorithmic decisions, then models will be limited to simple linear models.
Re: (Score:2)