How would you like to be sentenced by a computer program?

mobilehaathi

macrumors G3
Original poster
Aug 19, 2008
9,347
6,215
The Anthropocene
When Chief Justice John G. Roberts Jr. visited Rensselaer Polytechnic Institute last month, he was asked a startling question, one with overtones of science fiction.

“Can you foresee a day,” asked Shirley Ann Jackson, president of the college in upstate New York, “when smart machines, driven with artificial intelligences, will assist with courtroom fact-finding or, more controversially even, judicial decision-making?”

The chief justice’s answer was more surprising than the question. “It’s a day that’s here,” he said, “and it’s putting a significant strain on how the judiciary goes about doing things.”

He may have been thinking about the case of a Wisconsin man, Eric L. Loomis, who was sentenced to six years in prison based in part on a private company’s proprietary software. Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
https://mobile.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-software-programs-secret-algorithms.html

Maybe you see nothing wrong with this? At a minimum a defendant has a constitutional right to inspect and challenge all evidence used against him. But maybe you think that Big Data is Unbiased and Fair?

Well perhaps you should consider this:

In debates over the future of artificial intelligence, many experts think of the new systems as coldly logical and objectively rational. But in a new study, researchers have demonstrated how machines can be reflections of us, their creators, in potentially problematic ways. Common machine learning programs, when trained with ordinary human language available online, can acquire cultural biases embedded in the patterns of wording, the researchers found. These biases range from the morally neutral, like a preference for flowers over insects, to the objectionable views of race and gender
https://www.sciencedaily.com/releases/2017/04/170413141055.htm

Machine learning is a means to derive artificial intelligence by discovering patterns in existing data. Here, we show that applying machine learning to ordinary human language results in human-like semantic biases. We replicated a spectrum of known biases, as measured by the Implicit Association Test, using a widely used, purely statistical machine-learning model trained on a standard corpus of text from the World Wide Web. Our results indicate that text corpora contain recoverable and accurate imprints of our historic biases, whether morally neutral as toward insects or flowers, problematic as toward race or gender, or even simply veridical, reflecting the status quo distribution of gender with respect to careers or first names. Our methods hold promise for identifying and addressing sources of bias in culture, including technology.
http://science.sciencemag.org/content/356/6334/183

Good luck.
 
  • Like
Reactions: Eraserhead

DrewDaHilp1

macrumors 6502a
Mar 29, 2009
578
11,573
All Your Memes Are Belong to US
That's not quite what happened. The man was convicted assuming by a jury(good luck finding any info via google about his original trial), and the sentence was handed down by a Judge. Six years is pretty light compared to the fact that, A. He's already been convicted of sexual assault. He was in a stolen car and he evaded a LEO. So a history of violence and grand theft. My care meter is pegged at zero.
 

shinji

macrumors 65816
Mar 18, 2007
1,306
1,497
If we're going to use algorithms for sentencing at all, then they should be open source. We have no idea if this private company's trade secret is heavily biased, and yet the whole selling point is that the algorithm is "more objective."
 

Eraserhead

macrumors G4
Nov 3, 2005
10,300
10,372
UK
That's not quite what happened. The man was convicted assuming by a jury(good luck finding any info via google about his original trial), and the sentence was handed down by a Judge. Six years is pretty light compared to the fact that, A. He's already been convicted of sexual assault. He was in a stolen car and he evaded a LEO. So a history of violence and grand theft. My care meter is pegged at zero.
That's not what the justice system is supposed to do. The right of a fair trial applies to everyone.
 
  • Like
Reactions: satcomer

satcomer

macrumors 603
Feb 19, 2008
6,300
929
The Finger Lakes Region
That's not what the justice system is supposed to do. The right of a fair trial applies to everyone.
I'm under the mind that justice is not fair. Just ask a Black man or any other man falsely accused or rape! Then think real hard on a sexual preditor that happens to cute white female teacher and the slap on the wrist they always get! :mad:
 
Last edited:

yaxomoxay

macrumors 68040
Mar 3, 2010
3,606
24,517
Texas
I'm under the mind that justice is nit fair. Just ask a Black man or any other man falsely accused or rape! Then think real hard on a sexual preditor that happens to cute white female teacher and the slap on the wrist they always get! :mad:
What?
 

Raid

macrumors 68020
Feb 18, 2003
2,144
3,926
Toronto
To answer the question in the OP, It depends on if I have access to the source code or not... :D

Seriously though; I think the algorithm can be a useful tool, but shouldn't be the final say.
 

chown33

Moderator
Staff member
Aug 9, 2009
8,369
4,357
Pumpkindale
To answer the question in the OP, It depends on if I have access to the source code or not... :D

Seriously though; I think the algorithm can be a useful tool, but shouldn't be the final say.
I realize you posted with a smiley, but source code alone won't suffice. You'd also need the entire training data set, and the order in which the data was applied.

The program's behavior depends on multiple factors, only one of which is source code. Think of the training data as a huge blob of state. The output depends on the next input, as well as all the variables making up the current state.

"Machine learning" is really "machine-directed data acquisition", where the "machine-directed" part uses some of the previously acquired data to decide how to process subsequent data.
 
  • Like
Reactions: mobilehaathi

mobilehaathi

macrumors G3
Original poster
Aug 19, 2008
9,347
6,215
The Anthropocene
To answer the question in the OP, It depends on if I have access to the source code or not... :D

Seriously though; I think the algorithm can be a useful tool, but shouldn't be the final say.
Well, then we can agree that Trade Secrets do not trump Due Process, but as chown33 rightly points out---and as is suggested in the second set of links I quoted---the training data is a critical component of the process.

To lay out my two concerns more concretely: first, due process requires complete transparency at every level. This means the ability for a defendant to examine and confront not only the source code of the algorithm that has an effect on his/her proceedings but also the underlying process and data by which it was trained.

Second, there are many people who promote the idea that the analysis of Big Data with the proper procedure is some kind of unbiased truth machine that will not only solve the all the world's problems but is actually a fundamental representation of objective truth. I'd say that the people working in these fields who aren't blinkered by arrogance would readily admit that this is ridiculous, but there are some who believe it, promote it, and have convinced lay people of it---maybe even courts. And that is dangerous.
 

Gutwrench

Contributor
Jan 2, 2011
3,914
9,033
If we're going to use algorithms for sentencing at all, then they should be open source. We have no idea if this private company's trade secret is heavily biased, and yet the whole selling point is that the algorithm is "more objective."
The thread title is misleading. That isn't what happened. Please read the article.

Mr. Loomis says his right to due process was violated by a judge’s consideration of a report generated by the software’s secret algorithm, one Mr. Loomis was unable to inspect or challenge.
 

Gutwrench

Contributor
Jan 2, 2011
3,914
9,033
Are you really this pedantic?
It has nothing to do with being pedant because it's inaccurate to imply software is sentencing people. What happened was a judge improperly considered a report generated by software in determining a person's sentence. That was inappropriate. What actually happened was bad enough. Exaggeration, sensationalism, and just plan inaccuracies are at epic proportions.
 

Raid

macrumors 68020
Feb 18, 2003
2,144
3,926
Toronto
I realize you posted with a smiley, but source code alone won't suffice.
Jeez if I was going to fix it for everyone sure, but for me I just need a simple if statement just prior to releasing the AI's decision (like: If $Defendant = "Raid" Then $Sentence = 'Two years less a day; time served. No probation required." else $Sentence = $Big_Data_Result End) :p

To lay out my two concerns more concretely: first, due process requires complete transparency at every level. This means the ability for a defendant to examine and confront not only the source code of the algorithm that has an effect on his/her proceedings but also the underlying process and data by which it was trained.

Second, there are many people who promote the idea that the analysis of Big Data with the proper procedure is some kind of unbiased truth machine that will not only solve the all the world's problems but is actually a fundamental representation of objective truth. I'd say that the people working in these fields who aren't blinkered by arrogance would readily admit that this is ridiculous, but there are some who believe it, promote it, and have convinced lay people of it---maybe even courts. And that is dangerous.
I agree with this completely. While AI can be useful, quick (and even transparent if so desired) in sourcing relevant facts and equating them to sentencing periods, there could be bias in the results due to poor data input or source data.

For every estimation equation there's this term stuck way at the far end called 'e', it's tiny and just sits there but it means error, and it's rather innocuous presence could mean a whole lot of trouble with the variability in predictions... anyone who tells you different is probably the one marketing such systems.
 
  • Like
Reactions: mobilehaathi