just algorithms Statistically-derived algorithms, adopted by many jurisdictions in an effort to identify the risk of reoffending posed by criminal defendants, have been lambasted as racist, de-humanizing, and antithetical to the foundational tenets of criminal justice. Just Algorithms argues that these attacks are misguided and that,properlyregulated,riskassessmenttoolscanbeacrucialmeansofsafelyand humanely dismantling our massive jail and prison complex. The book explains howriskalgorithmswork,thetypesoflegalquestionstheyshouldanswer,andthe criteriaforjudgingwhethertheydosoinawaythatminimizesbiasandrespects humandignity.Italsoshowshowriskassessmentinstrumentscanprovideleverage for curtailing draconian prison sentences and the plea-bargaining system that produces them. The ultimate goal of Christopher Slobogin’s insightful analysis istodeveloptheprinciplesthatshouldgovern,inboththepretrialandsentencing settings,thecriminaljusticesystem’sconsiderationofrisk. Christopher Slobogin holds the Milton Underwood Chair at Vanderbilt University Law School. He has authored or co-authored eight books and over 150 articles on criminal justice issues. He is one of the most heavily cited law professorsinthecriminaljusticefieldandistheonlylawprofessortohavereceived Distinguished Scholar awards from both the American Psychology-Law Society andtheAmericanBoardofForensicPsychology. Just Algorithms using science to reduce incarceration and inform a jurisprudence of risk CHRISTOPHER SLOBOGIN VanderbiltUniversityLawSchool UniversityPrintingHouse,Cambridgecb28bs,UnitedKingdom OneLibertyPlaza,20thFloor,NewYork,ny10006,USA 477WilliamstownRoad,PortMelbourne,vic3207,Australia 314–321,3rdFloor,Plot3,SplendorForum,JasolaDistrictCentre, NewDelhi–110025,India 79AnsonRoad,#06–04/06,Singapore079906 CambridgeUniversityPressispartoftheUniversityofCambridge. ItfurtherstheUniversity’smissionbydisseminatingknowledgeinthepursuitof education,learning,andresearchatthehighestinternationallevelsofexcellence. www.cambridge.org Informationonthistitle:www.cambridge.org/9781108833974 doi:10.1017/9781108988025 ©ChristopherSlobogin2021 Thispublicationisincopyright.Subjecttostatutoryexception andtotheprovisionsofrelevantcollectivelicensingagreements, noreproductionofanypartmaytakeplacewithoutthewritten permissionofCambridgeUniversityPress. Firstpublished2021 AcataloguerecordforthispublicationisavailablefromtheBritishLibrary. isbn978-1-108-83397-4Hardback isbn978-1-108-98434-8Paperback CambridgeUniversityPresshasnoresponsibilityforthepersistenceoraccuracyof URLsforexternalorthird-partyinternetwebsitesreferredtointhispublication anddoesnotguaranteethatanycontentonsuchwebsitesis,orwillremain, accurateorappropriate. Contents Preface:ThePointofThisBook pagevii 1 Rationale WhatRiskAlgorithmsCanDofortheCriminalJusticeSystem 1 1.1 ThePlagueofMassIncarceration 1 1.2 Why?Crime,Culture,andCodes 7 1.3 ThePotentialRoleofRiskAlgorithms 25 2 Fit WhyandWhenDataaboutGroupsAreRelevanttoIndividuals 37 2.1 RiskAssessmentInstruments 38 2.2 G2i—UsingInformationaboutGroupstoResolveIndividual Cases 42 2.3 TheProbabilityCriterion 45 2.4 TheOutcomeCriterion 49 2.5 TheDurationCriterion 52 2.6 TheInterventionCriterion 52 2.7 Conclusion:ImplementingtheFitCriteria 56 2.7.1 PrisonRelease/Diversion 59 2.7.2 PretrialDetentionBasedonRiskofOffending 60 2.7.3 PretrialDetentionBasedonFlightRisk 61 2.7.4 PrisonPopulationReduction 62 3 Validity FiguringOutWhenRiskAlgorithmsAreSufficientlyAccurate 64 3.1 IncrementalValidityandHelpfulness 65 3.2 CalibrationValidity 68 3.3 DiscriminantValidity 70 v vi Contents 3.4 ExternalValidity(LocalandGroupValidation) 72 3.5 ImplementationValidity:Inter-RaterReliabilityand “Adjustments”ofScores 74 3.6 CurrentValidity/Re-Validation 80 3.7 TheMeaningofPredictiveValidity 81 3.8 Conclusion:AssuringAccuracy 82 4. Fairness AvoidingUnjustAlgorithms 86 4.1 EgalitarianInjustice 86 4.1.1 Race 90 4.1.2 Sex 96 4.1.3 OtherTraits 97 4.2 RetributiveInjustice 99 4.3 ProceduralInjustice 104 4.3.1 Process 105 4.3.2 Transparency 108 4.3.3 Voice 112 4.4 Conclusion:OneCourt’sStrugglewithAssuringFairness 117 5 Structure LimitingRetributivismandIndividualPrevention 120 5.1 TheProblemswithDesert 124 5.1.1 TheAccuracyofDesert 125 5.1.2 DesertandDisparity 127 5.1.3 DesertandBlameworthiness 129 5.2 PreventiveJusticeandLimitingRetributivism 131 5.2.1 TheDeterminationofSentenceRanges 134 5.2.2 GoverningPrinciplesofPreventiveJustice 138 5.3 ParoleasaConstitutionalRight 142 5.4 PreventiveJusticeandPleaBargaining 147 5.5 Conclusion:PreventiveJusticeinPractice 153 6 MovingForward TheNeedforExperimentation 158 Index 164 Preface: The Point of This Book Intoday’sworld,algorithmsareeverywhere.Algorithms—ashorthandwayof describing statistical and computer driven decision-making processes—are crucialtoourcommercialand sociallife.Insurancecompaniesusethemto set premiums, credit card companies rely on them to detect fraud, and investors depend on them to figure out market trends. Algorithms permeate the Internet, smartphones,marketing,andtheforecastingofvirtually every- thing,fromweathertosportsoutcomestopandemics. Not surprisingly, government agencies rely on algorithms as well, to con- duct tax audits, welfare checks, police investigations, and DNA and other forensic analysis.1 Of most significance to this book, for at least a decade virtually every state has authorized the use of algorithms that purport to determine the recidivism risk posed by people who have been charged or convicted of crime.2 Commonly called risk assessment instruments, or RAIs, thesealgorithmshelpjudgesfigureoutwhetherarrestedindividualsshouldbe releasedpendingtrialandwhetherconvictedoffendersshouldreceiveprison time or an enhanced sentence; they assist parole boards in determining whether to release a prisoner; and they aid correctional officials in deciding 1 SeeDavidFreedmanEngstrometal.,GovernmentbyAlgorithm:ArtificialIntelligencein FederalAdministrativeAgencies,athttps://ssrn.com/abstract=3551505;SusanMiller,HowLaw EnforcementUsesForensicAlgorithms,GCN,May12,2020,athttps://gcn.com/articles/2020/ 05/12/gao-forensic-algorithms.aspx. 2 Memorandum from the Vera Inst. of Justice, Ctr. on Sentencing & Corr. to Del. Justice ReinvestmentTaskForce4(Oct.12,2011),athttp://www.ma4jr.org/wp-content/uploads/2014/ 10/vera-institute-memo-on-risk-assessment-for-delaware-2011.pdf (“almost every state uses an assessmenttoolat oneormorepointsinthecriminaljusticesystemto assistinthebetter managementofoffendersininstitutionsandinthecommunity.Overall,over60community supervisionagenciesin41statesreportedusinganactuarialassessmenttool,suggestingthatan overwhelmingmajorityofcorrectionsagenciesnationwideroutinelyutilizeassessmenttoolsto somedegree.”). vii viii Preface howoffendersshouldbehandledinprison.Mostofthesealgorithmsconsistof from five to twenty risk factors associated with criminal history, age, and diagnosis, although an increasing number incorporate other demographic traits and psychological factors as well. Each of these risk factors correlates withacertainnumberofpointsthatareusuallyaddedtocomputeaperson’s riskscore;thehigherthescore,thehighertherisk.Sometoolsmayalsoaimat identifyingneeds,suchassubstanceabusetreatmentandvocationaltraining, thoughttoberelevanttorehabilitativeinterventionsthatmightreducerecid- ivism. This book will provide examples of a number of these instruments so thatthereadercangetasenseoftheirdiversityandnuances. Onepurposeofthisbookistoexplainhowriskalgorithmsmightimprove the criminal justice system. If developed and used properly, RAIs could become a major tool ofreform. Most importantly, they can helpreduce the useofpretrialdetentionandprison,aswellasthelengthofprisonsentences, withoutappreciablyincreasingtheperiltothepublic(goalsthatareparticu- larlypressingasCOVID-19ravagesourpenalfacilities).Indoingso,theycan mitigate the excessively punitive bail and sentencing regimes that currently exist in most states. They can also allocate correctional resources more effi- cientlyandconsistently.Andtheycanprovidethespringboardforevidence- basedrehabilitativeprogramsaimedatreducingrecidivism.Morebroadly,by makingcriminaljusticedecision-makingmoretransparent,thesetoolscould force long overdue reexamination of the purposes of the criminal justice systemandoftheoutcomesitshouldbetryingtoachieve. Despitetheirpotentialadvantages,theriskalgorithmsusedinthecriminal justicesystemtodayarehighlycontroversial.Acommonclaimisthattheyare notgoodatwhattheypurporttodo,whichistoidentifywhowilloffendand whowillnot,whowillberesponsivetorehabilitativeeffortsandwhowillnot be.Butthetoolsarealsomalignedasraciallybiased,dehumanizing,and,for goodmeasure,antitheticaltothefoundationalprinciplesofcriminaljustice. A sampling of recent article and book titles makes the point: “Impoverished Algorithms: Misguided Governments, Flawed Technologies, and Social Control,”3“RiskasaProxyforRace:TheDangersofRiskAssessment,”4and “Automating Inequality: How High-Tech Tools Profile, Police, and Punish thePoor.”5In2019,over110civilrightsgroupssignedastatementcallingforan 3 SarahValentine,ImpoverishedAlgorithms:MisguidedGovernments,FlawedTechnologies, andSocialControl,44FordhamUrb.L.364(2019). 4 BernardHarcourt,RiskasaProxyforRace:TheDangersofRiskAssessment,27Fed.Sent.Rep. 237(2015). 5 VirginiaEubanks,AutomatingInequality:HowHigh-TechToolsProfile,Police,andPunish thePoor(2018). Preface ix endtopretrialriskassessmentinstruments.6Thatsameyeartwenty-sevenIvy LeagueandMITacademicsstatedthat“technicalproblems”withriskassess- ment instruments “cannot be resolved.”7 And in 2020 another group of 2435 scholarsfromawiderangeofdisciplines“demanded”thatSpringerpublish- ing company, one of the largest purveyors of healthcare and behavioral sciencebooksandjournals,“issueastatementcondemningtheuseofcriminal justicestatisticstopredictcriminality”becauseoftheirunscientificnature.8 Asecondpurposeofthisbookistoexploretheseclaims.Allofthemhave somebasisinfact.Buttheycaneasilybeoverblown.Andiftheimpactofthese criticisms is to prevent the criminal justice system from using algorithms, apotentiallyvaluablemeansofreformwillbelost.Akeyargumentinfavorof algorithmsiscomparativeinnature.Whilealgorithmscanbeassociatedwith a number of problems, alternative predictive techniques may well be much worse in each of these respects. Unstructured decision-making by judges, parole officers, and mental health professionals is notoriously bad, biased, and reflexive, and often relies on stereotypes and generalizations that ignore thegoalsofthesystem.Algorithmscandobetter,atleastifsubjecttocertain constraints. Surfacingthoseconstraints is athird purpose of thisbook. Although algo- rithms, on average, are superior tounstructured judgment when it comes to prediction, many are seriously defective in a number of respects. This book provides a set of principles meant to govern the risk assessment enterprise. Influencedbyinsightsgleanedfromthealgorithmsthemselves,itadvances,in short,amuch-needed“jurisprudenceofrisk”analogoustothejurisprudence of criminal liability that has long governed the definition of crimes and the scopeofpunishment. Without a jurisprudence of risk, judges and other legal decision-makers haveverylittleguidanceconcerningwhichRAIs,ifany,areworthyofconsid- eration, how to evaluate their results, and how much weight to give those results.Socialscientistsandotherresearcherswhodeveloptheseinstruments do not have a clear idea of the outcome measures that the law considers relevant, the types of risk factors they may or may not consider, or the 6 The Use of Pretrial “Risk Assessment” Instruments: A Shared Statement of Civil Rights Concerns(Jan.2019),athttps://www.supremecourt.ohio.gov/JCS/casemng/PJRSummit/mater ials/pretrialRiskAssessInstruments.pdf. 7 ChelseaBarabas,KarthikDinakar&ColinDoyle,TheProblemswithRiskAssessmentTools, N.Y.Times(July17,2019),athttps://dam-prod.media.mit.edu/x/2019/07/16/TechnicalFlawsO fPretrial_ML%20site.pdf. 8 CoalitionforCriticalTechnology,Abolishthe#TechtoPrisonpipeline(Dec.30,2020),atht tps://medium.com/@CoalitionForCriticalTechnology/abolish-the-techtoprisonpipeline -9b5b14366b16.