Island, and
Whatcom Counties
Artificial Intelligence (AI) has become part of our daily lives. People use AI to decide what to cook for dinner, to write email replies at work, and even to do their homework. How is AI being integrated into the criminal justice system?
AI has been used in policing for many years. Programs such as SoundThinking (formerly called ShotSpotter) can help departments in predictive policing. SoundThinking is a gunshot detection software that locates gunshots and alerts local police departments for response. Police departments also use facial recognition software to find suspects on surveillance videos around cities. AI can be used to analyze historical crime data and help police allocate manpower to specific areas and shifts. Police departments have long used the data from past crimes to overlay on maps highlight to determine which areas to send more officers to, but these maps are reactive to past data. AI can use data collected to extract patterns and to predict future events, thus being more proactive. For example, in a 2019 study in Vancouver, AI used the past 15 years of data to create a heatmap to predict the areas most likely to experience crime using a boosted decision tree with K-nearest neighbors.[1]
AI can also be used in risk assessment instruments (RAIs) to help determine the likelihood of an individual reoffending. These tools can assist judges in making informed decisions about bail, sentencing, and parole. Public Safety Assessment (PSA), which is a risk assessment tool used in the criminal justice system, uses a defendant’s age, criminal history, and some other factors to produce three scores: the risk that the individual will be convicted of any new crime, the risk that the individual will be convicted of a new violent crime, and the risk that the individual will fail to appear in court.[2] A 2020 study showed that if judges had used a simple RAI that took into account the age of the defendant and their number of prior failure to appears, 30% fewer defendants would have been detained without a rise in pretrial misconduct (failure to appear, etc).[3] This shows that pretrial misconduct and incarceration have the potential to decrease- a win for everyone.
However, AI cannot take into account extenuating circumstances or account for the human element of a crime. Solely using AI tools for determining bail, a sentence, or parole risks exacerbating the biases that exist today in the criminal justice system. Loomis v. Wisconsin, a 2016 Wisconsin Supreme Court case, dealt specifically with AI issues. Eric Loomis petitioned that his sentence, which was determined using an RAI called Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), was not individualized, among other arguments. You can read the decision here.
The integration of AI into the criminal justice system presents both opportunities and challenges. Police departments and courts will continue to implement more RAIs over time. AI can increase objectivity and efficiency, but there are still concerns about bias, transparency, and accountability.
Dakalbab, F., Talib, M., Waraga, O., Nassif, A., Abbas, S., & Nasir, Q. (2022) Artificial intelligence & crime prediction: A systemic literature review. Social Sciences & Humanities Open, Volume 6, Issue 1.
DeMichele, M., Baumgartner, P., Wenger, M., Barrick, K., Comfort, M., & Misra, S. (2018). The Public Safety Assessment: A re-validation and assessment of predictive utility and differential prediction by race and gender in Kentucky.
Jung, J., Concannon, C., Shroff, R., Goel, S. and Goldstein, D.G. (2020). Simple rules to guide expert classifications. Journal of the Royal Statistical Society: Series A (Statistics in Society).