Algorithmic emotion - algorithms in the criminal justice system
The following is a journal article that I wrote for UGBA 107: The Social, Political, and Ethical Environment of Business. The opinions below were written in an academic context and may or may not reflect my actual opinions on the subject matter.
The incredible rise in computing power coupled with the explosion of personal computing have resulted in a society in which corporations and governments alike compete to buy, barter, and collect data – data that’s then run through models to make predictions about actions and events in the future. Although supporters of a data-driven criminal justice system assert that using algorithms reduce human bias, an analysis of the probabilistic structures behind these models combined with a comparison to other applications of the technology suggest that the removal of emotion in the decision making process of these often emotionally-driven cases results in fallacious conclusions.
For industries that collect an expansive quantity of objective data – in particular, the healthcare industry – machine learning models can be highly effective at deducing common patterns in patients (source). Similarly, in the space of consumer advertising, purely objective data collected from a user’s clicks and purchase history can be incredibly powerful in identifying the user’s wants and needs (source). These models, however, are only as good as the quantity and quality of factors that it’s trained on: for example, it wouldn’t make sense to predict one’s likelihood of contracting the common cold based off of solely their age.
Crime is, arguably, a highly emotionally-driven situation – and thus, in a similar regard, assessing criminal potential without encapsulating the raw human emotion behind the crime doesn’t make any more sense than predicting the common cold based off of one’s age. While data scientists may provide a compelling case of creating a “complete picture” through the incorporation of a wide variety of metrics – like the Bristol algorithm that identifies children with high risk of being recruited into crime using educational records and housing information – fundamentally, the human emotions that drive these actions remain indecipherable to algorithms.