How the Rise in Artificial Intelligence has Exposed Computing Technology to a New Definition of Performance
Advancements in computer technology have introduced the concepts of neural networks and similar algorithmic processes which we barely understand yet have significant impacts on our everyday lives. COMPAS, one such algorithm used in sentencing decisions in Wisconsin, recently came under scrutiny after the reveal of its racially biased results against blacks. Examination of this algorithm as performance through the lens of performance studies reveals its nature as a double-edged sword in the fight for social equality against structural racism.
The COMPAS Algorithm
Computing technology has grown massively in the past several decades, leading to a rapid rise in the prevalence of algorithms in our everyday lives. From superstore chains determining what coupons to mail you from your shopping habits to credit companies calculating your credit score from your financial history to the infamous “YouTube algorithm,” it is quickly becoming harder and harder to find areas of life without some sort of learning algorithm embedded within.
The title of this article, “Performance Computing,” condends that a computer’s performance is today not only a measure of its gigahertz or petaFLOPS: It truly a “performance” in the sense of its performance of America itself. With omnipresent algorithms becoming more and more of a reality, it is crucial that we study how exactly they are shaping the ongoing performance of racism in America—and ensure they become a force towards breaking down structural racism and not yet another of its strongholds. Such examination reveals, quite worryingly, that the increasing usage of computational algorithms is, ultimately, an amplification engine that reflects the current state of performance in America and brings it to one extreme or another. However, while the current state of society pits them as barrier to social equality, they may possible become its ally given enough time and effort.
There is an algorithm named the Correctional Offender Management Profiling for Alternative Sanctions—COMPAS for short—used extensively in Wisconsin in order to assign “risk scores” to defendants pre-trial in order to determine whether to allow releasing defendants on bail, among many other possible actions. However, an article published by journalism press ProPublica on May 23, 2016, revealed that the COMPAS algorithm is in itself racially biased against black defendants, more likely assigning blacks a higher risk score than a white individual with the same background and case.
Interestingly enough, Northpointe, the developers of COMPAS, specifically excluded the factor of race in the algorithm, though it does take into account other factors such income and past offenses (Dieterich et al.). This is indicative of just have much of a grasp structural racism has on our modern-day society and is the starting point of the analysis on this particular way in which America performs.
It is no secret that structural racism is still an issue in our current times. Blacks and other minorities still face marginalization as a result of the systems in place in America (Bonilla-Silva), and COMPAS is the latest manifestation of this, with a potential scope greater than anything in the past. The implications and precedents set by COMPAS will lay the groundwork for the next era of social justice and equality in an era of ubiquitous computing, and the conclusions we can draw from COMPAS today give is a window into the state of structural racism and the effect of algorithms on America as a whole.
The first question to answer is why exactly the COMPAS algorithm appears to discriminate negatively against blacks. As mentioned earlier, race is not a factor that is directly input into the algorithm when it computes the score, yet blacks still have a higher mean score according to the graphs below (Dieterich).
Scores for White Defendants
Scores for Black Defendants
Furthermore, statistical analysis of the results of COMPAS reveals that the recidivism rates being assigned to individuals are equally predictive of reoffending rates across race (Dieterich 20), which is indicative that blacks are overall being rearrested more frequently than white individuals. This is confirmed by the chart below. While higher risk scores have higher proportions of blacks, the percentage of reoffenders to total defendants is about consistent between races for each risk category.
An article published by Klienberg et al. shortly after the reveal of COMPAS’s bias demonstrated that it is in fact mathematically impossible to guarantee both that the races are evenly distributed across bins and also to guarantee that risk score is indicative of recidivism rates when two population groups have different recidivism rates overall.
We can draw two conclusions from this. One is that, through the scores generated by COMPAS, we can take a gruesome look at exactly what the state of racism in America is. While COMPAS is a tool that is ultimately meant to aid judges in their decisions on how to permit bail for defendants, it is also a window through which we can quantify exactly the current status of America’s racism. Non-biased scores would indicate the equal treatment of blacks and whites in America, but this is obviously far from the case.
The other conclusion directly follows: Structural racism is still a major issue in our time. COMPAS provides undeniable, quantifiable evidence of this. If the fact that blacks have an overall higher recidivism rate reflects in the results of the COMPAS algorithm and if we dispel the myth of inherent, biological differences between races, we must conclude that blacks have higher recidivism rates because of the structures of America—structural racism.
In the specific case of recidivism, while the scores may be indicative of whether or not someone is likely to be rearrested, the inherent racism lies within the statistic itself. The current status quo is that neighborhoods with a high black population tend to be more heavily policed, and areas with heavier policing are bound to have higher recidivism rates simply because there are more police to make arrests in the first place (Briggs).
This is what is reflected in the bias of COMPAS scores. The historic discrimination against blacks tracing all the way back to the era of slavery, to the redlining of blacks in the era of homeownership causing gentrification of areas with a higher black population, depreciating land value and increasing crime, increasing policing and thus recidivism rates and thus the COMPAS scores we see today. We can see the performance of racism in America of centuries past in today’s COMPAS scores.
However, I argue that COMPAS is more than just a passive indicator of structural racism: it is an actor, not just an observer, in the performance of structural racism in modern-day America. As it currently stands, COMPAS is working to reinforce the organs of structural racism. In short, COMPAS demonstrates structural racism in its results, and these results are then used to further discriminate against blacks and other minority groups. This creates a negative feedback loop, which serves to tighten the already-tight grip of structural racism on such communities, but the key difference with COMPAS is that it is an algorithm intended for wide-scale use in the public sector.
Unlike real-estate agents with localized red-lining in their region or local law enforcement and their biased deployment of police in a given jurisdiction, COMPAS breaks ground in order to introduce algorithmic tools at every level from state circuit courts to the US Supreme Court. State v. Loomis, a 2016 case reviewed by the Wisconsin Supreme Court, upheld the legality of using COMPAS for use in sentencing decisions, locking the legality of COMPAS into case-law. It now has its place in our justice system—and for now, we can see how it acts to cement structural racism in our society by reflecting the social structures and amplifying them.
There is also a place for COMPAS and algorithms like it in the fight for social justice, however. Because COMPAS is ultimately a mathematical function with no sentience, it reflects the realities it was “trained” in (a term often used in the field of machine learning). It enjoys the unique position of being both a barometer of the social status quo and also a ballast that locks it in. Social justice in America—the fight to tear down structural racism—fortunately continues to march forwards, and COMPAS gives us a way to quantify our progress and assign real, numerical values to how far we are from achieving social equality. And as we approach this ideal, COMPAS has the potential to work in our favor in the same way it works against us right now: In a new environment of social equality, COMPAS will reflect such sentiments and allow it to act as an anchor against structural racism.
Of course, a lot of work lies ahead of us. The ballast effect of COMPAS and other algorithms like it works both ways, and we are fighting against it right now. In time, however, the hope is that as algorithmic development continues to march on, it will become a force that helps equalize that helps bring about social justice and turns from a stronghold of structural racism to a stronghold of structural equality. The computers of today and tomorrow can perform and will continue to perform in ways they never have before, shaping, reflecting, and amplifying America through mathematics and computer science.
Bonilla-Silva, Eduardo. “Rethinking Racism: Toward a Structural Interpretation.” American Sociological Review, vol. 62, no. 3, 1997, pp. 465–480. JSTOR, doi:10.2307/2657316. Accessed 20 Oct. 2014.
Briggs, Steven J., and Kelsey A. Keimig. “The Impact of Police Deployment on Racial Disparities in Discretionary Searches.” Race and Justice, vol. 7, no. 3, 2017, pp. 256–275, doi:10.1177/2153368716646163.
Dieterich, William et al. COMPAS Risk Scales: Demonstrating Accuracy Equity and Predictive Parity. Northpointe Inc. Research Department, 2016.
Kleinberg, Jon M. et al. “Inherent Trade-Offs in the Fair Determination of Risk Scores.” CoRR. Cornell University, 17 Nov. 2016, arxiv.org/abs/1609.05807. Accessed 11 Nov. 2019.
“State v. Loomis.” Harvard Law Review, 10 Mar. 2017, harvardlawreview.org/2017/03/state-v-loomis/. Accessed 12 Nov. 2019.