DETERMINISM AND PREDICTIVE POLICING

Papantoniou & Papantoniou LLC > News & Insights  > DETERMINISM AND PREDICTIVE POLICING

DETERMINISM AND PREDICTIVE POLICING

For the uninitiated, Determinism is the philosophy that we are not in charge of our own destiny, and that our actions are predetermined long before we act upon them. In this way, determinists will agree that a criminal is bound to their fate of conviction the moment they are born. While this may well be true, it appears we are quite far away from having the ability to predict criminal action to such a high a degree of accuracy. However, new developments in Artificial Intelligence (hereinafter ‘AI’) have allowed for certain police departments around the globe to implement predictive policing systems which promise to put an end to crime before it even occurs.

Originally, the systems were designed to be able to predict where violent crime was likely to occur. The location was the main focus of the predictions. Its natural evolution of course, was that it would eventually find its way to predicting ‘who’ rather than ‘where’. This applies both to victims as well as offenders. One might instantly realise how this might turn into a mismanaged tool that determines individual’s future incarceration based on their statistical chances of recidivism. This is where determinism is added into the equation. These systems use data analysis to predict who is more likely to recede based on facts such as their current habitation, age, income and a huge variety of other factors. But it is one thing to use such a technology for discovering locations where crime is likely to be committed, and another thing entirely to apply it to individuals. Deciding based on data essentially boils individuals down to numbers, deciding their fate for them. So, the question must be asked, is it moral to apply this technique to individuals?

There are many potential flaws within datasets that are fed into the prediction algorithms. The most blatant injustice that can be caused by such flaws is one that has plagued police departments globally for decades. Discrimination in data collection can lead to datasets that are themselves discriminatory. If a police department has a bias in collecting data, this can lead to an algorithm that only confirms the discriminatory approach, thus creating a feedback loop[1]. Of course, this would mean that using predictive algorithms on an individual basis rather than on locations would essentially be used as an excuse for discriminatory policing indefinitely. Worth noting of course, is that the functioning of such predictive algorithms does not take into account the infrastructure of an area or the socio-economic status of its residence presently[2]. However, the possibility of rudimentary data being automatically included in the prediction method as these algorithms become self-learning, will inevitably lead to discrimination towards groups that the pattern-recognition algorithm has determined as having a higher chance of offending, creating yet another ‘snowballing’ effect of discriminatory police targeting[3].

While this is a distinct possibility, perhaps what is equally as terrifying and actually more likely is the very concept that the police may target individuals not based on them being guilty of a crime, but on the statistical possibility that they may commit one in the future. It is essentially a system dictating that one may be found to be guilty of crime without them having even committed one. The police have essentially decided our future for us, excluding free will from being a factor.

The dangers of this are multifaceted. If these algorithms become the norm, then who is to say that algorithmic prediction will not be constitute sufficient evidence for a judge to grant a search or arrest warrant? If this is the case, then the state will have indeed adopted determinism as government policy, Jeopardising one of the most fundamental presumptions in criminal evidence law, innocent until proven guilty[4]. This has already been translated into public policy in Chicago where police go as far as send letters to potential victims and perpetrators of crime informing them that they are on a ‘heat list’[5].

That is not the only issue at hand however. Although the 4th Amendment of the US Constitution prevents state services from gathering information on a suspect in the absence of a warrant, the case of United States v Miller does not prevent state services from gathering such data information from third parties. Consequently, the Court has effectively created a workaround to the 4th Amendment. Thus, businesses creating the predictive software are not bound by such concerns when they sell the final product to police departments[6]. This of course should rightly generate additional privacy concerns. One should not mistake this as being a purely American issue however. Similar software has been tested in Franconia, Zurich, Munich and many more European cities. Thankfully Cyprus, like most European countries, has a constitutional clause protecting accused individuals from being found guilty until proven as such[7]. This has been affirmed in many court cases as well[8], however, there is no guarantee that cultural and technological shifts will not change this standard, and so we must remain ever vigilant from such encroaching technologies.

I believe it is logical to be afraid of the onset of one such technological takeover of the justice system. Technologies such as these may prove to be a threat to civil liberties and privacy rights, in addition to being a gateway to authoritarian practises of legal prosecution which criminal predictive software may cause. There is much potential in developing such software, but I for one would be sceptical of the appropriate application of such technologies.


By George Kassapis

LLB Law University of Aberdeen, Penultimate Year

Summer Intern at Papantoniou & Papantoniou LLC


[1] Aaron Cant (Algorithms and Future Crimes: Welcome to the Racial Profiling of the Future, SAN DIEGO FREE PRESS (Mar. 1, 2014), http://sandiegofreepress.org/2014/03/algorithms-and-futurecrimes-welcome-to-the-racial-profiling-of-the-future/

[2] Simon Egbert & Susanne Krasmann,’ Predictive policing: not yet, but soon preemptive?’ [2020] 30(8) Policing and Society, 30:8, 905-919, DOI: 10.1080/10439463.2019.1611821

[3] Ibid

[4] Woolimngton v DPP [1935] UKHL 1

[5] Andrew Guthrie Ferguson, ‘POLICING PREDICTIVE POLICING’ [2017] 94(5) Washington Law Review

[6] Perry, Walter L., Brian McInnis, Carter C. Price, Susan C. Smith, and John S. Hollywood. “Using Predictions to Support Investigations of Potential Offenders.” In Predictive Policing: The Role of Crime Forecasting in Law Enforcement Operations, 81-114. RAND Corporation, 2013. Accessed July 7, 2021. http://www.jstor.org/stable/10.7249/j.ctt4cgdcz.12.

[7] Cyprus Constitution, Article 12.  

[8] e.g., Siakolas v Police.