On September first, the University of Toronto’s Citizen Lab published a report about predictive policing technologies that use artificial intelligence (AI) to prevent crime before it’s even committed. 

There are two main types of algorithmic policing technologies:

1. Location-based: Historical police data predict where crime is more likely to occur within a specific geographic boundary. Essentially, areas with higher crime rates will be targeted by the predictive technology because the AI assumes crime is most likely to occur in a place with consistent crime reports. 

2. People-based: The people-based predictive policing technologies focus on historical police data as well. However, a people-based focus identifies individuals with a greater likelihood of breaking the law or reoffending. 

In theory, AI technology that can predict the probability of where and who will commit a crime is ideal. One would be inclined to believe that those who have committed a crime in the past are likely to repeat their deviant actions, and that places where crime occurs often are hotspots where police should focus their resources. 

Furthermore, potential victims of predicted crimes could be saved and spare from physical and mental trauma. With this ideal system, Canada’s judicial system could have a lighter caseload since the need for arrests would decrease.

Moreover, from the taxpayer’s perspective, the costs of the judicial system, including but not limited to, housing inmates and paying judges and lawyers, could all be dramatically lessened if predictive technologies were successful. 

However, the technology fails to account for policing bias and the sociological factors that contribute to crime. This results in more harm to the BIPOC (Black, Indigenous, and People of Colour) community, as these are the communities often suffering from poverty, homelessness, lack of social services, and racism. Therefore, this fundamental flaw destroys the credibility of the policing predictive system altogether. 

Both the location-based and people-based technologies are found in historical police dataUnfortunately, the Canadian police force has a history of targeting the BIPOC community. This was explicitly done through increased surveillance, carding, and an abnormal ratio between the percentage of Indigenous populations and Indigenous offenders, among others. The sad truth is that the police database is not free of bias or discrimination, meaning certain prejudices will bleed into the AI predicting technology and warp an already broken system. 

For example, since Indigenous communities are known to have higher crime rates than non-Indigenous communities, the technology would predict more crime to happen on reserves or where Indigenous populations are most condensed. As a result, these communities that are already suffering at the hands of the police will have more police surveillance through the Canadian government’s negligence. 

Why should their suffering continue based on the pre-conceived notion that crime may occur in their local community? The keyword here is predictive technology. The technology is not claiming to be 100 per cent accurate or effective. So, why should vulnerable communities, such as BIPOC, settle for a system with inherent flaws and the risk for detrimental consequences when there isn’t a guarantee of effectiveness? 

Moreover, according to the Canadian Charter of Rights and Freedom, artificial intelligence should not interfere with our freedoms and privacy. There is no justification for any individual’s liberty to be infringed upon just because they may commit a crime. Before a crime is committed, minority communities are already scrutinized. 

Unfortunately, we do not live in a fantastic world, like the sci-fi action movie Minority Report starring Tom Cruise, which uses similar technology to predictive policing. However, if you have seen that movie, you know that in the end, human trial and error works better than any AI. 

Our society is simply not ready for predictive technology. It will never be ready until we can rid ourselves entirely of the systemic bias rooted in our policing and justice system.

Leave a reply

Please enter your comment!
Please enter your name here