Complete Story
 

08/27/2020

COVID-19 Proves It’s Time to Abolish ‘Predictive’ Policing Algorithms

Why too many groups still rely on punitive software

As summer comes to a close, local governments are returning to their council chambers and facing massive pressures. Municipal budgets are losing hundreds of millions in revenue in the wake of coronavirus (COVID-19). Meanwhile, a generational uprising is pushing our government to divest from militarized, racist policing, calling instead for the resources that our neighborhoods have been starved of for generations—the resources that actually increase safety.

The broad-based support for these calls sends hopeful signals about where our cities and country are headed. But if we want to get there, we must take care not to repeat the mistakes of the past.

During the last great economic crisis this country faced, in 2008, local policymakers sought to save money while making their communities “safer” with new tech-based solutions. In the years since, police departments, probation officers, and courts have embedded this technology—like crime-predicting algorithms, facial recognition, and pretrial and sentencing software—deep inside America’s criminal legal system, even as budgets have risen and police forces have grown. But instead of actually predicting and reducing crime and violence, these algorithms promote systems of over-policing and mass incarceration, perpetuating racism and increasing tensions between police and communities.

Please select this link to read the complete article from WIRED.

Printer-Friendly Version