Practicing For Over 30 Years With
Free Initial Consultation

Why predictive policing doesn’t work

On Behalf of | Nov 8, 2022 | Criminal law

In an effort to remove bias from policing, some police departments began using computer models known as predictive policing. Essentially, computer systems could look at all the available data about crime in the area and predict where a crime would occur in future. The computer could then dispatch officers to this location, even if nothing had occurred yet. 

They hoped this would deter crime from taking place or that they may be able to make an arrest. However, critics of the system say that it’s just as biased as a human police officer, if not more so. Why doesn’t this computer system work if it is true that a computer cannot be inherently biased against anyone?

Here’s what you should know:

The data comes from the police officers

The problem is that the predictive policing algorithm has to be trained. Officers have to feed information into it about previous arrests. This means that any bias in those arrests is going to be reflected in how the machine views the data, and it can sometimes amplify this bias.

For instance, one of the officers who is making arrests could be biased against people of a certain ethnic background. If these people live in a specific neighborhood, this officer may stereotype them and spend more time making arrests in that neighborhood than anywhere else. 

This tells the computer to send more officers there because crime levels are very high, but the reality is that the computer is just reflecting the racial bias of that original officer. It could be completely ignoring crime in other neighborhoods, while focusing far too much of the police force on one group of people.

After an arrest

If you’ve been arrested, you may be worried that bias and prejudice played a role. This could certainly be a violation of your rights. It’s often important for you to know about all of your criminal defense options at this time.