Why is predictive policing flawed?

Why is predictive policing flawed?

Predictive policing is the use of a computer system to predict crime. This sounds like something out of a science fiction novel, but it’s actually pretty simple. If you put all the arrest and response data into a computer, it can create a model of where and when certain types of crimes tend to happen. It can then “predict” that similar future crimes will happen at similar locations. Police officers are dispatched in advance, and they know where to look.

When departments talk about this, they often glowingly report that it takes all bias out of the system. An officer cannot be biased against people of any age, race, gender or economic level because it’s not an officer deciding where to go. It’s a computer. A computer doesn’t have any biases, so this is a fair system.

But is it? It sounds like a good theory, but it’s really flawed. Remember, the initial data still comes from the police officers on the ground. As one expert put it, “the data collection for your system is generated by police.” If those officers were biased to begin with, this feeds into itself.

Say an officer is biased against an ethnic minority. He or she tends to arrest more people of this group and spends more time in neighborhoods where they live. The arrest data for the computer then indicates that a lot of crime happens there, but it’s just because that officer is targeting those individuals and ignoring crime elsewhere. The computer than predicts that more crime will happen in that area and sends more officers, who report their own data from the area, leading to even more predictions. Slowly, the computer hones in to be extremely biased.

There is no room for bias and prejudice in the justice system. If you think you’ve been arrested unfairly, you must know what defense options you have.