MINORITY REPORT POLICING

Predictive policing…Jesus Christ, it’s here.

Thanks to a lawsuit filed by the very specifically named civil rights watchdog group Stop LAPD Spy Coalition, the Los Angeles Police Department was forced to release documents detailing the use of their predictive policing algorithms.

https://injusticetoday.com/the-lapd-has-a-new-surveillance-formula-powered-by-palantir-1e277a95762a?gi=deb46640ba7f

The system works by assigning individual citizens a number of points. How many points someone is assigned depends on their past offenses. Even just having a recorded interaction with a police officer can generate a point.

Points are then used to determine the likelihood that an individual will commit a crime in the future. The algorithm processes the points and delivers a “must watch” list, which police then use to monitor the people whose names are on the list.

Here’s the problem. The predictions are developed using previous policing data and are lacking in context. We know for a fact that in many cities, especially Los Angeles, that there is a serious problem with over-policing of certain minority neighborhoods. Policing in America tends to be hugely racist, which itself creates criminality, which creates over-policing, which creates... But the more you hound any group, the more illegality you’ll discover. Crack down on any given gated community and there’ll be a march of white-collar criminals filling up the county jailhouse in no time at all.

The program comes from Peter (stop helping) Theil’s A.I.-centric surveillance company PALANTIR, which is named after the all-seeing-eye thing in Lord of the Rings. Ugh…

The use of old, racist data creates a feedback loop the algorithm not only encourages, but amplifies. But now it’s a machine making the prejudiced decisions, and machines are pure and good and color-blind, so how can it be wrong? So, what is already a difficult topic to navigate becomes all-the-more fraught because the accountability has been passed along to something perceived by many to be infallible.

The LAPD isn’t intentionally using discriminatory programs, but it may have radically jumped the gun in implementing the use of algorithms that depend on flawed data to run. The idea is to improve life for everyone by using unbiased technology.

The problem is that right now our technology is only as good as we are.



image
image @writesbackwards is a group of friends who love to write about life, sports, comedy, tech and other fun stuff!

Consider leaving a comment, we love rewarding engaging Steemians!

Sort:  

good post, I've followed you and given voute, because I'm new to steemit, to experience your steemit to me and help me to maximize my steemit.
thanks from me @ fauzan93.

Interesting

I agree. The algorithm makes no sense and is weighted due to a number of biases. It assumes some kind of linearity and even distribution. It's a terrible theory which should be kept as an 'ok that didn't work out so well' idea.

That is the biggest flaw in the desire to implement AI prediction systems.

Because these neural networks need training, they will be trained to hold those same biases.

Coin Marketplace

STEEM 0.35
TRX 0.12
JST 0.039
BTC 70310.43
ETH 3568.34
USDT 1.00
SBD 4.73