It’s July 17, 2014, and as Eric Garner is killed by the police, his final words are, “I can’t breathe.”
It’s April 12, 2018, and a barista calls the cops on two black men waiting patiently for a friend in a Starbucks. It’s August 4, 2025, and the Chicago Police Department, now relying heavily on facial recognition artificial intelligence software, wrongly identifies and arrests Barack Obama. While that last example may be a hypothetical, we’ve already seen the damaging ramifications of biased A.I. technology. Courts in Broward County, Florida, currently use risk assessment A.I. to predict whether the defendant of a petty crime is likely to commit more serious crimes in the future. This software wrongly labels black defendants almost twice as often as it does white defendants.