Chicago PD automated policing program got this man shot twice
What the hell. This is incredibly dystopian shit.
They told McDaniel something he could hardly believe: an algorithm built by the Chicago Police Department predicted […] that McDaniel would be involved in a shooting. That he would be a “party to violence,” but it wasn’t clear what side of the barrel he might be on. He could be the shooter, he might get shot. They didn’t know. But the data said he was at risk either way. McDaniel was both a potential victim and a potential perpetrator, and the visitors on his porch treated him as such. A social worker told him that he could help him if he was interested in finding assistance to secure a job, for example, or mental health services. And police were there, too, with a warning: from here on out, the Chicago Police Department would be watching him. The algorithm indicated Robert McDaniel was more likely than 99.9 percent of Chicago’s population to either be shot or to have a shooting connected to him. That made him dangerous, and top brass at the Chicago PD knew it. So McDaniel had better be on his best behavior.
tl;dr: police attention and apparently-“suspicious” interactions with cops as a result of the predictive policing listing wound up with him assumed to be a “snitch”, resulting in several attempts on his life. What a mess.(tags: precrime predictive-policing policing chicago dystopia future ai heat-list)