This story is in the news, mainly because it involves AI and that is seen as some kind of miraculous solution to lots of problems. In reality though, it is just another risk assessment tool and the only thing different about it is that a machine, rather than a probation officer, is totting up the persons score, to decide if they can be released from prison for example.
However, I wonder if this tool might be used for other things. Such as deciding who to put on the police watchlist, when doing their facial recognition fishing expeditions, in the local high street. We know that SO can be harassed now, just for walking down the street, and the criteria for the homicide prediction project include health markers like mental illness. So anybody with both a conviction and a mental health issue, could potentially be on the list to be stopped and questioned "just in case".
|