Predictive policing has unmasked a new breed of future criminals: MPs.
A new testing system has put five EU politicians in the spotlight as “at risk” of committing future crimes. Luckily for them, it’s not a tool used by law enforcement agencies, but rather one designed to expose the dangers of such schemes.
The project is the brainchild of Fair Trials, a criminal justice monitoring organization. The NGO is campaigning for a ban on predicting policing that uses Data Analytics to predict when and where crimes are likely to happen – and who might commit them.
Proponents argue that the approach can be more accurate, objective, and effective than traditional policing. But critics warn that it perpetuates historical prejudices, disproportionately targets marginalized groups, reinforces structural discrimination, and violates civil liberties.
Discover the future of technology!
Visit us at the TNW conference on June 15th and 16th in Amsterdam
“It may seem incredible that law enforcement and criminal justice agencies make predictions about crime based on people’s background, class, ethnicity and associations, but that is the reality of what is happening in the EU,” Griff said Ferris, senior legal and policy officer at Fair Trials.
In fact, the technology is becoming increasingly popular in Europe. In Italy, for example, a tool called Dalia analyzed ethnicity data to create profiles and predict future crime. In the Netherlands, on the other hand, the so-called Top 600 list was used to predict which young people will commit serious crimes. One in three people on the list – many of whom reported being harassed by police – was of Moroccan descent.
To illustrate the effects, fair trials were developed a sham assessment of future criminal behavior.
Unlike many real systems used by the police, the analysis has been made completely transparent. The test uses a questionnaire to profile each user. The more “yes” answers they give, the higher their risk score. Here you can try it yourself.
Politicians from the Social Democrats, Renew, Greens/EFA and the Left Group were invited to test the tool. After completing the quiz, MPs Karen Melchior, Cornelia Ernst, Tiemo Wölken, Petar Vitanov and Patrick Breyer were all rated as “medium risk” for future offences.
“There should be no place in the EU for such systems – they are unreliable, biased and unfair.
The gang faces no consequences for their possible crimes. In real life, however, such systems could add them to police databases and subject them to close monitoring, random questioning, or stopping and searching. Your risk assessments may also be shared with schools, employers, immigration authorities and child protection services. Algorithms have even led to people being jailed with scant evidence.
“I grew up in a low-income neighborhood in a poor Eastern European country and the algorithm profiled me as a potential criminal,” said Petar Vitanov, an MEP in Parliament Bulgarian Socialist Partysaid in a statement.
“There should be no place in the EU for such systems – they are unreliable, biased and unfair.”
Fair Trials released the test results amid growing calls to ban predictive policing.
The issue has proven controversial in proposals for the AI Act, which aims to become the first-ever legal framework for artificial intelligence. Some lawmakers are pushing for a total ban on predictive policing, while others want to give law enforcement agencies leeway.
Fair Trials has given proponents of the systems a new reason to reconsider their views: the technology can target them, too.