It was fraud on a grand scale. The “Fuck the Police” criminal gang based in Luton and Romania stole £800,000 in more than 3,000 withdrawals from cash machines in dozens of locations throughout 2024.
The police investigation matched the crime in its complexity. When detectives in Bedfordshire seized the suspects’ two dozen smartphones, they were faced with a mountain of potential digital evidence – 1.4 terabytes of information, according to the authorities, connecting co-conspirators across eastern England and the Bacau region of Romania.
Alongside human intelligence, forensics and the other mainstays of traditional policing, detectives increasingly face dauntingly vast reservoirs of digital information that could contain vital clues.
The data haul, which covered messages, geolocation positions, emails, notes and photographs, was equivalent to about 500,000 ebooks and would ordinarily take months, if not years, to comb through.
It was then that the detectives of the Eastern Region Special Operations Unit reached for new AI tools, supplied by the controversial US tech company Palantir, co-founded by the Trump-supporting billionaire Peter Thiel. Palantir has more than £500m in high-profile contracts with the NHS and the Ministry of Defence. But it is also providing AI investigation tools to 11 police forces, and the government last month pledged to invest more than £115m in the “rapid and responsible development, testing and rollout of AI tools” across all 43 forces in England and Wales, including creating Police.AI: a new national centre for AI in policing.
In the cashpoint case, the AI system – known as Nectar – unlocked the trove. It read and translated more than 100,000 messages, drew charts connecting suspects, analysed their movements, combed pictures and texts for suggestions of crimes and alerted detectives to potential leads. Human policing led to the arrests which then saw six men jailed in November, but without the AI, the detectives would still be digging through the evidence mountain now, the force said.
“It’s not Robocop,” said Dan James, the programme manager at the unit. “It’s about how we can make our investigators more efficient.”
But the use of Palantir’s AI technology in policing has caused concern. Last year, documents showed the intention was to “assist with decision making” and “aid in the prevention, detection and investigation of crimes”. Data the AI system processed included political and religious opinions. Last week, Shockat Adam, the MP for Leicester South called on ministers to provide greater transparency about another Palantir contract with Leicestershire police, which he called “dystopian”. The Lib Dem MP Martin Wrigley called for UK AI companies to be encouraged to bid for police AI contracts and Liberty, the civil liberties campaign group, has called for the government to install “a system of strong guardrails” before police forces deploy more AI.
For Bedfordshire police, the most immediate benefit offered by its Palantir-enabled system in the cash point case was the translation of the phones’ contents from the suspects’ native Romanian.
“We did 100,000 messages in a day,” said an official, requesting anonymity. “We would have been waiting weeks, if not months for that translation to be done.”
Human translation would have cost £30,000, and in the time taken to complete the translation, suspects in custody might have been bailed, released under investigation or absconded from the country, requiring re-arrests.
The AI system scanned texts for clues about other crimes and, in the cashpoint case, it identified about 120 potential offences. “If they’re talking about drugs, they’re talking about ATMs, if they’re talking about guns [the AI] highlights them,” they said.
The AI is trained to recognise images of cocaine and cannabis and reads all the texts and messages in a phone and tries to “understand them against UK law”. The technology is also able to help detectives put together a more accurate picture of suspects’ movements than would otherwise be possible, but the officials declined to elaborate on how this was done.
The system also creates live association charts – not dissimilar to photo pinboards familiar from TV dramas – that are constantly updated as new information arrives.
“You’re able to click on [a person and see they are] linked to that person through this and it explains why,” said James. “It’s really efficient and it stays up to date.”
At present, the AI tools are only used for investigation and AI outputs are not used directly in prosecutions, although that is a goal that could be achieved in the future.
“People could be lazy and people will get lazy,” said James. “So we need to make sure that … the AI’s suggestion is affirmed by a person.”
The force said Palantir does not hold or have access to any of its data and its AI tools cannot learn from it.








