A powerful new policing tool transforms everyday data into a comprehensive portrait of people’s lives. UK police authorities have begun procuring AI software from an American technology company that combines sensitive personal data such as race, health, political opinions, religious beliefs, sexuality and trade union membership into a unified information platform. A leaked internal memo from Bedfordshire Police, released as part of freedom of information, reveals the rollout of the “Nectar” system beyond the pilot stage.
Developed in collaboration with Palantir Technologies, Nectar combines around 80 streams on a single platform, from traffic cameras to intelligence files. Its stated aim is to create a detailed profile of suspects and to support investigations involving victims, witnesses and vulnerable groups, including minors. The 34-page briefing highlights that the police leadership wants to expand the software of the Bedfordshire Eastern Region Serious Organized Crime Unit to the national level. The system can improve crime prevention and protect vulnerable people more effectively.
The move is part of a broader government initiative to deploy AI in all public services, including the health and defense sectors, often through partnerships with the private sector. However, the use of Nectar, which has access to eleven “special categories,” has raised concerns among privacy advocates and some lawmakers. These categories include ethnicity, sexual orientation, political opinions, and trade union membership.
Although Palantir and Bedfordshire Police emphasize that Nectar only uses information already stored in existing law enforcement databases and remains inaccessible to non-police officers, there are concerns. There are concerns about potential abuses, such as storing data without proper deletion procedures, and the risk that innocent people could be flagged by algorithms designed to identify criminal networks. Former shadow Home Secretary David Davis told I Magazine that he was alarmed, urged parliamentary scrutiny and warned that “zero surveillance” could lead to police “gaining the powers they want.” Liberty and other activists have also questioned whether Nectar is really a mass surveillance tool capable of creating detailed “360-degree” profiles of people.
A spokesperson for Bedfordshire Police then explained that the initiative was an “exploratory exercise” that focused on data that was legally obtained and securely processed. The system speeds up the processing of cases and supports intervention in cases of abuse or exploitation, especially in the case of children. Palantir added that in the first eight days after launch, Nectar helped identify more than 120 potentially vulnerable young people and made it easier to apply Clare’s Law messages.
The company claims that its role is limited to data organization and not decision-making. Nevertheless, experts are very worried. Although the nationwide rollout has not yet been approved, the Ministry of Interior confirms that the results of the pilot project will be incorporated into future decisions.
With the increasing integration of AI tools from the private sector into the police, issues of surveillance, transparency, data deletion and individual rights are becoming more and more urgent.
Translated and edited by L.Earth