By Phil Pennington of RNZ
Police are using powerful new Artificial Intelligence (AI) to help them assess the risk posed by offenders when officers are called out to emergencies, sparked by the shooting murder of an unarmed constable.
SearchX is at the heart of a $200 million front-line safety programme being rolled out in response to the death of Matthew Hunt in West Auckland in 2020, and other recent gun violence.
Police have disclosed through the OIA what they have been working on for months with big tech multinationals that build data-driven policing systems.
The papers show the intelligence system used till now was so disjointed it could not access key information about gangs and firearms. It took minutes to find connections one-by-one between people.
In a trial, SearchX instantly found 15 times more connections, such as to associates or firearms. A whole new visual intelligence function sends this to front-line police.
It can “find a person that lives in Taranaki who is connected to someone with access to a black sedan”, documents reveal.
However, data-driven policing has been highly controversial in the US and UK, accused of intensifying police racial biases, and compounded by lack of transparency.
SearchX is the intelligence system behind the Tactical Response Model or TRM, launched in March, the OIA said.
TRM gives police much easier access to guns and dog teams, and “the model uses police intelligence to risk-assess situations early”, the government said in March. In a trial, TRM had benefits but also put strain on existing police resources, a study showed.
SearchX was built this year by IBM and Canadian software giant Constellation’s subsidiary VA Worldwide of Canberra, using IBM Watson AI and so-called i2 visual tech spun off from IBM.
It provides “dramatically more useful” intelligence, especially about firearm threats, the papers said.
This is disclosed in a 102-page OIA response to RNZ of business cases for both parts of the double-pronged system; all costs are blanked out.
Police did not disclose to the public what they were doing until now, when the system is already in place.
The documents show they perceived a “high risk” that, “If adverse media attention occurs then the project may be delayed or closed.”
The documents also show police have set up a “Top Secret Network” to communicate with the spy and national security agencies inside government. They would not say anything more except it had been working well since late 2022.
For SearchX, one option police looked briefly at was to buy new technology; this could have been from secretive firm Palantir, the papers show.
Palantir, founded by US billionaire and New Zealand citizen Peter Thiel, is closely linked to US defence and spy agencies, and poised to win a billion-dollar public health data contract in the UK.
Instead, police chose to expand their existing IBM Watson AI, and buy and integrate a whole new visual intelligence application called i2 Notebook Premium; IBM sold off i2 recently, but prior to 2020 - when IBM turned away from facial recognition tech - the notebook had a facial recognition module that could be added.
IBM’s Watson AI is used to search across police’s National Intelligence Application and other main databases, analyse findings and send those to front-line officers’ smartphones, in the blink of an eye, the documents show. The system uses machine learning to teach itself.
Police have done a privacy impact assessment on the Watson AI, but not on the new visual intelligence application, the papers suggest.
Police’s intelligence systems were harshly faulted up to and after the 2019 mosque attacks in internal assessments, RNZ revealed in 2021 using the OIA. That same year, a police report card on the Gang Intelligence Centre revealed poor leadership and a failure to tackle harm caused by organised crime.
The newly released SearchX business cases show the National Intelligence Application “contains rich gang organisation and association data which is critical to front line safety”.
But the system - in use up till this year - was siloed and “does not retrieve any of the new NIA gang data”.
That and a swathe of other shortcomings put intelligence blinkers on front-line officers.
“The SearchX project was established to improve the speed of access and the ways in which to link intelligence together,” police director of national intelligence, Dr Dan Wildy, told RNZ in the OIA.
The OIA documents show police set out in 2022 to set up an intelligence tool to be able to:
· “a. Tell me who is directly or indirectly linked to a POI [person of interest].
· “b. Tell me how one POI is linked to another (perhaps through a complicated chain of connections).
· “c. Identify clusters of entities such as people in an as yet unidentified criminal organisation.”
The two business cases repeatedly state the new system will allow much closer monitoring of police use, of which officers are accessing it and why. It was expected to start with 20-30 officers, and “if the capability proves valuable it will be picked up by larger groups in police”.
In Los Angeles, a form of data-driven policing, called “predictive policing”, was used by the LAPD for a decade from 2010 to target crime “hotspots” identified by AI mapping.
Civil liberties groups said it resulted in heavier policing of black and Latino communities, and the LAPD jettisoned it in 2020 - though critics say the tech and many practices remain the same.
The mapping tool the LAPD used has recently been adopted by the New Zealand police.
Called ArcGIS, it is being used here to generate “hotspot” maps, among other types of mapping, a public police document shows.
ArcGIS’s creator, US company Esri, heavily promotes hot spot analysis.
“Crime analysts use mapping and analytical methods such as hot spot analysis to identify crime trends and patterns,” Esri said in online promos.
“ArcGIS ... is critical to implementing evidence-based, data-driven crime reduction strategies. Data-driven policing enriches resident and officer safety, promotes transparency, and improves operational efficiency.”
After US police murdered George Floyd in 2020, 1400 researchers signed a letter calling on all US mathematicians to stop working on predictive-policing algorithms.
The SearchX business case said police wanted Esri’s ArcGIS added into the newly integrated visual analytics tech, called i2 Analyst Notebook Premium or ANB.
It added: “As Esri continue to develop the product it’s become clear that they are moving into the space currently dominated by vendors such as i2 and Palantir.”
New Zealand police told RNZ on Friday: “Police do not use or have access to IBM software for facial recognition or predictive policing.”
The Tactical Response Model superseded the police Armed Response Team trials, abandoned in 2020 after it was revealed that working groups warned police the ART teams could be seen as a new force to use against Māori.
Police then launched research into their own biases; RNZ is seeking an update on what this has found.
Bias in such practices as warrant-less searches has been widely reported.
AI, and its subset machine learning, is at risk of entrenching biases because it draws on acres of historical data that often is a result of biased practices, international research has shown.
Police said in the OIA that delivering successful crime prevention and front line safety relied on efficient and effective intelligence from their information holdings, where data was often large, complex and semi-structured.
“It’s also highly linked, such as an example of a car owned by one person but commonly driven by close family and friends, one of whom may have a cause for concern.
“An increasing focus on tactical intelligence has meant that analysts must often work under significant time pressure searching for relevant information to support a front-line operation.”
A major reason police did not buy a whole new Palantir or other system was their own ICT team was under too much pressure to cope with the overhaul required, the documents showed.
In a review just two years ago, newly released to RNZ, IBM described police ICT overall as “incomplete”, “disjointed”, “undefined”, “unclear”, “inconsistent” and with an expensive bias towards trying to build in-house bespoke systems instead of buying off-the-shelf.
IBM said it no longer offered “general-purpose facial recognition””
In 2019, its Watson Visual Recognition Service was changed to remove any face detection, analysis or identification capabilities, and now only analysed “to detect and differentiate between objects”, the company told RNZ.
“We oppose and would not condone the use of facial recognition or any other technology for mass surveillance, racial profiling, or other human rights violations.”
RNZ will report more on that shortly.