Source: The Conversation (Au and NZ) – By Kathryn Henne, Professor and Director, School of Regulation and Global Governance, Australian National University
Police departments around the world are increasingly using body-worn cameras in an attempt to improve public trust and accountability. But this has created huge amounts of data, about 95% of which is never reviewed or even seen.
Enter companies such as Axon, Polis Solutions and Truleo. These companies market artificial intelligence (AI) tools for analysing the data generated by body-worn cameras and other policing technologies.
Some police departments in the United States previously launched trials of these tools before abandoning them because of concerns about privacy.
Truleo told The Conversation that police in Australia were now using its technology, but did not name any specific department. However, when The Conversation asked Australian police departments if they were using or considering using Truleo’s software, all except the Queensland Police Service said they were not.
In a statement, a Queensland Police Service spokesperson said it is currently conducting an AI trial with “a variety of technology” as part of its work tackling domestic and family violence. The spokesperson added: “Once the trial is completed, a detailed evaluation will be undertaken before the QPS considers future options for using the technology”.
But AI will not solve the challenges facing police – at least, not by itself.
The unfulfilled promise of body-worn cameras
The increased use of body-worn cameras by law enforcement agencies in recent years follows a number of high-profile cases involving police using force. In Australia, for example, a police officer is currently on trial for the manslaughter of a 95-year-old great-grandmother by using a taser.
There is debate about whether body-worn cameras actually make police officers’ behaviour more transparent and accountable.
Some experts have said their effectiveness is uncertain. Others have said they are a failed attempt at reform.
These sentiments were echoed by a major study published earlier this year.
The study examined the use of body-worn cameras in response to domestic and family violence in Australia. It acknowledged their potential utility but showed how data from these technologies might not be used to support victim-survivors. This is because of more foundational problems with how police engage with victim-survivors.
AI’s many uses in policing
Police have been using AI as part of their work for a long time.
For example, in 2000, New South Wales Police launched a program that used data analytics to predict which people were at risk of committing a crime, to enhance police supervision.
A report from the Law Enforcement Conduct Commission later revealed the program disproportionately targeted Indigenous youth, who subsequently faced heightened surveillance and increased arrests for minor crimes. This led to NSW Police ending the program in 2023.
The Queensland Police Service has also proposed a program using AI technologies to predict risk of domestic and family violence.
Experts, however, have pointed to potential unintended consequences, including criminalising victim-survivors.
Companies such as Truleo, which provides police with AI tools to analyse body-worn camera footage, say these tools improve police “professionalism”. However, it is not clear if what is being measured and assessed as “professionalism” correlates with officers’ core duties and responsibilities.
In fact, the Seattle Police Department in the US ended its contract with Truleo despite acknowledging it was a “promising” trial.
It did so after finding a case of unprofessional conduct in which the police union cited the use of camera footage as infringing on the police officer’s privacy.
The need for structural reform
AI tools could help police manage and analyse body-worn camera data. Their value depends on several conditions.
First, police must thoroughly evaluate any AI tools to ensure they are fit for purpose in a local context. Many of these technologies are developed overseas and trained on data that have linguistic features such as accents, inflections and insults that are not common in Australia.
Second, police – and the companies that offer AI data analysis tools – must also be transparent about how they use body worn camera footage. In particular, they must share where, how and by what arrangements data is processed and stored.
Finally – and most importantly – the use of AI technologies by police should not supersede organisational and structural reforms.
Police need to examine the impact of behaviours and processes that have resulted in inequitable surveillance practices. AI technologies are not solutions to these underlying dynamics.
Without an understanding of the systemic structures that sustain disparities in the criminal legal system, police will not be prepared to address the implications of integrating AI technologies into their work. If they do not, these technologies are more likely to exacerbate existing injustices and inequalities.
In short, the questions about AI shouldn’t be simply about technology but about police legitimacy.
Kathryn Henne’s research is supported by funding from the Australian Research Council, Consortium of Humanities Centers and Institutes, Google Academic Research Awards Program, National Endowment for the Humanities and Social Sciences and Humanities Research Council.
Charles Gretton receives funding from Australian government agencies. Charles’ research has also been supported by Oracle for Research. Charles is a Fellow Member of Engineers Australia, a member of the Association for the Advancement of Artificial Intelligence, and an Associate Member of the Australian Information Industry Association.
Kanika Samuels-Wortley receives funding from the Social Science and Humanities Research Council, Canada Research Chair Program.
– ref. Australian police are trialling AI to analyse body-worn camera footage, despite overseas failures and expert criticism – https://theconversation.com/australian-police-are-trialling-ai-to-analyse-body-worn-camera-footage-despite-overseas-failures-and-expert-criticism-243371