Recently, we have seen how technology is taking its place in police investigations. In the past few years, many police departments, especially in the United States have introduced police body-cameras. The advancement of video technology allowed the police to gain more insights in some critical situations. Body-cam videos can be a tool for police accountability and for supporting officers who behave lawfully.
One of the pioneers in the field and largest producer of police body-cameras – Axon, has announced a new AI system for analyzing body-cam videos. Besides the body-cameras and the data storage space, Axon wants to broaden their scope with the new AI system. According to the company, their system will be able to interpret and describe in written form the recorded events and eventually help generate police reports from those descriptions.
However, police oversight is a critical application field and trusting commercial AI solutions might appear as a concern. Daniel Greene, a professor at the University of Maryland’s College of Information Studies and Genevieve Patterson, chief scientist in a computer-vision startup have addressed this problem in their article “Can We Trust Computers With Body-Cam Video?”.
According to them, many of the AI capabilities that Axon proposes to deploy aren’t mature enough. Moreover, they emphasize the fact that this kind of software is proprietary and therefore there would be no way to tell if the technology is free from bias.
A number of problems might arise when employing a technology based on AI, mainly because of biased training data. Axon claims that it will train its AI system using its existing database of body-camera data containing 30 petabytes of video data, collected by 200,000 officers. Again, since the AI system and the database are proprietary and there is no guarantee that the system will be unbiased.
As mentioned several times, automated video interpretation might be too optimistic goal, even for the technology that we (or Axon) have now. According to the authors, full video interpretation and report generation is currently impossible. Even more, the same issues of fairness, accountability, and transparency would remain.
However, according to them, there will be at least one useful feature in Axon’s new AI system. And that is obscuring the faces of people in body-camera videos so that they can’t be identified.
In conclusion, the technology is there, but it might not be mature enough to be employed. The authors of the article express concern and call for a public debate on this topic and everything around it. Such a debate might be of crucial importance before making any step in whatever direction.