Police helicopter: speed measurement and object detection with AI

The traditional radar control gets competition from the air. The Munich startup Helsing presented an adapted military helicopter for law enforcement officers at the digital police day of the “Behörden-Spiegel” on Tuesday, which can use artificial intelligence (AI) to carry out automatic object recognition with real-time geolocation and also to measure the speed of vehicles.

With a corresponding platform, Helsing had the Airbus H145M helicopter “ready for AI for a large state police force,” explained Christian Fischbach, program manager at the company that was spun off from the Helmholtz Center for Applied AI in 2016. He did not name the state. It is known that the Bavarian Ministry of the Interior ordered eight H145 D3 transport helicopters from Airbus Helicopters for the police of the Free State at the end of 2021 for 145 million euros.

Fischbach reported that the AI ​​technology was installed together with Airbus itself and the armaments company MBDA. It is based on Helsing’s basic infrastructure, which initially collects data using radar, telecommunications and radio reconnaissance, as well as camera and sensor evaluation, among other things. The platform for the H145M is one of several “mission modules” that also have their own operating system.

Fischbach showed footage from test flights in which the technology recognized different classes of objects such as trucks, cars and people and displayed them on the helicopter’s display. In addition, there is the speed detection, which “does not yet work so well” at higher speeds on freeways.



According to the former employee at the Federal Ministry of Defense, the platform also enables a semantic search in unstructured data with a “pattern of life” analysis. Showing all white vehicles that have been parked in the last few minutes or that have crossed a line at more than 30 km/h, he gave examples. The function is currently not intended to “distribute fines”. It is more likely to be aimed at detecting an escape.

In principle, the technology can also be packed on drones, even if a larger helicopter is better suited for the main functions thanks to the on-board resources, explained Fischbach. There have already been tests with micro drones and a 3D visualization platform in Bavaria. This allows an overview of where the unmanned aerial vehicles are “located, where they are looking, what they are seeing”. Here, too, the aim is to carry out the main detection on the platform itself in order to be prepared for a temporary failure of communication networks, for example due to jamming.

The technology is also well suited for rescue operations, said Fischbach. When recording during the flood disaster last summer, many people were immediately recognized in a sequence on partially still flooded terrain. Such findings could be passed on directly to the operations center, which also works in the twilight. Such information served as decision support in assessing the situation. At the same time, the representative of the company, in which Spotify co-founder Daniel Ek has invested millions, emphasized that one works “exclusively with liberal democracies” and is in favor of ethical AI regulation.



Ulrich Wilmsmann from the French IT company Atos illustrated what the use of AI in the everyday life of a police officer could soon look like. For a “spontaneous demo with vandalism”, for example, the emergency services could be optimally positioned in advance using virtual reality glasses so that the area “can be continuously monitored”. Drones flying in a swarm then identified certain actions in real time, so that a rioter could be arrested at a “convenient location”.

For Frankfurt, the company has already practiced such a “rein teleportation”, for the Bundeswehr they also monitor battlefields to detect all enemy movements, explained Wilmsmann. This approach can be “transferred 1:1 to the police”, for example for public festivals. In the example chosen, the interrogation of the arrested person is then automatically logged and translated so that it can be used in court. The judge can later navigate directly to a point and hear the original sound.

After work, the policeman enjoys a beer “in the automatically monitored zone with movement and sound analysis,” the big data expert continued the scenario. Foot patrols would be positioned there using predictive policing in such a way that violence would not break out. Atos is already doing this in Holland in a party district that is comparable to St. Pauli in Hamburg.

The pressure to investigate is high, for example in the case of hate on the Internet and the exchange of child pornography, confirmed Wilfried Karl, President of the Central Office for Information Technology in the Security Sector (Zitis). It is therefore obvious to use learning systems, for example for the initial inspection of digital evidence. Algorithms should not be trusted blindly, users must keep an eye on the basic rights of citizens.

In addition, it is important to maintain the “digital sovereignty of the security authorities,” emphasized Karl. However, the AI ​​solutions available on the market are often niche products from only a few manufacturers worldwide. He therefore opposed over-regulation in order to be able to continue to cover the needs in the security sector. However, it is important to know the procedures and training data used and to be able to manage the risks. Martin Thüne from the Thuringian University of Applied Sciences for Public Administration described an “early discussion of questions of data protection, fundamental rights, ethics and the population’s sense of security” as crucial.

“We are being overwhelmed with evidence,” reported Carsten Gußmann from the central and contact point for cybercrime (ZAC) of the North Rhine-Westphalia judiciary. The unit evaluates texts, audio, images and videos increasingly automatically with AI. “We’re hooked,” he admitted. The technology has potential, but has not yet arrived in everyday life. For example, the AIRA solution was tested with Microsoft. The procurement is still taking place, the tender is running. An “encroachment that protects fundamental rights” would be made possible, especially when searching for depictions of child sexual abuse, since the employees would be relieved. In the future, the technology could also be used to identify criminally relevant comments under the Network Enforcement Act or to evaluate chats.

According to ZAC Gußmann, the experience with the Leap AI software from the Viennese company T3K-Forensics has also been good: After analyzing a mobile phone image, including lots of raw data, “we were able to run to the investigating judge within hours”. Reports on digital evidence that are focused on criminal prosecution are provided, said T3K operations manager Martina Tschapka, pleased with the praise. Data could be read from smartphones via “various extraction devices” such as Celebrite, Oxygen, MSAB, Grayshift and Mobiledit. These would then be evaluated with a focus on abuse material as well as extremism and terrorism via relevant image sections and video frames. A semantic analysis to identify similarities from text to image and image to image is also possible. All products are available in the cloud.

The police should be allowed to use biometric data from video surveillance at crime hotspots, for example for face recognition, campaigned Michael Brand from the CDU/CSU parliamentary group for a new power. Marcel Emmerich from the Greens put a “big question mark” behind such methods, and not only in terms of added value. He pointed out that this would also enable “constant identification in public space”.

“We want to strengthen the work of the local authorities,” emphasized FDP interior expert Manuel Höferlin. However, this should “be able to be carried out safely under the rule of law”. The traffic light coalition therefore first wants to create transparency about the already existing large number of powers with the planned monitoring accounts.

As a data protection-friendly approach to video analysis, Dominik Lawatsch from Secunet brought software from the Berlin startup Brighter AI into play. This creates a “natural-looking anonymization of people” and license plates. For example, a face is automatically replaced by a similar looking one, but no longer recognizable as belonging together artificially created mask exchanged. In this way, the situational wealth of information from camera recordings can continue to be evaluated in a way that protects fundamental rights and, for example, unusual behavior can be identified.


(mho)

To home page

Leave a Comment

Your email address will not be published. Required fields are marked *