top of page

Detecting Drones from Miles Away

As drone technology becomes increasingly advanced, so does their potential threat to our daily lives. In situations like the crisis in Ukraine, a drone could be carrying an IED and put lives at risk. This makes it all the more important to detect them from as far away as possible.


A team of experts from the University of South Australia, Flinders University, and defense company Midspar Systems have developed a detection device with a processing algorithm that would significantly boost the range of current detection systems on the market.


The idea for the device started with observations of a hoverfly, which has a keen ability to not only fly and hover around an object with speed and accuracy, but detect predators and other oncoming noises. It’s this multi-tasking adaptation that intrigued and inspired both Professor Anthony Finn of UniSA and Dr. Russell Brinkworth of Flinders, two of the lead engineers on this project.


“At the moment, most robots are totally engineered to the environment around them, and conditions must be perfect for them to be able to operate,” Brinkworth explained. “As soon as one tiny thing changes, they can't do it anymore. You have industrial robots that can make cars, but they can't make sandwiches. Building something that changes and is flexible is very hard to do, but that's what biology has to do all the time. Their adaptation is key to their survival.”


The detection system’s main component is a smart adaptive algorithm that can take data from a variety of sensor types such as cameras or microphones. In their most recent work, the team used an array of seven microphones configured as two concentric circles with one set farther out to create an equilateral triangle. As sound propagates over the array, this precise configuration allows the system to create a vector and pinpoint the direction of any acoustic signature. The fly-inspired processing was then able to boost the detection range of the microphone system 50 percent farther than if it were not used, which in ideal conditions, is up to 2.5 miles.


“Drones have distinct tonal harmonics that the human ear can identify easily when they are in close range,” Finn said. “The beauty of the adaptability and functionality of this bioinspired model is that it intrinsically attempts to enhance that which is considered to be a signal and suppresses that which is considered to be noise.”


One area where the team was able to advance the design further was by using wind-cancelling microphones, which greatly improved the reliability of signals coming from even further away. Using a traditional approach, the device is able to detect the distance and location of a drone with significantly higher accuracy, and at much longer distances, than that of a human ear.


It’s important to distinguish that this system isn’t identifying and confirming the drone so much as it is offering regions where you should pay attention for a drone. So, the processing and classifying of what type of drone might be there is not the priority, which is different than what other systems currently being developed are trying to achieve. This is what makes the work unique: it can act as a compliment, not an alternative, to current standard approaches.


“By saying, ‘here's where your attention should be,’ it negates the need to analyze everything in excruciating detail, which is what's currently being done,” Brinkworth said. “This is why current detection systems are so slow and unable to adapt to their environments.”


Brinkworth noted that video surveillance systems today are inefficient because they are overwhelmed with data. There are thousands of cameras set up and millions of hours of footage being collected, but no one has time to sort through it all.


“You've all seen the footage of security guards sitting in front of dozens of monitors,” he said. “They can't pay attention to everything all at once. So, we are building a system that tells you when there's an anomaly, when there's something that needs you to pay attention to it, whether that's visual or acoustic.”


Currently the system is mounted to the ground, but the team wants to develop the design further by incorporating it into a drone itself, so that it can monitor and scan in-air. They’d also like to create a system that can monitor on and in water. The same principles can be applied to a visual system, so a series of cameras can detect and pinpoint areas of interest.


The key is that the system, whether it’s audio or visual, is the first to filter out all of the unnecessary data coming in, and do all the initial processing before a classification system steps in. This technology would be incredibly useful in a multitude of situations – think autonomous vehicles or even just basic speech recognition.


“If we are able to remove the environmental effects on the signal, we don't need to test every possible configuration of lighting or noise that could come into the sensor because the signal is still the same,” Brinkworth said. “So, we remove a dramatic confusing factor, which will improve the speed and reliability under different conditions.”


0 comments

Kommentare


bottom of page