DHS Tests Drone Video Detection with Sandia National Laboratories
Friday, November 02, 2018 | Comments

The Department of Homeland Security (DHS) Science and Technology Directorate (S&T) and Sandia National Laboratories is creating more precise drone detection capability through visuals.

“If you have a video of something, you can kind of identify it based on certain characteristics,” said Jeff Randorf, an S&T engineering advisor. “You can train neural networks to recognize patterns, and the algorithm can begin to pick up on certain features.”

Until now, videos of drones were limited to raw data analysis, which entailed capturing and learning from the video alone. This is unlike the temporal frequency analysis (TFA) being tested at Sandia, which dives deeper into the image. Instead of heat signatures, acoustic signatures or taking a video at face value, TFA analyzes the frequency of pixel fluctuation in an image over time, eventually obtaining a “temporal frequency signature” for the drones it has observed. Pairing robust imaging systems with machine learning in this way only makes it a matter of time before discrimination is seamless.

“Current systems rely upon exploiting electronic signals emitted from drones, but they can’t detect drones that do not transmit and receive these signals,” said Bryana Woo of Sandia National Laboratories.

Previously, drones could be spotted by picking up the radio signal between a remote control and the drone itself, but if drones are soon to be autonomous, that capability may quickly vanish. Alternatively, TFA captures tens of thousands of frames in a video, so a machine can learn about an object and its associations from how it moves through time. If mastered, TFA could be the most precise discrimination method to date.

The Sandia tests consisted of capturing impressions of three different multirotor drones on a streaming video camera. Each drone would travel forward and back, side to side, up and down, and the camera would capture spatial and temporal location. A machine learning algorithm was trained on the frames taken. Ultimately, the analysis renders the full flight path of the target object in all its directions.

To challenge the system, testers began with more complex data, providing lots of clutter in the environment — birds, cars and helicopters around the drone. Over time, Sandia noticed considerable difference in the system’s ability to discern whether an object was a drone or a bird.

The number of commercial and personal drones in the sky is expected to nearly triple within the current decade, raising concern as to how their traffic will be managed, how nefarious drones can be identified and how to merely tell drones apart from their environment.

The technology developed by Sandia is the subject of U.S. Patent Application Serial No. 16/141,385, entitled “Unmanned Aircraft System (UAS) Detection And Assessment Via Temporal Intensity Aliasing,” filed Sept. 24.

Would you like to comment on this story? Find our comments system below.



 
 
Post a comment
Name: *
Email: *
Title: *
Comment: *
 

Comments

No Comments Submitted Yet

Be the first by using the form above to submit a comment!


Magazines in Print







Events
August 2020

11 - 13
ENTELEC Conference and Expo
Houston
https://www.entelec.org/event/2020-entelec-conference-expo/

31 - 9/4
UTC Telecom & Technology
Virtual Conference and Expo
https://utctelecom.org/

September 2020

9 - 10
Comms Connect New Zealand
Wellington, New Zealand
https://www.comms-connect.co.nz/

21 - 21
NENA Conference and Expo
Virtual Conference and Expo
https://www.nena.org/events/EventDetails.aspx?id=1229619&group=

More Events >

Site Navigation

Close