The Naval Research Laboratory (NRL) and the Space Dynamics Laboratory (SDL) have been working with the support of the Office of Naval Research (ONR) on an autonomous multi-sensor motion-tracking and interrogation system.
This system reduces the workload for analysts by automatically finding moving objects, then presenting high-resolution images of those objects with no human input.
Intelligence, surveillance and reconnaissance or ISR assets tend to generate huge amounts of data that can overwhelm human handlers. This can restrict the ability of an analyst to generate fast and complete intelligence reports as needed in real time operations. This is where the new system being developed and tested will come in handy.
Dr. Brian Daniel, research physicist, NRL ISR Systems and Processing Section said that these tests display how a single imaging sensor can be used to provide imagery of multiple tracked objects, a job typically requiring multiple sensors.
The interrogation sensor was the precision, jitter-stabilized EyePod developed under the Fusion, Exploitation, Algorithm, and Targeting High-Altitude Reconnaissance (FEATHAR) program. EyePod is a dual-band visible-near infrared and long-wave infrared sensor mounted inside a nine-inch gimbal pod assembly designed for small UAV platforms.
The mid-wave infrared nighttime WAPSS (N-WAPSS) was chosen as the wide-area sensor, and has a 16 mega-pixel, large format camera that captures single frames at four hertz (cycles per second) and has a step-stare capability with a one hertz refresh rate.
Dr. Michael Duncan, ONR program manager said that the demonstration was a complete success. Not only did the network sensing demonstration achieve simultaneous real-time tracking, sensor cross cueing and inspection of multiple vehicle-sized objects, but they also showed an ability to follow smaller human-sized objects under specialized conditions he added.