The increasing prevalence of drones in warfare is evident. For instance, Ukraine’s defense against Russian forces is hampered by the loss of thousands of drones monthly. The widespread availability of affordable commercial drones has exacerbated vulnerabilities in the traditional security perimeters of civilian and military assets, critical infrastructure, and public gatherings. This has compelled the armed forces of several countries to invest in counter – UAS solutions. Seven primary technologies are employed to detect, track, and identify unmanned aerial systems (UAS): radar, visible light cameras, thermal imaging cameras, infrared sensors, lidar, frequency monitoring devices, and acoustic sensors. A technology’s effectiveness in detecting a UAS is influenced by factors such as the drone’s size, distance, and atmospheric conditions. Moreover, the chosen technology significantly impacts detection probability. Beyond detection, the system must ascertain the UAS’s spatial position, velocity, and movement direction.
Leading corporations such as Clear Align, Trakka USA Defense, ICR, ELTA North America, Teledyne FLIR, SAIC, ATSC, and Anduril Industries are at the forefront of developing cutting-edge counter-UAS (C-UAS) solutions for the U.S. Department of Defense.
Selecting the appropriate technology for C-UAS is crucial. Each drone detection and identification method has inherent limitations due to the underlying physical principles of the employed devices. Simultaneously, there has been a concerted effort to develop and deploy C-UAS systems capable of detecting, tracking, identifying, and neutralizing drone threats. A diverse range of detection, tracking, and identification (DTI) systems, comprising various subsystems and solutions, is currently available. The fundamental capability of all DTI systems is the detection of a UAS within a specified range, followed by tracking its trajectory over time, and potentially classifying the drone. In essence, a DTI system can be divided into three core functional components:
Detection functionality: This involves identifying an object through the reception of a signal emitted by or reflected from the object. Signal analysis, including time-series integration and feature extraction, enhances detection. Derived features enable estimation of the object’s state, such as position, speed, and size. Relevant attributes for UAS detection include rotor presence (in contrast to flapping wings) and position. These attributes are associated with specific sensing domains: active or passive. Active sensing relies on signal transmission and reflection, while passive sensing utilizes the drone’s own emissions or external energy sources. The sensing domain also pertains to the wave type, such as acoustic, electromagnetic, magnetic, or seismic. Each sub-function is characterized by specific attributes and domains.
Tracking functionality: also known as object assessment, is essential for DTI systems. Building upon detection data, tracking can be achieved using single or multiple sensors. The outcome contributes to creating an operational picture. Key characteristics of a comprehensive operational picture include detection range, track continuity, complete coverage of objects within the area of interest, and accurate position and velocity data for each tracked object.
Identification functionality: The identification process furnishes information enabling informed decisions about counter-drone actions. Drone classification provides essential data. Utilizing detection attributes and track information as input, the identification function can incorporate historical and external data for enhanced analysis. Artificial intelligence (AI) techniques often facilitate complex classification tasks. The combined output of identification and tracking creates a recognized situational awareness picture.
Beyond the United States, armed forces of NATO member countries and other nations are evaluating both non-lethal and lethal C-UAS solutions. These systems incorporate a variety of technologies, including multi-mission radars, electro-optical/infrared cameras, radio frequency scanners, jammers, guided rockets, interceptor drones, and small arms weapons.
The integration of artificial intelligence into drones is enabling them to autonomously navigate and respond to complex environments, previously inaccessible or dangerous for humans. This increased drone intelligence poses significant challenges for existing counter-drone systems. By combining pre-planned flight routes with self-guided LiDAR and other sensor technologies, drones can collect data faster, safer, more accurately, and at a lower cost.
Until we reach a place where autonomous C-UAS are fully capable of prudently managing every solitary unpredictable array of conditions that it could come across, autonomous systems will have to be developed as a human-machine collaboration system. Of late, the topic of cognitive ergonomics has shifted its attention from time and motion to mind and motion. The emphasis is on formulating tools and technology that correspond with the conscious and unconscious ways humans observe, deduce, and perceive their surroundings, make selections, and carry out multifaceted tasks. As an alternative of designing founded on credulous conjectures that people are consciously concentrating on and make fully engaged and coherent decisions, designers of autonomous systems are now adapting and designing for the fact that conduct of individuals is intensely influenced and sometimes administered, by brain functions of which humans have no conscious knowledge.