Transforming a UAV to an actively connected surveillance vehicle leveraging the use of multisensory data.

An effective assessment of illegal border crossings is planned through the exploitation of active data fusion from various types of sensors, complementing each other’s capabilities. This process uses aerial or ground sensors for devising optimal strategies on reducing errors (type I and II) in airborne detections. To accomplish this, the dominant role of ICT, sensorics, machine learning systems and unmanned aviation technology will assist to overcome challenges such as the deluge of uncertain sensor data and a high resource-consumption cost. The modern border surveillance platforms will have to improve the sensor range as well as the coverage capabilities and expand the mission time, to improve the effectiveness of surveillance along all (remote and nearby) border sections. Towards this goal within BorderUAS project we will fuse and analyse multi-stream data sources, as to produce knowledgeable insight into hidden data patterns for rapid decision-making that would respond to the strict border security requirements. In modern systems the need is to maximise the environment sensing, and bridge the interconnection between the physical world and the computer sensed entities through cutting edge data analysis and context awareness methods, supported by data fusion, feature extraction and perception, while assuring efficient network communication for timely and secure border monitoring and surveillance.

Concept of BorderUAS Sensors & Fusion Integration

BorderUAS sensorics (a technical overview): It will include optical and hyperspectral camera arrays, LTA-UAV-specific synthetic aperture radar (SAR), laser detection and ranging (LADAR), shortwave/longwave infrared (SWIR/LWIR) and acoustic cameras for both, direct as well as indirect target detection, e.g. via vegetation disturbance. The SAR imaging capability will be implemented using a MIMO radar that does not require the platform to be in motion while acquiring SAR images, leveraging the unique characteristics of the Hypersfera’s UAV which can hover above a given target. Besides the imaging capabilities, each antenna of the MIMO radar will be used for moving target identification. All data streams will be processed through a developed data processing, fusion and interpretation layer in order to produce “big” high-dimensional streaming data, appropriately processed, fused and interpreted, that enable effective and efficient detection in difficult terrain. The aim is that BorderUAS solution specifications will drive the service transformation to beyond state of the art technology that will provide ultra-high resolution multi-sensing array payload for the project’s border surveillance vehicle.


Overall, BorderUAS aims to combine an unmanned aerial vehicle with an ultra-high resolution multi-sensor surveillance payload, in order to support border surveillance applications and, specifically, detection of activity in difficult terrain. The primary objectives of BorderUAS sensorics layer include (i) sensing the critical information from the external physical environment, (ii) the sampling of internal system signals, and (iii) obtaining meaningful information from sensor data to perform effective and efficient decision-making.