Now showing 1 - 2 of 2
  • Publication
    A multi-sensorial approach for the protection of operational vehicles by detection and classification of small flying objects
    Due to the high availability and the easy handling of small drones, the number of reported incidents caused by UAVs both intentionally and accidentally is increasing. To be capable to prevent such incidents in future, it is essential to be able to detect UAVs. However, not every small flying object poses a potential threat and therefore the object not only has to be detected, but also classified or identified. Typical 360CR scanning LiDAR systems can be deployed to detect and track small objects in the 3D sensor data at ranges of up to 50 m. Unfortunately, in most cases the verification and classification of the detected objects is not possible due to the low resolution of this type of sensor. In high-resolution 2D images, a differentiation of flying objects seems to be more practical, and at least cameras in the visible spectrum are well established and inexpensive. The major drawback of this type of sensor is the dependence on an adequate illumination. An active illumination could be a solution to this problem, but it is usually impossible to illuminate the scene permanently. A more practical way would be to select a sensor with a different spectral sensitivity, for example in the thermal IR. In this paper, we present an approach for a complete chain of detection, tracking and classification of small flying objects such as micro UAVs or birds, using a mobile multi-sensor platform with two 360CR LiDAR scanners and pan-and-tilt cameras in the visible and thermal IR spectrum. The flying objects are initially detected and tracked in 3D LiDAR data. After detection, the cameras (a grayscale camera in the visible spectrum and a bolometer sensitive in the wavelength range of 7.5 µm to 14 µm) are automatically pointed to the object's position, and each sensor records a 2D image. A convolutional neural network (CNN) realizes the identification of the region of interest (ROI) as well as the object classification (we consider classes of eight different types of UAVs and birds). In particular, we compare the classification results of the CNN for the two camera types, i.e. for the different wavelengths. The high number of training data for the CNN as well as the test data used for the experiments described in this paper were recorded at a field trial of the NATO group SET-260 (""Assessment of EO/IR Technologies for Detection of Small UAVs in an Urban Environment"") at CENZUB, Sissonne, France.
  • Publication
    Image based classification of small flying objects detected in LiDAR point clouds
    The number of reported incidents caused by UAVs, intentional as well as accidental, is rising. To avoid such incidents in future, it is essential to be able to detect UAVs. However, not every small flying object is a potential threat and therefore the object not only has to be detected, but classified or identified. Typical 360° scanning LiDAR sensors, like those developed for automotive applications, can be deployed for the detection and tracking of small objects in ranges up to 50 m. Unfortunately, the verification and classification of the detected objects is not possible in most cases, due to the low resolution of that kind of sensor. In visual images a differentiation of flying objects seems more practical. In this paper, we present a method for the distinction between UAVs and birds in multi-sensor data (LiDAR point clouds and visual images). The flying objects are initially detected and tracked in LiDAR data. After detection, a grayscale camera is automatically pointed onto the object and an image is recorded. The differentiation between UAV and bird is then realized by a convolutional neural network (CNN). In addition, we investigate the potential of this approach for a more detailed classification of the type of UAV. The paper shows first results of this multi-sensor classification approach. The high number of training data for the CNN as well as the test data of the experiments were recorded at a field trial of the NATO group SET-260 ("Assessment of EO/IR Technologies for Detection of Small UAVs in an Urban Environment").