论文标题

使用机器学习将可见的和红外光谱图像结合起来,以进行小型无人机检测

Combining Visible and Infrared Spectrum Imagery using Machine Learning for Small Unmanned Aerial System Detection

论文作者

Goecks, Vinicius G., Woods, Grayson, Valasek, John

论文摘要

机器学习和深层神经网络的进步,以进行对象检测,再加上相机的成本和功率要求较低,导致了有希望的基于视觉的SUAS检测解决方案。但是,仅依靠可见范围的频谱以前导致了低对比度的可靠性问题,例如Suas在Treeline以下和光明的光源下飞行。另外,由于飞行过程中SUAS发出的热量相对较高,因此一个长波红外(LWIR)传感器能够产生从其背景中明显对比SUA的图像。但是,与广泛可见的光谱传感器相比,LWIR传感器的分辨率较低,并且在暴露于鸟类或其他热源时可能会产生更多的假阳性。这项研究工作提出了使用机器学习来结合LWIR和可见光谱传感器的优势,以基于视觉的SUA检测。利用LWIR传感器与可见光光谱传感器相对增加的分辨率结合和同步的LWIR传感器对比度的增强对比,对深度学习模型进行了训练,以通过以前困难的环境来检测SUA。更具体地说,该方法表明在有热源的情况下,有效地检测了在线条上方和下方飞行的多个SUA,并从太阳上眩光。我们的方法达到的检测率为71.2 +-8.3%,与LWIR相比,单独可见的光谱相比提高了69%,在单独可见频谱时提高了30.4%,并且在与单独的单独的单独频谱相比,单独使用的单独的对象,单独使用的误报率为2.7 +-2.6%,并降低了74.1%,并降低了47.1%,并降低了47.1%。探测器至少50%。解决方案性能的视频可以在https://sites.google.com/view/tamudrone-pie2020/上看到。

Advances in machine learning and deep neural networks for object detection, coupled with lower cost and power requirements of cameras, led to promising vision-based solutions for sUAS detection. However, solely relying on the visible spectrum has previously led to reliability issues in low contrast scenarios such as sUAS flying below the treeline and against bright sources of light. Alternatively, due to the relatively high heat signatures emitted from sUAS during flight, a long-wave infrared (LWIR) sensor is able to produce images that clearly contrast the sUAS from its background. However, compared to widely available visible spectrum sensors, LWIR sensors have lower resolution and may produce more false positives when exposed to birds or other heat sources. This research work proposes combining the advantages of the LWIR and visible spectrum sensors using machine learning for vision-based detection of sUAS. Utilizing the heightened background contrast from the LWIR sensor combined and synchronized with the relatively increased resolution of the visible spectrum sensor, a deep learning model was trained to detect the sUAS through previously difficult environments. More specifically, the approach demonstrated effective detection of multiple sUAS flying above and below the treeline, in the presence of heat sources, and glare from the sun. Our approach achieved a detection rate of 71.2 +- 8.3%, improving by 69% when compared to LWIR and by 30.4% when visible spectrum alone, and achieved false alarm rate of 2.7 +- 2.6%, decreasing by 74.1% and by 47.1% when compared to LWIR and visible spectrum alone, respectively, on average, for single and multiple drone scenarios, controlled for the same confidence metric of the machine learning object detector of at least 50%. Videos of the solution's performance can be seen at https://sites.google.com/view/tamudrone-spie2020/.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源