Object Detection and Pattern of Life Analysis from Remotely Piloted Aircraft System Acquired Full Motion Video
Object detection , Pattern of life , Remotely Piloted Aircraft System , Full Motion Video , drone , object tracking , virtual reality , four-dimensional visualization , YOLOv3 , unsupervised , artificial intelligence , machine learning , computer vision , flight parameters , survey design , unmanned aerial vehicle , uninhabited aircraft , photogrammetry , 3D model , feature engineering , exploratory data analysis , data visualization , clustering , human machine interface , game engine , unity , oculus rift , geospatial
Remotely piloted aircraft systems (RPAS) have introduced a new ability to quickly deploy low-cost, fully or partially autonomous aerial sensor platforms which has created new intelligence, surveillance, and reconnaissance capabilities in various domains using cameras which are ubiquitous in most RPAS. Despite the utility of these aerial sensor systems, the full motion video (FMV) they acquire presents a big data challenge for operators as they generate large volumes of data that are impractical to analyze using current workflows due to excessive time requirements, computational resources, cost, or the availability of human analysts. Additionally, moving the camera rather than having a static network of stationary cameras, complicates the data processing steps required to generate valuable outputs. In order to address this big data challenge, various artificial intelligence (AI) based algorithms and data analytic workflows that can extract useful insights and knowledge from large amounts of complex and ambiguous FMV data streams from airborne sensors are developed and assessed. A data acquisition campaign was launched resulting in a dataset consisting of 33 flights recording approximately eight and a half hours of RPAS acquired FMV to assess the suite of AI-based algorithmic tools. Some of the tools useful for analyzing aerial FMV include object detection, and tracking namely to conduct pattern of life (POL) analysis for which aerial sensors mounted on RPAS are well suited for as they capture spatiotemporal information crucial to understanding the context of a scenario. Analysis and interpretation of the acquired dataset revealed that state of the art performance was achieved using the AI-based tools when the RPAS was deployed under an altitude of 30 m, at a velocity of under 7 m/s, and at pitch angles ranging from 25° to 65° while acquiring FMV at a resolution of 4.16 MP. The POL analysis conducted on two flights proved the two developed feature engineering based workflows to be robust behavioral anomaly detection tools for the staged pedestrian traffic and high value target assailant scenarios. The acquired data was also visualized in virtual reality within an immersive four-dimensional scene as a novel enhanced dissemination tool to aid in the POL interpretation and decision making. The acquisition, processing, analysis, and dissemination of the data from the 33 flights has indicated that RPAS acquired FMV combined with AI-based algorithmic tools could serve as an effective and reliable platform for creating and handling the big data for a variety of different applications such as peace support, public safety, and aerial monitoring to name a few.