This paper presents a joint manifold learning based distributed sensor fusion approach for image and radio frequency (RF) data. A typical scenario includes several objects (with RF emitters), which are observed by a network of platforms with Medium Wavelength Infrared (MWIR) cameras and/or RF Doppler sensors. Based on a joint manifold learning (JML) sensor fusion approach, we propose to design and implement a distributed heterogeneous data fusion approach for improved Detection, Classification, and Identification (DCI) of targets and entities in dynamic environments with constrained communications. We design and implement distributed JML using diffusion and consensus approaches. In our distributed mechanism, we first partition the JML matrices into submatrices for each platform. For every platform, these submatrices represent the mapping from the sensor data to its contribution in the final fused result. Each node processes the local measurements using submatrices and shares the results with a limited number of neighbors. A prototype is constructed from drones, onboard processing capabilities (Intel Next Unit of Computing, NUC), cameras, and radars to demonstrate the proposed distributed data fusion approach. Supportive results are achieved from field tests and compared with centralized JML framework.
Joint Manifold Learning Based Distributed Sensor Fusion of Image and Radio-Frequency Data
01.03.2019
2122752 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch