Paper
7 June 2024 Asynchronous sensor fusion for multiple object tag-less activity tracking in manufacturing
Kaipei Yang, Aria Pezeshk, Charles Lehman, Amin Arbabian
Author Affiliations +
Abstract
Centralized fusion ensures minimal information loss and maximizes the effectiveness of state estimation. Statistically, it is the optimal solution for all sensor fusion configurations. In this paper, we introduce a local-sensor-driven asynchronous low-level centralized fusion methodology that seamlessly integrates radar and camera data at the level of detections from each sensor. For a local-sensor-driven asynchronous system, detections from the two sensing modalities with different sampling rates are transmitted to a centralized filter, which is updated whenever it receives a measurement. We implemented the proposed algorithm and validated the results using real data from manufacturing and industrial work sites. The data was obtained by Plato System’s Argus perception system, which combines high-resolution imaging mm-wave radar with camera sensors to provide indoor and outdoor activity tracking. We further compare the fusion results with vision-only MOT, as well as track-level fusion (track-to-track fusion).
© (2024) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Kaipei Yang, Aria Pezeshk, Charles Lehman, and Amin Arbabian "Asynchronous sensor fusion for multiple object tag-less activity tracking in manufacturing", Proc. SPIE 13057, Signal Processing, Sensor/Information Fusion, and Target Recognition XXXIII, 1305706 (7 June 2024); https://doi.org/10.1117/12.3013107
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Object detection

Radar sensor technology

Cameras

Sensors

Systems modeling

Manufacturing

Sensor fusion

RELATED CONTENT


Back to Top