In this paper, a self-contained system that is capable of precisely recognizing traffic data in real-time is designed and tested. This system detects the range and velocity of objects using a high-frequency automotive radar module. The system also records a video stream and employs a YOLOv3 detection algorithm using the COCO dataset to identify, label, and track different classes of vehicles and pedestrians. The fusion of these two sensor systems combines the benefits of both the radar’s accuracy and the camera’s object detection. The final design is deployed in a real-world environment and validated against collected ground-truth data. The system is capable of providing traffic information accurately.
Sensor Fusion for Traffic Monitoring Using Camera, Radar, and ROS
11.08.2022
11848398 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
A Camera-LiDAR Fusion Framework for Traffic Monitoring
IEEE | 2024
|Traffic Incident Detection Based on mmWave Radar and Improvement Using Fusion with Camera
DOAJ | 2022
|