Surrounding vehicle detection is one of the most important modules for a vision-based driver assistance system (VB-DAS) or an autonomous vehicle. In this paper, we put forward a wireless panoramic camera system for real-time and seamless imaging of the 360-degree driving scene. Using an embedded FPGA design, the proposed panoramic camera system can perform fast image stitching and produce panoramic videos in real-time, which greatly relives the computation and storage burden of a traditional multi-camera-based panoramic system. For surrounding vehicle detection, we present a novel deep convolutional neural network - EZ-Net, which perceives the potential vehicles by using 13 convolutional layers and locates the vehicles by a local non-maximum suppression process. Experimental results demonstrate that, the proposed EZ-Net performs vehicle detection on the panoramic video at a speed of 140 fps while holding a competing accuracy with the state-of-the-art detectors.
Surrounding Vehicle Detection Using an FPGA Panoramic Camera and Deep CNNs
IEEE Transactions on Intelligent Transportation Systems ; 21 , 12 ; 5110-5122
2020-12-01
3202888 byte
Article (Journal)
Electronic Resource
English