We propose an automatic infrastructure-based calibration approach for vehicular sensor systems. The approach is flexible to calibrate the extrinsics between multiple cameras, multiple lidars and a GNSS/INS. It supports sensors of various field of views (FOVs) and does not rely on overlapping FOV. The calibration is executed for all sensors in a unified single framework and does not require pairwise calibration between sensors. This approach is convenient to calibrate vehicular sensor systems of large sizes, e.g., autonomous driving trucks. The proposed approach novelly designs a low-cost calibration target object, namely CalibTower, which composes multiple highly reflective fiducial tags. A vehicular sensor system moves around CalibTower to collect calibration data. Our approach first stitches the pointcloud map of the scene from lidar data sequence and calibrates the extrinsics between lidars and GNSS/INS. We then propose a novel approach to accurately detect the fiducial tags from the stitched pointcloud. With the corresponding tags detected in both pointcloud and images, the extrinsics of cameras are then calibrated. Our experiments show this approach practically effective when calibrating complex vehicular sensor systems with various setups.
CalibTower: Automatic Camera-Lidar-GNSS/INS Calibration Based on Low-Cost Infrastructure
24.09.2023
5116805 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch