This paper applies a method to obtain the extrinsic calibration parameters between a Camera and a 3D-LiDAR using 3D point-to-point correspondences. We use a calibration board with ArUco marker as a reference to obtain features of interest in both sensor frames. Through a manual method which is easy to operate, the calibration board planar and edge will be extracted from the LiDAR point cloud by exploiting the geometry of the board. And then the vertices will be calculated by using nonlinear optimization. The corresponding vertices in the Camera image are detected by ArUco Marker API. Once we get the point-to-point correspondences, we use Kabsch algorithm to get the final rotation and transition. The calibration accuracy is demonstrated by evaluating it in real application scenarios.
Application of 3D-LiDAR & Camera Extrinsic Calibration in Urban Rail Transit
2020-09-01
572693 byte
Conference paper
Electronic Resource
English
Extrinsic Calibration of a 3D-LIDAR and a Camera
IEEE | 2020
|A Survey of Extrinsic Calibration of LiDAR and Camera
TIBKAT | 2022
|A Survey of Extrinsic Calibration of LiDAR and Camera
British Library Conference Proceedings | 2022
|