In recent years, event cameras, as a type of bio-inspired sensor, have become increasingly important in the fields of robotics and autonomous driving. Given that events inherently lack depth information, a multi-sensor solution is applied to combine events and LiDAR scans, and its performance highly depends on precise calibration between different sensors. However, due to the distinctive representation of event cameras, which only capture dynamic objects without texture structure, most previous calibration methods for event cameras have complex setups and controlled lighting conditions. In this paper, we propose a novel extrinsic calibration method by simulating events with LiDAR, which does not require any equipment beyond the sensors themselves. Specifically, we first extract LiDAR events from the continuous raw point clouds. Then a clustering algorithm is applied to refine this event data and identify motion centroids. Finally, a two-step optimization process is proposed: an initial rough calibration based on these centroids and a subsequent fine calibration achieved through nearest neighbor matching between LiDAR events and camera events. The experimental results validate that our algorithm achieves high accuracy and robustness in simulations and real-world experiments.
EventAlign: LiDAR-Event Camera Calibration with Event Alignment Loss
2024-09-24
1878082 byte
Conference paper
Electronic Resource
English