Three-dimensional Light Detection and Ranging (LiDAR) provides a new approach to sensing traffic environments. Currently, the spatial point clouds collected by roadside LiDAR can only be located in its own coordinate system instead of at their actual geography locations. Due to the fixed location of any roadside LiDAR sensor, a static object has the same laser-point coordinates in different LiDAR data frames. With the accurate Global Positioning System (GPS) data acquisition devices, it is possible to map points from a roadside LiDAR sensor onto their actual coordinates onto the Google Earth map. This paper presented an innovative algorithm that automatically converts roadside LiDAR data points in the Cartesian coordinate system to locations in the geodetic coordinate system in Google Earth. The developed algorithm needs at least four reference points with both known LiDAR coordinates and known actual geography coordinates. The process includes three major steps: 1) reference points matching, 2)transformation matrix calculation, and 3)LiDAR data coordinate system conversion. Besides that, the least squares method was used to solve the optimization problem when there were more reference points than four. The results of the case study showed that the method presented in this research can automatically map the point cloud collected by roadside LiDAR to the real-world geography locations.
A data mapping method for roadside LiDAR sensors
2019-10-01
1010675 byte
Conference paper
Electronic Resource
English
Points Registration for Roadside LiDAR Sensors
Transportation Research Record | 2019
|Automatic Vehicle Classification using Roadside LiDAR Data
Transportation Research Record | 2019
|Automatic Background Filtering Method for Roadside LiDAR Data
Transportation Research Record | 2018
|Automatic Identification of Vehicle Partial Occlusion in Data Collected by Roadside LiDAR Sensors
Transportation Research Record | 2022
|Automated Object Detection, Mapping, and Assessment of Roadside Clear Zones Using Lidar Data
Transportation Research Record | 2021
|