Autonomous vehicles require lane-level accuracy for safe navigation and localization. However, achieving this accuracy becomes challenging when GPS is unavailable due to signal blockage or disruption. This issue is often addressed by relying on HD maps, which restrict autonomous vehicle operations to specific locations. The proposed method overcomes these limitations. The method uses the on-board sensor data, which is crucial for vehicle operation itself. On-board vehicle sensor data through the Controller Area Network (CAN) bus of the vehicle is utilized to calculate vehicle kinematics. The vehicle’s onboard sensors provide steering angle, steering rate, yaw rate, and wheel speed sensors through the vehicle CANBus. However, kinematics is prone to drift. A map-based map-matching corrects kinematics drift to achieve lane-level localization and maintains accuracy. The approach employs an arc length-based map matching technique using a two-dimensional lane map to align kinematic trajectories with spatial information of the map. The kinematic model’s prediction is used to introduce a temporal notion to the spatial information available in the map data. The arc length is computed by summing up small segments of distance traveled over time by the vehicle. The lane map consists of points that are interpolated. Each segment is defined by two adjacent points of the map. In the beginning, the method localizes the vehicle using the last known GPS position. In the meantime, kinematics starts calculating the trajectory, which accumulates drift. Then, the method finds the point on the map segment based on distance (arc length) and closer to the end of the map segment. When the corrected position reaches the end of the map segment, it moves to a new map segment, ensuring continued and accurate lane-level localization. Additionally, the algorithm incorporates lane detection and classification algorithms, which assist the map matching algorithm to identify the lane changes and localize the vehicle onto the new lane. The lane detection algorithms determine left and right lane changes; based on that, the map matching algorithm finds the closest left or right lane to switch. The lane change detection model is composed of three sub-models: lane detection, lane classification, and lane change detection. The proposed model can incorporate any detection model that accepts an image as input and produces a list of lanes as output; however, CLRKDNet is currently employed. CLRKDNet was selected due to its utilization of the CLRNet architecture, which extracts deep features and generates multi-scale feature maps. It integrates ResNet or DLA backbones and employs a comprehensive loss function for precise lane detection. Knowledge distillation techniques optimize the model for real-time applications, reducing model complexity and runtime without compromising accuracy. The lane classification submodel utilizes classical computer vision techniques to determine the type and color of lane markings. It employs the HSV color space to mask yellow and white lane markings from the road surface and curbs. The model also analyzes the connected components of the lane marking masks to categorize lanes as dashed, solid, or double solid. The final sub-model performs lane change detection by evaluating lane positions relative to crossing thresholds. When a significant accumulation of lane markings appears ahead of the vehicle, the model signals a lane change. The map matching with lane detection and classification approach is independent of HD maps. This method was validated across distinct geometric intersections and paths with different maneuvers and speeds. To assess the repeatability and accuracy, the method was tested multiple times in each scenario and shows sub meter-level accuracy in different scenarios. This ensures reliable and continuous localization with a significant enhancement of operational reliability and safety of autonomous vehicles in GPS denied scenarios.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Lane-Level Localization of Vehicle in GPS-Denied Environment


    Contributors:


    Publication date :

    2025-04-28


    Size :

    7621457 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    A particle filter for vehicle tracking with lane level accuracy under GNSS-denied environments

    Zhong, Xionghu / Rabiee, Ramtin / Yan, Yongsheng et al. | IEEE | 2017


    Spacecraft 6-DoF Localization in a GPS denied Environment

    Peng, Chao-Chung / Chan, Chen-Yu / Lin, Jhih-Hong et al. | IEEE | 2021


    LaIF: A Lane-Level Self-Positioning Scheme for Vehicles in GNSS-Denied Environments

    Rabiee, Ramtin / Zhong, Xionghu / Yan, Yongsheng et al. | IEEE | 2019