Track selective localization of railway vehicles is a precondition to more efficient logistics, improved security and autonomous driving. Often, satellite based navigation systems are used for localization tasks. However, in many cases, satellite navigation is not available or the sensor information is corrupted. To enhance availability and localization quality new sensors are needed. In this work we propose the usage of a monofocal video camera to improve the localization quality. Our algorithm estimates the track recursively in the camera pictures. The result is used for a turnout detection. Compared to GPS/INS curvature and turnouts can be detected in advance. Our approach uses recursive estimation to track the tracks in the pictures and to estimate the geometry of the tracks. Experimental results show the efficiency of the proposed algorithm.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Vision-based track estimation and turnout detection using recursive estimation


    Beteiligte:
    Ross, R (Autor:in)


    Erscheinungsdatum :

    01.09.2010


    Format / Umfang :

    1260153 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Vision-based track turnout identification method

    YU GUIZHEN / FU ZI'ANG / WANG ZHANGYU et al. | Europäisches Patentamt | 2020

    Freier Zugriff

    Track turnout and control method of track turnout

    LIU FEIXIANG / LUO JIANLI / ZHOU WEN et al. | Europäisches Patentamt | 2021

    Freier Zugriff

    Track turnout chart

    Mifflen, S.C. | Engineering Index Backfile | 1930


    Track gauge widening type turnout track state detection method and device

    SUN XIANFU / YANG FEI / WEI ZILONG et al. | Europäisches Patentamt | 2023

    Freier Zugriff

    Track turnout opening direction safety detection system and detection method

    LIANG JINGYUAN / LIU GUOPING / XIE PENGCHENG et al. | Europäisches Patentamt | 2022

    Freier Zugriff