Driving assistance is a popular research topic. Generally, two informations are necessary for a good driving: "where am I?", "where are dangerous obstacles on the road?". An approach which can precisely localize the vehicle and obstacles on the road is presented in this paper. Here, data fusion from proprioceptive and exteroceptive sensors like odometer, GPS, LIDAR, and vision as well as the knowledge of the road map allow this localization. This system is fully operational and has been tested in Clermont Ferrand city. Localization accuracy already reached, is a decimetric precision on lateral vehicle position and a metric precision on longitudinal vehicle position.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Multisensorial data fusion for global vehicle and obstacles absolute positioning


    Contributors:
    Laneurit, J. (author) / Blanc, C. (author) / Chapuis, R. (author) / Trassoudaine, L. (author)


    Publication date :

    2003-01-01


    Size :

    455543 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Visual Tracking by a Multisensorial Approach

    Trassoudaine, L. / Alizon, J. / Collange, F. et al. | British Library Conference Proceedings | 1993


    Description and tests of a multisensorial driving interface for vehicle teleoperation

    Ortiz, Jesus / Tapia, Cecilia / Rossi, Lorenzo et al. | IEEE | 2008



    Bootstrapping Computer Vision and Sensor Fusion for Absolute and Relative Vehicle Positioning

    Janssen, Karel / Rademakers, Erwin / Boulkroune, Boulaid et al. | British Library Conference Proceedings | 2015


    Multisensorial Communication Method Between Services and Remote Users and System Architecture for Actuating It

    BERGAMASCO, Massimo / AVIZZANO, Carlo Alberto / RUFFALDI, EMANUELE et al. | BASE | 2005

    Free access