Accurate position estimation is a fundamental requirement for mobile robot navigation. The positioning problem consists of keeping in real-time a reliable estimate of the robot location with respect to a reference frame in the environment. A fast landmark-based position estimation method is presented in this paper. The technique combines orientation of the mobile robot from a heading sensor (a compass) with observations of landmarks from a vision sensor (a CCD camera). Knowing the position of the landmarks in a fixed coordinate system and the orientation of the optical axis of the camera it's possible to recover the robot position by simple geometric considerations. The experiments made in our laboratory demonstrate the reliability of the method and suggest its applicability in the context of autonomous robot navigation.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Self-location for indoor navigation of autonomous vehicles


    Contributors:

    Conference:

    Enhanced and Synthetic Vision 1998 ; 1998 ; Orlando,FL,USA


    Published in:

    Publication date :

    1998-07-30





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Self-location for indoor navigation of autonomous vehicles [3364-33]

    Stella, E. / Cicirelli, G. / Branca, A. et al. | British Library Conference Proceedings | 1998


    Autonomous indoor 3D navigation

    Vasin, Y. G. / Osipov, M. P. / Egorov, A. A. et al. | British Library Online Contents | 2015


    Indoor Autonomous Navigation System

    AMARASEKARA MELANIE / BERNARD MARC ALLEN | European Patent Office | 2017

    Free access

    Self-Contained Autonomous Indoor Flight with Ranging Sensor Navigation

    Chowdhary, Girish / Sobers, D. Michael / Pravitra, Chintasid et al. | AIAA | 2012


    Autonomous Systems: Autonomous Systems: Indoor Drone Navigation

    Iyer, Aswin / Narayan, Santosh / M, Naren et al. | ArXiv | 2023

    Free access