Accurate position estimation is a fundamental requirement for mobile robot navigation. The positioning problem consists of keeping in real-time a reliable estimate of the robot location with respect to a reference frame in the environment. A fast landmark-based position estimation method is presented in this paper. The technique combines orientation of the mobile robot from a heading sensor (a compass) with observations of landmarks from a vision sensor (a CCD camera). Knowing the position of the landmarks in a fixed coordinate system and the orientation of the optical axis of the camera it's possible to recover the robot position by simple geometric considerations. The experiments made in our laboratory demonstrate the reliability of the method and suggest its applicability in the context of autonomous robot navigation.


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Self-location for indoor navigation of autonomous vehicles


    Beteiligte:

    Kongress:

    Enhanced and Synthetic Vision 1998 ; 1998 ; Orlando,FL,USA


    Erschienen in:

    Erscheinungsdatum :

    30.07.1998





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Self-location for indoor navigation of autonomous vehicles [3364-33]

    Stella, E. / Cicirelli, G. / Branca, A. et al. | British Library Conference Proceedings | 1998


    Autonomous indoor 3D navigation

    Vasin, Y. G. / Osipov, M. P. / Egorov, A. A. et al. | British Library Online Contents | 2015


    Indoor Autonomous Navigation System

    AMARASEKARA MELANIE / BERNARD MARC ALLEN | Europäisches Patentamt | 2017

    Freier Zugriff

    Self-Contained Autonomous Indoor Flight with Ranging Sensor Navigation

    Chowdhary, Girish / Sobers, D. Michael / Pravitra, Chintasid et al. | AIAA | 2012


    Autonomous Systems: Autonomous Systems: Indoor Drone Navigation

    Iyer, Aswin / Narayan, Santosh / M, Naren et al. | ArXiv | 2023

    Freier Zugriff