When moving in generic indoor environments, robotic platforms generally rely solely on information provided by onboard sensors to determine their position and orientation. However, the lack of absolute references often leads to the introduction of severe drifts in estimates computed, making autonomous operations really hard to accomplish. This paper proposes a solution to alleviate the impact of the above issues by combining two vision‐based pose estimation techniques working on relative and absolute coordinate systems, respectively. In particular, the unknown ground features in the images that are captured by the vertical camera of a mobile platform are processed by a vision‐based odometry algorithm, which is capable of estimating the relative frame‐to‐frame movements. Then, errors accumulated in the above step are corrected using artificial markers displaced at known positions in the environment. The markers are framed from time to time, which allows the robot to maintain the drifts bounded by additionally providing it with the navigation commands needed for autonomous flight. Accuracy and robustness of the designed technique are demonstrated using an off‐the‐shelf quadrotor via extensive experimental tests.


    Access

    Download


    Export, share and cite



    Title :

    Mixed marker-based/marker-less visual odometry system for mobile robots



    Publication date :

    2013-01-01



    Type of media :

    Article (Journal)


    Type of material :

    Electronic Resource


    Language :

    English


    Classification :

    DDC:    629




    Visual Odometry

    Nister, D. / Naroditsky, O. / Bergen, J. et al. | British Library Conference Proceedings | 2004


    Visual odometry

    Nister, D. / Naroditsky, O. / Bergen, J. | IEEE | 2004



    Automotive visual odometry

    Buczko, Martin / Shaker Verlag | TIBKAT | 2018