Legged robots have received much attention recently due to their maneuverability in various environments. For autonomously operating the legged robots in harsh conditions like bushes, mountains, and mud, stable and accurate state estimation is necessary. However, estimating the robot’s pose under such slippery and visually not informative conditions is hard. Therefore, this paper proposes a novel framework for the legged robot’s state estimation. Our proposed algorithm utilizes the deep inertial-joint factor based on the inertial-joint network. With real-world experiments, our proposed system validates that the deep inertial-joint factor can help improve the performance of the state estimator.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Inertial-Joint Learning-Aided Robust Pose Estimation for Legged Robots


    Additional title:

    Lect. Notes in Networks, Syst.


    Contributors:

    Conference:

    International Conference on Robot Intelligence Technology and Applications ; 2023 ; Taicang December 06, 2023 - December 08, 2023



    Publication date :

    2024-11-22


    Size :

    7 pages





    Type of media :

    Article/Chapter (Book)


    Type of material :

    Electronic Resource


    Language :

    English




    A sensor fusion method for pose estimation of c-legged robots

    León, Jorge de / Cebolla, Raúl / Barrientos, Antonio | BASE | 2020

    Free access

    Learning inertial odometry for dynamic legged robot state estimation

    Buchanan, R / Camurri, M / Dellaert, F et al. | BASE | 2022

    Free access

    Vision-Aided Inertial Navigation for Pose Estimation of Aerial Vehicles

    Saeedi, S. / Samadzadegan, F. / El-Sheimy, N. et al. | British Library Conference Proceedings | 2009


    Legged robots and methods for controlling legged robots

    LI RONGZHONG | European Patent Office | 2023

    Free access

    LEGGED ROBOTS AND METHODS FOR CONTROLLING LEGGED ROBOTS

    LI RONGZHONG | European Patent Office | 2019

    Free access