Safe navigation under resource constraints is a key concern for autonomous planetary rovers operating on extraterrestrial bodies. Computational power in such applications is typically constrained by the radiation hardness and energy consumption requirements. For example, even though the microprocessors used for the Mars Science Laboratory (MSL) mission rover are an order of magnitude more powerful than those used for the rovers on the Mars Exploration Rovers (MER) mission, the computational power is still significantly less than that of contemporary desktop microprocessors. It is therefore important to move safely and efficiently through the environment while consuming a minimum amount of computational resources, energy and time. Perception, pose estimation, and motion planning are generally three of the most computationally expensive processes in modern autonomy navigation architectures. An example of this is on the MER where each rover must stop, acquire and process imagery to evaluate its surroundings, estimate the relative change in pose, and generate the next mobility system maneuver [1]. This paper describes improvements in the energy efficiency and speed of planetary rover autonomous traverse accomplished by converting processes typically performed by the CPU onto a Field Programmable Gate Arrays (FPGA) coprocessor. Perception algorithms in general are well suited to FPGA implementations because much of processing is naturally parallelizable. In this paper we present novel implementations of stereo and visual odometry algorithms on a FPGA. The FPGA stereo implementation is an extension of [2] that uses "random in linear out" rectification and a higher-performance interface between the rectification, filter, and disparity stages of the stereo pipeline. The improved visual odometry component utilizes a FPGA implementation of a Harris feature detector and sum of absolute differences (SAD) operator. The FPGA implementation of the stereo and visual odometry functionality have demonstrated a performance improvement of approximately three orders of magnitude compared to the MER-class avionics. These more efficient perception and pose estimation modules have been merged with motion planning techniques that allow for continuous steering and driving to navigate cluttered obstacle fields without stopping to perceive. The resulting faster visual odometry rates also allow for wheel slip to be detected earlier and more reliably. Predictions of resulting improvements in planetary rover energy efficiency and average traverse speeds are reported. In addition, field results are presented that compare the performance of autonomous navigation on the Athena planetary rover prototype using continuous steering or driving and continuous steering and driving with GESTALT traversability analysis using the FPGA perception and pose estimation improvements.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Enabling continuous planetary rover navigation through FPGA stereo and visual odometry




    Publication date :

    2012-03-01


    Size :

    1235567 byte





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Stereo-Based Visual Odometry for Autonomous Robot Navigation

    Kostavelis, Ioannis / Boukas, Evangelos / Nalpantidis, Lazaros et al. | BASE | 2016

    Free access

    Stereo vision and rover navigation software for planetary exploration

    Goldberg, S.B. / Maimone, M.W. / Matthies, L. | IEEE | 2002


    7.0501 Stereo Vision and Rover Navigation Software for Planetary Exploration

    Institute of Electrical and Electronics Engineers | British Library Conference Proceedings | 2002


    Communicationless navigation through robust visual odometry

    Van Hamme, David / Veelaert, Peter / Philips, Wilfried | IEEE | 2012


    Evaluation of a Stereo Visual Odometry Algorithm for Passenger Vehicle Navigation

    Aladem, Mohamed / Rawashdeh, Samir / Rawashdeh, Nathir | British Library Conference Proceedings | 2017