A method of matching unequally spaced height maps is described. This method would be useful in a Mars rover which tries to refine the estimate of its position by matching the data from its stereo vision system or laser rangefinder to data obtained from a camera orbiting Mars. The refined position can be used to merge the two sets of data with proper registration. The method is designed to make full use of the information contained in the data, including accuracy estimates in the form of covariance matrices and reliability estimates in the form of probabilities. Means of extracting this information from stereo vision are discussed. The terrain matching process uses a coarse-to-fine strategy, and it includes automatic editing to remove points with excessive disagreement. An example using real data from an experimental vehicle is presented.<>


    Zugriff

    Zugriff prüfen

    Verfügbarkeit in meiner Bibliothek prüfen

    Bestellung bei Subito €


    Exportieren, teilen und zitieren



    Titel :

    Visual terrain matching for a Mars rover


    Beteiligte:
    Gennery, D.B. (Autor:in)


    Erscheinungsdatum :

    01.01.1989


    Format / Umfang :

    1171653 byte





    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch



    Visual terrain matching for a Mars rover

    Gennery, D.B. | Tema Archiv | 1989


    Field trial results of planetary rover visual motion estimation in Mars analogue terrain

    Bakambu, J. N. / Langley, C. / Pushpanathan, G. et al. | British Library Online Contents | 2012


    SPOC: Deep Learning-Based Terrain Classification for Mars Rover Missions

    Rothrock, Brandon / Papon, Jeremie / Kennedy, Ryan et al. | NTRS | 2016


    Mars synthetic terrain generation and rover mission simulation using supercomputers

    Curkendall, D. / Block, G. / Husman, L. E. | NTRS | 2001


    SPOC: Deep Learning-based Terrain Classification for Mars Rover Missions

    Rothrock, Brandon / Kennedy, Ryan / Cunningham, Chris et al. | AIAA | 2016