In this paper we describe a method for automatic determination of sensor pose (position and orientation) related to a 3D landmark or scene model. The method is based on geometrical matching of 2D image structures with projected elements of the associated 3D model. For structural image analysis and scene interpretation, a blackboard-based production system is used resulting in a symbolic description of image data. Knowledge of the approximated sensor pose measured for example by IMU or GPS enables to estimate an expected model projection used for solving the correspondence problem of image structures and model elements. These correspondences are presupposed for pose computation carried out by nonlinear numerical optimization algorithms. We demonstrate the efficiency of the proposed method by navigation update approaching a bridge scenario and flying over urban area, whereas data were taken with airborne infrared sensors in high oblique view. In doing so we simulated image-based navigation for target engagement and midcourse guidance suited for the concepts of future autonomous systems like missiles and drones.


    Access

    Download


    Export, share and cite



    Title :

    Image-based 3D scene analysis for navigation of autonomous airborne systems


    Contributors:
    Jaeger, K. (author) / Bers, K.-H. (author)

    Conference:

    2001


    Publication date :

    2001


    Size :

    11 pages



    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English




    AUTONOMOUS AIRBORNE VIDEO-AIDED NAVIGATION

    Lee, Kyungsuk | Online Contents | 2010




    AUTONOMOUS AIRBORNE MISSION NAVIGATION AND TASKING SYSTEM

    BIHL TREVOR / COX CHADWICK | European Patent Office | 2023

    Free access