In this paper we describe a method for automatic determination of sensor pose (position and orientation) related to a 3D landmark or scene model. The method is based on geometrical matching of 2D image structures with projected elements of the associated 3D model. For structural image analysis and scene interpretation, a blackboard-based production system is used resulting in a symbolic description of image data. Knowledge of the approximated sensor pose measured for example by IMU or GPS enables to estimate an expected model projection used for solving the correspondence problem of image structures and model elements. These correspondences are presupposed for pose computation carried out by nonlinear numerical optimization algorithms. We demonstrate the efficiency of the proposed method by navigation update approaching a bridge scenario and flying over urban area, whereas data were taken with airborne infrared sensors in high oblique view. In doing so we simulated image-based navigation for target engagement and midcourse guidance suited for the concepts of future autonomous systems like missiles and drones.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Image-based 3D scene analysis for navigation of autonomous airborne systems


    Contributors:

    Conference:

    Intelligent Robots and Computer Vision XX: Algorithms, Techniques, and Active Vision ; 2001 ; Boston,MA,United States


    Published in:

    Publication date :

    2001-10-05





    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Image-based 3D scene analysis for navigation of autonomous airborne systems

    Jaeger, K. / Bers, K.-H. | Fraunhofer Publica | 2001

    Free access

    AUTONOMOUS AIRBORNE VIDEO-AIDED NAVIGATION

    Lee, Kyungsuk | Online Contents | 2010




    AUTONOMOUS AIRBORNE MISSION NAVIGATION AND TASKING SYSTEM

    BIHL TREVOR / COX CHADWICK | European Patent Office | 2023

    Free access