In this paper we describe a method for automatic determination of sensor pose (position and orientation) related to a 3D landmark or scene model. The method is based on geometrical matching of 2D image structures with projected elements of the associated 3D model. For structural image analysis and scene interpretation, a blackboard-based production system is used resulting in a symbolic description of image data. Knowledge of the approximated sensor pose measured for example by IMU or GPS enables to estimate an expected model projection used for solving the correspondence problem of image structures and model elements. These correspondences are presupposed for pose computation carried out by nonlinear numerical optimization algorithms. We demonstrate the efficiency of the proposed method by navigation update approaching a bridge scenario and flying over urban area, whereas data were taken with airborne infrared sensors in high oblique view. In doing so we simulated image-based navigation for target engagement and midcourse guidance suited for the concepts of future autonomous systems like missiles and drones.


    Zugriff

    Download


    Exportieren, teilen und zitieren



    Titel :

    Image-based 3D scene analysis for navigation of autonomous airborne systems


    Beteiligte:
    Jaeger, K. (Autor:in) / Bers, K.-H. (Autor:in)

    Kongress:

    2001


    Erscheinungsdatum :

    2001


    Format / Umfang :

    11 pages



    Medientyp :

    Aufsatz (Konferenz)


    Format :

    Elektronische Ressource


    Sprache :

    Englisch




    AUTONOMOUS AIRBORNE VIDEO-AIDED NAVIGATION

    Lee, Kyungsuk | Online Contents | 2010




    AUTONOMOUS AIRBORNE MISSION NAVIGATION AND TASKING SYSTEM

    BIHL TREVOR / COX CHADWICK | Europäisches Patentamt | 2023

    Freier Zugriff