Under National Ground Intelligence Center (NGIC) sponsorship, BBN Technologies (BBN) has developed a physical model and a software system to estimate the range and orientation of a target from video images and known information about the target and video camera. This capability may be exploited in an unattended ground sensors field to enhance situation assessment for subsequent response decisions. The target range and orientation estimation system (Video Range Finder) is based on an interactive procedure enabling the user to associate identifiable features of the video image with physical components of the target. These points define a three-dimensional polyhedron in the coordinate frame of the target, which is projected onto the two-dimensional polygon image at the focal plane of the video camera. The associated projection equations are nonlinear with respect to the azimuth, elevation, and viewing-axis rotation angles of the video camera, therefore, a solution for these angles is obtained numerically. The calculated angles are subsequently used in the estimation of the target range. The paper discusses the physical model, describes the algorithm for estimating target range and orientation, and demonstrates its application using both simulated and real objects.


    Access

    Access via TIB

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Estimation of target range and orientation from video images for tactical unattended sensor applications


    Contributors:


    Publication date :

    2002


    Size :

    7 Seiten, 1 Quelle




    Type of media :

    Conference paper


    Type of material :

    Print


    Language :

    English




    Unattended ship system with target recognition function

    KONG JIANHUA / HUANG JINSHA / XU GUOLI et al. | European Patent Office | 2022

    Free access


    Low-cost miniature unattended RF sensor suite

    Wild, N.C. / Coakley, P.G. / Doft, F. et al. | Tema Archive | 2001



    Unattended system

    LI JI | European Patent Office | 2023

    Free access