We introduce a motion estimation algorithm that fuses visual and range data to give an unambiguous estimate of the velocity of objects visible to a camera and range sensor. Dynamic scale space is used to avoid temporal aliasing and a novel robust estimator based on Least Trimmed Squares is used to smooth results between boundaries established using range data. Simulation results (from a specially developed simulation environment) and experimental results (from an FPGA based implementation of our algorithm) show that our approach gives accurate motion estimates.


    Access

    Check access

    Check availability in my library

    Order at Subito €


    Export, share and cite



    Title :

    Fusion of range and vision for real-time motion estimation


    Contributors:
    Kolodko, J. (author) / Vlacic, L. (author)


    Publication date :

    2004-01-01


    Size :

    651919 byte




    Type of media :

    Conference paper


    Type of material :

    Electronic Resource


    Language :

    English



    Fusion of Range and Vision for Real-Time Motion Estimation

    Kolodko, J. / Vlacic, L. / IEEE | British Library Conference Proceedings | 2004



    Tracking Vision: A real-time motion tracking system

    British Library Online Contents | 1998


    A Real-Time Vision-Based 3D Motion Estimation System for Positioning and Trajectory Following

    Negahdaripour, S. / Jin, L. / Xu, X. et al. | British Library Conference Proceedings | 1996


    Stereoscopic and Range Sensor Fusion for Motion Estimation in Mobile Robotics

    Lherbier, R. / Tu, X. W. / Dubuisson, B. et al. | British Library Conference Proceedings | 1994