We introduce a motion estimation algorithm that fuses visual and range data to give an unambiguous estimate of the velocity of objects visible to a camera and range sensor. Dynamic scale space is used to avoid temporal aliasing and a novel robust estimator based on Least Trimmed Squares is used to smooth results between boundaries established using range data. Simulation results (from a specially developed simulation environment) and experimental results (from an FPGA based implementation of our algorithm) show that our approach gives accurate motion estimates.
Fusion of range and vision for real-time motion estimation
2004-01-01
651919 byte
Conference paper
Electronic Resource
English
Fusion of Range and Vision for Real-Time Motion Estimation
British Library Conference Proceedings | 2004
|Real-time one-dimensional motion estimation and its application in computer vision
British Library Online Contents | 2015
|Tracking Vision: A real-time motion tracking system
British Library Online Contents | 1998
A Real-Time Vision-Based 3D Motion Estimation System for Positioning and Trajectory Following
British Library Conference Proceedings | 1996
|Stereoscopic and Range Sensor Fusion for Motion Estimation in Mobile Robotics
British Library Conference Proceedings | 1994
|