Presentado al ICRA 2014 celebrado en Hong Kong del 31 de mayo al 7 de junio. ; In this paper we present an automated system that is able to track and grasp a moving object within the workspace of a manipulator using range images acquired with a Microsoft Kinect sensor. Realtime tracking is achieved by a geometric particle filter on the affine group. Based on the tracked output, the pose of a 7-DoF WAM robotic arm is continuously updated using dynamic motor primitives until a distance measure between the tracked object and the gripper mounted on the arm is below a threshold. Then, it closes its three fingers and grasps the object. The tracker works in real-time and is robust to noise and partial occlusions. Using only the depth data makes our tracker independent of texture which is one of the key design goals in our approach. An experimental evaluation is provided along with a comparison of the proposed tracker with state-of-the-art approaches, including the OpenNI-tracker. The developed system is integrated with ROS and made available as part of IRI's ROS stack. ; This work was supported by the EU project IntellAct FP7-269959, the project PAU+ DPI2011-27510 and the project CINNOVA 201150E088. B. Dellen was supported by the Spanish Ministry for Science and Innovation via a Ramon y Cajal fellowship. ; Peer Reviewed
Realtime tracking and grasping of a moving object from range video
2014-01-01
Article (Journal)
Electronic Resource
English
DDC: | 629 |
Moving object tracking in video
IEEE | 2000
|Moving Object Tracking in Video
British Library Conference Proceedings | 2000
|A realtime object tracking system using a color camera
IEEE | 2001
|