Recognition of human hand gestures in industrial environments is gaining popularity, especially in the context of assistance systems, thanks to advancements in deep learning-based vision methods. Also, head-worn devices with cameras are becoming more popular especially for smart assistance using Extended Reality (XR) technology, even for industrial use cases. Employing sensors from head-worn devices such as HoloLens enhance the communication between human and robot hereby providing interaction using ego-centric vision. This study delves into human-robot interaction by investigating ego-centered hand gesture recognition for commanding robots. A pipeline is developed for collecting these HoloLens video frames and to detect hand landmark labels on them using MediaPipe library by Google. Then, a Long Short-Term Memory Network (LSTM) model for hand-gesture recognition was developed that classifies the hand-gesture from the given hand landmarks in near real-time, which can then be translated into robot commands. We also present results for our network’s performance and implementation pipeline.
Human-Robot Interaction Through Egocentric Hand Gesture Recognition
Lect.Notes Mechanical Engineering
European Symposium on Artificial Intelligence in Manufacturing ; 2024 ; Athens, Greece October 16, 2024 - October 16, 2024
2025-03-22
9 pages
Article/Chapter (Book)
Electronic Resource
English
Real-Time Hand Gesture Recognition for Robot Hand Interface
Springer Verlag | 2014
|Real-Time Hand Gesture Recognition for Robot Hand Interface
British Library Conference Proceedings | 2014
|An integrative framework of human hand gesture segmentation for human robot interaction
BASE | 2015
|