This project aims to address the limitations in existing human-robot interaction systems by developing a real-time facial expression recognition and emotion response system using a Raspberry Pi-based humanoid robot. Unlike traditional systems that rely solely on explicit commands, our system enables the robot to autonomously respond to human emotions, demonstrating a level of emotional intelligence. This approach enhances the naturalness and human-like- ness of robot interactions. Our system processes real-time video feeds captured by a camera to identify user expressions, focusing specifically on sadness and happiness. Upon detecting a sad expression, the robot performs a song and dance routine to uplift the user’s mood and ceases the performance once a happy expression is recognized. Experimental results indicate that the system effectively recognizes emotions and improves users’ emotional states, showcasing its potential applications in therapeutic and interactive environments. The integration of deep learning with robotic control offers a novel approach to enhancing human-robot interaction through emotional intelligence.
Autonomous Real-Time Human-Robot Emotional Interaction Through Facial Recognition
Lect. Notes in Networks, Syst.
International Conference on Intelligent Manufacturing and Robotics ; 2024 ; Suzhou, China August 22, 2024 - August 23, 2024
Selected Proceedings from the 2nd International Conference on Intelligent Manufacturing and Robotics, ICIMR 2024, 22-23 August, Suzhou, China ; Kapitel : 6 ; 79-91
01.04.2025
13 pages
Aufsatz/Kapitel (Buch)
Elektronische Ressource
Englisch
Emotional intelligence , Humanoid robot , Deep learning , Human-robot interaction , Real-time facial expression recognition Engineering , Control, Robotics, Mechatronics , Artificial Intelligence , Industrial Chemistry/Chemical Engineering , Signal, Image and Speech Processing , Nanoscale Science and Technology , Computational Intelligence
Human-Robot Interaction Through Egocentric Hand Gesture Recognition
Springer Verlag | 2025
|Improving Human-Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression
BASE | 2021
|