TY - GEN
T1 - Autonomous Real-Time Human-Robot Emotional Interaction Through Facial Recognition
AU - Ma, Wenning
AU - Jin, Nanlin
AU - Guan, Steven
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2025
Y1 - 2025
N2 - This project aims to address the limitations in existing human-robot interaction systems by developing a real-time facial expression recognition and emotion response system using a Raspberry Pi-based humanoid robot. Unlike traditional systems that rely solely on explicit commands, our system enables the robot to autonomously respond to human emotions, demonstrating a level of emotional intelligence. This approach enhances the naturalness and human-like- ness of robot interactions. Our system processes real-time video feeds captured by a camera to identify user expressions, focusing specifically on sadness and happiness. Upon detecting a sad expression, the robot performs a song and dance routine to uplift the user’s mood and ceases the performance once a happy expression is recognized. Experimental results indicate that the system effectively recognizes emotions and improves users’ emotional states, showcasing its potential applications in therapeutic and interactive environments. The integration of deep learning with robotic control offers a novel approach to enhancing human-robot interaction through emotional intelligence.
AB - This project aims to address the limitations in existing human-robot interaction systems by developing a real-time facial expression recognition and emotion response system using a Raspberry Pi-based humanoid robot. Unlike traditional systems that rely solely on explicit commands, our system enables the robot to autonomously respond to human emotions, demonstrating a level of emotional intelligence. This approach enhances the naturalness and human-like- ness of robot interactions. Our system processes real-time video feeds captured by a camera to identify user expressions, focusing specifically on sadness and happiness. Upon detecting a sad expression, the robot performs a song and dance routine to uplift the user’s mood and ceases the performance once a happy expression is recognized. Experimental results indicate that the system effectively recognizes emotions and improves users’ emotional states, showcasing its potential applications in therapeutic and interactive environments. The integration of deep learning with robotic control offers a novel approach to enhancing human-robot interaction through emotional intelligence.
KW - Deep learning
KW - Emotional intelligence
KW - Human-robot interaction
KW - Humanoid robot
KW - Real-time facial expression recognition
UR - http://www.scopus.com/inward/record.url?scp=105002717313&partnerID=8YFLogxK
U2 - 10.1007/978-981-96-3949-6_6
DO - 10.1007/978-981-96-3949-6_6
M3 - Conference Proceeding
AN - SCOPUS:105002717313
SN - 9789819639489
T3 - Lecture Notes in Networks and Systems
SP - 79
EP - 91
BT - Selected Proceedings from the 2nd International Conference on Intelligent Manufacturing and Robotics, ICIMR 2024 - Advances in Intelligent Manufacturing and Robotics
A2 - Chen, Wei
A2 - Ping Tan, Andrew Huey
A2 - Luo, Yang
A2 - Huang, Long
A2 - Zhu, Yuyi
A2 - PP Abdul Majeed, Anwar
A2 - Zhang, Fan
A2 - Yan, Yuyao
A2 - Liu, Chenguang
PB - Springer Science and Business Media Deutschland GmbH
T2 - 2nd International Conference on Intelligent Manufacturing and Robotics, ICIMR 2024
Y2 - 22 August 2024 through 23 August 2024
ER -