Publication: SPOTting Emotions: Dynamic Emotion Detection and Response in Human-Robot Interactions
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
As robots are increasingly integrated into human-centric environments, equipping them with emotional intelligence is essential for safe, ethical, and meaningful interactions. This thesis explores the intersection of emotional intelligence and Human-Robot Interaction (HRI) by leveraging Boston Dynamics’ quadruped robot, Spot, to detect one user’s emotional state, specifically happiness, and respond physically in real time to the detected emotion. The project uses a machine learning model with tools from the Deepface Python library for facial emotion recognition (FER) to interpret the user’s emotion, which is captured by the camera in Spot’s gripper arm. When happiness is detected, a servo powers a flag to wave left and right to provide a clear, expressive response.
This project serves as a proof-of-concept for real-time emotional responsiveness in non-humanoid robots and demonstrates the potential for future quantitative and qualitative studies in affective HRI. By focusing on a single emotion and a single user, the project also prioritizes controlled experimentation and ethical design, avoiding implementing generalized facial recognition. The research contributes to the emerging field of emotionally expressive robotics and highlights the potential for enhancing human trust and engagement of useful robots in human environments.