PERCY: Personal Emotional Robotic Conversational System
Meng, Zhijin ; Althubyani, Mohammed ; Xie, Shengyuan ; Razzak, Muhammad Imran ; Benítez Sandoval, Eduardo Benitez ; Bamdad, Mahdi ; Cruz, Francisco
Meng, Zhijin
Althubyani, Mohammed
Xie, Shengyuan
Razzak, Muhammad Imran
Benítez Sandoval, Eduardo Benitez
Bamdad, Mahdi
Cruz, Francisco
Supervisor
Department
Computational Biology
Embargo End Date
Type
Conference proceeding
Date
2026
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
Traditional rule-based conversational robots, constrained by fixed scripts and static response mappings, fundamentally lack adaptability for sustained personalized human interaction. Although large language models (LLMs) such as GPT-4 enable open-domain dialogue capabilities, most existing social robot approaches remain deficient in emotional awareness and longitudinal personalization continuity. To address this critical gap, we present PERCY (Personal Emotional Robotic Conversational sYstem) – an innovative framework that dynamically integrates: (1) real-time affective signals through facial expression recognition, (2) semantic content of user utterances, and (3) contextual profile data, synthesizing these multimodal inputs into emotion-aware prompt engineering for GPT-4. This integration drives both contextually appropriate verbal responses and synchronized non-verbal robot behaviors. PERCY utilizes GPT-4 to dynamically model the robot’s internal affective state, with non-verbal feedback primarily expressed through facial expressions. The system architecture leverages ROS-based multimodal processing: visual emotion recognition via fine-tuned MobileNetV2, textual sentiment analysis using NLTK’s VADER, decision-level sensor fusion, and GPT-4 prompt conditioning to orchestrate ARI robot behaviors. Empirical evaluation with 30 human participants demonstrated statistically significant improvements in dialogue coherence, contextual relevance, and response diversity compared to baseline systems. PERCY highlights the potential of integrating advanced multimodal perception and personalization to build a scalable foundation for next-generation emotionally intelligent human-robot interaction systems, rooted in contextually conditioned, multimodal affective computing.
Citation
Z. Meng et al., “PERCY: Personal Emotional Robotic Conversational System,” Mar. 2025, doi: 10.1007/978-981-95-4972-6_36
Source
Lecture Notes in Computer Science
Conference
38th Australasian Joint Conference on Artificial Intelligence, AI 2025
Keywords
Cognitive modelling and computer-human interaction, Human-Robot Interaction, Social Robotics
Subjects
Source
38th Australasian Joint Conference on Artificial Intelligence, AI 2025
Publisher
SpringerNature
