Skip to main navigation Skip to search Skip to main content

An Adaptive Human-Robot Interaction Framework Using Real-Time Emotion Recognition and Context-Aware Task Planning

  • Udit Mamodiya
  • , Indra Kishor
  • , Abrar Ahmed Syed
  • , Pranati Sankalkar
  • , Nithesh Naik*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Existing approaches often operate poorly in noisy or occluded conditions, rely on cloud-based inference subject to latency and privacy issues, and do not take emotional context into consideration while planning for a task. This research introduces the Emotion–Context Reinforced Planner, a novel human–robot interaction framework designed to deliver emotionally and contextually adaptive behaviors in real-time. The objective is to enhance social coherence and responsiveness while enabling intelligent action sequencing on low-power robotic platforms. The framework utilizes a multimodal emotion recognition system that relies on a confidence-weighted estimate of how facial and vocal features should be fused, a context encoder, and a reinforcement learning policy based on a Double Deep Q-Network. The framework’s architecture was engineered to be readily deployed on low-power embedded systems (Jetson Nano, Raspberry Pi 4) and evaluated via TensorRT quantization and kernel fusion. Experimental evaluations reveal a 92.1% F1-score in emotion classification, 94.2% task-switching accuracy, and 34.8 ms response latency outperforming state-of-the-art models including CLEF and Pepper-based multimodal planners. Performance in real-time was achieved with up to 30 FPS on edge devices. A user satisfaction rating of 4.4/5 indicates strong perceived coherence and empathy in social interaction scenarios. ECRP offers a robust, scalable, and socially coherent solution for real-time human–robot interaction, outperforming existing methods both in accuracy and latency. Its fusion of emotional understanding and adaptive task planning significantly improves human engagement, making it suitable for use in elderly care, education, and customer service domains.

Original languageEnglish
Pages (from-to)152219-152240
Number of pages22
JournalIEEE Access
Volume13
DOIs
Publication statusPublished - 2025

All Science Journal Classification (ASJC) codes

  • General Computer Science
  • General Materials Science
  • General Engineering

Fingerprint

Dive into the research topics of 'An Adaptive Human-Robot Interaction Framework Using Real-Time Emotion Recognition and Context-Aware Task Planning'. Together they form a unique fingerprint.

Cite this