Emotion-Centric AI Agents in Multi-User Virtual Reality Platforms

Authors

  • Er. Akshit Kohli ABESIT Engineering College, Crossings Republik, Ghaziabad, Uttar Pradesh 201009 akshitkohli69@gmail.com Author

DOI:

https://doi.org/10.63345/

Keywords:

Emotion-Centric AI Agents, Multi-User Virtual Reality, Affective Computing, Social Presence, User Engagement

Abstract

Emotion‑Centric AI Agents (ECAAs) hold the promise of transforming multi‑user Virtual Reality (VR) experiences by endowing non‑player avatars with the capacity to sense, interpret, and respond to human affect in real time. This manuscript details the conception, integration, and evaluation of ECAAs within a group‑based VR environment built in Unity3D. We employ a multimodal emotion recognition pipeline—fusing facial expression analysis via Affectiva, vocal‐prosody features via openSMILE, and physiological signals from Empatica E4 wristbands—processed through a Bayesian inference engine to generate continuous estimates of user valence and arousal. These estimates feed into an adaptive behavior engine that modulates avatar nonverbal cues (e.g., proxemics, gestures) and dialogic content, yielding empathic and contextually appropriate responses.

We conducted a within‑subjects study with 30 participants (10 triads) who engaged in collaborative problem‑solving tasks under two conditions: (1) Baseline—with static, non‑affective agents—and (2) ECAA—agents driven by live emotion feedback. Quantitative measures included Social Presence (SP) and User Engagement (UE) scales, each comprising multiple Likert‑type items, as well as system latency and classification accuracy metrics. Qualitative data were gathered through post‑session semi‑structured interviews.

Results revealed that sessions with ECAAs significantly outperformed baseline in SP (mean increase of 22%, p < .001) and UE (mean increase of 18%, p < .001). The multimodal recognition pipeline achieved 87% overall accuracy in valence classification with average end‑to‑end latency of 48 ms. Thematic analysis of interviews indicated that users perceived ECAAs as more empathetic, facilitating smoother turn‑taking and deeper group cohesion. Participants reported that empathic avatar prompts (e.g., supportive nods when frustration was detected) mitigated tension and fostered collaboration.

These findings confirm that emotion‑aware agents can substantially enrich multi‑user VR by bolstering social presence and engagement without imposing prohibitive computational overhead. We discuss implications for remote teamwork, virtual training, and therapeutic group interventions, and outline directions for scaling to larger participant counts, expanding affect taxonomies, and integrating long‑term affective adaptation.

Downloads

Download data is not yet available.

Additional Files

Published

2026-05-02

Issue

Section

Original Research Articles

How to Cite

Emotion-Centric AI Agents in Multi-User Virtual Reality Platforms. (2026). World Journal of Future Technologies in Computer Science and Engineering (WJFTCSE) U.S. ISSN: 3070-6203, 2(2), May (13-23). https://doi.org/10.63345/

Similar Articles

71-80 of 90

You may also start an advanced similarity search for this article.