Files

Abstract

Several big tech companies are currently eager of building the metaverse, mainly through virtual reality experiences. Albeit immersive, in shared virtual environments it might be difficult to have emotionally rich interactions. Indeed, current available headsets and VR applications have limited possibilities for tracking and sharing emotions. We believe that physiological signal technology could enhance future metaverse applications. In this context, this paper presents a framework for visualizing, recording and synchronizing experiences in VR with human body signals. In order to prove the effectiveness of the system, we illustrate a use case and the development of a proof-of-concept scenario. Finally, we present the results of the tests conducted on this proof-of-concept that demonstrate the validity of the proposed system. Such framework could be used to design new emotionally augmented experiences in VR.

Details

Actions

PDF