Tag: Perception
-

ICS-MR: Interactive Conversation Scenarios for Assessment of Mixed Reality Communication
We present ICS-MR, a dataset containing three conversational scenarios designed for the evaluation of communication quality in Mixed Reality (MR) systems. Along with detailed descriptions of the conversation tasks, we provide all the materials required to incorporate the tasks into MR user studies. The materials also support application of the scenarios in real-world and video-conferencing contexts for studies that, for example, call for comparison of immersive systems against reference communication media. Open-source Unity implementations of the scenarios are also made available, supporting direct usage of the scenarios in distributed, multi-user experiments. The conversation tasks have all been administered in recent scientific works that address the evaluation of user experiences in immersive communication systems, allowing analysis and comparison of each scenario’s evoked behavioral properties. The ICS-MR dataset therefore contributes valuable resources for further research on communication in immersive systems.
-

Rhythmic Interaction influences Synchrony Perception in VR
Shared synchronised activities, such as dancing and singing, can promote social bonding in both physical and virtual settings, but are susceptible to network latencies in mediated environments, such as social virtual reality (VR). Temporal delays can hinder interpersonal entrainment, causing users to feel out of sync, and reduce presence. While prior work has examined how stimulus type and frequency affect synchrony perception, the role of rhythmic interaction, central to many interactive synchronised experiences, remains unexplored. We conducted a within-subject study with 32 participants to systematically investigate how rhythmic interaction shapes synchrony perception of rhythmic audiovisual stimuli in networked VR environments. Motor behaviour and subjective synchrony ratings (n = 32) are analysed, complemented by eye-tracking metrics from a quality-controlled subset (n = 25). Through controlled stimulus onset asynchronies (SOAs) between audio and video stimuli, we simulate network latency scenarios where (1) audio precedes video (as in VR dance environments where music playback is synchronised to a global clock while remote dancer movements appear delayed), and (2) video precedes audio (as in orchestra experiences where a conductor’s movements are perceived first, followed by a delayed auditory response from the orchestra). Participants either synchronised hand movements with a virtual avatar or passively observed it across two stimulus frequencies (0.5 Hz and 1 Hz), for both audio- and video-leading offsets. Our results show that rhythmic interaction significantly increases tolerance to audiovisual offsets, shifting the left decision criterion (LDC) by up to 66 ms, the point of subjective synchrony (PSS) by up to 31 ms, and widening the window of subjective synchrony (WSS) by up to 59 ms. These effects suggest that rhythmic interaction can reduce sensitivity to asynchronies in audio-leading scenarios, such as VR dance environments. We also found that temporal offsets influence rhythmic interaction behaviour and confirm that recent findings on the influence of stimulus frequency on PSS and temporal offsets on pupil dilation can to a certain extent be replicated in interactive VR.