Immersive Audio Experience
Electrophysiological Measurement for Immersive Audio Experience
Hyunjae Kim, Yoojin Choi, Hyojin Kim, Taein Song, Kyun Myun Lee
Immersive audio is increasingly adopted in audiovisual content, yet its impact on neural responses remains underexplored. Here, we investigated how immersive audio shapes neural dynamics under naturalistic conditions, contrasting prior EEG studies that relied on brief, repetitive stimuli. Thirty-four participants experienced four musical excerpts and two cinematic clips (each 3–5 minutes) with 24-channel EEG recordings. Each stimulus was presented in either stereo or 7.1.4 multichannel audio, yielding a counterbalanced within-subject design with two factors: content (music vs. movie) and presentation (stereo vs. spatial). We employed temporal response function (TRF) modeling to estimate ERP-like responses from continuous stimuli. TRF analysis revealed significant main and interaction effects in the occipital P300 component, with reduced amplitudes for spatial audio—especially in movie conditions. This suggests lower demands for attentional reallocation with spatial audio presentations where audiovisual information is more perceptually congruent. Additionally, inter-subject correlation (ISC) analysis, indexing shared neural engagement, showed increased beta-band synchrony in the spatial movie condition, indicating enhanced collective engagement. Combining TRF and ISC analyses under ecologically valid conditions, this study highlights how immersive audio modulates both perceptual processing and shared engagement, offering novel insights into the neural impact of spatial sound in real-world experience.
Funding
National Research Foundation of Korea: 기초연구실
