Back to Publications

The Bitter Taste of Confidence: Exploring Audio-Visual Taste Modulation in Immersive Reality

Pooria Ghavamian, Jan Henri Beyer, Sophie Orth, Mia Johanna Nona Zech, Florian Müller, Andrii Matviienko
IMX 2025
Proceedings of the 2025 ACM International Conference on Interactive Media Experiences
TL;DR
What we did: We conducted a study to explore how augmented reality visual filters and synchronized audio cues influence taste perception in Extended Reality environments.
What we found: We found that while a pink visual filter reduced perceived bitterness alone, it unexpectedly increased bitterness perception when combined with sweet-associated audio cues, highlighting complex crossmodal interactions.
Takeaway: Our findings suggest that incorporating user confidence levels in sensory evaluations can enhance the design of multisensory experiences, revealing new dimensions in user engagement and perceptual reliability.

Abstract

Extended Reality (XR) technologies present innovative ways to augment sensory experiences, including taste perception. In this study, we investigated how augmented reality (AR) visual filters and synchronized audio cues affect gustation through a controlled experiment with 18 participants. Our findings revealed unexpected crossmodal interactions: while pink visual filter typically associated with sweetness reduced perceived bitterness in isolation, it paradoxically enhanced bitterness perception when combined with sweet-associated audio cue. Furthermore, we observed an inverse correlation between participant confidence levels and their perception of taste intensities across multiple dimensions, highlighting confidence as an overlooked factor in sensory experience design. These findings inform the design of nuanced multisensory experiences in immersive media, where subtle crossmodal interactions significantly influence user perception.