Back to Publications

Meme, Myself and AR: Exploring Memes Sharing in Face-to-face Conversation Using Augmented Reality

Yanni Mei, Samuel Wendt, Florian Müller, Jan Gugenheimer
CHI 2026
Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems
TL;DR
What we did: We explored how AR memes influence face-to-face conversations through a survey of meme users and a multi-user prototype study in which pairs of friends designed and used their own AR meme visualizations.
What we found: We found that participants utilized three archetypes for visualizing memes in Augmented Reality: 'Me in the Meme', 'The Meme on Me', and 'Meme as Visual Reference', along with distinct integration patterns of sequential and parallel usage.
Takeaway: Our research highlights the potential of integrating Augmented Reality memes into social interactions, offering a novel form of communication that enhances humor and engagement in face-to-face conversations.

Abstract

Internet memes are central to online communication, yet their visual humor is often lost in face-to-face (F2F) conversations. Augmented reality (AR) offers new ways to bring memes into F2F interactions, but it is unclear how memes can be integrated into F2F conversations using AR, and how they impact conversational dynamics. We surveyed meme users (N=29) to understand motivations and challenges in visualising memes in F2F conversations. With these insights, we developed an AR meme-sharing prototype and invited 12 pairs of friends to design AR visualizations for their memes and use them in conversations. Our analysis reveals two AR-unique visualizations: merging memes with one’s body (The-Meme-OnMe) and situating oneself in meme environment (Me-In-The-Meme). We observed two integration patterns: using speech as setup before a meme punchline, and showing memes simultaneously with speech to amplify humor. We report users’ reactions toward AR memes, showing how it enables playful social interaction.