top of page
  • White YouTube Icon
  • LinkedIn
  • Instagram

XR PROJECTS

Virtual Become Reality (VBR), 2025

By Virtual Human Interaction Lab (VHIL)

Sound Design by Na-Young Son

VBR is an immersive, self-contained version of the Stanford VHIL's classic lab tour. Designed to highlight why Virtual Reality (VR) is powerful and what research has revealed about its effects, VBR showcases the core concept that VR feels real and can have real-world consequences. The project premiered at the Cannes XR festival and was an official selection of the 2020 Tribeca Film Festival.

I contributed to the development of the VBR experience by implementing the spatial sound design. I utilized Unity scripts and Max/MSP, architecting an ambisonics sound environment. 

Screenshot 2025-10-17 at 12.52.06 PM.png

available on SteamVR: link

Earthquake Simulation, 2025

 

Designed and Engineered by Na-Young Son

Lead the development of a mixed reality simulation aimed at educating users on effective responses during natural disaster using Unity and Max/MSP for real-time audio-visual interaction for Meta Quest 3 and 24 multi-channel sound system.

 

Designed and implemented immersive soundscapes to reinforce urgency, realism, and appropriate behavioral cues within the training environment; created interactive environmental effects to simulate high-stress scenarios for experiential learning.

* Sound is unavailable due to it's lab specific ambisonics configuration. 

Out of the Cage, 2025

Designed by Na-Young Son


This audiovisual performance illustrates the act of freeing yourself from your own thoughts. Sometimes, the path that you thought would lead you to where you want to be traps you. Sometimes, rejection opens a door to a new direction that will still lead you to where you want to be, or even somewhere better than you ever imagined. Once you realize that it is your own thoughts that are trapping you from reaching your dreams, you'll be able to free yourself and start living your life to its full potential. I used Max/MSP to create the live performance. The piece was performed at CCRMA, Stanford University

FoodWise Concept Video, 2024

 

Directed by Na-Young Son

Filmed by Na-Young Son

Edited by Na-Young Son

This is a concept video for a mobile application "FoodWise." FoodWise is an innovative mobile application that transforms eating disorder recovery into an engaging, game-like journey. The app creates a supportive environment where users can progress through their healing journey while feeling empowered and understood, making recovery more accessible and less intimidating for those seeking help.

Technology and Music, 2023

Designed by Na-Young Son


This is an AR project built on Unity. It narrates the history of synthesizers, starting with the development of FM synthesis – an algorithm developed by professor John Chowning at the Center for Computer Research in Music and Acoustics. I used the museum space on the first floor of the Knoll to build a geolocated project. The project outlines how Yamaha adapted FM synthesis to the Yamaha DX7 model, and how the development of synthesizers expanded the possibilities of music creation and performance.

Cho-Ji-Il-Gwan (初志一貫), 2022

 

Produced by Na-Young Son

Edited by Na-Young Son


I was inspired by a Korean folk tune 'Arirang (아리랑)'. This is a conceptual audiovisual performance that expresses 'Han (한)' -- a complex feeling of longing and sorrow that serves as an essential element of Korean identity. I used Logic Pro X to remix Kelly Moran's Helix, adding layers of dreamy vocal and ethereal-sounding synthesizers, and Max/MSP to create the audiovisual live performance. The piece was performed at CCRMA, Stanford University.

Water and its flow, 2022

 

Filmed and Edited by Na-Young Son

Composed and Produced by Na-Young Son

This audiovisual piece explores the movement of water. The synthesizer sounds capture granular nature of water droplets and arc-resembling melodies embody the flow of water. Through reversed footage and calculated distortion, the piece creates a cognitive dissonance and prompts the audience to experience their instinctive responses when confronted with imagery that challenges their fundamental understanding of natural physics.

Verifying 3D Models, 2022

 

Starling Lab X Stanford Engineering

​This is a research project that aims to design a verification model for 3D photogrammetry models. The team first researched and collected data on the usage of 3D photogrammetric models in Web3 and the XR industry. Using C2PA schema to access the metadata of individual photos used for creating the 3D model, we programmed a blockchain model that attaches a digital signature to photogrammetry models. Created a model of an interactive AR website using Javascript and CSS that has digital signatures of 3D models. 

poster.jpg

© Na-Young Son

bottom of page