Experiments

Interactive Audio & Music Systems – Headphones Recommended

Overview

These are examples of interactive and immersive audio design and implementation developed for games, XR, and virtual environments, focusing on how sound and music respond to player agency, spatial context, and real-time interaction.

A key component of this work is spatial audio design, including 3D positioning, distance-based attenuation, ambisonic recordings, environmental reflections, occlusion and diffraction modeling, and spatial mixing strategies for both screen-based and XR experiences.

The examples also feature interactive music systems that adapt to player behavior and environment using techniques such as state-based composition, dynamic layering, branching structures, adaptive mixes, and music-driven sound effects.

This work builds on formal training from “Audio for Virtual Reality,” taught by Dr. Spencer Russell (PhD, MIT Media Lab), with a focus on VR-specific mixing considerations, perceptual clarity, and reliable performance on head-mounted displays. Across projects, I work fluently between interactive audio tools (Unity Audio, Wwise, FMOD) and traditional production DAWs such as Ableton Live and Pro Tools, bridging composition, sound design, and technical implementation.