Latest Project
Control Methods for Immersive Audio Systems
( In Progress )
During my senior year at UCSD I began studying concepts and applications of spatial audio, otherwise known as Immersive Audio. The systems in particular that I was interested in were 3D audio systems for movies and theaters. It wasn't until I really dug deep into the topic that I began to form some questions of my own in regards to control methods for spatial audio systems. Especially when it came to content creation for these systems and the level of artistic control or lack there of. This led me to formulating the basis of my senior project; Develop a gesture or motion based panning control for immersive audio systems and applications that allows for a more seamless experience during the creative process.
My reasoning for this project also stems from my background in Audio Engineering and Mixing. Most control options found on large format mixing consoles comprise of a joystick and a couple of buttons. The joystick serves it's purpose well when working with stereo or even more common formats of surround sound like 5.1 that use only one horizontal plane. However, with the advent of immersive audio systems like Dolby Atmos (see picture below), which uses multiple horizontal planes (including overhead), the conventional methods of panning control are no longer effective.
For this project, I began by reading countless research papers, dissertations, forums, and anything else I could get my hands on. Dr. Durand Begault of NASA was a huge help in recommending specific papers that covered many topics related to spatial audio systems. Also, Zachary Seldess of Sonic Arts Research and Development at CalIT2 was of great help with recommending additional papers to read as well as sharing invaluable guidance for working in the Max MSP programming environment of which I am using to create my control system.
My reasoning for this project also stems from my background in Audio Engineering and Mixing. Most control options found on large format mixing consoles comprise of a joystick and a couple of buttons. The joystick serves it's purpose well when working with stereo or even more common formats of surround sound like 5.1 that use only one horizontal plane. However, with the advent of immersive audio systems like Dolby Atmos (see picture below), which uses multiple horizontal planes (including overhead), the conventional methods of panning control are no longer effective.
For this project, I began by reading countless research papers, dissertations, forums, and anything else I could get my hands on. Dr. Durand Begault of NASA was a huge help in recommending specific papers that covered many topics related to spatial audio systems. Also, Zachary Seldess of Sonic Arts Research and Development at CalIT2 was of great help with recommending additional papers to read as well as sharing invaluable guidance for working in the Max MSP programming environment of which I am using to create my control system.

My Max MSP patch utilizes Jitter and OpenGL objects to create the virtual room/space and the sphere that represents the audio file and it's position within the space. The Control aspects of the sphere were originally going to use accelerometer data transmitted from an iPhone. However, the accelerometers did not provide the data in the stream that I needed. It worked only when all three planes (X,Y, and Z) were active and moving. If just one of the planes was not being used, the sphere kept trying to jump back to the center of the space at location 0,0,0. This led me to abandoning the use of accelerometers and implementing motion tracking via the use of a LeapMotion controller. The LeapMotion data is now being captured and transmitted via OSC protocol. So far the results are encouraging as the refresh rate and smoothness of the Leap controller is more than adequate. The actual panning of the audio is currently being handled by the ICST Ambisonics package available for Max MSP. Other tests are being conducted that utilize VBAP (Vector Based Amplitude Panning and DBAP (Distance Based Amplitude Panning). All results and final presentation materials including the Max patch will be posted once they are finished.