Audio Design Demo Reel
Custom Software Tools, Instruments & Projects
Gesture Control 3D Sound Panner
In 2012 I began researching concepts and applications of spatial audio, otherwise known as immersive or 3D audio. The systems in particular that I was interested in were 3D audio systems for movies and theaters. It wasn't until I really dug deep into the topic that I began to form some questions of my own in regards to control methods for spatial audio systems. Especially when it came to content creation for these systems and the level of artistic control or lack there of. This led me to formulating the basis of this project; Develop a gesture or motion based panning control for immersive audio systems and applications that allows for a more seamless experience during the creative process. My reasoning for this project also stems from my background in Audio Engineering and Mixing. Most control options found on large format mixing consoles comprise of a joystick and a couple of buttons. The joystick serves it's purpose well when working with stereo or even more common formats of surround sound like 5.1 or 7.1 that use only one horizontal plane. However, with the advent of immersive audio systems like Dolby Atmos, DTS:X (MDA), and Auro3D, which use multiple horizontal planes of speakers (including overhead), the conventional methods of panning control are no longer effective. The end result was an object-based panning system created in Max MSP, controlled by a Leap Motion controller that could record and playback hand motions and gestures which are then spatially rendered through an audio system with customizable speaker mappings up to 24 channels.
Groove Box: Drum Designer & Rhythm Sequencer
I designed and created this Groove Box using PureData. The idea behind this patch was to create an all-in-one solution that would enable me to create and design my own Kick drum as well as be able to program and sequence it alongside other samples and sounds. I also added additional elements such as backing drum tracks that you can jam and groove over while triggering samples and adjusting the FX section. The control elements were then mirrored using TouchOSC and uploaded to an iPad for wireless use.
H.S.F (Hear.See.Feel) - An Exercise in Sensorial & Technical Connections
This multi-program patch patch was designed using Processing, Ableton Live, TouchOSC, and OSCulator. I wanted to enable an iPad to act as a drum machine which then triggers various strings and classes in Processing that displays words that are descriptive of the drum samples being played. Processing then takes the OSC data from TouchOSC(iPad) and passes it through to OSCulator which then translates the OSC data into MIDI which is then mapped to a drum rack in Ableton Live. The idea was not only about the visual and lexical relationship to the audio samples but also to explore routing possibilities between programs.