Daniel Ross - Audio Designer
  • About
  • Portfolio
  • Contact

Project Portfolio

This page contains a sampling of some of my favorite projects that I've been fortunate enough to be a part of over the years. It also features a few custom software tools and instruments that I've made as time allowed.  Thanks for checking it out!

                                                                                                                                                                                              - Cheers,  Daniel                                                                                                                                                                                                                            

Moses - Beyond The Waves
​

Picture




​Summary:
An animated cinematic adventure from director & animator Dina Garatly that follows the fabled tale of Moses.

This project was created as the basis for funding to create a larger cinematic VR experience as well as submission to the Red Sea Film Festival.


Role:
Sound Design & Mix

Robot Recon
​

Summary:
Robot Recon is a third-person, playable game level, created using  Unreal Engine 4 and Audiokinetic's Wwise middleware program to test various spatial audio rendering solutions and SDK's.

By navigating the character through the environment's different spaces and interacting with both static and moving objects, it allows for rapid and effective evaluation of how well the target audio solution handles key elements such as sound object localization, occlusion and obstruction, room portals, reflections, and acoustic space simulations. Solutions tested include Unreal's audio engine, Wwise Spatial Audio, Dolby Atmos for gaming, Auro 3D, Windows Sonic, Oculus Audio, and Google Resonance.  All of the visual assets in the level came from the UE Marketplace. The process also gave valuable insight into other aspects of the Unreal Engine and how visual elements like meshes, materials, skeletons, animations, blend spaces, locomotion, nav meshes, behavior trees, and sequences, among many other things, are created, implemented, and manipulated within the game engine.

Role:
Technical Sound Design / Game(level) Design

Mass Effect: New Earth - A 4D Holographic Journey



​Summary:
Mass Effect: New Earth is a theatrical motion simulator amusement park ride at California's Great America in Santa Clara, California. It is an adaptation of the Mass Effect video game series and features recurring characters from the franchise.

In the 4½-minute ride, guests sit in the 80-seat Action Theater with a custom 60-foot 3D LED screen with 4K resolution. The audience's motion seats are equipped to simulate wind and water and are equipped with leg pokers and neck ticklers. The ride features an 80-channel immersive surround sound system with custom audio beam forming speaker arrays inside the railing set pieces for each row. Guests also wear passive 3D glasses.


Role:
Principal Sound Design & Mix / Audio system spec & design

"The People Upstairs - Joe Garrison and Night People"
​

Picture


​Summary:
A 6-track music album from avant garde jazz composer and pianist, Joe Garrison, and his Night People ensemble. The album features and highlights instruments including flute, flugelhorn, clarinet, bass trombone, French horn, and piano.


Role:
Recording Engineer

Cry Out: The Lonely Whale Experience
​

Picture

​Summary:
Cry Out: The Lonely Whale Experience  is an underwater virtual reality experience created for the Lonely Whale Foundation, championed by actor Adrian Grenier. The three-minute, 360-degree film transports viewers into the sea to witness underwater life and how pollution has disrupted it.

The underwater VR expedition was created by 3D Live with Dell Precision, Alienware, AMD and HTC technology, and featured the use of the 360 Spatial Workstation  (formerly Two Big Ears) audio software by Facebook.



Role:
Technical Sound Design/Spatial Audio Design & Mix

Custom Software Tools & Instruments


​Gesture Control 3D Sound Panner
​

Picture

Summary:
In 2012 I began researching concepts and applications of spatial audio, otherwise known as immersive or 3D audio. The systems in particular that I was interested in were 3D audio systems for movies and theaters. It wasn't until I really dug deep into the topic that I began to form some questions of my own in regards to control methods for spatial audio systems. Especially when it came to content creation for these systems and the level of artistic control or lack there of. This led me to formulating the basis of this project; Develop a gesture or motion based panning control for immersive audio systems and applications that allows for a more seamless experience during the creative process. 
My reasoning for this project also stems from my background in Audio Engineering and Mixing. Most control options found on large format mixing consoles comprise of a joystick and a couple of buttons. The joystick serves it's purpose well when working with stereo or even more common formats of surround sound like 5.1 or 7.1 that use only one horizontal plane. However, with the advent of immersive audio systems like Dolby Atmos, DTS:X (MDA), and Auro3D, which use multiple horizontal planes of speakers (including overhead), the conventional methods of panning control are no longer effective. The end result was an object-based panning system created in Max MSP, controlled by a Leap Motion controller that could record and playback hand motions and gestures which are then spatially rendered through an audio system with customizable speaker mappings up to 24 channels.

Role:
Audio Software Developer

Pan Wiz
​

Picture

Summary:
Pan Wiz was developed for Jon Anderson (lead vocalist - Yes) and Emmy-nominated composer, Sean Mckee, to use on their upcoming album project that uses Dolby Atmos spatial audio technology. It's a simplified version of the Gesture Control 3D Sound Panner (above) that takes the hand tracking data from the Leap Motion controller and converts it to MIDI data that can then be (easily) mapped to the Atmos Music Panner plug-in inside their DAW.


Role:
​Audio Software Developer

Groove Box: Drum Designer & Rhythm Sequencer
​

Picture

Summary:
I designed and created this Groove Box using PureData. The idea behind this patch was to create an all-in-one solution that would enable me to create and design my own Kick drum as well as be able to program and sequence it alongside other samples and sounds. I also added additional elements such as backing drum tracks that you can jam and groove over while triggering samples and adjusting the FX section. The control elements were then mirrored using TouchOSC and uploaded to an iPad for wireless use.

Role:
Audio Software Developer


H.S.F (Hear.See.Feel) - An Exercise in Sensorial & Technical Connections
​

Picture
This multi-program patch patch was designed using Processing, Ableton Live, TouchOSC, and OSCulator. I wanted to enable an iPad to act as a drum machine which then triggers various strings and classes in Processing that displays words that are descriptive of the drum samples being played. Processing then takes the OSC data from TouchOSC(iPad) and passes it through to OSCulator which then translates the OSC data into MIDI which is then mapped to a drum rack in Ableton Live. The idea was not only about the visual and lexical relationship to the audio samples but also to explore routing possibilities between programs.

Proudly powered by Weebly