After presenting a paper on our work in San Jose, California at the IEEE international conference on Multimedia and Expo Workshops we took the opportunity to visit DTS, just down the road from the conference in Los Gatos. After a fascinating discussion around the future of object-based audio and a great dinner the discussion continued by email. Some time later we agreed to demonstrate some of our work integrated into an object-based broadcast setup in the DTS demo room at IBC 2015. A great opportunity for us to showcase our research, and a chance for DTS to show what object-based live production could look like.
We also needed a name for our technology and after an evening of acronym tennis by email came up with SALSA: Spatial Automated Live Sports Audio.
Here’s a blog post I wrote at the time…..
In addition to the Object Based Clean Audio demos that Rob Oldfield and I gave at IBC 2015 we were also showcasing the results of a long running development project in audio for live sports broadcast. Thanks to some internal funding from the University of Salford‘s Staff Innovation Challenge competition, and to some great work from Darius Satonger we have done some further development on research that we started on the EU FP7 FascinatE Project capturing audio objects from a live football match.
During the development work we spent quite a bit of time in outside broadcast trucks with some of the best mixing engineers around – the skill levels of these guys is impressive and watching them using mixing desk faders to follow the ball movement around the pitch during a football match made us realise the complexity of their job. The aim of our work was to find ways that would assist the mixing engineer in creating a great mix for both conventional, and object-based broadcast, and to ensure that transition to object-based broadcast was a painless one by tailoring our additions to existing workflows.
The software we have developed over the last few years works in real time by matching on-pitch sound events to a database of audio object templates in order to identify sounds that we want to capture such as ball kicks and referee whistle blows. Once a sound is identified the software identifies the location of the sound (typically to within around 50cm), isolates the sound as a short-duration audio object and tags it with metadata detailing the type of sound, its location on the pitch and its duration.
Once the tagged audio object is created it is used in three ways:
- firstly, to control the faders on a mixing console (using standard console automation protocols) so simplifying the process of capturing on-pitch events and allowing the mixing engineer to concentrate on the sound design of the game.
- secondly, to route the packaged object to a different bus for each object type – so ball kicks were routed to bus 1, whistle blows to bus 2 – in order that they could be processed and eq’d independently.
- and third, to create audio objects suitable for new object-based broadcast formats.
Thanks to our friends at DTS and Fairlight we were able to implement all of this in real time for demonstrations at IBC 2015 as part of a complete outside broadcast production workflow, from capture to reproduction, using DTS’s MDA object-based audio format.
Our SALSA software was modified so that our extracted audio objects were sent to Fairlight’s 3DAW system in their QUANTUM live mixing console which then packaged the objects into an MDA stream for broadcast.
Many thanks to DTS and Fairlight for such a great collaboration.