Could you tell us about the b<>com technologies coming out at NAB in early April?
For the first time, we're presenting a complete workflow at NAB, including a series of 360 audio plugins based on HOA (Higher-Order Ambisonics) technology for producing premium immersive content. At the booth, we'll show how to record, mix, and broadcast audio soundtracks for virtual reality videos. Live 3D soundscape capture using microphone arrays will be demonstrated.
Specifically, at NAB we will present two new tools for capturing and analyzing 3D audio signals, which have been developed as part of the ORPHEUS European research project. The first is a plugin that converts the signals recorded by a spherical microphone array into HOA signals. This conversion is done using custom filters based on precise calibration of the microphone array.
The second tool is a plugin that analyzes the signals recorded and displays a map of the sound field. This spatial analysis also makes it possible to accentuate or eliminate certain sounds coming from a particular location in space. With respect to playback, we'll be showing a 360 video player compatible with the Oculus Rift. This player offers high-quality dynamic rendering for HOA soundtracks encoded in MPEG-H.
What are the benefits of this technology?
For a long time, audio production for virtual reality content was reserved for specialists. With the b<>com plugins, any sound engineer can now easily produce high-quality immersive audio content.
What it offers consumers is an audio experience guaranteed to be accurate and highly immersive, thanks to dynamic audio rendering. The resolution will be much better than with the ambisonic format currently used on web platforms.
For the producer, the use of object-oriented audio standards (ADM and MPEG-H) facilitates the archiving, exchanging, and distribution of content. Thus, a video with an HOA soundtrack compressed with the MPEG-H codec may be integrated into a standard MP4 container and read from any compatible video player on a smartphone or POC.
How is this demonstration innovative?
Until recently, to produce audio intended for immersive content, there were only two solutions: Either employ free tools that require a certain degree of expertise and did not ensure an optimal experience, or make use of proprietary tools that were easier to use and offered better sound quality, but were very expensive and totally locked-down.
At NAB, we want to show that there is now a third way that combines the best of both: Audio production that relies on open standards and offers a high-end experience at an affordable price.
What content was chosen for the demonstration?
We have the chance to present two exclusive pieces of VR content in collaboration with studios and content creators. They are exceptional-quality creations where the soundtrack plays a prominent role in the experience: Such an approach is still rare, but it always makes a big difference in the success of VR productions.
"Longing for Wilderness", a short film that is part of the VR NOW program at the Animationinstitut der Filmakademie Baden Württemberg. Directed by Marc Zimmermann (EpicScapes), it will be presented with an audio mix produced by b<>com's teams thanks to our plugins.
"Alteration", a short film directed by Jérôme Blanquet and OKIO Studios, one of the most prominent french studio in the European VR scene. Audio spatialization was performed using the tools of our partner Aspic Technologies*. For b<>com, the chance to demonstrate the total compatibility of our tools because we encoded the soundtrack in MPEG-H and we play it back using our player.
With these two experiences, we plan to make the difference in immersive sound!
*CinematicVR
b<>com *Spatial Audio Toolbox* will be in demonstration from April 9 to 12 at NAB Show - Las Vegas. Come meet our team in the Futures Park - North Hall.