Could you tell us a little about the b<>com *Spatial Audio Toolbox* that will be at b<>com's IBC stand?
The "Advanced Media Coding" lab explores new ways to enhance images and sound in order to enable content creators to offer more immersive, captivating experiences. One example we've been working on for years is spatial audio for 360° video and Virtual Reality (VR). Audio is essential to feeling immersed in a virtual environment.
Among the most striking results of this work is the b<>com *Spatial Audio Toolbox*, a suite of plugins that make it possible to process the entire audio chain, from recording to rendering spatial sound in a headset or speakers. These plugins use HOA format (Higher-Order Ambisonics), which can reproduce panoramic sound fields in ways particularly suited to VR.
At our IBC stand, we'll be demonstrating how our tools and the HOA format make it easy to create VR content that offers total sound immersion. We'll also be showing that this type of content can be distributed very efficiently in bitrate terms using the MPEG-H 3D audio codec. Two different pieces of content are being presented at the stand, one on a mobile platform and the other on a PC, encoded with MPEG-H (and decoded) using Fraunhofer and Qualcomm tools, respectively.
Who are these plug-ins for?
Our tools are primarily for sound engineers who work on producing VR and 360° experiences. Our goal is to create tools for different steps in the production and mixing process: Processing signals recorded using ambisonic microphones, monitoring (particularly while offering a visualization of the sound field), transformation effects like rotating the sound field, spatial rendering on headsets and speakers, etc.
The two major principles that guided the design of our plugins are ease of use and excellent sound rendering quality, particularly with respect to timbre. To achieve this, we work directly with audio professionals. The goal is to assist sound engineers, not upend their work habits.
What projects are in the works?
Besides 360°, the idea now is to offer audiovisual experiences within which the viewer can move freely. One example would be transporting yourself to a concert with the ability to move around the stage or even amongst the musicians. This type of experience raises numerous audio questions. Right now we're working on the issue of audio playback formats and on field analysis tools that will assist in creating this sort of content.
Outside the world of music, illustrated quite well by Fanfarai's VR video* (presented exclusively at IBC with a new audio mix), we are adding new content collaborations. The primary goal of the technology must be to serve the creators' work, while also improving the viewer's experience. We're working in partnership with François Klein and the team at DV Group on virtual reality narration. The objective is to go further in getting the viewer emotionally invested, and to enhance the feeling of immersion. One of our joint efforts, the short film "Vaudeville", will also be available to discover at our stand.
Find b<>com's teams at the IBC's Future Zone, Hall 8 - stand G14!
*The video from the band Fanfaraï is a coproduction of Tour’n’Sol Prod and RFI Labo.