Immersive Object-Based Mastering
To date, object-based spatial audio has been subject to much research, and a few production tools have entered the market. The main difference to the channel-based audio paradigm is that object-based spatial audio enables engineers to create three-dimensional audio scenes regardless of any particular loudspeaker setup. To playback audio scenes, a separate rendering process calculates all loudspeaker signals in real time, based on the meta data of audio objects in a scene. While the authoring process of an audio scene can be easily connected to conventional audio recording and mixing processes, possibilities for mastering audio scenes are still very limited. The main limitation stems from the aspect that compared to channel-based mastering, no loudspeaker channels can be altered. However, instead of always altering audio objects individually, the meta data part of audio objects can be used to provide engineers with convenient tools for object-based spatial mastering as well. Beyond solely mastering the audio data of a scene, new tools can also alter the meta data of audio objects on a global level for meta data mastering. These new object-based spatial mastering techniques are presented.