Object-Based Audio in Large Scale Live Sound Reinforcement Controlled by Motion Tracking
This work shows a detailed application of an optical tracking system to control the positioning of sound sources in an object-based audio reproduction system for live sound reinforcement. This need is brought up by live performances with moving actors like operas, musicals or spoken theater. With state-of-the-art object-based audio reproduction systems it is possible to distribute virtual sound sources for improved sound localization within the audience area. To cope with applications of high complexity automated auxiliary systems like motion tracking provide valuable control data and thus enhance the usability of such systems. The presented approach shows a solution with focus on interfaces between systems and devices.