Sladeczek, ChristophChristophSladeczekSeideneck, MarioMarioSeideneckLorenz, WolfgangWolfgangLorenzPursche, KatrinKatrinPurscheSchneider, BenjaminBenjaminSchneider2022-07-052024-03-142022-07-052022-06-08https://publica.fraunhofer.de/handle/publica/4186072-s2.0-85135393727Current studies show that vehicle interiors will change more than they have in decades. This is due to simultaneously occurring mega trends. Autonomous driving in particular allows the driver to focus less on what is happening on the road. This leads to new usage concepts that shift attention to the interior experience. This again comes with completely new demands on sound systems. The availability of immersive entertainment technologies used for new comfort functions and mobile working will be essential. Due to the changed focus of attention, not only the personalized sound staging itself will take on a new importance, but also its correct spatial mapping. This applies to a wide variety of functions such as interior staging, driving safety, well-being or communication. In this context, the effort required to create the audio content will take on a significant role, requiring a new unified interface for spatial presentation in order to limit production efforts. This paper describes how object-based audio (OBA) as a platform technology can be used to meet these requirements. Based on specific use cases, a new workflow is presented that has been implemented for use in series production. The concepts for rendering technology, audio tuning process, and implementation on resource-limited hardware are described.enautomotive audioobject-based audioin-car sound systemSound Field ControlObject-Based Audio as Platform Technology in Vehiclesjournal article