Berghi, D.D.BerghiStenzel, H.H.StenzelVolino, M.M.VolinoHilton, A.A.HiltonJackson, P.J.B.P.J.B.Jackson2022-03-142022-03-142020https://publica.fraunhofer.de/handle/publica/40856710.1109/VRW50115.2020.00184Immersive audio-visual perception relies on the spatial integration of both auditory and visual information which are heterogeneous sensing modalities with different fields of reception and spatial resolution. This study investigates the perceived coherence of audio-visual object events presented either centrally or peripherally with horizontally aligned/misaligned sound. Various object events were selected to represent three acoustic feature classes. Subjective test results in a simulated virtual environment from 18 participants indicate a wider capture region in the periphery, with an outward bias favoring more lateral sounds. Centered stimulus results support previous findings for simpler scenes.en621006Audio-Visual Spatial Alignment Requirements of Central and Peripheral Object Eventsconference paper