Schreer, O.O.SchreerAtzapadin, N.N.AtzapadinFeldmann, I.I.FeldmannKauff, P.P.Kauff2022-03-112022-03-112009https://publica.fraunhofer.de/handle/publica/365877The European FP7 project 3DPresence is developing a multiparty, high-end 3D videoconferencing concept that tackles the problem of transmitting the feeling of physical presence in real-time to multiple remote locations in a transparent and natural way. Traditional set-top camera video-conferencing systems still fail to meet the 'telepresence challenge' of providing a viable alternative for physical business travel, which is nowadays characterized by unacceptable delays, costs, inconvenience, and an increasingly large ecological footprint. Even recent high-end commercial solutions, while partially removing some of these traditional shortcomings, still present the problems of not scaling easily, expensive implementations, not utilizing 3D life-sized representations of the remote participants and addressing only eye contact and gesture-based interactions in very limited ways. One of many challenges in this project is to calculate depth information for many different views in order to synthesise novel views to provide eye contact. In this paper, we present a multi-baseline disparity fusion scheme for improved real-time disparity map estimation. The advantages and disadvantages of different configurations are discussed and theoretical considerations are presented regarding disparity resolution and baseline. These observations together with experimental investigations lead to a multi-baseline configuration that allows taking advantage of small and wide baseline stereo camera as well as trifocal camera configurations.en621Multi-baseline Disparity Fusion for Immersive Videoconferencingconference paper