Options
November 26, 2024
Conference Paper
Title
A Novel Approach for Remote Rendering and Streaming in XR
Abstract
This paper introduces and compares two approaches for remotely rendering 3D environments in Extended Reality (XR) applications. The first approach utilizes a single 360-degree video stream, effectively eliminating motion-to-photon latency during head rotation and enabling scalable deployment across various Head-Mounted Displays (HMDs) for non-interactive XR environments. However, this approach compromises display quality and reduces 3D immersion. In contrast, the second approach employs two virtual cameras to generate stereo video, leading to better bandwidth efficiency, higher image fidelity, and en-hanced 3D immersion. Yet, it introduces greater implementation complexity, necessitates HMD-specific camera configurations, and may increase motion-to-photon latency. Implementation specifics are detailed for 3D environments created in Unity, which are applicable similarly to Unreal Engine. Both rendering pipelines are integrated into a gaming engine-agnostic remote rendering framework utilizing WebRTC. The evaluation of both approaches considers quantitative metrics and qualitative assessments to determine their efficacy.
Author(s)