Options
2025
Conference Paper
Title
Multimodal multiview photogrammetry system for on-site 3D digitization of low-textured objects
Abstract
Traditional multi-view photogrammetry (MVP) faces significant challenges in reconstructing low-textured objects, often resulting in artifacts or incomplete 3D models. Our work introduces a novel multimodal MVP approach that combines near-infrared (NIR) pattern projection with synchronized RGB-NIR imaging to overcome these limitations. By projecting dense, invisible NIR patterns (830. 850 nm) onto low-textured surfaces, our method enables robust feature detection while preserving the object's original color texture. A handheld dual-camera rig - comprising an RGB camera and an NIR-sensitive camera - captures both modalities synchronously, allowing dynamic, real-world scanning without surface preparation. Key innovations include a pre-calibrated camera setup for pose alignment and a customized photogrammetric workflow that fuses NIR-derived geometry with RGB texture data. The use of compact diffractive optical element (DOE) NIR projectors ensures scalability for diverse applications, from e-commerce to predictive maintenance. Experiments demonstrate that our method reduces artifacts by up to 60 times in tie point density compared to classic RGB MVP, achieving sub-0.2 mm accuracy against structured-light reference models. Practical validation on industrial and cultural heritage objects (e.g., a high-pressure turbine) confirms our system's ability to digitize complex, low-textured surfaces while maintaining portability and cost-efficiency. Our work advances handheld, dynamic MVP systems by addressing critical gaps in handling uncooperative surfaces, offering a non-invasive solution for high-fidelity digital twins. Our results underscore the potential of multimodal sensing to enhance 3D reconstruction accuracy and realism, paving the way for broader adoption in industrial and heritage preservation contexts.
Author(s)
Conference