• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Scopus
  4. GrapeSLAM: UAV-based monocular visual dataset for SLAM, SfM and 3D reconstruction with trajectories under challenging illumination conditions
 
  • Details
  • Full
Options
2025
Journal Article
Title

GrapeSLAM: UAV-based monocular visual dataset for SLAM, SfM and 3D reconstruction with trajectories under challenging illumination conditions

Abstract
SLAM (Simultaneous Localization and Mapping) is an efficient method for robot to percept surrendings and make decisions, especially for robots in agricultural scenarios. Perception and path planning in an automatic way is crucial for precision agriculture. However, there are limited public datasets to implement and develop robotic algorithms for agricultural environments. Therefore, we collected dataset “GrapeSLAM”. The ``GrapeSLAM'' dataset comprises video data collected from vineyards to support agricultural robotics research. Data collection involved two primary methods: (1) unmanned aerial vehicle (UAV) for capturing videos under different illumination conditions, and (2) trajectories of the UAV during each flight collected by RTK and IMU. The UAV used was Phantom 4 RTK, equipped with a high resolution camera, flying at around 1 to 3 meters above ground level.
Author(s)
Wang, Kaiwen
Wageningen University & Research
Vélez, Sergio  
Fraunhofer-Institut für Solare Energiesysteme ISE  
Kooistra, Lammert
Wageningen University & Research
Wang, Wensheng
Chinese Academy of Agricultural Sciences
Valente, João
Consejo Superior de Investigaciones Científicas
Journal
Data in Brief  
DOI
10.1016/j.dib.2025.111495
Language
English
Fraunhofer-Institut für Solare Energiesysteme ISE  
Keyword(s)
  • Precision agriculture

  • RTK GNSS data

  • UAV

  • Visual SLAM

  • Woody crop

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024