Michel, AndreasAndreasMichelWeinmann, MartinMartinWeinmannKüster, JannickJannickKüsterAlNasser, FaisalFaisalAlNasserGomez, TomasTomasGomezFalvey, MarkMarkFalveySchmitz, RainerRainerSchmitzHinz, StefanStefanHinzMiddelmann, WolfgangWolfgangMiddelmann2025-04-082025-04-082025https://publica.fraunhofer.de/handle/publica/48630310.1007/s11263-025-02376-92-s2.0-85218699627Detecting airborne dust in standard RGB images presents significant challenges. Nevertheless, the monitoring of airborne dust holds substantial potential benefits for climate protection, environmentally sustainable construction, scientific research, and various other fields. To develop an efficient and robust algorithm for airborne dust monitoring, several hurdles have to be addressed. Airborne dust can be opaque or translucent, exhibit considerable variation in density, and possess indistinct boundaries. Moreover, distinguishing dust from other atmospheric phenomena, such as fog or clouds, can be particularly challenging. To meet the demand for a high-performing and reliable method for monitoring airborne dust, we introduce DustNet++, a neural network designed for dust density estimation. DustNet++ leverages feature maps from multiple resolution scales and semantic levels through window and grid attention mechanisms to maintain a sparse, globally effective receptive field with linear complexity. To validate our approach, we benchmark the performance of DustNet++ against existing methods from the domains of crowd counting and monocular depth estimation using the Meteodata airborne dust dataset and the URDE binary dust segmentation dataset. Our findings demonstrate that DustNet++ surpasses comparative methodologies in terms of regression and localization capabilities.entrueAirborne dust detectionAttentionMachine learningVisual regressionDustNet++: Deep Learning-Based Visual Regression for Dust Density Estimationjournal article