Options
2022
Conference Paper
Title
Assessing the Impact of Transformations on Physical Adversarial Attacks
Abstract
The decision of neural networks is easily shifted at an attacker's will by so-called adversarial attacks. Initially only successful when directly applied to the input, recent advances allow attacks to breach the digital realm, leading to over-the-air physical adversarial attacks. During training, some physical phenomena are simulated through equivalent transformations to increase the attack's success. In our work, we evaluate the impact of the selected transformations on the performance of physical adversarial attacks. We quantify their performance across diverse attack scenarios, e.g., multiple distances and angles. Our evaluation motivates that some transformations are indeed essential for successful attacks, no matter the target class. These also appear to be responsible for creating shapes within the attacks, which are semantically related to the target class. However, they do not ensure physical robustness alone. The choice of the remaining transformations appears to be context-dependent, e.g., some being more advantageous for long-range attacks, but not for close-range ones. With our findings, we not only provide useful information on generating physical adversarial attacks, but also help research on defenses to understand their weaknesses. CCS CONCEPTS • Security and privacy → Software and application security; • Computing methodologies → Neural networks.
Author(s)