• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Anderes
  4. Transferrability of Adversarial Attacks from Convolutional Neural Networks to ChatGPT4
 
  • Details
  • Full
Options
November 13, 2023
Paper (Preprint, Research Paper, Review Paper, White Paper, etc.)
Title

Transferrability of Adversarial Attacks from Convolutional Neural Networks to ChatGPT4

Title Supplement
Preprint
Abstract
This research evaluates the ability of adversarial attacks, primarily designed for CNN-based classifiers, to target the multimodal image captioning tasks executed by large multimodal models (LMMs), such as ChatGPT4. The study included several attacks, with a particular emphasis on the Projected Gradient Descent (PGD) attack, considering various parameters, surrogate models, and datasets. Initial but limited experiments support the hypothesis that PGD attacks are transferable to ChatGPT. Subsequently, results demonstrated that PGD attacks could be adaptively transferred to disrupt the normal functioning of ChatGPT. On the other hand, other adversarial attack strategies showed a limited ability to compromise ChatGPT. These findings provide insights into the security vulnerabilities of emerging neural network architectures used for generative AI. Moreover, they underscore the possibility of cost-effectively crafting adversarial examples against novel architectures, necessitating the development of robust defense mechanisms for LMMs in practical applications.
Author(s)
Bunzel, Niklas  
Fraunhofer-Institut für Sichere Informationstechnologie SIT  
Open Access
File(s)
Download (5.03 MB)
Rights
CC BY 4.0: Creative Commons Attribution
DOI
10.24406/publica-2166
Language
English
Fraunhofer-Institut für Sichere Informationstechnologie SIT  
Keyword(s)
  • Adversarial Attack

  • ChatGPT4

  • Multimodal Models

  • CNNs

  • Transferred Black Box Attack

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024