• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. Is data-efficient learning feasible with quantum models?
 
  • Details
  • Full
Options
2025
Poster
Title

Is data-efficient learning feasible with quantum models?

Title Supplement
Poster presented at 9th International Conference on Quantum Techniques in Machine Learning, QTML 2025, Singapore, Singapore, November 16-21, 2025; This poster was also presented at the Munich Quantum Valley (MQV) Annual Meeting, 30.09.2025, Munich
Abstract
The importance of analyzing nontrivial datasets when testing quantum machine learning (QML) models is becoming increasingly prominent in literature, yet a cohesive framework for understanding dataset characteristics remains elusive. In this work, we concentrate on the size of the dataset as an indicator of its complexity and explores the potential for QML models to demonstrate superior data-efficiency compared to classical models, particularly through the lens of quantum kernel methods (QKMs). We provide a method for generating semi-artificial fully classical datasets, on which we show one of the first evidence of the existence of classical datasets where QKMs require less data during training. Additionally, our study introduces a new analytical tool to the QML domain, derived for classical kernel methods, which can be aimed at investigating the classical-quantum gap. Our empirical results reveal that QKMs can achieve low error rates with less training data compared to classical counterparts. Furthermore, our method allows for the generation of datasets with varying properties, facilitating further investigation into the characteristics of real-world datasets that may be particularly advantageous for QKMs. We also show that the predicted performance from the analytical tool we propose - a generalization metric from classical domain - show great alignment empirical evidence, which fills the gap previously existing in the field. We pave a way to a comprehensive exploration of dataset complexities, providing insights into how these complexities influence QML performance relative to traditional methods. This research contributes to a deeper understanding of the generalization benefits of QKM models and potentially a broader family of QML models, setting the stage for future advancements in the field.
Author(s)
Sakhnenko, Alona
Fraunhofer-Institut für Kognitive Systeme IKS  
Mendl, Christian B.
Technische Universität München  
Lorenz, Jeanette Miriam  orcid-logo
Fraunhofer-Institut für Kognitive Systeme IKS  
Project(s)
QACI-K7
Funder
Bayerisches Staatsministerium für Wirtschaft, Landesentwicklung und Energie  
Conference
International Conference on Quantum Techniques in Machine Learning 2025  
File(s)
Download (503.88 KB)
Rights
Use according to copyright law
DOI
10.24406/publica-6648
Language
English
Fraunhofer-Institut für Kognitive Systeme IKS  
Fraunhofer Group
Fraunhofer-Verbund IUK-Technologie  
Keyword(s)
  • quantum machine learning

  • QML

  • quantum kernel method

  • data-efficient learning

  • data-efficiency

  • QKM

  • generalization metric

  • QML model

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024