• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Konferenzschrift
  4. ILLUMINER: Instruction-tuned Large Language Models as Few-shot Intent Classifier and Slot Filler
 
  • Details
  • Full
Options
May 2024
Conference Paper
Title

ILLUMINER: Instruction-tuned Large Language Models as Few-shot Intent Classifier and Slot Filler

Abstract
State-of-the-art intent classification (IC) and slot filling (SF) methods often rely on data-intensive deep learning models, limiting their practicality for industry applications. Large language models on the other hand, particularly instruction-tuned models (Instruct-LLMs), exhibit remarkable zero-shot performance across various natural language tasks. This study evaluates Instruct-LLMs on popular benchmark datasets for IC and SF, emphasizing their capacity to learn from fewer examples. We introduce ILLUMINER, an approach framing IC and SF as language generation tasks for Instruct-LLMs, with a more efficient SF-prompting method compared to prior work. A comprehensive comparison with multiple baselines shows that our approach, using the FLAN-T5 11B model, outperforms the state-of-the-art joint IC+SF method and in-context learning with GPT3.5 (175B), particularly in slot filling by 11.1-32.2 percentage points. Additionally, our in-depth ablation study demonstrates that parameter-efficient fine-tuning requires less than 6% of training data to yield comparable performance with traditional full-weight fine-tuning.
Author(s)
Mirza, Paramita
Sudhi, Viju
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Sahoo, Soumya Ranjan  
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Bhat, Sinchana Ramakanth
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Mainwork
The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation, LREC-COLING 2024. Main Conference Proceedings  
Conference
International Conference on Computational Linguistics 2024  
International Conference on Language Resources and Evaluation 2024  
Link
Link
Language
English
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Keyword(s)
  • instruction-tuned models

  • intent classification

  • parameter-efficient fine-tuning

  • slot filling

  • Filling

  • Computational linguistics

  • Deep learning

  • Integrated circuits

  • Learning systems

  • Zero-shot learning

  • Data intensive

  • Filling methods

  • Fine tuning

  • Instruction-tuned model

  • Intent classification

  • Language model

  • Parameter-efficient fine-tuning

  • Slot filling

  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024