• English
  • Deutsch
  • Log In
    Password Login
    Research Outputs
    Fundings & Projects
    Researchers
    Institutes
    Statistics
Repository logo
Fraunhofer-Gesellschaft
  1. Home
  2. Fraunhofer-Gesellschaft
  3. Scopus
  4. Towards Linguistically Informed Multi-objective Transformer Pre-training for Natural Language Inference
 
  • Details
  • Full
Options
2023
Conference Paper
Title

Towards Linguistically Informed Multi-objective Transformer Pre-training for Natural Language Inference

Abstract
We introduce a linguistically enhanced combination of pre-training methods for transformers. The pre-training objectives include POS-tagging, synset prediction based on semantic knowledge graphs, and parent prediction based on dependency parse trees. Our approach achieves competitive results on the Natural Language Inference task, compared to the state of the art. Specifically for smaller models, the method results in a significant performance boost, emphasizing the fact that intelligent pre-training can make up for fewer parameters and help building more efficient models. Combining POS-tagging and synset prediction yields the overall best results.
Author(s)
Pielka, Maren  
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Schmidt, Svetlana
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Pucknat, Lisa
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Sifa, Rafet  
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
Mainwork
Advances in Information Retrieval. 45th European Conference on Information Retrieval, ECIR 2023. Proceedings. Pt.II  
Conference
European Conference on Information Retrieval 2023  
DOI
10.1007/978-3-031-28238-6_46
Language
English
Fraunhofer-Institut für Intelligente Analyse- und Informationssysteme IAIS  
  • Cookie settings
  • Imprint
  • Privacy policy
  • Api
  • Contact
© 2024