Biesner, DavidDavidBiesnerPielka, MarenMarenPielkaRamamurthy, RajkumarRajkumarRamamurthyDilmaghani, TimTimDilmaghaniKliem, BerndBerndKliemLoitz, RĂ¼digerRĂ¼digerLoitzSifa, RafetRafetSifa2023-06-062023-06-062022https://publica.fraunhofer.de/handle/publica/44256010.1109/ICMLA55696.2022.002512-s2.0-85152214393Natural language processing methods have several applications in automated auditing, including document or passage classification, information retrieval, and question answering. However, training such models requires a large amount of annotated data which is scarce in industrial settings. At the same time, techniques like zero-shot and unsupervised learning allow for application of models pre-trained using general domain data to unseen domains.In this work, we study the efficiency of unsupervised text matching using Sentence-Bert, a transformer-based model, by applying it to the semantic similarity of financial passages. Experimental results show that this model is robust to documents from in- and out-of-domain data.enNLPTransfer LearningBERTText ClassificationZero-Shot Text Matching for Automated Auditing using Sentence Transformersconference paper