CC BY 4.0Biesner, DavidDavidBiesnerCvejoski, KostadinKostadinCvejoskiSifa, RafetRafetSifa2022-12-132022-12-132022-08-23https://publica.fraunhofer.de/handle/publica/429936https://doi.org/10.24406/publica-63210.1145/3538969.353900010.24406/publica-6322-s2.0-85136927710Password generation techniques have recently been explored by leveraging deep-learning natural language processing (NLP) algorithms. Previous work has raised the state of the art for password guessing algorithms significantly, by approaching the problem using either variational autoencoders with CNN-based encoder and decoder architectures or transformer-based architectures (namely GPT2) for text generation. In this work we aim to combine both paradigms, introducing a novel architecture that leverages the expressive power of transformers with the natural sampling approach to text generation of variational autoencoders. We show how our architecture generates state-of-the-art results in password matching performance across multiple benchmark datasets.enlanguage modelslatent variable modelsneural networkspasswordstext generationtransformersCombining Variational Autoencoders and Transformer Language Models for Improved Password Generationconference paper