Options
August 23, 2022
Conference Paper
Title
Combining Variational Autoencoders and Transformer Language Models for Improved Password Generation
Abstract
Password generation techniques have recently been explored by leveraging deep-learning natural language processing (NLP) algorithms. Previous work has raised the state of the art for password guessing algorithms significantly, by approaching the problem using either variational autoencoders with CNN-based encoder and decoder architectures or transformer-based architectures (namely GPT2) for text generation. In this work we aim to combine both paradigms, introducing a novel architecture that leverages the expressive power of transformers with the natural sampling approach to text generation of variational autoencoders. We show how our architecture generates state-of-the-art results in password matching performance across multiple benchmark datasets.
Open Access
File(s)
Rights
CC BY 4.0: Creative Commons Attribution
Additional full text version
Language
English