Language modeling using semantic web & neural networks
Language Models are an integral part of many applications like speech recognition, machine translation systems, natural language processing etc. State of the art N-gram models have dominated this part from decades but now neural network language models have shown better results than N-gram models. Preparing data has a great impact on the performace of the language model. Lot of research has been carried out for retaining semantic properties of the data. The goal of this thesis is to use linked data for language modeling with advanced neural network modeling techniques. It is clear by comparing the results of N-gram model and neural network language model that neural network outperforms state of the art N-gram model. This improved language models can further improve the performance of applications like speech recognition by reducing the word error rate.
Köln, TH, Master Thesis, 2016