Options
2020
Conference Paper
Title
Multiple run ensemble learning with low dimensional knowledge graph embeddings
Abstract
Knowledge graphs (KGs) represent facts about a domain in a structured form. Although KGs can be quantitatively huge and consist of millions of triples, their coverage is usually still only a small fraction of the available knowledge. Among the most promising recent approaches for tackling this incompleteness problem is link prediction using knowledge graph embedding models. Various embedding models have been proposed so far, among which, the RotatE model is reported to obtain state-of-the-art performance in such link prediction tasks. However, RotatE mainly outperforms other models when using a high embedding dimension (e.g. 1000). In this paper, we simulate such scenarios by studying the performance of different models using multiple low dimensions in different repetition rounds of the same model. For example, our studies show better results when instead of training a model one time with a high dimension of 1200, we repeat the training of the model 6 times in parallel with dimension of 200 and then combine the 6 models, This can improve results while maintaining the overall number of adjustable parameters is the same. In order to justify our findings, we perform experiments on various models including TransE, DistMult, RotatE and ComplEx. Experimental results on standard benchmark dataset show that multiple low-dimensional models outperform a single high dimensional model while the overall number parameters is same.
Author(s)
Mainwork
Ceur Workshop Proceedings
Conference
2020 International Workshop on Knowledge Representation and Representation Learning, KR4L 2020