Now showing 1 - 10 of 126
  • Publication
    Climate Bot: A Machine Reading Comprehension System for Climate Change Question Answering
    ( 2022-07-01)
    Rony, Md Rashad Al Hasan
    ;
    Zuo, Ying
    ;
    Kovriguina, Liubov
    ;
    ;
    Climate change has a severe impact on the overall ecosystem of the whole world, including humankind. This demo paper presents Climate Bot - a machine reading comprehension system for question answering over documents about climate change. The proposed Climate Bot provides an interface for users to ask questions in natural language and get answers from reliable data sources. The purpose of the climate bot is to spread awareness about climate change and help individuals and communities to learn about the impact and challenges of climate change. Additionally, we open-sourced an annotated climate change dataset CCMRC to promote further research on the topic. This paper describes the dataset collection, annotation, system design, and evaluation.
  • Publication
    DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation
    ( 2022-07)
    Rony, Md Rashad Al Hasan
    ;
    ;
    Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both human-like and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system’s inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-the-art methods on several standard benchmark datasets.
  • Publication
    Improving Inductive Link Prediction Using Hyper-Relational Facts (Extended Abstract)
    ( 2022-07-01) ;
    Berrendorf, Max
    ;
    ;
    Thost, Veronika
    ;
    Ma, Tengfei
    ;
    Tresp, Volker
    ;
    For many years, link prediction on knowledge. graphs has been a purely transductive task, not allowing for reasoning on unseen entities. Recently, increasing efforts are put into exploring semi- and fully inductive scenarios, enabling inference over unseen and emerging entities. Still, all these approaches only consider triple-based KGs, whereas their richer counterparts, hyper-relational KGs (e.g., Wikidata), have not yet been properly studied. In this work, we classify different inductive settings and study the benefits of employing hyper-relational KGs on a wide range of semi- and fully inductive link prediction tasks powered by recent advancements in graph neural networks. Our experiments on a novel set of benchmarks show that qualifiers over typed edges can lead to performance improvements of 6% of absolute gains (for the Hits@10 metric) compared to triple-only baselines. Our code is available at https://github.com/mali-git/hyper_relational_ilp.
  • Publication
    RoMe: A Robust Metric for Evaluating Natural Language Generation
    ( 2022-05)
    Rony, Md Rashad Al Hasan
    ;
    Kovriguina, Liubov
    ;
    Chaudhuri, Debanjan
    ;
    ;
    Evaluating Natural Language Generation (NLG) systems is a challenging task. Firstly, the metric should ensure that the generated hypothesis reflects the reference’s semantics. Secondly, it should consider the grammatical quality of the generated sentence. Thirdly, it should be robust enough to handle various surface forms of the generated sentence. Thus, an effective evaluation metric has to be multifaceted. In this paper, we propose an automatic evaluation metric incorporating several core aspects of natural language understanding (language competence, syntactic and semantic variation). Our proposed metric, RoMe, is trained on language features such as semantic similarity combined with tree edit distance and grammatical acceptability, using a self-supervised neural network to assess the overall quality of the generated sentence. Moreover, we perform an extensive robustness analysis of the state-of-the-art methods and RoMe. Empirical results suggest that RoMe has a stronger correlation to human judgment over state-of-the-art metrics in evaluating system-generated sentences across several NLG tasks.
  • Publication
    Time-aware Entity Alignment using Temporal Relational Attention
    ( 2022-04-25)
    Xu, Chengjin
    ;
    Su, Fenglong
    ;
    Xiong, Bo
    ;
    Knowledge graph (KG) alignment is to match entities in different KGs, which is important to knowledge fusion and integration. Temporal KGs (TKGs) extend traditional Knowledge Graphs (KGs) by associating static triples with specific timestamps (e.g., temporal scopes or time points). Moreover, open-world KGs (OKGs) are dynamic with new emerging entities and timestamps. While entity alignment (EA) between KGs has drawn increasing attention from the research community, EA between TKGs and OKGs still remains unexplored. In this work, we propose a novel Temporal Relational Entity Alignment method (TREA) which is able to learn alignment-oriented TKG embeddings and represent new emerging entities. We first map entities, relations and timestamps into an embedding space, and the initial feature of each entity is represented by fusing the embeddings of its connected relations and timestamps as well as its neighboring entities. A graph neural network (GNN) is employed to capture intra-graph information and a temporal relational attention mechanism is utilized to integrate relation and time features of links between nodes. Finally, a margin-based full multi-class log-loss is used for efficient training and a sequential time regularizer is used to model unobserved timestamps. We use three well-established TKG datasets, as references for evaluating temporal and non-temporal EA methods. Experimental results show that our method outperforms the state-of-the-art EA methods.
  • Publication
    A Simulated Annealing Meta-heuristic for Concept Learning in Description Logics
    ( 2022) ;
    Vahdati, Sahar
    ;
    Ontologies - providing an explicit schema for underlying data - often serve as background knowledge for machine learning approaches. Similar to ILP methods, concept learning utilizes such ontologies to learn concept expressions from examples in a supervised manner. This learning process is usually cast as a search process through the space of ontologically valid concept expressions, guided by heuristics. Such heuristics usually try to balance explorative and exploitative behaviors of the learning algorithms. While exploration ensures a good coverage of the search space, exploitation focuses on those parts of the search space likely to contain accurate concept expressions. However, at their extreme ends, both paradigms are impractical: A totally random explorative approach will only find good solutions by chance, whereas a greedy but myopic, exploitative attempt might easily get trapped in local optima. To combine the advantages of both paradigms, different meta-heuristics have been proposed. In this paper, we examine the Simulated Annealing meta-heuristic and how it can be used to balance the exploration-exploitation trade-off in concept learning. In different experimental settings, we analyse how and where existing concept learning algorithms can benefit from the Simulated Annealing meta-heuristic.
  • Publication
    Dihedron Algebraic Embeddings for Spatio-Temporal Knowledge Graph Completion
    ( 2022)
    Nayyeri, M.
    ;
    Vahdati, S.
    ;
    Khan, M.T.
    ;
    Alam, M.M.
    ;
    Wenige, L.
    ;
    Behrend, A.
    ;
    Many knowledge graphs (KG) contain spatial and temporal information. Most KG embedding models follow triple-based representation and often neglect the simultaneous consideration of the spatial and temporal aspects. Encoding such higher dimensional knowledge necessitates the consideration of true algebraic and geometric aspects. Hypercomplex algebra provides the foundation of a well defined mathematical system among which the Dihedron algebra with its rich framework is suitable to handle multidimensional knowledge. In this paper, we propose an embedding model that uses Dihedron algebra for learning such spatial and temporal aspects. The evaluation results show that our model performs significantly better than other adapted models.
  • Publication
    Context Transformer with Stacked Pointer Networks for Conversational Question Answering over Knowledge Graphs
    ( 2021)
    Plepi, J.
    ;
    Kacupaj, E.
    ;
    ;
    Thakkar, Harsh
    ;
    Neural semantic parsing approaches have been widely used for Question Answering (QA) systems over knowledge graphs. Such methods provide the flexibility to handle QA datasets with complex queries and a large number of entities. In this work, we propose a novel framework named CARTON (Context trAnsformeR sTacked pOinter Networks), which performs multi-task semantic parsing for handling the problem of conversational question answering over a large-scale knowledge graph. Our framework consists of a stack of pointer networks as an extension of a context transformer model for parsing the input question and the dialog history. The framework generates a sequence of actions that can be executed on the knowledge graph. We evaluate CARTON on a standard dataset for complex sequential question answering on which CARTON outperforms all baselines. Specifically, we observe performance improvements in F1-score on eight out of ten question types compared to the previous state of the art . For logical reasoning questions, an improvement of 11 absolute points is reached.
  • Publication
    Space Efficient Context Encoding for Non-Task-Oriented Dialogue Generation with Graph Attention Transformer
    ( 2021)
    Galetzka, Fabian
    ;
    Rose, Jewgeni
    ;
    Schlangen, David
    ;
    To improve the coherence and knowledge retrieval capabilities of non-task-oriented dialogue systems, recent Transformer-based models aim to integrate fixed background context. This often comes in the form of knowledge graphs, and the integration is done by creating pseudo utterances through paraphrasing knowledge triples, added into the accumulated dialogue context. However, the context length is fixed in these architectures, which restricts how much background or dialogue context can be kept. In this work, we propose a more concise encoding for background context structured in the form of knowledge graphs, by expressing the graph connections through restrictions on the attention weights. The results of our human evaluation show that this encoding reduces space requirements without negativ e effects on the precision of reproduction of knowledge and perceived consistency. Further, models trained with our proposed context encoding generate dialogues that are judged to be more comprehensive and interesting.
  • Publication
    Knowledge Graph Representation Learning using Ordinary Differential Equations
    ( 2021)
    Nayyeri, Mojtaba
    ;
    Xu, Chengjin
    ;
    Hoffmann, Franca
    ;
    Alam, Mirza Mohtashim
    ;
    ;
    Vahdati, Sahar
    Knowledge Graph Embeddings (KGEs) have shown promising performance on link prediction tasks by mapping the entities and relations from a knowledge graph into a geometric space. The capability of KGEs in preserving graph characteristics including structural aspects and semantics, highly depends on the design of their score function, as well as the inherited abilities from the underlying geometry. Many KGEs use the Euclidean geometry which renders them incapable of preserving complex structures and consequently causes wrong inferences by the models. To address this problem, we propose a neuro differential KGE that embeds nodes of a KG on the trajectories of Ordinary Differential Equations (ODEs). To this end, we represent each relation (edge) in a KG as a vector field on several manifolds. W e specifically parameterize ODEs by a neural network to represent complex manifolds and complex vector fields on the manifolds. Therefore, the underlying embedding space is capable to assume the shape of various geometric forms to encode heterogeneous subgraphs. Experiments on synthetic and benchmark datasets using state-of-the-art KGE models justify the ODE trajectories as a means to enable structure preservation and consequently avoiding wrong inferences.