The paper, titled “Message Passing Query Embedding” was accepted at the ICML 2020 Workshop on Graph Representation and Learning. It is authored by two of our lab members: Daniel Daza and Michael Cochez. The paper proposes a novel architecture for graph embeddings of knowledge graph queries, with important advantages compared to previous works.

Abstract:

Recent works on representation learning for Knowledge Graphs have moved beyond the prob- lem of link prediction, to answering queries of an arbitrary structure. Existing methods are based on ad-hoc mechanisms that require training with a diverse set of query structures. We propose a more general architecture that employs a graph neural network to encode a graph representation of the query, where nodes correspond to entities and variables. The generality of our method allows it to encode a more diverse set of query types in comparison to previous work. Our method shows competitive performance against previous models for complex queries, and in contrast with these models, it can answer complex queries when trained for link prediction only. We show that the model learns entity embeddings that capture the notion of entity type without explicit supervision.