EL Embeddings: Geometric construction of models for the description logic EL++

Abstract
An embedding is a function that maps entities from one algebraic structure into another while preserving certain characteristics. Embeddings are being used successfully for mapping relational data or text into vector spaces where they can be used for machine learning, similarity search, or similar tasks. We address the problem of finding vector space embeddings for theories in the Description Logic that are also models of the TBox. To find such embeddings, we define an optimization problem that characterizes the model-theoretic semantics of the operators in within , thereby solving the problem of finding an interpretation function for an theory given a particular domain . Our approach is mainly relevant to large theories and knowledge bases such as the ontologies and knowledge graphs used in the life sciences. We demonstrate that our method can be used for improved prediction of protein--protein interactions when compared to semantic similarity measures or knowledge graph embeddings.

Citation
Kulmanov, M., Liu-Wei, W., Yan, Y., & Hoehndorf, R. (2019). EL Embeddings: Geometric Construction of Models for the Description Logic EL++. Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. doi:10.24963/ijcai.2019/845

Publisher
International Joint Conferences on Artificial Intelligence

Conference/Event Name
28th International Joint Conference on Artificial Intelligence, IJCAI 2019

DOI
10.24963/ijcai.2019/845

arXiv
1902.10499

Additional Links
https://www.ijcai.org/proceedings/2019/845

Relations
Is Supplemented By:

Permanent link to this record

Version History

Now showing 1 - 2 of 2
VersionDateSummary
2*
2020-02-02 09:46:56
Published with DOI
2019-11-11 09:42:47
* Selected version