Arrow Research search
Back to IJCAI

IJCAI 2019

Learning Relational Representations with Auto-encoding Logic Programs

Conference Paper Understanding Intelligence and Human-level AI in the New Machine Learning era Artificial Intelligence

Abstract

Deep learning methods capable of handling relational data have proliferated over the past years. In contrast to traditional relational learning methods that leverage first-order logic for representing such data, these methods aim at re-representing symbolic relational data in Euclidean space. They offer better scalability, but can only approximate rich relational structures and are less flexible in terms of reasoning. This paper introduces a novel framework for relational representation learning that combines the best of both worlds. This framework, inspired by the auto-encoding principle, uses first-order logic as a data representation language, and the mapping between the the original and latent representation is done by means of logic programs instead of neural networks. We show how learning can be cast as a constraint optimisation problem for which existing solvers can be used. The use of logic as a representation language makes the proposed framework more accurate (as the representation is exact, rather than approximate), more flexible, and more interpretable than deep learning methods. We experimentally show that these latent representations are indeed beneficial in relational learning tasks.

Authors

Keywords

  • Special Track on Understanding Intelligence and Human-level AI in the New Machine Learning era: Knowledge representations for Learning (Special Track on Human AI and Machine Learning)
  • Special Track on Understanding Intelligence and Human-level AI in the New Machine Learning era: Learning knowledge representations (Special Track on Human AI and Machine Learning)

Context

Venue
International Joint Conference on Artificial Intelligence
Archive span
1969-2025
Indexed papers
14525
Paper id
571012725356926151