AIJ Journal 2026 Journal Article
From monotonic graph neural networks to datalog and back: Expressive power and practical applications
- David Tena Cucala
- Bernardo Cuenca Grau
- Boris Motik
- Egor V. Kostylev
Many tasks over knowledge graphs, such as link prediction, can be conceptualised as a problem of learning a transformation of sets of relational facts. Machine learning models such as graph neural networks (GNNs) can be used to realise this transformation, allowing the transformation to be learned from examples. However, it is often difficult to verify formally the properties of such a transformation, or understand why it derives a specific fact. Alternatively, such a transformation can be realised using a set of rules expressed in a knowledge representation language such as Datalog. Formal properties of such a transformation can be verified using symbolic means, and each derived fact can be justified by a rule; however, writing and curating the rules is costly and requires expertise in both the application domain and the formal language. To bridge the gap between these two approaches, in this paper we study the relationship between transformations realised by monotonic max-sum GNNs , a subclass of GNNs with nonnegative weights and max and sum aggregation functions, and transformations realised by Datalog rules. First, we provide an algorithm that can verify whether a given Datalog rule is sound for a network, in the sense that the GNN always derives all consequences of the rule on any input dataset. Second, we provide an algorithm that allows us to justify any fact derived by a GNN by computing a rule that is sound for the GNN and that derives the fact. Third, we study the expressive power of monotonic max-sum GNNs and show that, for each such GNN, one can compute a Datalog program where applying the GNN to any dataset produces the same facts as a single round of application of the program’s rules to the dataset; we also sharpen our result to the subclass of monotonic max GNNs, which use only the max aggregation function, and identify a corresponding class of Datalog programs. Finally, we carry out a practical evaluation and show that monotonic max-sum GNNs can be successfully trained in practice on common knowledge graph tasks, and that extracting rules from max-sum GNNs is practically feasible.