ICLR Conference 2024 Conference Paper
LogicMP: A Neuro-symbolic Approach for Encoding First-order Logic Constraints
- Weidi Xu
- Jingwei Wang
- Lele Xie
- Jianshan He
- Hongting Zhou
- Taifeng Wang
- Xiaopei Wan
- Jingdong Chen
Integrating first-order logic constraints (FOLCs) with neural networks is a crucial but challenging problem since it involves modeling intricate correlations to satisfy the constraints. This paper proposes a novel neural layer, LogicMP, which performs mean-field variational inference over a Markov Logic Network (MLN). It can be plugged into any off-the-shelf neural network to encode FOLCs while retaining modularity and efficiency. By exploiting the structure and symmetries in MLNs, we theoretically demonstrate that our well-designed, efficient mean-field iterations greatly mitigate the difficulty of MLN inference, reducing the inference from sequential calculation to a series of parallel tensor operations. Empirical results in three kinds of tasks over images, graphs, and text show that LogicMP outperforms advanced competitors in both performance and efficiency.