Arrow Research search
Back to AAAI

AAAI 2025

Unified Graph Neural Networks Pre-training for Multi-domain Graphs

Conference Paper AAAI Technical Track on Data Mining & Knowledge Management I Artificial Intelligence

Abstract

Graph Neural Networks (GNNs) have proven effective and typically benefit from pre-training on accessible graphs to enhance performance on tasks with limited labeled data. However, existing GNNs are constrained by the ``one-domain-one-model'' limitation, which restricts their effectiveness across diverse graph domains. In this paper, we tackle this problem by developing a method called Multi-Domain Pre-training for a Unified GNN Model (MDP-GNN). This method is based on the philosophical notion that everything is interconnected, suggesting that a latent meta-domain exists to encompass the diverse graph domains and their interconnections. MDP-GNN seeks to identify and utilize this meta-domain to train a unified GNN model through three core strategies. Firstly, it integrates node feature semantics from different domains to create unified representations. Secondly, it employs a bi-level learning strategy to build a domain-synthesized network that identifies latent connections to facilitate cross-domain knowledge transfer. Thirdly, it uses Wasserstein distance to map diverse domains into the common meta-domain for graph distribution alignment. We validate the effectiveness of MDP-GNN through theoretical analysis and extensive experiments on four real-world graph datasets, showing its superiority in enhancing GNN performance across diverse domains.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
516032832491186460