Arrow Research search
Back to AAAI

AAAI 2025

Deep Hypergraph Neural Networks with Tight Framelets

Conference Paper AAAI Technical Track on Machine Learning III Artificial Intelligence

Abstract

Hypergraphs provide a flexible framework for modeling high-order (complex) interactions among multiple entities, extending beyond traditional pairwise correlations in graph structures. However, deep hypergraph neural networks (HGNNs) often face the challenge of oversmoothing with increasing depth, similar to issues in graph neural networks (GNNs). While oversmoothing in GNNs has been extensively studied, its implications in relation to hypergraphs are less explored. This paper addresses this gap by first theoretically exploring the reasons behind oversmoothing in deep HGNNs. Our novel insights suggest that a spectral-based hypergraph convolution, equipped with both low-pass and high-pass filters, can potentially mitigate these effects. Motivated by these findings, we introduce FrameHGNN, a framework that utilizes framelet-based hypergraph convolutions integrating tight framelet transforms with both low-pass and high-pass components, as well as the commonly used strategies in designing deep GNN architecture: initial residual and identity mappings. The experiment results on diverse benchmark datasets demonstrate that FrameHGNN outperforms several state-of-the-art models, effectively reducing oversmoothing while improving predictive accuracy. Our contributions not only advance the theoretical understanding of deep hypergraph learning but also provide a practical spectral-based approach for HGNNs, emphasizing the design of multifrequency channels.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
364578803671150060