Arrow Research search

Author name cluster

Renda Han

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

6 papers
2 author rows

Possible papers

6

AAAI Conference 2026 Conference Paper

Causally-Aware Attribute Completion for Incomplete Federated Graph Clustering

  • Jingxin Liu
  • Wenxuan Tu
  • Haotian Wang
  • Renda Han
  • Haoyi Li
  • Junlong Wu
  • Xiangyan Tang

Node-level federated graph clustering allows multiple unlabeled subgraph holders to collaboratively train on node-level tasks without sharing private information. Existing methods usually assume that the node attributes are complete and have achieved promising progress. However, in the Federated Graph Learning (FGL) scenarios, this assumption is overly strict due to failures in data collection devices. Consequently, most existing FGL frameworks struggle to extract useful features from attribute-incomplete graphs for clustering, yet the issue remains underexplored. To bridge this gap, we propose a causally-aware attribute completion for Incomplete Federated Graph Clustering (IFedGC), which constructs a reliable global causal structure that incorporates clustering-friendly information to guide attribute completion for each subgraph. Specifically, in the attribute completion step, we first construct the causal structure to extract the causal relationships between initialized features, and then upload them to the server. Subsequently, we integrate multiple uploaded causal structures into a global causal one to achieve cross-client attribute completion. Moreover, to support reliable clustering, we first collect the high-confidence cluster centroids from each subgraph using a Graph Neural Network (GNN) model and subsequently aggregate these centroids on the server. The above two steps are seamlessly integrated into a unified FGL framework to obtain a clustering-oriented causal structure, which is sent back to the client to promote high-quality attribute completion for better clustering. Extensive results on five benchmark datasets demonstrate the effectiveness and superiority of IFedGC against its competitors.

AAAI Conference 2026 Conference Paper

Federated Graph-level Clustering Network with Attribute Inference

  • Renda Han
  • Junlong Wu
  • Wenxuan Tu
  • Jingxin Liu
  • Haotian Wang
  • Jieren Cheng

With the rise of vertical segmentation in real-world data, federated graph-level clustering has gained significant attention in recent years. However, the inherent missing attributes in graph datasets held by certain clients lead to suboptimal local parameter updates and misaligned global parameter consensus. This results in knowledge shifts during negotiation to ultimately impair overall clustering performance. This issue remains largely underexplored in the current advanced research. To bridge this gap, we propose a novel deep learning network called Federated Graph-level Clustering Network with Attribute Inference (FedAI), which utilizes high-confidence prior knowledge from each domain and multi-party collaborative optimization to achieve efficient reasoning of unknown features. Specifically, on the client, high-confidence graph samples are projected into a latent space. We then extract and upload irreversible path digest information and attribute-oriented inference signals from them. On the server, we first identify affinity relationships hierarchically via the improved graph kernel method. We then infer the features of clients lacking node attributes through a prior structure-guide recovery operator, facilitating inter-client knowledge transfer for better clustering. Experimental results on 15 cross-dataset and cross-domain non-IID graph datasets demonstrate that FedAI consistently outperforms existing methods.

AAAI Conference 2026 Conference Paper

Personalized Federated Graph-Level Clustering Network

  • Jingxin Liu
  • Wenxuan Tu
  • Renda Han
  • Junlong Wu
  • Haotian Wang
  • Guohui Liu
  • Xiangyan Tang
  • Yue Yang

In the federated clustering task, structural heterogeneity across clients inevitably impedes effective multi-source information sharing. To solve this issue, Personalized Federated Learning (PFL) has emerged as a potentially effective solution for image and text clustering. Unlike Euclidean data, graph-structured data exhibits diverse and fragile local patterns, which widely exist in real-world scenarios. Multi-graph data analysis in the federated learning setting is challenging and important, yet remains underexplored. This motivates us to propose a novel PERsonalized Federated graph-lEvel Clustering neTwork (PERFECT), which generates a specialized aggregation strategy for each client by uploading key model parameters and representative samples without sharing private information. Specifically, for each client, we first reconstruct privacy-preserving representative samples in a min-max optimization manner and then upload these samples to the server for subsequent personalized parameter aggregation. On the server, we first extract graph-level embeddings from the uploaded data, and then estimate affinities among multiple learned embeddings to formulate a personalized aggregation strategy for each client. Subsequently, to help each local model better identify the cluster boundaries, we utilize clustering-wise gradient to update the key components in the personalized model parameters from the server. Extensive experimental results have demonstrated the effectiveness and superiority of PERFECT over its competitors.

AAAI Conference 2026 Conference Paper

V-Pruner: A Fast and Globally-informed Token Pruning Framework for Vision Transformer

  • Guangzhen Yao
  • Jiayun Zheng
  • Zezhou Wang
  • Wenxin Zhang
  • Renda Han
  • Chuangxin Zhao
  • Zeyu Zhang
  • Runhao Liu

Vision Transformer (ViT) has become one of the cornerstones of the computer vision field, demonstrating exceptional performance. However, its inherent high computational complexity and inference latency still pose significant obstacles for deployment in resource-constrained environments. Token pruning, by removing less informative tokens, offers an effective strategy to reduce computational overhead. However, existing pruning methods largely rely on static or local token importance scores. This myopic approach fundamentally overlooks the sequential dependency of pruning decisions and fails to capture the interaction effects between pruning decisions across layers, often neglecting the global interactions between mask variables. To address this limitation, we propose V-Pruner, a fast and globally-informed token pruning framework for Vision Transformer. V-Pruner first leverages Fisher information to perform an initial assessment of token importance, providing a principled initial prior for pruning decisions. Building on this, V-Pruner introduces a Reinforcement Learning (RL) Proximal Policy Optimization (PPO) algorithm, refining token pruning into a global sequential decision process. The algorithm combines a composite reward signal that incorporates both model performance and computational cost to guide policy exploration, effectively evaluating the long-term impact of different pruning decision combinations on global model performance. Extensive experiments on ViT-L, DeiT-B, DeiT-S, and DeiT-T demonstrate that V-Pruner achieves a better balance between accuracy, GFLOPs, inference speed, and training time, surpassing existing mainstream ViT pruning algorithms in overall performance.

AAAI Conference 2025 Conference Paper

Federated Graph-Level Clustering Network

  • Jingxin Liu
  • Jieren Cheng
  • Renda Han
  • Wenxuan Tu
  • Jiaxin Wang
  • Xin Peng

Federated graph learning (FGL), which excels in analyzing non-IID graphs as well as protecting data privacy, has recently emerged as a hot topic. Existing FGL methods usually train the client model using labeled data and then collaboratively learn a global model without sharing their local graph data. However, in real-world scenarios, the lack of data annotations impedes the negotiation of multi-source information at the server, leading to sub-optimal feedback to the clients. To address this issue, we propose a novel unsupervised learning framework called Federated Graph-level Clustering Network (FedGCN), which collects the topology-oriented features of non-IID graphs from clients to generate global consensus representations through multi-source clustering structure sharing. Specifically, in the client, we first preserve the prototype features of each cluster from the structure-oriented embedding through clustering and then upload the learned multiple prototypes that are hard to be reconstructed into the raw graph data. In the server, we generate consensus prototypes from multiple condensed structure-oriented signals through Gaussian estimation, which are subsequently transferred to each client to promote the great encoding capacity of the local model for better clustering. Extensive experiments across multiple non-IID graph datasets have demonstrated the effectiveness and superiority of FedGCN against its competitors.

ICML Conference 2025 Conference Paper

Federated Node-Level Clustering Network with Cross-Subgraph Link Mending

  • Jingxin Liu 0006
  • Renda Han
  • Wenxuan Tu
  • Haotian Wang
  • Junlong Wu
  • Jieren Cheng

Subgraphs of a complete graph are usually distributed across multiple devices and can only be accessed locally because the raw data cannot be directly shared. However, existing node-level federated graph learning suffers from at least one of the following issues: 1) heavily relying on labeled graph samples that are difficult to obtain in real-world applications, and 2) partitioning a complete graph into several subgraphs inevitably causes missing links, leading to sub-optimal sample representations. To solve these issues, we propose a novel $\underline{\text{Fed}}$erated $\underline{\text{N}}$ode-level $\underline{\text{C}}$lustering $\underline{\text{N}}$etwork (FedNCN), which mends the destroyed cross-subgraph links using clustering prior knowledge. Specifically, within each client, we first design an MLP-based projector to implicitly preserve key clustering properties of a subgraph in a denoising learning-like manner, and then upload the resultant clustering signals that are hard to reconstruct for subsequent cross-subgraph links restoration. In the server, we maximize the potential affinity between subgraphs stemming from clustering signals by graph similarity estimation and minimize redundant links via the N-Cut criterion. Moreover, we employ a GNN-based generator to learn consensus prototypes from this mended graph, enabling the MLP-GNN joint-optimized learner to enhance data privacy during data transmission and further promote the local model for better clustering. Extensive experiments demonstrate the superiority of FedNCN.