Arrow Research search

Author name cluster

Yuling Wang

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

6 papers
2 author rows

Possible papers

6

IJCAI Conference 2025 Conference Paper

HeTa: Relation-wise Heterogeneous Graph Foundation Attack Model

  • Yuling Wang
  • Zihui Chen
  • Pengfei Jiao
  • Xiao Wang

Heterogeneous Graph Neural Networks (HGNNs) are vulnerable, highlighting the need for tailored attacks to assess their robustness and ensure security. However, existing HGNN attacks often require complex retraining of parameters to generate specific perturbations for new scenarios. Recently, foundation models have opened new horizons for the generalization of graph neural networks by capturing shared semantics across various graph distributions. This leads us to ask: Can we design a foundation attack model for HGNNs that enables generalizable perturbations across different HGNNs, and quickly adapts to new heterogeneous graphs (HGs)? Empirical findings reveal that, despite significant differences in model design and parameter space, different HGNNs surprisingly share common vulnerability patterns from a relation-aware perspective. Therefore, we explore how to design foundation HGNN attack criteria by mining shared attack units. In this paper, we propose a novel relation-wise heterogeneous graph foundation attack model, HeTa. We introduce a foundation surrogate model to align heterogeneity and identify the importance of shared relation-aware attack units. Building on this, we implement a serialized relation-by-relation attack based on the identified relational weights. In this way, the perturbation can be transferred to various target HGNNs and easily fine-tuned for new HGs. Extensive experiments exhibit powerful attack performances and generalizability of our method.

AAAI Conference 2025 Conference Paper

LightPROF: A Lightweight Reasoning Framework for Large Language Model on Knowledge Graph

  • Tu Ao
  • Yanhua Yu
  • Yuling Wang
  • Yang Deng
  • Zirui Guo
  • Liang Pang
  • Pinghui Wang
  • Tat-Seng Chua

Large Language Models (LLMs) have impressive capabilities in text understanding and zero-shot reasoning. However, delays in knowledge updates may cause them to reason incorrectly or produce harmful results. Knowledge Graphs (KGs) provide rich and reliable contextual information for the reasoning process of LLMs by structurally organizing and connecting a wide range of entities and relations. Existing KG-based LLM reasoning methods only inject KGs' knowledge into prompts in a textual form, ignoring its structural information. Moreover, they mostly rely on close-source models or open-source models with large parameters, which poses challenges to high resource consumption. To address this, we propose a novel Lightweight and efficient Prompt learning-ReasOning Framework for KGQA (LightPROF), which leverages the full potential of LLMs to tackle complex reasoning tasks in a parameter-efficient manner. Specifically, LightPROF follows a “Retrieve-Embed-Reason” process, first accurately, and stably retrieving the corresponding reasoning graph from the KG through retrieval module. Next, through a Transformer-based Knowledge Adapter, it finely extracts and integrates factual and structural information from the KG, then maps this information to the LLM’s token embedding space, creating an LLM-friendly prompt to be used by the LLM for the final reasoning. Additionally, LightPROF only requires training Knowledge Adapter and can be compatible with any open-source LLM. Extensive experiments on two public KGQA benchmarks demonstrate that LightPROF achieves superior performance with small-scale LLMs. Furthermore, LightPROF shows significant advantages in terms of input token count and reasoning time.

ECAI Conference 2024 Conference Paper

Hop-based Heterogeneous Graph Transformer

  • Zixuan Yang 0001
  • Xiao Wang 0017
  • Yanhua Yu
  • Yuling Wang
  • Kangkang Lu 0002
  • Zirui Guo
  • Xiting Qin
  • Yunshan Ma 0002

The Graph Transformer (GT) has shown significant ability in processing graph-structured data, addressing limitations in graph neural networks, such as over-smoothing and over-squashing. However, the implementation of GT in real-world heterogeneous graphs (HGs) with complex topology continues to present numerous challenges. Firstly, a challenge arises in designing a tokenizer that is compatible with heterogeneity. Secondly, the complexity of the transformer hampers the acquisition of high-order neighbor information in HGs. In this paper, we propose a novel Hop-based Heterogeneous Graph Transformer (H2Gormer) framework, paving a promising path for HGs to benefit from the capabilities of Transformers. We propose a Heterogeneous Hop-based Token Generation module to obtain high-order information in a flexible way. Specifically, to enrich the fine-grained heterogeneous semantics of each token, we propose a tailored multi-relational encoder to encode the hop-based neighbors. In this way, the resulting token embeddings are input to the Hop-based Transformer to obtain node representations, which are then combined with position embeddings to obtain the final encoding. Extensive experiments on four datasets are conducted to demonstrate the effectiveness of H2Gormer.

IJCAI Conference 2023 Conference Paper

Intent-aware Recommendation via Disentangled Graph Contrastive Learning

  • Yuling Wang
  • Xiao Wang
  • Xiangzhou Huang
  • Yanhua Yu
  • Haoyang Li
  • Mengdi Zhang
  • Zirui Guo
  • Wei Wu

Graph neural network (GNN) based recommender systems have become one of the mainstream trends due to the powerful learning ability from user behavior data. Understanding the user intents from behavior data is the key to recommender systems, which poses two basic requirements for GNN-based recommender systems. One is how to learn complex and diverse intents especially when the user behavior is usually inadequate in reality. The other is different behaviors have different intent distributions, so how to establish their relations for a more explainable recommender system. In this paper, we present the Intent-aware Recommendation via Disentangled Graph Contrastive Learning (IDCL), which simultaneously learns interpretable intents and behavior distributions over those intents. Specifically, we first model the user behavior data as a user-item-concept graph, and design a GNN based behavior disentangling module to learn the different intents. Then we propose the intent-wise contrastive learning to enhance the intent disentangling and meanwhile infer the behavior distributions. Finally, the coding rate reduction regularization is introduced to make the behaviors of different intents orthogonal. Extensive experiments demonstrate the effectiveness of IDCL in terms of substantial improvement and the interpretability.

IJCAI Conference 2022 Conference Paper

Ensemble Multi-Relational Graph Neural Networks

  • Yuling Wang
  • Hao Xu
  • Yanhua Yu
  • Mengdi Zhang
  • Zhenhao Li
  • Yuji Yang
  • Wei Wu

It is well established that graph neural networks (GNNs) can be interpreted and designed from the perspective of optimization objective. With this clear optimization objective, the deduced GNNs architecture has sound theoretical foundation, which is able to flexibly remedy the weakness of GNNs. However, this optimization objective is only proved for GNNs with single-relational graph. Can we infer a new type of GNNs for multi-relational graphs by extending this optimization objective, so as to simultaneously solve the issues in previous multi-relational GNNs, e. g. , over-parameterization? In this paper, we propose a novel ensemble multi-relational GNNs by designing an ensemble multi-relational (EMR) optimization objective. This EMR optimization objective is able to derive an iterative updating rule, which can be formalized as an ensemble message passing (EnMP) layer with multi-relations. We further analyze the nice properties of EnMP layer, e. g. , the relationship with multi-relational personalized PageRank. Finally, a new multi-relational GNNs which well alleviate the over-smoothing and over-parameterization issues are proposed. Extensive experiments conducted on four benchmark datasets well demonstrate the effectiveness of the proposed model.

YNIMG Journal 2020 Journal Article

Bibliometric evaluation of 2000–2019 publications on functional near-infrared spectroscopy

  • Wangwang Yan
  • Kangyong Zheng
  • Linman Weng
  • Changcheng Chen
  • Suparata Kiartivich
  • Xue Jiang
  • Xuan Su
  • Yuling Wang

This study aimed to explore and analyze research trends and frontiers on functional near-infrared spectroscopy (fNIRS) in the past 20 years and identify collaboration networks. fNIRS-related publications from 2000 to 2019 were retrieved from the Web of Science database. A total of 1727 publications satisfied the search criteria. Bibliometric visualization analysis of active authors, journals, institutions, countries, references, and keywords were conducted. The number of annual related publications remarkably increased over the years. Fallgatter published the largest number of fNIRS-related papers (83). Neuroimage not only had the largest number of papers published in the first 10 journals (157 articles) but also had the highest impact factor (IF, 2018 ​= ​5.812). The University of Tubingen had the highest number of fNIRS-related publications in the past 20 years. The United States ranked first in terms of comprehensive influence in this field. In recent years, burst keywords (e.g., infant, social interaction, and older adult) and a series of references with citation burst provided clues on research frontiers.