Arrow Research search

Author name cluster

Shihui Wang

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

2 papers
1 author row

Possible papers

2

IJCAI Conference 2025 Conference Paper

DGCPL: Dual Graph Distillation for Concept Prerequisite Relation Learning

  • Miao Zhang
  • Jiawei Wang
  • Jinying Han
  • Kui Xiao
  • Zhifei Li
  • Yan Zhang
  • Hao Chen
  • Shihui Wang

Concept prerequisite relations determine the learning order of knowledge concepts in one domain, which has an important impact on teachers' course design and students' personalized learning. Current research usually predicts concept prerequisite relations from the perspective of knowledge, and rarely pays attention to the role of learners' learning behavior. We propose a Dual Graph Distillation Method for Concept Prerequisite Relation Learning (DGCPL). Specifically, DGCPL constructs a dual graph structure from both the knowledge and learning behavior perspectives, and captures the high-order knowledge features and learning behavior features through the concept-resource hypergraph and the learning behavior graph respectively. In addition, we introduce a gated knowledge distillation to fuse the structural information of concept nodes in the two graphs, so as to obtain a more comprehensive concept embedding representation and achieve accurate prediction of prerequisite relations. On three public benchmark datasets, we compare DGCPL with eight graph-based baseline methods and five traditional classification baseline methods. The experimental results show that DGCPL achieves state-of-the-art performance in learning concept prerequisite relations. Our code is available at https: //github. com/wisejw/DGCPL.

AAAI Conference 2025 Conference Paper

Learning Concept Prerequisite Relation via Global Knowledge Relation Optimization

  • Miao Zhang
  • Jiawei Wang
  • Kui Xiao
  • Shihui Wang
  • Yan Zhang
  • Hao Chen
  • Zhifei Li

Learning concept prerequisite relations helps better master and build a logically coherent knowledge structure. Many studies use graph neural networks to create heterogeneous knowledge networks that enhance concept representations. However, different types of relations in these networks can influence each other. Existing research often focuses solely on concept relations, neglecting other types of knowledge connections. To address this issue, this paper proposes a novel concept prerequisite relation learning model, named the Global Knowledge Relation Optimization Model(GKROM). Specifically, we capture the impact of different knowledge relation types on document and concept semantic representations separately, integrating the document and concept semantic representations. Then, we introduce multi-objective learning to optimize the knowledge relation network from a global perspective. Through the above optimization, GKROM learns richer semantic representations for concepts and documents, improving the accuracy of concept prerequisite relation learning. Extensive experiments on public datasets demonstrate the effectiveness of our GKROM, achieving state-of-the-art performance in concept prerequisite relation learning.