ICML Conference 2025 Conference Paper
Advancing Personalized Learning with Neural Collapse for Long-Tail Challenge
- Hanglei Hu
- Yingying Guo
- Zhikang Chen
- Sen Cui
- Fei Wu 0001
- Kun Kuang 0001
- Min Zhang 0068
- Bo Jiang 0016
Personalized learning, especially data-based methods, has garnered widespread attention in recent years, aiming to meet individual student needs. However, many works rely on the implicit assumption that benchmarks are high-quality and well-annotated, which limits their practical applicability. In real-world scenarios, these benchmarks often exhibit long-tail distributions, significantly impacting model performance. To address this challenge, we propose a novel method called N eural- C ollapse- A dvanced personalized L earning (NCAL), designed to learn features that conform to the same simplex equiangular tight frame (ETF) structure. NCAL introduces Text-modality Collapse (TC) regularization to optimize the distribution of text embeddings within the large language model (LLM) representation space. Notably, NCAL is model-agnostic, making it compatible with various architectures and approaches, thereby ensuring broad applicability. Extensive experiments demonstrate that NCAL effectively enhances existing works, achieving new state-of-the-art performance. Additionally, NCAL mitigates class imbalance, significantly improving the model’s generalization ability.