Arrow Research search
Back to AAAI

AAAI 2026

Learn from Global Correlations: Enhancing Evolutionary Algorithm via Spectral GNN

Conference Paper AAAI Technical Track on Machine Learning VI Artificial Intelligence

Abstract

Evolutionary algorithms (EAs) are optimization algorithms that simulate natural selection and genetic mechanisms. Despite advancements, existing EAs have two main issues: (1) they rarely update next-generation individuals based on global correlations, thus limiting comprehensive learning; (2) it is challenging to balance exploration and exploitation, excessive exploitation leads to premature convergence to local optima, while excessive exploration results in an excessively slow search. Existing EAs heavily rely on manual parameter settings, inappropriate parameters might disrupt the exploration-exploitation balance, further impairing model performance. To address these challenges, we propose a novel evolutionary algorithm framework called Graph Neural Evolution (GNE). Unlike traditional EAs, GNE represents the population as a graph, where nodes correspond to individuals, and edges capture their relationships, thus effectively leveraging global information. Meanwhile, GNE utilizes spectral graph neural networks (GNNs) to decompose evolutionary signals into their frequency components and designs a filtering function to fuse these components. High-frequency components capture diverse global information, while low-frequency components capture more consistent information. This explicit frequency filtering strategy directly controls global-scale features through frequency components, overcoming the limitations of manual parameter settings and making the exploration-exploitation control more interpretable and effective. Extensive evaluations on nine benchmark functions (e.g., Sphere, Rastrigin, and Rosenbrock) demonstrate that GNE consistently outperforms both classical algorithms (GA, DE, CMA-ES) and advanced algorithms (SDAES, RL-SHADE) under various conditions, including original, noise-corrupted, and optimal solution deviation scenarios. GNE achieves solution quality several orders of magnitude better than other algorithms (e.g., 3.07e-20 mean on Sphere vs. 1.51e-07).

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
584740310713538614