Arrow Research search
Back to NeurIPS

NeurIPS 2025

Gradient Alignment in Physics-informed Neural Networks: A Second-Order Optimization Perspective

Conference Paper Main Conference Track Artificial Intelligence · Machine Learning

Abstract

Physics-informed neural networks (PINNs) have shown significant promise in computational science and engineering, yet they often face optimization challenges and limited accuracy. In this work, we identify directional gradient conflicts during PINN training as a critical bottleneck. We introduce a novel gradient alignment score to systematically diagnose this issue through both theoretical analysis and empirical experiments. Building on these insights, we show that (quasi) second-order optimization methods inherently mitigate gradient conflicts, thereby consistently outperforming the widely used Adam optimizer. Among them, we highlight the effectiveness of SOAP \cite{vyas2024soap} by establishing its connection to Newton’s method. Empirically, SOAP achieves state-of-the-art results on 10 challenging PDE benchmarks, including the first successful application of PINNs to turbulent flows at Reynolds numbers up to 10, 000. It yields 2–10x accuracy improvements over existing methods while maintaining computational scalability, advancing the frontier of neural PDE solvers for real-world, multi-scale physical systems. All code and datasets used in this work are publicly available at: \url{https: //github. com/PredictiveIntelligenceLab/jaxpi/tree/pirate}. \end{abstract}

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
Annual Conference on Neural Information Processing Systems
Archive span
1987-2025
Indexed papers
30776
Paper id
481114615359927843