SGNO: Spectral Generator Neural Operators for Stable Long Horizon PDE Rollouts
arXiv:2602.18801v1 Announce Type: new Abstract: Neural operators provide fast PDE surrogates and often generalize across parameters and resolutions. However, in the short train long test setting, autoregressive rollouts can become unstable. This typically happens for two reasons: one step errors accumulate over time, and high frequency components feed back and grow. We introduce the Spectral Generator Neural Operator (SGNO), a residual time stepper that targets both effects. For the linear part, SGNO uses an exponential time differencing update in Fourier space with a learned diagonal generator. We constrain the real part of this generator to be nonpositive, so iterating the step does not amplify the linear dynamics. For nonlinear dynamics, SGNO adds a gated forcing term with channel mixing within each Fourier mode, which keeps the nonlinear update controlled. To further limit high frequency feedback, SGNO applies spectral truncation and an optional smooth mask on the forcing pathwa
arXiv:2602.18801v1 Announce Type: new Abstract: Neural operators provide fast PDE surrogates and often generalize across parameters and resolutions. However, in the short train long test setting, autoregressive rollouts can become unstable. This typically happens for two reasons: one step errors accumulate over time, and high frequency components feed back and grow. We introduce the Spectral Generator Neural Operator (SGNO), a residual time stepper that targets both effects. For the linear part, SGNO uses an exponential time differencing update in Fourier space with a learned diagonal generator. We constrain the real part of this generator to be nonpositive, so iterating the step does not amplify the linear dynamics. For nonlinear dynamics, SGNO adds a gated forcing term with channel mixing within each Fourier mode, which keeps the nonlinear update controlled. To further limit high frequency feedback, SGNO applies spectral truncation and an optional smooth mask on the forcing pathway. We derive a one step amplification bound and a finite horizon rollout error bound. The bound separates generator approximation error from nonlinear mismatch and gives sufficient conditions under which the latent $L^2$ norm does not grow across rollout steps. On APEBench spanning 1D, 2D, and 3D PDE families, SGNO achieves lower long horizon error and longer stable rollout lengths than strong neural operator baselines. Ablations confirm the roles of the generator constraint, gating, and filtering.The code is available at https://github.com/lijy32123-cloud/SGNO.
Executive Summary
The article introduces the Spectral Generator Neural Operator (SGNO), a novel approach to stabilizing long horizon PDE rollouts. SGNO addresses the issues of error accumulation and high frequency component growth through a residual time stepper, exponential time differencing update, and gated forcing term. The method achieves lower long horizon error and longer stable rollout lengths compared to strong neural operator baselines, with the code available for further development and application.
Key Points
- ▸ Introduction of the Spectral Generator Neural Operator (SGNO) for stable long horizon PDE rollouts
- ▸ Use of exponential time differencing update in Fourier space with a learned diagonal generator
- ▸ Incorporation of a gated forcing term with channel mixing and spectral truncation for controlled nonlinear updates
Merits
Improved Stability
SGNO's design effectively mitigates the accumulation of one-step errors and controls high frequency component growth, leading to more stable long horizon rollouts.
Theoretical Bounds
The derivation of one-step amplification and finite horizon rollout error bounds provides a solid theoretical foundation for understanding SGNO's performance and behavior.
Demerits
Computational Complexity
The implementation of SGNO may introduce additional computational complexity due to the use of Fourier space updates and spectral truncation, potentially impacting efficiency in certain applications.
Expert Commentary
The introduction of SGNO represents a significant advancement in the field of neural operators for PDE surrogates, addressing a critical issue of stability in long horizon rollouts. The method's ability to control error accumulation and high frequency component growth through a combination of exponential time differencing and gated forcing terms is particularly noteworthy. Theoretical bounds provided in the article offer a rigorous framework for understanding SGNO's performance, underscoring its potential for applications requiring high accuracy and stability over extended periods.
Recommendations
- ✓ Further investigation into the application of SGNO in diverse PDE families and complex systems
- ✓ Comparison of SGNO with other state-of-the-art methods in neural operator development to assess its relative advantages and limitations