Academic

On the Convergence of Single-Loop Stochastic Bilevel Optimization with Approximate Implicit Differentiation

arXiv:2602.23633v1 Announce Type: new Abstract: Stochastic Bilevel Optimization has emerged as a fundamental framework for meta-learning and hyperparameter optimization. Despite the practical prevalence of single-loop algorithms--which update lower and upper variables concurrently--their theoretical understanding, particularly in the stochastic regime, remains significantly underdeveloped compared to their multi-loop counterparts. Existing analyses often yield suboptimal convergence rates or obscure the critical dependence on the lower-level condition number $\kappa$, frequently burying it within generic Lipschitz constants. In this paper, we bridge this gap by providing a refined convergence analysis of the Single-loop Stochastic Approximate Implicit Differentiation (SSAID) algorithm. We prove that SSAID achieves an $\epsilon$-stationary point with an oracle complexity of $\mathcal{O}(\kappa^7 \epsilon^{-2})$. Our result is noteworthy in two aspects: (i) it matches the optimal $\math

Y
Yubo Zhou, Luo Luo, Guang Dai, Haishan Ye
· · 1 min read · 11 views

arXiv:2602.23633v1 Announce Type: new Abstract: Stochastic Bilevel Optimization has emerged as a fundamental framework for meta-learning and hyperparameter optimization. Despite the practical prevalence of single-loop algorithms--which update lower and upper variables concurrently--their theoretical understanding, particularly in the stochastic regime, remains significantly underdeveloped compared to their multi-loop counterparts. Existing analyses often yield suboptimal convergence rates or obscure the critical dependence on the lower-level condition number $\kappa$, frequently burying it within generic Lipschitz constants. In this paper, we bridge this gap by providing a refined convergence analysis of the Single-loop Stochastic Approximate Implicit Differentiation (SSAID) algorithm. We prove that SSAID achieves an $\epsilon$-stationary point with an oracle complexity of $\mathcal{O}(\kappa^7 \epsilon^{-2})$. Our result is noteworthy in two aspects: (i) it matches the optimal $\mathcal{O}(\epsilon^{-2})$ rate of state-of-the-art multi-loop methods (e.g., stocBiO) while maintaining the computational efficiency of a single-loop update; and (ii) it provides the first explicit, fine-grained characterization of the $\kappa$-dependence for stochastic AID-based single-loop methods. This work demonstrates that SSAID is not merely a heuristic approach, but admits a rigorous theoretical foundation with convergence guarantees competitive with mainstream multi-loop frameworks.

Sources