| ||||
| ||||
![]() Title:Complete Asymptotic Analysis for Projected Stochastic Approximation and Debiased Variants Conference:Allerton 2023 Tags:asymptotic behaviors, diffusion approximation, Linearly constrained optimization, phase transitions and projection complexity Abstract: In this study, we investigate federated optimization through the lens of linearly constrained optimization problems and analyze the loopless projection stochastic approximation (LPSA) method. LPSA incorporates a probabilistic projection with probability $p_n$ at the $n$-th iteration to ensure solution feasibility at random. We set $p_n \propto \eta_n^\beta$, where $\eta_n$ represents the step size and $\beta \in [0, 1]$ is a tuning parameter. The previous research demonstrates that LPSA exhibits a fascinating bias-variance trade-off through diffusion approximation in the asymptotic regime. Specifically, for $\beta < 0.5$, the bias becomes degenerate, while for $\beta > 0.5$, the bias dominates the variance. In this work, we investigate the intricate scenario where $\beta=0.5$ and discover that the last iterate, after appropriate scaling, weakly converges to a biased Gaussian distribution. We specify the exact form of the incurred bias, as a result of which, we provide a comprehensive asymptotic analysis of LPSA and a complete characterization of phase transitions. We observe that such a non-zero bias leads to slow convergence. Motivated by this observation, we propose a debiased version of LPSA, called Debiased LPSA (DLPSA), which effectively reduces projection complexity compared to the vanilla LPSA. Complete Asymptotic Analysis for Projected Stochastic Approximation and Debiased Variants ![]() Complete Asymptotic Analysis for Projected Stochastic Approximation and Debiased Variants | ||||
Copyright © 2002 – 2025 EasyChair |