The theory and practice of stochastic optimization has focused on stochastic gradient descent (SGD) in recent years, retaining the basic first-order stochastic nature of SGD while aiming to improve it via mechanisms such as averaging, momentum, and variance reduction. Improvement can be measured along various dimensions, however, and it has proved difficult to achieve improvements both in terms of nonasymptotic measures of convergence rate and asymptotic measures of distributional tightness. In this work, we consider first-order stochastic optimization from a general statistical point of view, motivating a specific form of recursive averaging of past stochastic gradients. The resulting algorithm, which we refer to as Recursive One-Over-T SGD (ROOT-SGD), matches the state-of-the-art convergence rate among online variance-reduced stochastic approximation methods. Moreover, under slightly stronger distributional assumptions, the rescaled last-iterate of ROOT-SGD converges to a zero-mean Gaussian distribution that achieves near-optimal covariance. This is a joint work with Wenlong Mou, Martin Wainwright, and Michael Jordan.
28
August 2020
11am - 12pm
Where
https://hkust.zoom.us/j/5616960008
Organizer(S)
Department of Mathematics
Contact/Enquiries
mathseminar@ust.hk
Payment Details
Audience
Alumni, Faculty and Staff, PG Students, UG Students
語言
English
Other Events
Seminar, Lecture, Talk
PHYS Seminar - Ultracold Trapped Ions and Molecules for Quantum Control
25
Jan 2021
Seminar, Lecture, Talk
PHYS Seminar - Electron Correlations in Topological Quantum Crystals for Future Applications
19
Jan 2021