FLAG: Fast Linearly-Coupled Adaptive Gradient Method

TitleFLAG: Fast Linearly-Coupled Adaptive Gradient Method
Publication TypeMiscellaneous
Year of Publication2016
AuthorsChen, X., Roosta-Khorasani F., Bartlett P. L., & Mahoney M.
Abstract

The celebrated Nesterov's accelerated gradient method offers great speed-ups compared to the classical gradient descend method as it attains the optimal first-order oracle complexity for smooth convex optimization. On the other hand, the popular AdaGrad algorithm competes with mirror descent under the best regularizer by adaptively scaling the gradient. Recently, it has been shown that the accelerated gradient descent can be viewed as a linear combination of gradient descent and mirror descent steps. Here, we draw upon these ideas and present a fast linearly-coupled adaptive gradient method (FLAG) as an accelerated version of AdaGrad, and show that our algorithm can indeed offer the best of both worlds. Like Nesterov's accelerated algorithm and its proximal variant, FISTA, our method has a convergence rate of 1/T2 after T iterations. Like AdaGrad our method adaptively chooses a regularizer, in a way that performs almost as well as the best choice of regularizer in hindsight.

URLhttps://arxiv.org/pdf/1605.08108.pdf
ICSI Research Group

Big Data