Weighted SGD for ℓp Regression with Randomized Preconditioning

TitleWeighted SGD for ℓp Regression with Randomized Preconditioning
Publication TypeConference Paper
Year of Publication2015
AuthorsYang, J., Chow Y-L., Re C., & Mahoney M.
Published inProceedings of the 27th Annual SODA Conference
Page(s)558-569
Abstract

In recent years, stochastic gradient descent (SGD) methods and randomized linear algebra (RLA) algorithms have been applied to many large-scale problems in machine learning and data analysis. We aim to bridge the gap between these two methods in solving constrained overdetermined linear regression problems---e.g., ℓ2 and ℓ1 regression problems. We propose a hybrid algorithm named pwSGD that uses RLA techniques for preconditioning and constructing an importance sampling distribution, and then performs an SGD-like iterative process with weighted sampling on the preconditioned system. We prove that pwSGD inherits faster convergence rates that only depend on the lower dimension of the linear system, while maintaining low computation complexity. Particularly, when solving ℓ1 regression with size n by d, pwSGD returns an approximate solution with ϵ relative error in the objective value in O(logn⋅nnz(A)+poly(d)/ϵ2) time. This complexity is uniformly better than that of RLA methods in terms of both ϵ and d when the problem is unconstrained. For ℓ2 regression, pwSGD returns an approximate solution with ϵ relative error in the objective value and the solution vector measured in prediction norm in O(logn⋅nnz(A)+poly(d)log(1/ϵ)/ϵ) time. We also provide lower bounds on the coreset complexity for more general regression problems, indicating that still new ideas will be needed to extend similar RLA preconditioning ideas to weighted SGD algorithms for more general regression problems. Finally, the effectiveness of such algorithms is illustrated numerically on both synthetic and real datasets.

URLhttp://www.stat.berkeley.edu/~mmahoney/pubs/sgd-rla-soda16.pdf
ICSI Research Group

Big Data