Structured Block Basis Factorization for Scalable Kernel Matrix Evaluation

TitleStructured Block Basis Factorization for Scalable Kernel Matrix Evaluation
Publication TypeMiscellaneous
Year of Publication2015
AuthorsWang, R., Li Y., Mahoney M. W., & Darve E.
Abstract

Kernel matrices are popular in machine learning and scientific computing, but they are limited by their quadratic complexity in both construction and storage. It is well-known that as one varies the kernel parameter, e.g., the width parameter in radial basis function kernels, the kernel matrix changes from a smooth low-rank kernel to a diagonally-dominant and then fully-diagonal kernel. Low-rank approximation methods have been widely-studied, mostly in the first case, to reduce the memory storage and the cost of computing matrix-vector products. Here, we use ideas from scientific computing to propose an extension of these methods to situations where the matrix is not well-approximated by a low-rank matrix. In particular, we construct an efficient block low-rank approximation method---which we call the Block Basis Factorization---and we show that it has O(n) complexity in both time and memory. Our method works for a wide range of kernel parameters, extending the domain of applicability of low-rank approximation methods, and our empirical results demonstrate the stability (small standard deviation in error) and superiority over current state-of-art kernel approximation algorithms.

URLhttps://arxiv.org/pdf/1505.00398.pdf
ICSI Research Group

Big Data