Research

Faster Linear Algebra for Distance Matrices

NeurIPS

Authors

Published on

12/04/2022

Categories

NeurIPS

The distance matrix of a dataset X of n points with respect to a distance function f represents all pairwise distances between points in X induced by f. Due to their wide applicability, distance matrices and related families of matrices have been the focus of many recent algorithmic works. We continue this line of research and take a broad view of algorithm design for distance matrices with the goal of designing fast algorithms, which are specifically tailored for distance matrices, for fundamental linear algebraic primitives. Our results include efficient algorithms for computing matrix-vector products for a wide class of distance matrices, such as the `1 metric for which we get a linear runtime, as well as an ⌦(n2) lower bound for any algorithm which computes a matrix-vector product for the `1 case, showing a separation between the `1 and the `1 metrics. Our upper bound results, in conjunction with recent works on the matrix-vector query model, have many further downstream applications, including the fastest algorithm for computing a relative error low-rank approximation for the distance matrix induced by `1 and `2 2 functions and the fastest algorithm for computing an additive error lowrank approximation for the `2 metric, in addition to applications for fast matrix multiplication among others. We also give algorithms for constructing distance matrices and show that one can construct an approximate `2 distance matrix in time faster than the bound implied by the Johnson-Lindenstrauss lemma.

Please cite our work using the BibTeX below.

@inproceedings{
indyk2022faster,
title={Faster Linear Algebra for Distance Matrices},
author={Piotr Indyk and Sandeep Silwal},
booktitle={Advances in Neural Information Processing Systems},
editor={Alice H. Oh and Alekh Agarwal and Danielle Belgrave and Kyunghyun Cho},
year={2022},
url={https://openreview.net/forum?id=y--ZUTfbNB}
}
Close Modal