Research

Revisiting the Sample Complexity of Sparse Spectrum Approximation of Gaussian Processes

NeurIPS

Authors

  • Quang Minh Hoang
  • Nghia Hoang
  • Hai Pham
  • David P. Woodruff

Published on

11/17/2020

Categories

NeurIPS

We introduce a new scalable approximation for Gaussian processes with provable guarantees which hold simultaneously over its entire parameter space. Our approximation is obtained from an improved sample complexity analysis for sparse spectrum Gaussian processes (SSGPs). In particular, our analysis shows that under a certain data disentangling condition, an SSGP’s prediction and model evidence (for training) can well-approximate those of a full GP with low sample complexity. We also develop a new auto-encoding algorithm that finds a latent space to disentangle latent input coordinates into well-separated clusters, which is amenable to our sample complexity analysis. We validate our proposed method on several benchmarks with promising results supporting our theoretical analysis.

This paper has been published as a poster at the 2020 Neural Information Processing Systems (NeurIPS) conference.

Please cite our work using the BibTeX below.

@misc{hoang2020revisiting,
      title={Revisiting the Sample Complexity of Sparse Spectrum Approximation of Gaussian Processes}, 
      author={Quang Minh Hoang and Trong Nghia Hoang and Hai Pham and David P. Woodruff},
      year={2020},
      eprint={2011.08432},
      archivePrefix={arXiv},
      primaryClass={cs.LG}
}
Close Modal