Research

Beyond Worst-Case Analysis in Stochastic Approximation: Moment Estimation Improves Instance Complexity

ICML

Authors

Published on

07/23/2022

Categories

ICML

We study oracle complexity of gradient based methods for stochastic approximation problems. Though in many settings optimal algorithms and tight lower bounds are known for such problems, these optimal algorithms do not achieve the best performance when used in practice. We address this theory-practice gap by focusing on instance-dependent complexity instead of worst case complexity. In particular, we first summarize known instance-dependent complexity results and categorize them into three levels. We identify the domination relation between different levels and propose a fourth instance-dependent bound that dominates existing ones. We then provide a sufficient condition according to which an adaptive algorithm with moment estimation can achieve the proposed bound without knowledge of noise levels. Our proposed algorithm and its analysis provide a theoretical justification for the success of moment estimation as it achieves improved instance complexity.

Please cite our work using the BibTeX below.

@InProceedings{pmlr-v162-zhang22r,
  title = 	 {Beyond Worst-Case Analysis in Stochastic Approximation: Moment Estimation Improves Instance Complexity},
  author =       {Zhang, Jingzhao and Lin, Hongzhou and Das, Subhro and Sra, Suvrit and Jadbabaie, Ali},
  booktitle = 	 {Proceedings of the 39th International Conference on Machine Learning},
  pages = 	 {26347--26361},
  year = 	 {2022},
  editor = 	 {Chaudhuri, Kamalika and Jegelka, Stefanie and Song, Le and Szepesvari, Csaba and Niu, Gang and Sabato, Sivan},
  volume = 	 {162},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {17--23 Jul},
  publisher =    {PMLR},
  pdf = 	 {https://proceedings.mlr.press/v162/zhang22r/zhang22r.pdf},
  url = 	 {https://proceedings.mlr.press/v162/zhang22r.html},
  abstract = 	 {We study oracle complexity of gradient based methods for stochastic approximation problems. Though in many settings optimal algorithms and tight lower bounds are known for such problems, these optimal algorithms do not achieve the best performance when used in practice. We address this theory-practice gap by focusing on instance-dependent complexity instead of worst case complexity. In particular, we first summarize known instance-dependent complexity results and categorize them into three levels. We identify the domination relation between different levels and propose a fourth instance-dependent bound that dominates existing ones. We then provide a sufficient condition according to which an adaptive algorithm with moment estimation can achieve the proposed bound without knowledge of noise levels. Our proposed algorithm and its analysis provide a theoretical justification for the success of moment estimation as it achieves improved instance complexity.}
}
Close Modal