Research

Proximal Stochastic Recursive Momentum Methods for Nonconvex Composite Decentralized Optimization

AAAI

Authors

  • Gabriel Mancino-Ball
  • Shengnan Miao
  • Yangyang Xu
  • Jie Chen

Published on

02/14/2023

Categories

AAAI

Consider a network of N decentralized computing agents collaboratively solving a nonconvex stochastic composite problem. In this work, we propose a single-loop algorithm, called DEEPSTORM, that achieves optimal sample complexity for this setting. Unlike double-loop algorithms that require a large batch size to compute the (stochastic) gradient once in a while, DEEPSTORM uses a small batch size, creating advantages in occasions such as streaming data and online learning. This is the first method achieving optimal sample complexity for decentralized nonconvex stochastic composite problems, requiring 0(1) batch size. We conduct convergence analysis for DEEPSTORM with both constant and diminishing step sizes. Additionally, under proper initialization and a small enough desired solution error, we show that DEEPSTORM with a constant step size achieves a network-independent sample complexity, with an additional linear speed-up with respect to N over centralized methods.

This work was published in AAAI 2023.

Please cite our work using the BibTeX below.

@inproceedings{mancinoball2023deepstorm,
    title={Proximal Stochastic Recursive Momentum Methods for Nonconvex Composite Decentralized Optimization},
    author={Gabriel Mancino-Ball and Shengnan Miao and Yangyang Xu and Jie Chen},
    booktitle={The Thirty-Seventh {AAAI} Conference on Artificial Intelligence, {AAAI} 2023},
    publisher={{AAAI} Press},
    year={2023}
}
Close Modal