Research

Correlation Clustering in Constant Many Parallel Rounds

ICML

Authors

  • Vincent Cohen-Addad
  • Silvio Lattanzi
  • Slobodan Mitrović
  • Ashkan Norouzi-Fard
  • Nikos Parotsidis
  • Jakub Tarnawski

Published on

07/24/2021

Correlation clustering is a central topic in unsupervised learning, with many applications in ML and data mining. In correlation clustering, one receives as input a signed graph and the goal is to partition it to minimize the number of disagreements. In this work we propose a massively parallel computation (MPC) algorithm for this problem that is considerably faster than prior work. In particular, our algorithm uses machines with memory sublinear in the number of nodes in the graph and returns a constant approximation while running only for a constant number of rounds. To the best of our knowledge, our algorithm is the first that can provably approximate a clustering problem using only a constant number of MPC rounds in the sublinear memory regime. We complement our analysis with an experimental scalability evaluation of our techniques.

Please cite our work using the BibTeX below.

@InProceedings{pmlr-v139-cohen-addad21b,
  title = 	 {Correlation Clustering in Constant Many Parallel Rounds},
  author =       {Cohen-Addad, Vincent and Lattanzi, Silvio and Mitrovi{\'c}, Slobodan and Norouzi-Fard, Ashkan and Parotsidis, Nikos and Tarnawski, Jakub},
  booktitle = 	 {Proceedings of the 38th International Conference on Machine Learning},
  pages = 	 {2069--2078},
  year = 	 {2021},
  editor = 	 {Meila, Marina and Zhang, Tong},
  volume = 	 {139},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {18--24 Jul},
  publisher =    {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v139/cohen-addad21b/cohen-addad21b.pdf},
  url = 	 {https://proceedings.mlr.press/v139/cohen-addad21b.html}
}
Close Modal