Bayesian Nonparametric Federated Learning of Neural Networks
Authors
Authors
- Mikhail Yurochkin
- Mayank Agarwal
- Soumya Ghosh
- Kristjan Greenewald
- Trong Nghia Hoang
- Yasaman Khazaeni
Authors
- Mikhail Yurochkin
- Mayank Agarwal
- Soumya Ghosh
- Kristjan Greenewald
- Trong Nghia Hoang
- Yasaman Khazaeni
Published on
05/28/2019
Categories
In federated learning problems, data is scattered across different servers and exchanging or pooling it is often impractical or prohibited. We develop a Bayesian nonparametric framework for federated learning with neural networks. Each data server is assumed to provide local neural network weights, which are modeled through our framework. We then develop an inference approach that allows us to synthesize a more expressive global network without additional supervision, data pooling and with as few as a single communication round. We then demonstrate the efficacy of our approach on federated learning problems simulated from two popular image classification datasets.
Please cite our work using the BibTeX below.
@misc{yurochkin2019bayesian,
title={Bayesian Nonparametric Federated Learning of Neural Networks},
author={Mikhail Yurochkin and Mayank Agarwal and Soumya Ghosh and Kristjan Greenewald and Trong Nghia Hoang and Yasaman Khazaeni},
year={2019},
eprint={1905.12022},
archivePrefix={arXiv},
primaryClass={stat.ML}
}