Research

Variational Russian Roulette for Deep Bayesian Nonparametrics

Deep Learning

Authors

Published on

01/01/2019

Bayesian nonparametric models provide a principled way to automatically adapt the complexity of a model to the amount of the data available, but computation in such models is difficult. Amortized variational approximations are appealing because of their computational efficiency, but current methods rely on a fixed finite truncation of the infinite model. This truncation level can be difficult to set, and also interacts poorly with amortized methods due to the over-pruning problem. Instead, we propose a new variational approximation, based on a method from statistical physics called Russian roulette sampling. This allows the variational distribution to adapt its complexity during inference, without relying on a fixed truncation level, and while still obtaining an unbiased estimate of the gradient of the original variational objective. We demonstrate this method on infinite sized variational auto-encoders using a Beta-Bernoulli (Indian buffet process) prior.

Please cite our work using the BibTeX below.

@inproceedings{DBLP:conf/icml/XuSS19,
          author={Kai Xu and Akash Srivastava and Charles A. Sutton},
          title={Variational Russian Roulette for Deep Bayesian Nonparametrics},
          year={2019},
          cdate={1546300800000},
          pages={6963-6972},
          url={http://proceedings.mlr.press/v97/xu19e.html},
          booktitle={ICML},
          crossref={conf/icml/2019}
        }
        
Close Modal