Research

Scalable inference of topic evolution via models for latent geometric structures

NeurIPS

Authors

Published on

12/14/2019

Categories

NeurIPS

We develop new models and algorithms for learning the temporal dynamics of the topic polytopes and related geometric objects that arise in topic model based inference. Our model is nonparametric Bayesian and the corresponding inference algorithm is able to discover new topics as the time progresses. By exploiting the connection between the modeling of topic polytope evolution, Beta-Bernoulli process and the Hungarian matching algorithm, our method is shown to be several orders of magnitude faster than existing topic modeling approaches, as demonstrated by experiments working with several million documents in under two dozens of minutes.

This work was published in NeurIPS 2019.

Please cite our work using the BibTeX below.

@inproceedings{DBLP:conf/nips/YurochkinFGKN19,
  author={Mikhail Yurochkin and Zhiwei Fan and Aritra Guha and Paraschos Koutris and XuanLong Nguyen},
  title={Scalable inference of topic evolution via models for latent geometric structures},
  year={2019},
  cdate={1546300800000},
  pages={5949-5959},
  url={https://proceedings.neurips.cc/paper/2019/hash/31c0c178a9fc26ffecffd8670e6d746d-Abstract.html},
  booktitle={NeurIPS},
  crossref={conf/nips/2019},
}
Close Modal