Research

Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark

NeurIPS

Authors

  • Alexander Korotin
  • Lingxiao Li
  • Aude Genevay
  • Justin Solomon
  • Alexander Filippov
  • Evgeny Burnaev

Published on

12/14/2021

Categories

NeurIPS

Despite the recent popularity of neural network-based solvers for optimal transport (OT), there is no standard quantitative way to evaluate their performance. In this paper, we address this issue for quadratic-cost transport—specifically, computation of the Wasserstein-2 distance, a commonly-used formulation of optimal transport in machine learning. To overcome the challenge of computing ground truth transport maps between continuous measures needed to assess these solvers, we use inputconvex neural networks (ICNN) to construct pairs of measures whose ground truth OT maps can be obtained analytically. This strategy yields pairs of continuous benchmark measures in high-dimensional spaces such as spaces of images. We thoroughly evaluate existing optimal transport solvers using these benchmark measures. Even though these solvers perform well in downstream tasks, many do not faithfully recover optimal transport maps. To investigate the cause of this discrepancy, we further test the solvers in a setting of image generation. Our study reveals crucial limitations of existing solvers and shows that increased OT accuracy does not necessarily correlate to better results downstream.

Please cite our work using the BibTeX below.

@inproceedings{
korotin2021do,
title={Do Neural Optimal Transport Solvers Work? A Continuous Wasserstein-2 Benchmark},
author={Alexander Korotin and Lingxiao Li and Aude Genevay and Justin Solomon and Alexander Filippov and Evgeny Burnaev},
booktitle={Advances in Neural Information Processing Systems},
editor={A. Beygelzimer and Y. Dauphin and P. Liang and J. Wortman Vaughan},
year={2021},
url={https://openreview.net/forum?id=CI0T_3l-n1}
}
Close Modal