Research

Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion

ICML

Authors

Published on

07/18/2020

Model fusion is an emerging study in collective learning where heterogeneous experts with private data and learning architectures need to combine their black-box knowledge for better performance. Existing literature achieves this via a local knowledge distillation scheme that transfuses the predictive patterns of each pre-trained expert onto a white-box imitator model, which can be incorporated efficiently into a global model. This scheme however does not extend to multi-task scenarios where different experts were trained to solve different tasks and only part of their distilled knowledge is relevant to a new task. To address this multi-task challenge, we develop a new fusion paradigm that represents each expert as a distribution over a spectrum of predictive prototypes, which are isolated from task-specific information encoded within the prototype distribution. The task-agnostic prototypes can then be reintegrated to generate a new model that solves a new task encoded with a different prototype distribution. The fusion and adaptation performance of the proposed framework is demonstrated empirically on several real-world benchmark datasets.

Please cite our work using the BibTeX below.

@InProceedings{pmlr-v119-hoang20b,
  title = 	 {Learning Task-Agnostic Embedding of Multiple Black-Box Experts for Multi-Task Model Fusion},
  author =       {Hoang, Nghia and Lam, Thanh and Low, Bryan Kian Hsiang and Jaillet, Patrick},
  booktitle = 	 {Proceedings of the 37th International Conference on Machine Learning},
  pages = 	 {4282--4292},
  year = 	 {2020},
  editor = 	 {III, Hal Daumé and Singh, Aarti},
  volume = 	 {119},
  series = 	 {Proceedings of Machine Learning Research},
  month = 	 {13--18 Jul},
  publisher =    {PMLR},
  pdf = 	 {http://proceedings.mlr.press/v119/hoang20b/hoang20b.pdf},
  url = 	 {https://proceedings.mlr.press/v119/hoang20b.html}
}
Close Modal