Learning to Learn without Forgetting By Maximizing Transfer and Minimizing Interference

Lifelong Learning


  • Matthew Riemer
  • Ignacio Cases
  • Robert Ajemian
  • Miao Liu
  • Irina Rish
  • Yuhai Tu
  • Gerald Tesauro

Published on


Lack of performance when it comes to continual learning over non-stationary distributions of data remains a major challenge in scaling neural network learning to more human realistic settings. In this work we propose a new conceptualization of the continual learning problem in terms of a temporally symmetric trade-off between transfer and interference that can be optimized by enforcing gradient alignment across examples. We then propose a new algorithm, Meta-Experience Replay (MER), that directly exploits this view by combining experience replay with optimization based meta-learning. This method learns parameters that make interference based on future gradients less likely and transfer based on future gradients more likely. We conduct experiments across continual lifelong supervised learning benchmarks and non-stationary reinforcement learning environments demonstrating that our approach consistently outperforms recently proposed baselines for continual learning. Our experiments show that the gap between the performance of MER and baseline algorithms grows both as the environment gets more non-stationary and as the fraction of the total experiences stored gets smaller.

Please cite our work using the BibTeX below.

  author    = {Matthew Riemer and
               Ignacio Cases and
               Robert Ajemian and
               Miao Liu and
               Irina Rish and
               Yuhai Tu and
               Gerald Tesauro},
  title     = {Learning to Learn without Forgetting By Maximizing Transfer and Minimizing
  journal   = {CoRR},
  volume    = {abs/1810.11910},
  year      = {2018},
  url       = {},
  archivePrefix = {arXiv},
  eprint    = {1810.11910},
  timestamp = {Thu, 12 Sep 2019 14:49:11 +0200},
  biburl    = {},
  bibsource = {dblp computer science bibliography,}

Close Modal