Not Logged In

Meta-Learning Representations for Continual Learning

Full Text: 8458-meta-learning-representations-for-continual-learning.pdf PDF

The reviews had two major concerns: lack of a benchmarking on a complex dataset, and unclear writing. To address these two major issues we: 1- Rewrote experiments section with improved terminology to make the paper more clear. Previously we were using the term Pretraining to refer to both a baseline and the meta-training stage. As the reviewers pointed out, this was confusing. We have replaced one of the usages with 'meta-training.' We have also changed evaluation to meta-testing. 2- Added mini-imagenet experiments to show that the proposed method scales to more complex datasets. Moreover, it wasn't clear if the objective we introduced improved over a maml like objective that also learned representations. We added MAML-Rep as a baseline that shows that our method -- which minimizes interference in addition to maximizing fast adaptation -- performs noticeably better. We also added the pseudo-code of the algorithms to the main paper as requested by reviewers. Moreover, we contrast our algorithm with MAML to highlight the difference between the two. We believe that this makes the current version significantly more clear to anyone who already understands the MAML objective. We have fixed various minor issues in writing and included some missing related work. (bengio2019meta, nagabandi19, al2017continuous) that we have discovered since our initial submission. Finally, we thank the reviewers and the meta-reviewer for the feedback, which allowed us to improve the work in several aspects.

Citation

K. Javed, M. White. "Meta-Learning Representations for Continual Learning". Neural Information Processing Systems (NIPS), (ed: Hanna M. Wallach, Hugo Larochelle, Alina Beygelzimer, Florence d'Alche-Buc, Emily B. Fox, Roman Garnett), pp 1818-1828, December 2019.

Keywords:  
Category: In Conference
Web Links: NeurIPS

BibTeX

@incollection{Javed+White:NIPS19,
  author = {Khurram Javed and Martha White},
  title = {Meta-Learning Representations for Continual Learning},
  Editor = {Hanna M. Wallach, Hugo Larochelle, Alina Beygelzimer, Florence
    d'Alche-Buc, Emily B. Fox, Roman Garnett},
  Pages = {1818-1828},
  booktitle = {Neural Information Processing Systems (NIPS)},
  year = 2019,
}

Last Updated: February 24, 2020
Submitted by Sabina P

University of Alberta Logo AICML Logo