Show simple item record

dc.contributor.authorLi, Yuanpeng
dc.contributor.authorZhao, Liang
dc.contributor.authorChurch, Kenneth
dc.contributor.authorElhoseiny, Mohamed
dc.date.accessioned2020-04-26T23:58:59Z
dc.date.available2020-04-26T23:58:59Z
dc.date.issued2020-03-11
dc.date.submitted2019-09-25
dc.identifier.urihttp://hdl.handle.net/10754/662643
dc.description.abstractMotivated by the human's ability to continually learn and gain knowledge over time, several research efforts have been pushing the limits of machines to constantly learn while alleviating catastrophic forgetting. Most of the existing methods have been focusing on continual learning of label prediction tasks, which have fixed input and output sizes. In this paper, we propose a new scenario of continual learning which handles sequence-to-sequence tasks common in language learning. We further propose an approach to use label prediction continual learning algorithm for sequence-to-sequence continual learning by leveraging compositionality. Experimental results show that the proposed method has significant improvement over state-of-the-art methods. It enables knowledge transfer and prevents catastrophic forgetting, resulting in more than 85% accuracy up to 100 stages, compared with less than 50% accuracy for baselines in instruction learning task. It also shows significant improvement in machine translation task. This is the first work to combine continual learning and compositionality for language learning, and we hope this work will make machines more helpful in various tasks.
dc.description.sponsorshipWork partially done while visiting Baidu Research.
dc.publisherICLR
dc.relation.urlhttps://openreview.net/forum?id=rklnDgHtDS
dc.relation.urlhttps://iclr.cc/Conferences/2020
dc.rightsArchived with thanks to ICLR
dc.titleCompositional Continual Language Learning
dc.typeConference Paper
dc.contributor.departmentComputer, Electrical and Mathematical Sciences and Engineering (CEMSE) Division
dc.conference.dateApril 26 - May 1, 2020
dc.conference.nameInternational Conference on Learning Representations (ICLR) 2020
dc.conference.locationVirtual Conference, Formerly Addis Ababa ETHIOPIA
dc.eprint.versionPost-print
dc.contributor.institutionBaidu Research
dc.contributor.institutionStanford University
pubs.publication-statusPublished
kaust.personElhoseiny, Mohamed
dc.date.accepted2019-12-20
refterms.dateFOA2020-04-26T23:58:59Z


Files in this item

Thumbnail
Name:
compositional_language_continual_learning.pdf
Size:
484.8Kb
Format:
PDF
Description:
Accepted Manuscript

This item appears in the following Collection(s)

Show simple item record