Continual classification learning using generative model

Lavda, Frantzeska (University of Geneva, Switzerland) ; Ramapuram, Jason (University of Geneva, Switzerland) ; Gregorova, Magda (Haute école de gestion de Genève, HES-SO // Haute Ecole Spécialisée de Suisse Occidentale) ; Kalousis, Alexandros (Haute école de gestion de Genève, HES-SO // Haute Ecole Spécialisée de Suisse Occidentale)

Continual learning is the ability to sequentially learn over time by accommodating knowledge while retaining previously learned experiences. Neural networks can learn multiple tasks when trained on them jointly, but cannot maintain performance on previously learned tasks when tasks are presented one at a time. This problem is called catastrophic forgetting. In this work, we propose a classification model that learns continuously from sequentially observed tasks, while preventing catastrophic forgetting. We build on the lifelong generative capabilities of [10] and extend it to the classification setting by deriving a new variational bound on the joint loglikelihood, log p(x; y).


Conference Type:
full paper
Faculty:
Economie et Services
School:
HEG - Genève
Institute:
CRAG - Centre de Recherche Appliquée en Gestion
Subject(s):
Informatique
Publisher:
Montreal, Canada, 3-8 December 2018
Date:
2018-12
Montreal, Canada
3-8 December 2018
Pagination:
5 p.
Published in:
Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS) 2018
External resources:
Appears in Collection:



 Record created 2019-10-17, last modified 2019-10-22

Fulltext:
Download fulltext
PDF

Rate this document:

Rate this document:
1
2
3
 
(Not yet reviewed)