Topology-Preserving Class-Incremental Learning

Xiaoyu Tao, Xinyuan Chang, Xiaopeng Hong, Xing Wei, Yihong Gong ;

Abstract


A well-known issue for class-incremental learning is the catastrophic forgetting phenomenon, where the network's recognition performance on old classes degrades severely when incrementally learning new classes. To alleviate forgetting, we put forward to preserve the old class knowledge by maintaining the topology of the network's feature space. On this basis, we propose a novel topology-preserving class-incremental learning (TPCIL) framework. TPCIL uses an elastic Hebbian graph (EHG) to model the feature space topology, which is constructed with the competitive Hebbian learning rule. To maintain the topology, we develop the topology-preserving loss (TPL) that penalizes the changes of EHG's neighborhood relationships during incremental learning phases. Comprehensive experiments on CIFAR100, ImageNet, and subImageNet datasets demonstrate the power of the TPCIL for continuously learning new classes with less forgetting. The code will be released."

Related Material


[pdf]