CoupleFace: Relation Matters for Face Recognition Distillation

Jiaheng Liu, Haoyu Qin, Yichao Wu, Jinyang Guo, Ding Liang, Ke Xu ;

Abstract


"Knowledge distillation is an effective method to im- prove the performance of a lightweight neural network (i.e., student model) by transferring the knowledge of a well- performed neural network (i.e., teacher model), which has been widely applied in many computer vision tasks, includ- ing face recognition. Nevertheless, the current face recogni- tion distillation methods usually utilize the Feature Consis- tency Distillation (FCD) (e.g., L 2 distance) on the learned embeddings extracted by the teacher and student models for each sample, which is not able to fully transfer the knowl- edge from the teacher to the student for face recognition. In this work, we observe that mutual relation knowledge between samples is also important to improve the discrim- inative ability of the learned representation of the student model, and propose an effective face recognition distilla- tion method called CoupleFace by additionally introducing the Mutual Relation Distillation (MRD) into existing distil- lation framework. Specifically, in MRD, we first propose to mine the informative mutual relations, and then intro- duce the Relation-Aware Distillation (RAD) loss to trans- fer the mutual relation knowledge of the teacher model to the student model. Extensive experimental results on multi- ple benchmark datasets demonstrate the effectiveness of our proposed CoupleFace for face recognition."

Related Material


[pdf] [DOI]