Interpolating Convolutional Neural Networks Using Batch Normalization

Gratianus Wesley Putra Data, Kirjon Ngu, David William Murray, Victor Adrian Prisacariu; The European Conference on Computer Vision (ECCV), 2018, pp. 574-588

Abstract


Perceiving a visual concept as a mixture of learned ones is natural for humans, aiding them to grasp new concepts and strengthening old ones. For all their power and recent success, deep convolutional networks do not have this ability. Inspired by recent work on universal representations for neural networks, we propose a simple emulation of this mechanism by purposing batch normalization layers to discriminate visual classes, and formulating a way to combine them to solve new tasks. We show that this can be applied for 2-way few-shot learning where we obtain between 4% and 17% better accuracy compared to straightforward full fine-tuning, and demonstrate that it can also be extended to the orthogonal application of style transfer.

Related Material


[pdf]
[bibtex]
@InProceedings{Data_2018_ECCV,
author = {Wesley Putra Data, Gratianus and Ngu, Kirjon and William Murray, David and Adrian Prisacariu, Victor},
title = {Interpolating Convolutional Neural Networks Using Batch Normalization},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}