ReActNet: Towards Precise Binary Neural Network with Generalized Activation Functions

Zechun Liu, Zhiqiang Shen, Marios Savvides, Kwang-Ting Cheng ;

Abstract


In this paper, we propose several ideas for enhancing a bi- nary network to close its accuracy gap from real-valued networks without incurring any additional computational cost. We first construct a base- line network by modifying and binarizing a compact real-valued network with parameter-free shortcuts, bypassing all the intermediate convolu- tional layers including the downsampling layers. This baseline network strikes a good trade-o↵ between accuracy and e ciency, achieving su- perior performance than most of existing binary networks at approxi- mately half of the computational cost. Through extensive experiments and analysis, we observed that the performance of binary networks is sensitive to activation distribution variations. Based on this important observation, we propose to generalize the traditional Sign and PReLU functions, denoted as RSign and RPReLU for the respective general- ized functions, to enable explicit learning of the distribution reshape and shift at near-zero extra cost. Lastly, we adopt a distributional loss to further enforce the binary network to learn similar output distribu- tions as those of a real-valued network. We show that after incorporating all these ideas, the proposed ReActNet outperforms all the state-of-the- arts by a large margin. Specifically, it outperforms Real-to-Binary Net and MeliusNet29 by 4.0% and 3.6% respectively for the top-1 accuracy and also reduces the gap to its real-valued counterpart to within 3.0% top-1 accuracy on ImageNet dataset. Code and models are available at: https://github.com/liuzechun/ReActNet."

Related Material


[pdf]