Symmetry Regularization and Saturating Nonlinearity for Robust Quantization

Sein Park, Yeongsang Jang, Eunhyeok Park ;

Abstract


"Robust quantization improves the tolerance of networks for various implementations, allowing the maintenance of accuracy in a different bit-width or quantization policy. It offers appealing candidates, especially when the target objective (i.e., energy consumption and performance) is not static and implementations are fragmented. In this work, we perform extensive analyses to identify the sources of quantization error and present three insights to robustify the network against quantization: reduction of error propagation, range clamping for error minimization, and inherited robustness against quantization. Based on these insights, we propose two novel methods called symmetry regularization (SymReg) and saturating nonlinearity (SatNL). Applying the proposed methods during training can enhance the robustness of arbitrary neural networks against quantization on existing post-training quantization (PTQ) and quantization-aware training (QAT) algorithms, obtaining a single weight flexible enough to maintain the output quality at various conditions. We conduct extensive studies and validate the effectiveness of the proposed methods."

Related Material


[pdf] [supplementary material] [DOI]