Histogram-equalized quantization for logic-gated residual neural networks

authors

  • Nguyen Van Thien
  • Guicquero William
  • Sicard Gilles

keywords

  • CNN
  • Quantized neural networks
  • Histogram equalization
  • Skip connections
  • Logic-gated CNN

abstract

Adjusting the quantization according to the data or to the model loss seems mandatory to enable a high accuracy in the context of quantized neural networks. This work presents Histogram-Equalized Quantization (HEQ), an adaptive framework for linear and symmetric quantization. HEQ automatically adapts the quantization thresholds using a unique step size optimization. We empirically show that HEQ achieves state-of-the-art performances on CFAR-10. Experiments on the STL-10 dataset even show that HEQ enables a proper training of our proposed logic-gated (OR, MUX) residual networks with a higher accuracy at a lower hardware complexity than previous work.

more information