Input Normalization with the first layer fusion
In this repository a simplistic implementation of input fusion with the first layer is presented. This repository might be used to speed up preprocessing step.
We know that both input normalization and the first layer most often are linear operations. Input normalization can be noted as xn = π¦n β x + bn. Where π¦n is a multiplicative part and bn is a bias or an additive part. Let π(π¦, b, x) be the first layer of our neural network. What we intend to do is to substitute π(π¦, b, π¦n β x + bn) for π(π¦β, bβ, x), or in other words to make fusion.
- Tensorflow.
- Performance benchmarks.
Thanks to @lext for the Batch Norm Fusion for Pytorch and author of the blog Funny pictures might be found here