Skip to content

mikhailkin/input_normalization_fusion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 

Repository files navigation

input_normalization_fusion

Input Normalization with the first layer fusion

About

In this repository a simplistic implementation of input fusion with the first layer is presented. This repository might be used to speed up preprocessing step.

How it works

We know that both input normalization and the first layer most often are linear operations. Input normalization can be noted as xn = 𝓦n β—‹ x + bn. Where 𝓦n is a multiplicative part and bn is a bias or an additive part. Let 𝓛(𝓦, b, x) be the first layer of our neural network. What we intend to do is to substitute 𝓛(𝓦, b, 𝓦n β—‹ x + bn) for 𝓛(𝓦’, b’, x), or in other words to make fusion.

TODO

  • Tensorflow.
  • Performance benchmarks.

Acknowledgements

Thanks to @lext for the Batch Norm Fusion for Pytorch and author of the blog Funny pictures might be found here

About

Input Normalization with first layer fusion

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors