Instance normalization layer
layer_instance_normalization( object, groups = 2, axis = -1, epsilon = 0.001, center = TRUE, scale = TRUE, beta_initializer = "zeros", gamma_initializer = "ones", beta_regularizer = NULL, gamma_regularizer = NULL, beta_constraint = NULL, gamma_constraint = NULL, ... )
object | Model or layer object |
---|---|
groups | Integer, the number of groups for Group Normalization. Can be in the range [1, N] where N is the input dimension. The input dimension must be divisible by the number of groups. |
axis | Integer, the axis that should be normalized. |
epsilon | Small float added to variance to avoid dividing by zero. |
center | If TRUE, add offset of `beta` to normalized tensor. If FALSE, `beta` is ignored. |
scale | If TRUE, multiply by `gamma`. If FALSE, `gamma` is not used. |
beta_initializer | Initializer for the beta weight. |
gamma_initializer | Initializer for the gamma weight. |
beta_regularizer | Optional regularizer for the beta weight. |
gamma_regularizer | Optional regularizer for the gamma weight. |
beta_constraint | Optional constraint for the beta weight. |
gamma_constraint | Optional constraint for the gamma weight. |
... | additional parameters to pass |
A tensor
Instance Normalization is an specific case of ```GroupNormalizationsince``` it normalizes all features of one channel. The Groupsize is equal to the channel size. Empirically, its accuracy is more stable than batch norm in a wide range of small batch sizes, if learning rate is adjusted linearly with batch sizes.
[Instance Normalization: The Missing Ingredient for Fast Stylization](https://arxiv.org/abs/1607.08022)