Definition
Layer normalization (layer norm) normalizes the inputs across the features for each sample. In a fully connected layer, layer norm is applied across all the neurons in that layer for each input. In a convolutional layer, layer norm is applied across the channel dimension for each spatial position.