Definition

He initialization is designed for Neural Network using the ReLU activation function.

For Normal Distribution: For Uniform Distribution where is the number of input nodes, and is the number of output nodes.