Definition

He initialization is designed for Neural Network using the ReLU activation function.
For Normal Distribution: For Uniform Distribution where is the number of input nodes, and is the number of output nodes.

He initialization is designed for Neural Network using the ReLU activation function.
For Normal Distribution: For Uniform Distribution where is the number of input nodes, and is the number of output nodes.