A few different initialization methods, including random, zeros, and He initialization, and investigate how each leads to different results.
Initialization methods are one of the way to improve Deep Neural Network results. your neural network requires specifying an initial value of the weights. A well-chosen initialization method helps the learning process.
A well-chosen initialization can:
- Speed up the convergence of gradient descent
- Increase the odds of gradient descent converging to a lower training (and generalization) error