implementation of all the functions required to build a deep neural network.
build a deep neural network with as many layers as we want!
- Use non-linear units like ReLU to improve your model
- Build a deeper neural network (with more than 1 hidden layer)
- Implement an easy-to-use neural network class
Notation:
- Superscript
$[l]$ denotes a quantity associated with the$l^{th}$ layer.- Example:
$a^{[L]}$ is the$L^{th}$ layer activation.$W^{[L]}$ and$b^{[L]}$ are the$L^{th}$ layer parameters.
- Example:
- Superscript
$(i)$ denotes a quantity associated with the$i^{th}$ example.- Example:
$x^{(i)}$ is the$i^{th}$ training example.
- Example:
- Lowerscript
$i$ denotes the$i^{th}$ entry of a vector.- Example:
$a^{[l]}_i$ denotes the$i^{th}$ entry of the$l^{th}$ layer's activations).
- Example: