For instance you’ve came upon the full price of the community.
We’d like a really low price for a great neural community and correct outcomes.
To attain this we use a way that entails calculating the unfavourable gradient of the associated fee perform — which lets us perceive the best way to change the present weights and biases to lower the present price. The algorithm that does that is backpropagation.
Suppose your price perform seems to be like this [1.1, 2.2, 3.3, … ]. This may be understood because the sensitivity of every neuron to any change in its weight.
i.e. the associated fee perform is 1.1 occasions delicate to a change within the first weight whereas the second weight influences the associated fee perform by 2.2 occasions.
! activation of a neuron = σ(w0a0 + w1a1 + …. + wnan +b)
There are 3 methods we may enhance the activation of a neuron:
1. Enhance the bias
2. Enhance the weights
3. Change the activations from the earlier layer
- Enhance the bias:
Enhance the bias of the neuron which we have to activate extra and reduce the bias of all different neurons. - Enhance the burden:
Weights are being multiplied by activations — wi*ai , which implies they’ve completely different results on the activations. - Altering the activations from the earlier layers:
If we management the earlier layer weights and biases the activation of a present layer modifications. So recursively activation of every layer depends upon the earlier layer.