As an example you’ve came upon the overall value of the community.
We’d like a really low value for a great neural community and correct outcomes.
To realize this we use a way that entails calculating the unfavourable gradient of the price operate — which lets us perceive tips on how to change the present weights and biases to lower the present value. The algorithm that does that is backpropagation.
Suppose your value operate seems like this [1.1, 2.2, 3.3, … ]. This may be understood because the sensitivity of every neuron to any change in its weight.
i.e. the price operate is 1.1 occasions delicate to a change within the first weight whereas the second weight influences the price operate by 2.2 occasions.
! activation of a neuron = σ(w0a0 + w1a1 + …. + wnan +b)
There are 3 methods we may improve the activation of a neuron:
1. Enhance the bias
2. Enhance the weights
3. Change the activations from the earlier layer
- Enhance the bias:
Enhance the bias of the neuron which we have to activate extra and reduce the bias of all different neurons. - Enhance the burden:
Weights are being multiplied by activations — wi*ai , which implies they’ve totally different results on the activations. - Altering the activations from the earlier layers:
If we management the earlier layer weights and biases the activation of a present layer adjustments. So recursively activation of every layer will depend on the earlier layer.