I started developing a neural net but I don't really understand how to update weights and biases in the hidden layer or lets say there is one factor I do not understand.
Lets assume we have a net of this structure:
For updating weights in the last layer I have the following formula (example for w5):
sigmoid' stands for sigmoid * 1 - sigmoid and w5-updated would be w5 - learning-rate * the product of the formula.
So far everything makes sence for me
The problem I have is in the hidden layer, as mentioned above. For the weights there I have this formula (example of w1):
And the thing I don't get here is the sum at the end. In this example it makes sence, but if i had two hidden layers instead of one, what would I sum then? I first thought I'd need to sum the data of the next layer but this does not make any sense because I need the expected output of the neurons and I have no expected output for a neuron in a hidden layer.
Lets assume we had this net:
How would I calculate a new weight for w1 for example?
Thank you!
question from:
https://stackoverflow.com/questions/65905039/how-to-calculate-weights-and-biases-in-hidden-layer-in-neural-net 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…