+ 2
Help with adjusting neural network weights
I am building a cpp neural network, and according to the tutorials i have read, using backpropagation, the weight of the connection inside the neural network is adjusted with following formula: weight = learnRate * error * connectionInput + weight what im confused about is "error". I know that it is ExpectedValue - ActualValue, but expected value of what? Expected value of the neural network output ? Then, what if we have multiple outputs, how do we calculate error then?
3 Réponses
+ 4
The weight is adjusted for each time an output which falls outside the expected value is produced. Let's say that the output of a single layer neural network of 3 input nodes, P1, P2 and P3 would give an output of:
(P1*W1)+(P2*W2)+(P3*W3)
where W1, W2 and W3 are their weights respectively.
From that equation, only a single output is produced for each time the network fires, and the weights of P1, P2 and P3 are adjusted accordingly. How did you get multiple outputs?
+ 1
i thought that output of each neuron in the last layer will be considered as a separate neural network output, but if NN output is the sum of the last layer output, then i guess that answers my question