This equation is used to compute the error of a hidden neuron using its potential and the weighted sum of errors of the neurons in the next layer. This is the key idea behind backpropagation and the reason backpropagation is computationally feasible.
\( j \) | This is a secondary symbol for an iterator, a variable that changes value to refer to a series of elements |
\( i \) | This is the symbol for an iterator, a variable that changes value to refer to a sequence of elements. |
\( ' \) | This is the symbol for the derivative of a function. If \(f(x)\) is a function, then \(f'(x)\) is the derivative of that function. |
\( \sigma \) | This symbol represents the activation function. It maps real values to other real values in a non-linear way. |
\( \mathbf{W} \) | This symbol represents the matrix containing the weights and biases of a layer in a neural network. |
\( \delta \) | This is the error of a neuron in a feedforward neural network. |
\( \sum \) | This is the summation symbol in mathematics, it represents the sum of a sequence of numbers. |
\( a \) | This is the potential of a neuron in a layer of a feedforward neural network. |
\( L \) | This symbol refers to the number of neurons in a layer. |