Citation Hunt

The Wikipedia snippet below is not backed by a reliable source. Can you find one?

Click I got this! to go to Wikipedia and fix the snippet, or Next! to see another one. Good luck!

In page Neural network (machine learning):

"

Backpropagation is a method used to adjust the connection weights to compensate for each error found during learning. The error amount is effectively divided among the connections. Technically, backpropagation calculates the gradient (the derivative) of the cost function associated with a given state with respect to the weights. The weight updates can be done via stochastic gradient descent or other methods, such as extreme learning machines,[1] "no-prop" networks,[2] training without backtracking,[3] "weightless" networks,[4][5] and non-connectionist neural networks.[citation needed]