© 2000 Todd Neller.  A.I.M.A. text figures © 1995 Prentice Hall.  Used by permission.
Back-Propagation
•Basic idea: Supply training inputs, computation feeds forward, error computed with training output, error propagates backward for weight updates.
–Start with final layer
–Update output weights of layer according to layer output error as with perceptron learning rule
–Assign error to units of previous layer according to weights
–Repeat this process backwards through layers