CGSC170 Chapter Notes - Chapter 8.3: Backpropagation, Hebbian Theory, Unsupervised Learning

18 views2 pages

Document Summary

Individual unit"s weights changes directly as a function of the inputs to and outputs from that unit. Information for changing the weight of a synaptic connection is directly available to the presynaptic axon and the postsynaptic dendrite. Neural network modelers think of it as much more biologically plausible than backpropagation. And a way of spreading an error signal back through the network. No fixed target for each output unit. Classified set of inputs in such a way that each output unit fires in response to a particular set of input patterns. Require detecting similarities between different input patterns. Ex- they have been used for modeling visual pattern recognition. Visual recognition is able to see the same object from multiple angles and perspectives. There are several competitive network models of this type of position-invariant object recognition. No connections between units in a single layer. Allow the output units to compete with one another.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents