Textbook Notes (280,000)
US (110,000)
UD (1,000)
CGSC (70)
CGSC170 (20)
Chapter 8.3

CGSC170 Chapter Notes - Chapter 8.3: Backpropagation, Hebbian Theory, Unsupervised Learning

Cognitive Science
Course Code
Kaja Jasinka

This preview shows half of the first page. to view the full 2 pages of the document.
Local Algorithms
Individual unit’s weights changes directly as a function of the inputs to and outputs from
that unit
In terms of neurons
Information for changing the weight of a synaptic connection is directly available
to the presynaptic axon and the postsynaptic dendrite
EX- Hebbian Learning
Neural network modelers think of it as much more biologically plausible than
Used in learning with unsupervised learning
Back Propagation required very detailed feedback
And a way of spreading an error signal back through the network
Competitive Networks
Do not require feedback at all
No fixed target for each output unit
No external teacher
Classified set of inputs in such a way that each output unit fires in response to a
particular set of input patterns
Particularly good at classification tasks
Require detecting similarities between different input patterns
Ex- They have been used for modeling visual pattern recognition
Visual recognition is able to see the same object from
multiple angles and perspectives
You're Reading a Preview

Unlock to view full version