01:830:305 Lecture Notes - Lecture 14: Central Processing Unit, Long-Term Potentiation, Perceptron

10 views3 pages
17 Jun 2020
Department
Professor

Document Summary

Models of the mind should be based on the actual architecture of the brain. The brain does not have a central processing unit; it has many units working in parallel. Information is not stored in one discrete place; instead it is distributed everywhere in the weights along the connections. All learning follows a common mechanism: modification of connection weights based on experience long-term potentiation. In neuroscience, long term change in the degree to which a particular synaptic connection conveys excitation or inhibition is called long term potentiation. The analog of ltp in an artificial neural network is the weight on the connection between two nodes in the network. Raising an excitatory weight makes it excite more; lowering it makes it inhibit less. Rasing an inhibitory weight makes it inhibit more; lowering it makes it inhibit less. A weight of zero means it has no effect .

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents