COGS 100 Lecture Notes - Fault Tolerance, Hebbian Theory, Positron Emission Tomography

25 views2 pages

Document Summary

There were two leading theories of word recognition: Peterson performed scans to decide between these two theories of lexical processing. The cog theory came out ahead: viewed words, listened to words, spoke words, generate appropriate verbs. The pet scan images above arise by subtraction. The average baseline is subtracted by the average. Introduce single unit networks and boolean functions. Introduce (donald) hebbian learning and the perceptron convergence rule. Explain the limits of learning in single unit networks. Connectionism developed with the idea that one should model computations in a way inspired by the way the brain works. Can be used to model multiple satisfaction of soft constraints. Intended as models of information-processing and the algorithmic level. Functions are mappings from a domain of objects into a range. For boolean functions the domain is made up of truth values. Binary boolean functions: the domain is all the different possible pairs of truth value.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents