COGS 2160 Lecture Notes - Lecture 11: Artificial Neural Network, Hebbian Theory, Unsupervised Learning

116 views7 pages

Document Summary

Neurons that are out of synch fail to link. : si(cid:373)ple for(cid:373)al e(cid:454)pressio(cid:374): (cid:449)12 = (cid:632) (cid:454) a1 x a2 a1 = activation level of first node a2 = activation level of second node. It also features in more complex learning algorithms, e. g. competitive learning. It will tell you the output activation which is fed through and transmits the activity level to the units in the next layer. If delta is positive, then the intended output was bigger than the actual network: the network has undershot. So the weights need to be increased and the threshold needs to be decreased. It is not capable of computing all boolean functions: there are some that it cannot compute. Xor is not linearly separable: the network must output 1 when i2 = 1, so we have 1 x w2 > t. The basic problem: multilayer networks can be constructed to compute any turing-computable function, but cannot be trained using the perceptron convergence rule.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents