PSYC 315 Lecture Notes - Lecture 4: Learning Cycle, Habituation, Bayes Estimator

40 views11 pages

Document Summary

Different activations enhance different ideas and different concepts. Kind of like pixels where certain pixels that get activated make a specific pattern. The pixels can partake in many different patterns. Neurons in brain have this kind of activation. Can take derivative because it is continuous function. Too many parameters for user/designer or network. More a matter of art and designing networks over. Error in network is a parabola of size of connection weight. A fast learning algorithm would take large steps. But with back propagation, we could oscillate across value instead of solving the problem. Each hidden unit tries to become a feature detector to contribute to solution. Units must decide which subtask to be resolved. If a gets more error signal, than all units with converge on a and ignore b. Error due to a decreases, but error due to b can get worse.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents