PSYC 315 Lecture Notes - Lecture 4: Learning Cycle, Habituation, Bayes Estimator
Document Summary
Different activations enhance different ideas and different concepts. Kind of like pixels where certain pixels that get activated make a specific pattern. The pixels can partake in many different patterns. Neurons in brain have this kind of activation. Can take derivative because it is continuous function. Too many parameters for user/designer or network. More a matter of art and designing networks over. Error in network is a parabola of size of connection weight. A fast learning algorithm would take large steps. But with back propagation, we could oscillate across value instead of solving the problem. Each hidden unit tries to become a feature detector to contribute to solution. Units must decide which subtask to be resolved. If a gets more error signal, than all units with converge on a and ignore b. Error due to a decreases, but error due to b can get worse.