01:830:305 Lecture Notes - Lecture 14: Central Processing Unit, Long-Term Potentiation, Perceptron
Document Summary
Models of the mind should be based on the actual architecture of the brain. The brain does not have a central processing unit; it has many units working in parallel. Information is not stored in one discrete place; instead it is distributed everywhere in the weights along the connections. All learning follows a common mechanism: modification of connection weights based on experience long-term potentiation. In neuroscience, long term change in the degree to which a particular synaptic connection conveys excitation or inhibition is called long term potentiation. The analog of ltp in an artificial neural network is the weight on the connection between two nodes in the network. Raising an excitatory weight makes it excite more; lowering it makes it inhibit less. Rasing an inhibitory weight makes it inhibit more; lowering it makes it inhibit less. A weight of zero means it has no effect .