COGS 100 Lecture Notes - Lecture 17: Dementia, Parallel Computing, Recurrent Neural Network

45 views4 pages

Document Summary

Weights (connections) are the things that change overtime. Learn and adjust to produce input you want it to produce. Has no ready and semantically interpretable representations: model of: conceptual hierarchy, concept learning, semantic memory. What are they like, what are they related to. E. g. first differentiate plants and animals, then fish and birds, then robin and canary (general to more specific) Actions are in the nodes, the weights do not actually do very much: learning is in the weights, representation is in the weights, building a model of a particular chunk, sense-think-act. Mainly a thinking model: parallel processing (not serial) Literal: processing thinking, layer to layer, at one time: semantically interpretable representations. Earlier in learning one pattern, later in learning diff pattern. Shrdlu"s representations are stable always represent the same things: language-like rules. E. g. for symbolic system how to move an object, making sub-goals, the status. Need to ask the model, cannot just say it is.

Get access

Grade+20% off
$8 USD/m$10 USD/m
Billed $96 USD annually
Grade+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
40 Verified Answers
Class+
$8 USD/m
Billed $96 USD annually
Class+
Homework Help
Study Guides
Textbook Solutions
Class Notes
Textbook Notes
Booster Class
30 Verified Answers

Related Documents