Textbook Notes (280,000)
CA (170,000)
UTSG (10,000)
PSY (3,000)
PSY100H1 (2,000)
Chapter 7

Summary of Chapter 7

Course Code
Michael Inzlicht

This preview shows page 1. to view the full 5 pages of the document.
Chapter 7 Human Memory
-Encoding: forming a memory code (usually requires
-Storage: maintaining encoded information in memory over
-Retrieval: recovering information from memory stores.
-Cocktail party phenomenon: attention involves late selection,
based on the meaning of input.
-Working memory: phonological loop (recitation to
temporarily remember something), visuospatial sketchpad
(temporarily hold and manipulate visual images), central
executive system (switching the focus of attention and
dividing attention as needed), episodic buffer (temporary,
limited-capacity store that allows the various components of
working memory to integrate information and that serves as
an interfere between working memory and LTM).
-There is no evidence that memories are stored away
permanently and that forgetting is all a matter of retrieval
You're Reading a Preview

Unlock to view full version

Only page 1 are available for preview. Some parts have been intentionally blurred.

-STM and LTM are not separate memory stores. Both
semantic encoding and interference effects have been found
in research on STM.
-Clustering: tendency to remember similar or related items in
-Conceptual hierarchy: multilevel classification system based
no common properties among items.
-Schema: organized cluster of knowledge about a particular
object or event abstracted from previous experience with the
object or event.
-Semantic network: consists of nodes representing concepts,
joined together by pathways that link related concepts.
-Connectionist/parallel distributed processing models:
cognitive processes depend on patterns of activation in
highly interconnected computational networks that resemble
neural networks.
-Encoding specificity principle: your memory for
information would be better when the conditions during
encoding and retrieval were similar.
You're Reading a Preview

Unlock to view full version