Study Guides (400,000)
CA (160,000)
UTSC (10,000)
Psychology (2,000)
PSYB57H3 (100)

PSYB57H3 Study Guide - Neuroimaging, Ketch, Prefrontal Cortex

Course Code
George Cree

This preview shows pages 1-3. to view the full 10 pages of the document.
Chapter 6
Working Memory
Working memory replaces our concept of short term memory. It is best to abolish short term memory
from our memory.
Working memory
1. How is working memory used in cognition?
2. How did the modern view of working memory arise?
3. What are the elements of working memory?
4. How does working memorywork in the brain?
5. How might views of working memory change in the future?
If working memory were a capacity of a computer, what component might it
correspond to, and why?
Working Memory: the memory system responsible for the short term mental storage and
manipulation of information.
Analogy: In a computer, there are two means by which information is stored:
Hard Drives
Which of these two is most like LTM? WM?
Working memory is most like RAM, while LTM is most like hard drives.
In RAM, information can be brought into a work space temporarily but quickly, to
manipulate and compute, and then when you are finished, you can wipe RAM clear and
put new information in.
A hard drive is slower to access and the representations put there are intended to stay
there, and not be wiped out.
The central executive controls what happens in the different storing systems. RAM does
not have something like the central executive. Therefore, RAM as working memory is
not entirely correct, because our WM does have a central executive.

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Historically, how have people thought aboutworking memory?
Contrast primary memory with secondary memory (James, 1890).
William James distinguished between primary and secondary memory and how each
linked up with consciousness. Primary memory is directly accessible to consciousness (it
is in your mind, you can think about it). Secondary memory is more like long term
memory, a form of crystallized memory. This memory is not conscious to you unless you
bring it up into primary memory temporarily.
In the 1960s people started talking about long term and short term memory instead of
primary and secondary memory.
The Magical Number Seven, Plus or Minus Two (Miller, 1956). This experiment occurred quite a
bit before the birthday of cognitive psychology in 1969.
An investigation of the capacity of STM (now termed working memory)
Reported capacity to be approx. 7 items (people seem to have a working memory
capacity around 7. Some people are a bit better, some are a bit worse. It didn’t seem to
matter what the content was in this experiment, the span would remain the same)
But what counts as an item? (an individual letter? Or 7 words with many letters?)
Chunking can cram more infointo an ‘item’ (letters can be bound together in a word to
become an individual entity. Therefore we can remember 7 units of information)
Historically speaking, what was the proposed relationship between STM and LTM?
The following is a diagram that captures main aspects of the memory system as thought of in the 1960s.
Specifically, this is the Atkinson and Shiffrin model (1968) aka the modal model. Input goes into the
system and into sensory memory. The arrows represent a transfer of information. Sensory memory was
meant to be a very short form of memory, and there was a different store for each different sensory
modality (although the focus was mostly visual at this time). Information is not stored for more than a
few hundred milliseconds in this store. Sensory memory is what allows you to see an entire circle when
a firecracker is spun quickly in a circle. There is rapid decay in this store.
Information goes from here into short term memory. This area holds information for a longer period of
time (from 3-6 seconds without rehearsal). If the individual engages in rehearsal, information can stay in
this store indefinitely.
Information can go from here into long term memory, but it is not transferred, butfaxed, because
there are copies in both short term memory and long term memory. This is why the arrow is double
ended. When we need information it can be pulled from LTM to STM to work with. In this model, short
term memory does not include the active process that our conception of working memory includes.

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

A problem with this model comes from cognitive neuropsychology. A patient was able to acquire new
long term memories, but his short term memory was basically 0. The above model cannot explain this. If
STM is damaged, there should be severe problems acquiring LTM because information flows in a linear
manner from one to another. However, this was not the case.
What kinds of experiments did people do to test claims about STM?
Famous paradigm used in many studies:
Goal: test time information is available in STM before it decayed away
Task: Memorize string of 3 consonants (TLM), then recall after delay (from 3 to 18 seconds as
seen below). Perform distracter task (count backwards in 3s) during delay to ensure no
The first graph shows that people cannot remember the information after an 18 second delay. This
distracter task is good because it involves numbers instead of words, and therefore will not,
hypothetically, overwrite the letters that people have been asked to remember. The distracter task,
therefore, is only wiping out the ability to rehearse, and the memory for the letters is naturally
evaporating, not being replaced. People accepted these ideas of decay theories of memory for a long
time. However, passive decay is probably not responsible for the loss of information from memory.
What is really causing the loss-- Possibly proactive or retroactive interference?
The second graph represents the first trial of this experiment. The difference between this graph and the
first is drastic. In the first trial, there is not much decay between the 3 and 18 second delay. In the third
trial graph, however, the recall gets a bit worse, even though there are the same delays. What is really
happening then is more letters are being introduced in each trial; creating interference and making the
decay appear to be great on average. Information from previous trials, however, is simply interfering
with performance on later trails. Therefore the loss of information is not simply due to decay. If we were
You're Reading a Preview

Unlock to view full version