Class Notes (1,100,000)
CA (630,000)
UTSC (30,000)
Psychology (8,000)
PSYA02H3 (1,000)

Chapter notes

Course Code

This preview shows pages 1-3. to view the full 9 pages of the document.
Roshan Singh
Mr. Joordens
Chapter 10 Notes
-Our use of language is behaviour. Spoken words by themselves are highly organized patterns of motor
activity. It is seen in the case of Carl Bennett where under the influence of Tourettes syndrome, they are
simple behaviours such as twitching and finger tapping. When organized into language, they can
communicate the complex steps of a surgical procedure. Bennett was able to suppress the random words
whenever he needed to communicate clearly and precisely.
Our use of language can be private (talking to oneself) but the development of language involves social
interactions among our early ancestors. Speaking and writing are social behaviours as we learn them from
other people and use them to communicate with them.
We also use language as a tool in our own remembering and thinking. Language also enables us to think
about very complex and abstract issues by encoding them in words and then manipulating the words
according to logical rules.
Linguists have studied the rules of language and have described precisely what we do when we speak or
PsycholinguisticsA branch of psychology devoted to the study of verbal behaviour. Researchers in this
field are more concerned with human cognition than with the particular rules that describe language.
They are more interested in how children acquire language i.e. how verbal behaviour develops and how
children learn to speak from their interactions with adults. They also study how adults use language and
how verbal abilities interact with other cognitive abilities.
Human vocalizations contain enough information that we can recognize individuals from the sounds of
their speech. We can filter out the nonspeech sounds such as coughs, within an individuals vocalization.
The auditory system recognizes the patterns underlying speech rather than just the sounds themselves.
Belin, Zatorre, and Ahad (2002) used fMRI scans and found that some regions of the brain responded
more when people heard human vocalizations (speech and non-speech) than when they heard only natural
sounds. Regions where there was a large difference were located in the temporal lobe, on the auditory
cortex. The auditory area on the left hemisphere showed a greater contrast in response to the natural
speech or speech that had been scrambled in frequency. This suggests that the left hemisphere plays a
larger role when it comes to analyzing the detailed information of speech.
PhonemeThe minimum unit of sound that conveys meaning in a particular language, such as /p/
Voice-onset timeThe delay between the initial sound of a consonant (such as the puffing sound of the
phoneme /p/) and the onset of vibration of the vocal cords.
Voicing is the vibration of your vocal cords. The delay in voicing that occurs when you say pa is very
slight: only 0.06 seconds.
Phonemic discriminations begin with auditory processing of the sensory differences, and this occurs in
both hemispheres. Scott, Blank, Rosen and Wise (2000) identified some of the areas in the left

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Roshan Singh
Mr. Joordens
Chapter 10 Notes
hemisphere using PET scans. Some areas responded to both natural and unintelligible speech, while
others responded only to speech that was intelligibleeven if it was highly distorted. The latter regions of
the auditory cortex rely on information that transcends the distortions of individual phonemes. Perhaps
this information is based on larger segments of speech such as that provided by syllables. Ganong (1980)
found that perception of the phoneme is affected by the sounds that follow it. Kiss and gift experiment.
The results suggest that we recognize speech sounds in pieces larger than individual phonemes.
These larger units of speech are established by learning and experience. Sanders, Newport and Neville
(2002) made people listen to a continuous string of sounds. The sounds were composed of short syllabic
sounds spliced together, such as the string:
They then took some of the sounds from the stream such as dutaba and made them as words. Sanders and
her coworkers found that when people learned these nonsense sounds as words, they showed the N100
response (despite no other auditory cues).
In addition to learning the units of speech, words, we also learn its content. Context affects the perception
of words through top-down processing.
If we want a listener to understand our speech, we must follow the rules of language. We must use words
with which the listener is familiar and combine them in specific ways. All languages have syntax or
Syntactical ruleA grammatical rule of a particular language for combining words to form phrases,
clauses and sentences.
Syntax provides important information. For example, the sentence a little girl picked the pretty flowers
could be analyzed and identify the part of speech for every word by a linguist. Linguists and English
teachers could understand this sentence even before they learned the names articles, noun phrases and so
on. Our understanding of syntax is automatic. We are not conscious of this process similar to a child not
being conscious of the law of physics when they learn to ride a bicycle.
The syntactical rules are learned implicitly. Later, we can be taught to talk about these rules and recognize
their application, but this ability is not needed to speak and understand the speech of others.
Knowlton, Ramus and Squire (1991) found that patients with anterograde amnesia were able to learn an
artificial grammar even though they had lost the ability to form explicit memories. In contrast, Gabrieli,
Cohen and Corkin (1988) found that such patients are unable to learn the meanings of new words. Thus,
word learning and syntax appears to involve different types of memory and consequently different brain
A person need not learn to categorize nouns, verbs in order to recognize and use them appropriately.
Regardless of where the word is located in the sentence, we have not trouble identifying what the word
refers to.

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Roshan Singh
Mr. Joordens
Chapter 10 Notes
Function word A preposition, article, or other word that conveys little of the meaning of a sentence but
is important in specifying its grammatical structure.
Content word A noun, verb, adjective, or adverb that conveys meaning.
AffixA sound or group of letters that is added to the beginning of a word (prefix) or to its end (suffix).
Epstein (1961) presented people with wacky word strings and people could more easily remember the
ones with the affixes.
SemanticsThe meanings and the study of the meanings represented by words.
Just as function words help us determine the syntax of a sentence, content words help us determine its
ProsodyThe use of changes in intonation and emphasis to convey meaning in speech besides that
specified by the particular words; an important means of communication of emotion.
Deep structureThe essential meaning of a sentence, without regard to the grammatical features
(surface structure) of the sentence that are needed to express it in words.
Surface structureThe grammatical features of a sentence.
People with language disorder known as conduction aphasia have difficulty repeating words and phrases,
but they can understand them. So, they are unable to retain the surface structure of other peoples speech.
Most psychologists disagree with Chomsky about the particular nature of the cognitive mechanisms
through which deep structure is translated into surface structure.
ScriptThe characteristics (events, rules, and so on) that are typical of a particular situation; assists the
comprehension of verbal discourse.
Shank and Abelson (1977) suggested that this knowledge of the world is organized into scripts. Example:
Bar and aspirin.
Mechanisms involved in perceiving, comprehending and producing speech are located in different areas of
the cerebral cortex.
To produce meaningful speech, we must convert perceptions, memories and thoughts into speech. The
neural mechanisms that control speech production appear to be located in the frontal lobes.
Brocas aphasiaSevere difficulty in articulating words, especially function words, caused by damage
that includes Brocas area, a region of the frontal cortex on the left (speech-dominant) side of the brain.
Damage restricted to the cortex of Brocas area does not appear to produce Brocas aphasia; the damage
must extend to surrounding regions of the frontal lobe and to the underlying subcortical white matter.
You're Reading a Preview

Unlock to view full version