MIDTERM NOTES - CHAPTER 10 – LANGUAGE.docx

18 views10 pages
Published on 3 Aug 2012
School
UTSC
Department
Psychology
Course
PSYA01H3
MIDTERM NOTES: CHAPTER 10 LANGUAGE
SPEECH AND COMPREHENSION
Perception of Speech
Recognition of Speech Sounds
Recognizing audible sounds is recognizing the patterns underlying speech not just the sounds
Using fMRI, found some regions of brain responded more when hearing human vocalization
than only natural sounds
When analyzing detailed information of speech, left hemisphere plays bigger role
Analysis of speech begins with phonemes: minimum unit of sound that conveys meaning
Voice onset time: delay between initial sound of a consonant and the onset of vibration of
the vocal cords
Is one distinction of phonemes we can detect, allows for us to distinguish between p and b
Delay between p and ah in pa is 0.06seconds
Ganong perception of phoneme affected by what follow it. Computer sound between g and k
perceived as g or k if followed by ift or iss. We recognize sounds in larger pieces than phonemes
(Morphemes smallest unit of meaning in language)
Recognition of Words in Continuous Speech: The Importance of Learning and Context
Unlock document

This preview shows pages 1-3 of the document.
Unlock all 10 pages and 3 million more documents.

Already have an account? Log in
Sanders, nonsense line of syllabic sounds, people once told to study the “nonsense words”
showed electrical signal (N100) response at onset of words
Context affects perception of words through top-down processing
Context is also non-textual and non-verbal
Understanding the Meaning of Speech
Syntax
Syntactical rules: grammatical rule of a particular language for combining words to form
phrases, clauses, and sentences
Understanding of syntax is automatic, no more conscious than child is of physics when riding
bike
fMRI shows as syntax becomes more ambiguous or complex, brain is more active
syntactical rules learned implicitly but can be taught to talk about these rules and recognize the
applications
Knowlton, Ramus and Squire patients with anterograde amnesia able to learn artificial
grammar een though lost ability to form explicit memories
Gabrieli, Cohen, Corkin such patients unable to learn meaning of the words
Learning Syntax and word meanings involve different types of memory, and thus, brain
mechanisms
Syntactical rules signalled by word order, word class, function (preposition, article, or other
word that conveys little meaning of sentence but important in grammatical structure) and
content (noun, verb, adjective, adverb that contains meaning) words, affixes, word meanings
(semantics) and prosody (use of stress, rhythm and pitch)
Unlock document

This preview shows pages 1-3 of the document.
Unlock all 10 pages and 3 million more documents.

Already have an account? Log in
Relation Between Semantics and Syntax
Chomsky suggested newly formed sentences are represented in brain as deep structure (meaning
without grammatical features) and is spoken in surface
Fromkin Slip of the tongue (usher saying “may i sew you to your sheet”)
Conduction Aphasia difficulty repeating words and phrases but can understand them
Knowledge of the World
Comprehension of speech also involves knowledge of world and situations we encounter.
Schank and Abelson suggested this is organized into scripts: characteristics typical of a
particular situation, assists comprehension of verbal discourse
Brain Mechanisms of Verbal Behaviour
Brain damage studies and PET studies show perceiving, comprehending, and producing speech
located in different areas of cerebral cortex
Speech Production: Evidence from Broca’s Aphasia
To produce meaningful speech, convert perceptions, memories, and thoughts into speech
Neural mechanisms that control speech production in frontal lobe(broca’s)
Damage to region of motor association cortex in left frontal lobe disrupts ability to speak
Broca’s Aphasia: sever difficulty in articulating words, especially function words
Unlock document

This preview shows pages 1-3 of the document.
Unlock all 10 pages and 3 million more documents.

Already have an account? Log in

Document Summary

Recognizing audible sounds is recognizing the patterns underlying speech not just the sounds. Using fmri, found some regions of brain responded more when hearing human vocalization than only natural sounds. When analyzing detailed information of speech, left hemisphere plays bigger role. Analysis of speech begins with phonemes: minimum unit of sound that conveys meaning. Voice onset time: delay between initial sound of a consonant and the onset of vibration of the vocal cords. Is one distinction of phonemes we can detect, allows for us to distinguish between p and b. Delay between p and ah in pa is 0. 06seconds. Ganong perception of phoneme affected by what follow it. Computer sound between g and k perceived as g or k if followed by ift or iss. We recognize sounds in larger pieces than phonemes (morphemes smallest unit of meaning in language) Recognition of words in continuous speech: the importance of learning and context.

Get OneClass Grade+

Unlimited access to all notes and study guides.

YearlyMost Popular
75% OFF
$9.98/m
Monthly
$39.98/m
Single doc
$39.98

or

You will be charged $119.76 upfront and auto renewed at the end of each cycle. You may cancel anytime under Payment Settings. For more information, see our Terms and Privacy.
Payments are encrypted using 256-bit SSL. Powered by Stripe.