PSY280H1 Lecture Notes - Sound Pressure, Monaural, Sound Localization

88 views4 pages
Published on 20 Apr 2013
School
UTSG
Department
Psychology
Course
PSY280H1
Page:
of 4
CH12 Sound Localization and the Auditory Scene
Auditory Location
AUDITORY SPACE when perceive objects located at different positions
based on their sound, w/o visual cues
o project from the head in all directions, exists whenever there’s sound
o AUDITORY LOCALIZATION feats
of locating objects in space based
on their sound
o People can locate the position of a
sound in three directions:
AZIMUTH extends from left
to right
ELEVATION extends up and
down
DISTANCE the sound source
from the listener
o Major problems w/ auditory
system in locating sound source, in comparison w/ vision
Visual information for relative locations contained in the image
on the surface of the retina
Audio information stimulate the cochlea based on sound
frequencies determine sound’s pitch and timbre
Does not contain information for relative locations
o LOCATION CUES used by the auditory system to determine sound
location, created by the way sound interacts w/ listener
Binaural Cues for Sound Location
o BINAURAL CUES cue that depends on information from both ears
2 types: interaural time difference/level difference
both based on comparing sound signals reaching
left/right ears
o INTERAURAL TIME DIFFERENCE (ITD) based on there’s a difference
in a sound reaches the left and right ears
If sound in front/behind
listener, distance to each
ear same
If sound off to one side,
will reach one ear faster
ITD larger as sound
source located more to
one side
ITD an effective
cue for locating
low-frequency
sounds
o INTERAURAL LEVEL DIFFERENCE (ILD) based on the difference in
the sound pressure level of the sound reaching the two ears
Difference in level b/c head blocks sound reaching the other ear
ILD larger as sound source located more to one side
Occurs greater for higher frequencies than for lower frequencies
Higher freq. sound waves disrupted by the head, low
freq. sound waves are not b/c period btwn waves long
enough to compensate
ACOUSTIC SHADOW decrease in sound intensity on the
far side of the head, away from the sound source due to
the disruption of high-freq. sound waves
o Using Binaural Cues for Perceiving Azimuth Locations
ITD and ILT complement each other
ITD low freq. ILT high freq.
Enable location along azimuth coordinate
Provide ambiguous information about elevation
CONE OF CONFUSION when sound source its extremely one
sided, all points (A & B) on this cone have same ILD and ITD
Monaural Cue for Localization
o MONAURAL CUE cue that depends on info from only one ear,
important for sound location on elevation coordinates
o SPECTRAL CUE primary monaural cue for localization
Information for localization contained in differences in the
distribution (spectrum) of frequencies that reach ear from
different locations
Differences caused by sound stimulus reflected from the head
and w/i various folds of pinnae before entering auditory canal
o Ex. 2 sounds sources 15° above head, 15° below same ITD, ILD
Differences in way sounds bounce around within pinna create
different frequency spectra for the 2 location
o Paul Hofman demonstrated localization affected by using mold to
change inside contours of pinnae when mold worn for several weeks
Also measurable effect when mold removed
Ex. blue grid = positions of sound stimuli presented
Red grid = average localization performance
Localization performance poor for elevation coordinates
immediately after mold inserted
Localization gradually improved person learned new
spectral cues associations to different directions in space
Localization remained excellent after mold remove b/c
training w/ mold created new set of correlation btwn
spectral cues and location, old correlation still present
Explanation: different set of neurons involved in
responding to set of spectral cues, like separate brain
areas for processing different languages
Moving head provide additional ITD, ILD, and spectral information
o Help minimize the effect of Cone Of Confusion
o Vision also helpful in sound localization
Easier to determine the source
The Physiology of Auditory Localization
How information in these cues is represented in various system for ITD?
NARROWLY TUNED ITD NEURONS neurons narrowly tuned to respond
best to a narrow range of ITDs
o In the Inferior Colliculus and Superior Olivary Nuclei
o ITD tuning curves for narrowly tuned neurons
Neurons associated w/ cures on the left (blue) fire when sound
reaches left year first, ones on the right (red) fire when sound
reaches the right ear first form of specificity coding
o Explanation by Lloyd Jeffress
Series of neurons that each respond best to specific ITD, wired
so that they each receive signal from the two ears
left ear signal arrive along blue axon, left along red
Sound directly in front reaches left & right ear simultaneously
Signals from left/right ear starts together
As signal travels through (red/blue) axon, stimulates each
neuron sequentially but neurons don’t fire unless:
COINCIDENCE DETECTORS these neurons only fire when
both signals arrive at neuron simultaneously
Finally, fires neuron 5 indicate ITD = 0
Sound from right ear reach neuron 3 simultaneously fire
Note, the side neuron fires is opposite to the ear which
received sound first, but proportional to ITD magnitude
BROADLY TUNED ITD NEURONS There are neurons that are broadly
tuned to ITD
o Recent research indicates localization can also be based on neurons
that are broadly tuned
Ex. gerbil’s right hemisphere respond best when sound is
coming from left, and vice versa
Location of sound indicated by ratio of responding of
these two types of broadly tuned neurons
Form of distributed coding, similar to color vision
Diagrams:
Left = ITD tuning curves for broadly tuned neurons in the
two hemisphere
Right = patterns of response of the broadly tuned
neurons for stimuli coming from left, front, right
there’s evidence for both narrowly & broadly tuned ITD neurons
o unsure exactly which mechanism/combination works
Perceptually Organizing Sounds in the Environment
rarely hear isolated single tones in environment
o usually experience number of sounds simultaneously
o How can auditory system separate one sound from another?
Problem, each sound source produce own signal, however, are
combined when reproduced by loudspeaker and enter ears
AUDITORY SCENE ANALYSIS process by which you separate the stimuli
produced by each of the sources in the scene into separate perceptions
o AUDITORY SCENE the array of sound sources in the environment
o Ex. can separate single voice apart from background noise
Sound sources’ position in space can potentially help separate
the sources from one another
PRINCIPLES OF AUDITORY GROUPING number of heuristics that help us
perceptually organize elements of an auditory scene
o LOCATION sound created by particular source usually from one
position in space or from slowly changing location
Any two sounds separated in space two sources
When source moves, typically follows continuous path, doesn’t
jump erratically from one place to another
Help perceive sound source in motion (cars, planes)
o SIMILARITY OF TIMBRE AND PITCH sounds w/ same timbre/pitch
range often same source
Ex. music from a flute vs. a trombone
Baroque period music alternate rapidly btwn high/low
tones to emulate two separate melodies from 2 sources
AUDITORY STREAM SEGREGATION when passage played
rapidly, low notes sound as a melody form one instrument, high
notes sound as melody from another instrument
Albert Bergman & Jeffrey Campbell demonstrated this is based
on pitch w/ an experiment
alternated btwn high & low scale perceptually
grouped into 2 auditory streams based on pitch
Bregman and Alexander Rudnicky experiment
Listener presented with 2 standard tones X and Y, when
presented along, easy to perceive order
When presented w/ distractor tones (D) , difficult
When presented w/ distractor tones and captor tones (C)
that are same pitch as distractor tones, easy
Captor tones capture the distractor, form a stream
separate from tones X and Y
Two streams of sound, one constant note repeating (red), other
scale that goes up (blue)
Listeners simultaneously perceive same both streams
At first they’re separate, but when stimuli become similar
grouping by similarity merge streams, cause galloping
SCALE ILLUSION (MELODIC CHANNELING) present two
sequences of notes simultaneously, one to right, other to left
Alone, these two sequences have alternating pitches
Together, perceive smooth sequence of notes, higher
sequence in right ear, lower sequence in left ear; even
though each ear receive both high and low notes
Perceived higher/lower sequence determined by initial
note right sear started w/ higher note
o PROXIMITY IN TIME
ONSET TIME if two sounds start at different times, likely
they’re from different sources
For stream segregation by Similarity of Timbre or Pitch, tones
with similar timbres or frequencies have to occur close together
in time
o AUDITORY CONTINUITY Sound that stay constant or that change
smoothly are often produced by the same source
Resemble Gestalt principle of good continuation for vision
Demonstrated by Richard Warren:
Presented burst of tone interrupted by gaps of silence
listeners perceive these tones as stopping during silence
Presented burst of tone interrupted by white noise
listeners perceived tone as continuing behind the noise
o Experience effect of past experience on perceptual grouping of
auditory stimuli
Ex. demonstrated by presenting melody of familiar song w/
notes jumping btwn octaves
First time play, listeners find it difficult to identify song
But after hearing the song as it meant to be played, the
octave jumping melody became easier to identify
Melody Schema representation of a familiar melody that is
stored in a person’s memory
When don’t know that melody is present, no access to
schema, have nothing with which to compare unknown
melody to
When know that melody is present, can compare what’s
being heard to stored schema and perceive

Document Summary

Acoustic shadow decrease in sound intensity on the far side of the head, away from the sound source due to the disruption of high-freq. sound waves. Ch12 sound localization and the auditory scene. Azimuth extends from left to right. Distance the sound source from the listener: major problems w/ auditory system in locating sound source, in comparison w/ vision. Visual information for relative locations contained in the image on the surface of the retina. Audio information stimulate the cochlea based on sound frequencies determine sound"s pitch and timbre. Does not contain information for relative locations: location cues used by the auditory system to determine sound location, created by the way sound interacts w/ listener, using binaural cues for perceiving azimuth locations. Cone of confusion when sound source its extremely one sided, all points (a & b) on this cone have same ild and itd. Binaural cues for sound location: binaural cues cue that depends on information from both ears.