Class Notes (1,100,000)
CA (650,000)
UTSC (30,000)
Lecture

NROC64: Lec 6 - Multisensory perception (nearly word-for-word what was said in lecture).doc


Department
Neuroscience
Course Code
NROC64H3
Professor
Niemier

This preview shows pages 1-3. to view the full 14 pages of the document.
NROC64: Lec 6: Multisensory Perception
Slide 5: Intro
Multisensory illusions = how diff senses will actually work together
How multisensory integration is implemented in the brain (2 mechanisms)
oWe have areas where diff unimodal areas flowing together (cross-talk);
significant cxns btn very early brain areas, between diff modalities
oThe cerebral cortex also has multimodal areas; have receptive fields for both
types of stimuli
Oscillations better swing together; oscillations is one way that connectivity can be
established on ongoing basis; can actually change depending on need
Optimizing perception
oMaximum likelihood estimation = statistical = integrating info from diff
sources in best possible way (info from diff modalities, and each of these
modalities is imperfect --> then combining 2 modalities in some way will
give chance of getting something that’s better than individual ones)
oSimple statistical procedure to estimate what is the best possible solution
--> this actually closely explains what humans actually do = humans
integrate multiple senses in very optimal fashion (although not perfect)
Slide 6: Intro
It makes sense to integrate across diff modalities b/c events and objects will convey
some forms of changes in energy that goes together; not just one form of NRG that
changes
oAn object may emit light and at the same time give off sound waves; these 2
events go together so it makes sense to integrate them --> organizes and
makes it simpler
oAlso helps to improve estimate of direction from which something is at
oOr complementary effect; can see something in front of you but you can’t
see behind you, but you hear something --> something dangerous, can turn
around and add vision to the audition; diff senses will complement each
other
Synergy: the whole is greater than the sum of its parts
oStill possible to understand the whole by studying individual parts
oGraphs:
Visual event as a fxn of time; light is turned off and then it is on
What was recorded in superior colliculus neuron; respond to
visual/auditory/multisensory stimuli
Spike = AP
You get some activity, simply count the # of APs that are
caused by the light turning on
Same thing for audition; have an auditory event = click
Same thing
Have both things at the same time
Light turns on and there’s a click

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Get 1200% more APs; neuron responds much more strongly
= called superadditive
An old mystery: How do our senses work together?
Slide 7:
We have brain areas specialized for vision, sound, somatosensation etc;
But there are illusions that are crossmodal and multisensory that suggest there are
strong interactions between the diff senses; how is it possible to have crossmodal
illusions despite separate brain areas:
oSolution 1: feedforward from unimodal areas to later, multisensory
(multimodal/heteromodal) areas
oSolution 2: have cross-talk btn unimodal areas (lateral connections)
Slide 9: Multisensory illusions; McGurk effect
Under ideal conditions: hear dah dah
Close eyes: hear bah bah
Turn of sounds: see gah gah
Slide 10: McGurk effect
Audio-visual illusion where the sound is bah
These are freq plots of energy; sound analysis
Sound ‘bah’ in terms of freq is plotted on vertical axis; horizontal axis = time; color
coded = power
Get bands of higher energy which are very typical for human voices = can tell apart
vowels and consonants that way
3 bands
what it illustrates is that these are actually sounds that are generated by a synthesize
= is a continuum between ‘bah’ and in the in the end ‘gah’ --> noone is saying gah,
but the video indicates ‘gah’; in middle is ‘dah’
since produced w/ sythesizer, each time that you hear some sound --> will have
very clear opinion of what you’re hearing; even at the transitions --> this is
categorical perception of speech sounds
in audition, we have this categorical perception = depends on language you grew up
w/; we have a way of sorting sounds into discrete categories even if it is on a
continuum
McGurk effect: auditory info = dah and visual info = gah; mix that together --> get
something in the middle --> you don’t say that something weird is going on, you
say you perceive ‘dah’ (in the middle)
It also depends on how clear the 2 modalities are; depending on that, there may be
more weight for auditory or visual info
Slide 11: Oculogravic (gravity) illusion
Another multi-sensory/cross-modal effect
Something that pilot would experience, esp on jet

Only pages 1-3 are available for preview. Some parts have been intentionally blurred.

Illusion caused by linear acceleration/deceleration which gives feeling of false
climb/descent respectively; caused by pilot’s misinterpretation of otolith organs
Perceive gravity uses the same senses as perceiving acceleration = otolith organs of
vestibular sys
If you’re stationary, gravity = perceived as arrow pointing down; acceleration
would go opposite to that = inertia
Jet that pilot is flying is going horizontally, orthogonal to gravity
Adding the 2 vectors together = get something in between; so pilot gets impression
of flying up even though that’s actually not true
Cross-talk between vision and vestibular sense
Slide 12: Rubber hand illusion
False sense that some rubber hand is part of your body
Recalibration of your body schema; what belongs to our own body and what
doesn’t can actually change
Participant’s left hand covered by screen, there’s a rubber hand; experimenter
brushes over rubber hand and participant’s left hand --> someone attacks rubber
hand w/ a fork, participant w/draws his own left hand
Participant tries to protect the rubber hand because had illusion that rubber hand is
part of his body (natural reflex)
We can readjust what we perceive to be part of our own body (very easily);
something as artificial as rubber hand
Tool use
Slide 15: Multisensory neurons in the SC
Superior colliculus = one of the structures involved in visual pathways, relay station
for oculomotor control
SC = computing orientation, move eyes/head (shift attention) --> direct your
processing towards certain location to better perceive from that direction
SC = has multisensory neurons
oSC = dominant for visual sys; but clearly audition and somatosensation does
have a role here also
oMost SC neurons have receptive fields NOT only in the visual domain but
also in auditory and somatosensory
Superadditive effect: if have visual input only or auditory input only, then response
from neuron will be much less than if the neuron were stimulated by both
modalities
oWorks best if there’s spatial congruency (same direction); if object/event is
sending NRG in terms of light AND sound, will be sent from same location
oMost of the time, comes from same direction
As eye movements shift, things become more complicated
oEye moves = retinas move also; receptive fields of SC are moving w/ the
eye; these RFs move away somewhere else
oEars don’t necessarily move when you move your eyes
You're Reading a Preview

Unlock to view full version