Class Notes (836,381)
Canada (509,762)
Psychology (4,210)
PSY274H5 (129)
Lecture 5

PSY274 Lecture 5 (Oct 8).docx

7 Pages
87 Views
Unlock Document

Department
Psychology
Course
PSY274H5
Professor
Craig Chambers
Semester
Fall

Description
PSY274 Lecture 5 - Gestures in human communication - Starting points o Notion of “channels” of communication  Vocal auditory (speech) OR manual-visual (sign)? - Structure of topics o Standing features  Appearance, clothes, bodily adornment o Broader aspects of “body language”  Proxemics, haptics o Eyes and faces  Attentional cueing and eye gaze  Emotional expression  Audiovisual integration in speech processing o Hands  emblems  **Co-speech gesture - Standing features of interaction o Appearance, clothes, bodily adornment  Jewelry, makeup, tattoos are examples of adornment o What do/can these “communicate”? o Are they iconic? Indexical? Symbolic? - Aspects of “body” language”: proxemics, haptics o Parameters for the spacing between individuals or for rules governing touch  Degree to how much you need to be close to someone  Violations of comfort zone or personal space  Just as there’s rules for spacing, there’s also boundaries for touch o Usually depend on cultural and social factors, although some individual level variations - Eyes and faces o Eye gaze  Possible because of some characteristics of the human eye  Human eye: sclera surrounding pupil and iris lacks pigmentation; visible portion of eye is comparatively large for body size  Not true of other primates  Contrasted with gorilla  Pigmented sclera; big body yet smaller eyes  Makes it easy to see where people are gazing because more of the eye is visible o Upshot: only human eye gives easy information on where person is looking o “cooperative eye hypothesis”  Human eye evolved in way that optimizes ability to track gaze of others - Tests of gaze-following in humans and nonhuman primates (Tomasello et al; 2007) o Gorilla, bonobo, chimpanzee, human infant o View videos of faces in different conditions  Face and eyes straight ahead (control)  Eyes closed, head tilts upward  Head stable, eyes look upward  Looked up with both eyes and head o Human infants: tend to shift own gaze only following changes in viewed EYE position o Great apes: tend to shift own gaze in response to changes in viewed HEAD position  Coarse cues  Suggests that other primates don’t use the eye gaze cues like humans do - Are we “wired to shift our attention in response to the gaze shifts of others? (“reflexive/automatic”) o Attention cueing task: press L or R button, depending on where “+” appears; reaction time is measured o There’s also a schematic face with schematic eyeballs in the middle of the screen and sometimes the eyeballs shift in certain left or right direction o Eye gaze cue is either valid (predictive of L-R position) or invalid o Results: valid cue condition: RT faster compared to baseline o Invalid cue condition: RT slower compared to baseline o Important point: effect occurs even after many trails, when participant should have learned gaze cue is overall not predictive (50% valid, 50% invalid). Automatic?  For a certain period, researchers were convinced that it was automatic o Beyond the lab: gaze-driven attentional shifts now NOT believed to be “automatic”  Ex. When people are thinking of something, they sometimes look away for a second, but we don’t look at that direction because we know they’re just thinking so it’s not really automatic o …but, still a strong link between gaze and communication behaviour - Eyes and faces o Facial expression  Emotions - Research by Paul Ekman o Surprising degree of cross-cultural similarity in how basic emotions are expressed on the face  E.g., anger, disgust, fear, joy, sadness, surprise o Contradicted earlier work arguing for culture-specific patterns o The set of basic emotions = recognizable even by individuals in isolated cultures shown faces of unfamiliar looking humans - Eyes and faces o Audiovisual integration in speech perception  Not only do we see their facial expressions and gazes, we can also see their mouth  Visual information of the articulation of our speech accompany with our ears - “McGurk effect” o Reflects automatic synthesis of visual and auditory information o Procedure: create videos with edited audio tracks  Video image taken from recording of speaker repeated nonsense syllable (e.g. ga-ga, ga-ga)  Audio track replaced with same speaker saying different syllable (e.g. ba-ba, ba- ba)  Perceptual experience?? o Visual information changes the way in which we perceive the speech input  Lips: indicate sound is a /g/ (or maybe a /d/)  Ears: indicate sound is a /b/  Perceptual experience: in this case, neither o Effect is really robust and even children are affected by it o Is this “joint processing” of auditory and visual information reflected in the brain?  The images reveal that the silent lip-reading task, like the listening task, activates primary auditory and auditory-association cortices  Area where we see overlap is where speech processes occur in humans  Suggest that there’s something deeply rooted in the McGurk effect - Communication and the hands o Silent gestures  Pantomime  Charades  French styled mime, telling a story  Pointing  Specialized conventions: usually specific to profession/particular task – sports signals, waving in plant to gate, astronauts, scuba divers, lifeguards  “emblems”: more commonly known; not specific procession/task, but meaning = culturally specified – “hi”/”bye” wave, “crazy”, thumbs up, crossed fingers for luck, rude gestures - Video on emblems o Note: what the presenter refers to as “illustrators” are things we will call as co-speech gestures later on - Co-speech gesture o Gestures that accompany speech; tend to reflect patterns that are comparatively similar across cultures o Meaning/function can be understood only in relation to speech o Only recently have these become the topic of focused research using a wide variety of experimental methodologies o Today:  Some broad observations on the link with speech  Classification: different types of co-speech gesture  For whose benefit do we gesture?  Relationship to information conveyed by speech - Links between co-speech gesture and speech o Development  Initial state: infants produce communicative gestures, no speech  Communicative gestures, unrelated speech  Related speech + communicative gestures are produced at same time that temporal synchrony is achieved, just before ‘2-word stage’  Same meaning with gestures and speech at the same time where children are able to combine words together; perhaps a shift 
More Less

Related notes for PSY274H5

Log In


OR

Join OneClass

Access over 10 million pages of study
documents for 1.3 million courses.

Sign up

Join to view


OR

By registering, I agree to the Terms and Privacy Policies
Already have an account?
Just a few more details

So we can recommend you notes for your school.

Reset Password

Please enter below the email address you registered with and we will send you a link to reset your password.

Add your courses

Get notes from the top students in your class.


Submit