Box 5.3: The Neural Substrates of Auditory–Visual Integration During Speech Perception: A Combined fMRI and TMS Study of the McGurk Effect
- McGurk effect: illusion in face-to-face speech perception; brain fuses auditory and visual signals
- Method: Present the audio recording of syllable /ba/ w/ video recording of face/mouth pronouncing /ga/
- Result: most ppl perceive a blend, the syllable /da/
- Explanation: the brain integrate 2 competing sensory signals by adopting an intermediate interpretation
- Alveolar /da/ is midway b/w labial /ba/ (sound) and velar/ ga/ (visual)
A Double Dissociation Between Comprehension and Repetition: Initial Evidence for Separate Streams of Speech Processing
- Visual processing occurs at occipital lobe, then it splits into 2 channels
- Channel 1: enters the ventral temporal cortex
- Aka “what” path; provide info on shape, color, texture to recognize the object
- Channel 2: runs dorsally, enter superior parietal cortex to the premotor cortex
- “how” path: responsible for visual-motor transformations, help coordinate bodily interaction w/ objects
- Ex. reach out and grasp apple
- Evidence
- Study: double dissociation
- 2 visual streams can be selectively impaired
- Damage to “what” path disrupt the ability to perceive and identify visually presented objects
- Ex. Patient DF cannot say if the pencil is oriented vertical or horizontally
- But she can reach out and grasp it
- Damage to “how” stream disrupt the ability to act appropriately on visually presented objects; but you can recognize them
- Ex. patient w/ optic ataxia
- They aim at the wrong direction to reach and grasp objects
- But they can recognize the object perfectly
- double dissociation cases in speech processing
- Ex. focal brain damage can selectively impair comprehension (knowing “what is said/content) or repetition (knowing how it is said/ vocal action)
- Ex. patient w/ transcortical sensory aphasia: can’t understand meanings in words and sentences; but can perfectly repeat words and sentences
- Ex. patients w/ conduction aphasia: understand perfectly; terribly at repetition
- This suggest that after early cortical stages of speech perception, info is further processed in 2 separate streams
- Route 1: link phonological representations w/ lexical semantic system
- Route 2: link phonological rep w/ motor articulatory system
- IOW: Dual Stream Model
double dissociation studies for auditory comprehension tasks and auditory monitoring task
- Auditory comprehension task: word-pic matching
- Auditory monitoring task: discriminate and identify phoneme
- Ex. Miceli et al 1980
- Gave auditory comprehension and monitoring tasks to 70 aphasia patients
- Comprehension task: match words/pics
- 6 pictures:
- the target
- semantic related distractors
- phonologic related distractor
- 3 unrelated distractors
- Discrimination/monitoring task: make same-diff judgements on pairs of syllables
- Set: prin,trin,krin,brin,drin,grin
- Results: double dissociation
- Some were perfect at both tasks
- Some sucked at both tasks
- Some were good at comprehension task; shitted discrimination task
- Some were good at discrimination task; shitted at comprehension task
- Summary: Some ppl do well in auditory monitoring tasks, but shit at auditory comprehension task
- Ex. matches “cat” w/ a “cat” picture ; can’t tell apart “cat” vs “cot”