4- Multi-sensory processing Flashcards

1
Q

What are multisensory interactions?

A

useful illusions that can illustrate that our brain can combine bits of information across multiple modalities and how we put info together is known as…

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

Sensory processing:

Historically, how were senses viewed?

A

as separate ‘modules’,

-each operating independently to provide us with unique information about the external world.

This modular view has been fuelled by findings of distinct cortical areas specialised for processing different types of sensory input.

-But ), there are also areas of overlap from cortical areas (e.g. spatial position of objects and timings of events)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

Finish the sentence:

Our perception is actually shaped by interactions between different … modalities from different cortical areas of the brain.

A

sensory

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

Name all of the sensory modality areas:

A
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

Multisensory interactions:

Which effect occurs when the auditory component of one sound is paired with the visual component of another sound, leading to a third, distinct perception.

eg. hear the sound “ba” while watching a video of someone saying “ga,” you might perceive the sound as “da” or “tha.”

what you hear depends on what you see

A

the McGurk effect

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

Multisensory reactions in spatial perception:

Which effect occurs when visual and auditory stimuli are simultaneously presented at different spatial locations, the perceived location of the auditory stimulus is often shifted towards the visual location.

Ventriloquists exploit this to create the impression that their voice is emanating from the dummy’s mouth. We also experience it whenever we go to the cinema or watch television
(eg. ventriloquists)

A

Spatial ventriloquism

-location of sound is pulled towards location of visual event

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

What does it mean when the ventriloquist effect is perceived auditory location shifts towards the visual location, but perceived visual location is largely unaffected by the auditory stimulus?

A

the effect is typically asymmetric

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Multisensory reactions in temporal (time) perception:

What term is used to describe this process:

The perceived timing of visual stimuli is biased towards that of asynchronously presented auditory stimuli
The visual stimuli have little or no effect on the perceived timing of auditory stimuli

So the direction of the interaction is reversed

(eg. sequences of brief auditory tones/clicks, alongside flashing lights. Observer will experience the visual timing of the event as being pulled towards the nearest auditory event)

A

Temporal ventriloquism

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

Which hypothesis describes that if there are discrepancies (not in agreement) between sensory estimates (temporal and spatial don’t match such as ventriloquists), there should be a resolvent in favour of the modality that is most appropriate for the task at hand (rely upon which sense it better at making the particular judgement)?

A

Modality appropriateness hypothesis

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

According to the Modality appropriateness hypothesis:

… is typically much better than auditory spatial acuity, so vision ‘captures’ the spatial location of auditory stimuli.

A

Visual spatial acuity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

According to the Modality appropriateness hypothesis:

… is typically much better than visual spatial acuity, so audition ‘captures’ visual stimuli in time.

A

Auditory temporal acuity

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

Which sensory modality works well but sits between Audition and Vision?

A

Touch

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

What rules govern multisensory interactions?

Recent research has shown that these rules are quite flexible, and depend on the balance of … for a given judgement

A

unimodal sensitivities

-For example, Alais & Burr (2004) measured participants’ ability to localise auditory clicks and visual blobs of different sizes
-Depending on size, visual sensitivity was either better (4∘), equivalent to (32∘) or worse (64∘) than auditory sensitivity
Next they measured the perceived location of pairs of discrepant auditory and visual stimuli (i.e. the ventriloquist effect)
- Found perceived location of the bimodal stimulus was determined by vision when small blobs were used (4∘), but audition when the visual stimulus was large and difficult to localise (64∘). When sensitivity was matched, perceived location was mid-way between the two (32∘).

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

Is how we weight the different sensory cues fixed?

A

No
-not just determined by space or visual, it depends also on sensitivity of that particular moment in time

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

Which theory explains that we form a weighted average of what our senses are telling us which acts as a framework for understanding integration of sensory cues?

-proposes that the brain forms a weighted average depending on the sensitivity of that sensory modality (eg if spatial is more sensitive than auditory) of the estimates obtained from each sensory modality

-theory can predict performance over a range of different tasks

A

Maximum Likelihood Estimation

eg. auditory stimulus presented at 1 location, we make an estimate of its sound position, but that estimate will have some uncertainty associated with it.
If we also present a small visual stimulus, we make a visual estimate, but generally our uncertainty is going to be lower (more correct when we try and point to its location across different trials)

  • The maximum likelihood formula, we can predict exactly where you can localise this combined auditory/ visual stimulus when presented together, this estimate is now closer to the auditory location (which was harder to distinguish)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

What is the use of the maximum likelihood formula?

A

It allows us to predict exactly where you can localise this combined auditory/ visual stimulus when presented together

making a best estimate nearer to the stimuli with the uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

Maximum Likelihood Estimation:
Finish the sentence:

Larger weight is assigned to estimates with … uncertainty (when sensitivity is high).

Inversely proportional process

A

low uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Maximum Likelihood Estimation:
Finish the sentence:

Smaller weight is assigned to estimates with … uncertainty (when sensitivity is low).

Inversely proportional process

A

high uncertainty

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

Benefits of multisensory integration:

When a single event gives rise to sensory cues in different modalities, what are the two main benefits of combining these cues?

A

1) resolves discrepancies associated with internal (neural) and external (environmental) noise, thus helping to maintain a unified percept of the world

(there is noise when the sensory modalities pick up information in the environment, it basically filters them eg. we are not really aware that things are happening at different times)

2) increases the precision (i.e. reduce the uncertainty) of perceptual judgements

(the combined estimate we end up with is more precise and less uncertain compared to individual estimates themselves)

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

What is the name of the types of delays that happen between how long light and sound reach to our senses?

A

intrinsic delays

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

Why do we need to be careful when combining information across multiple senses?

A

Those senses only really apply if they relate to a common cause out there in the world

When sensory cues relate to different sources, integrating them would not be beneficial (and could be hazardous in some scenarios).

eg. Visual estimate of dog location
but Auditory estimate of meow location

=separate estimates should be formed based on each sensory modality.

How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

In the scenario of having a visual estimate of dog location but having an auditory estimate of meow location, separate estimates should be formed based on each sensory modality.

But How does the brain decide when to integrate and when to segregate?

A

By looking at how different those estimates are via balancing the benefits and costs.

-Integration is restricted to multisensory signals that occur close together in space and time

15
Q

Fill in the sentence:

Multisensory interactions occur when conflicting information presented to one sensory modality alter … presented to another sensory modality

A

perception of a stimulus

16
Q

Fill in the sentence:

The … effect = visual lip movements alter the perception of auditory speech

A

McGurk effect

17
Fill in the sentence: ... is the perceived location of one stimulus (e.g. auditory click) shifts towards location of another stimulus presented at the same time (e.g. visual flash)
Spatial ventriloquism
18
Fill in the sentence: ... with asynchronous presentations, is the perceived timing of one stimulus (e.g. visual flash) shifts towards that of another (e.g. auditory click)
Temporal ventriloquism
19
Fill in the sentence: The direction and magnitude of multisensory interactions depends on the relative reliability of the sensory ... from each modality (modality appropriateness hypothesis / maximum likelihood integration models), and the degree of ... conflict
estimates, spatial and/or temporal
19
Multisensory recalibration: info slide What happens if there is a consistent discrepancy between sensory estimates? adaptation of owls applied to humans
Neurons in the optic tectum of the barn owl respond to both auditory and visual stimuli and have co-localised spatial receptive fields (a main way to localise sound location ff mouse is via intraoral time differences) -Depending on the location of the sound source, sound will arrive at the 2 ears of the owl at slightly different time points (softer at one side) Use differences in the timings of the sound to localise mouse (stimuli) Within these Optic tectum Neurons there are neurons that are tuned to time differences and special receptive fields that overlap with spatial location
19
Multisensory recalibration: info slide owls Optic tectum Neurons are tuned aprox to time of 0 (directly ahead) but what happends when juvenile barn owls are made to wear prism goggles?
the goggles shift the visual image laterally, (if it totally remapped its auditory tuning to accommodate the shift caused by the prism) auditory receptive fields shift so as to counteract the introduced audio-visual discrepancy (to shift things into alignment)
19
Multisensory adaptation- spatial recalibration Can multisensory recalibration occur over shorter (i.e. adaptation) timescales? Period of adaptation: After exposure to a sequence of audio-visual stimuli with a consistent spatial offset (sound is always off to the left of the visual image, but stimuli varied across trials), the perceived locations of auditory stimuli are shifted in the direction of the conflict So our perception of auditory space will change to minimise that offset during adaptation, location of sound will start to be pulled to the right
but after adapting to offset 8 degrees, all of the estimates had shifted following adaptation this can happen in minutes (our brains are constantly monitoring and adjusting the discrepancies' in sensory modalities
19
Repeated exposure to pairs of auditory-visual stimuli with a consistent lag changes the perceived timing of subsequently presented stimuli. (eg. downloading a film on TV and it doesn't match up to the sound correctly). You are really aware of this difference at first, but over time we become less aware of that difference. Is this an example of temporal or spatial recalibration?
temporal recalibration -For example, following adaptation sequences in which a visual flash precedes an auditory click by 100ms, a small visual lead (e.g. 20ms) might be required for perceived simultaneity This shifts our perception of what we experience as simultaneous to try and match what we are experiencing during adaptation
19
What are the mechanisms of temporal recalibration?
- Asynchrony adaptation changes the speed at which sounds are processed - reaction times to auditory stimuli speed up to counteract a visual lead and slow down to counteract an auditory lead - predicts a uniform recalibration of perceived timing (our perception of not just what is synchronous but all types of relationships between sounds/ visual events would be shifted by the same amount) (It is the speed of which we are processing auditory information which changes, eg. when you have a constant visual league you may want to speed up your auditory processing in alignment or when there is a visual lag you slow down your auditory processing)
20
Mechanisms of temporal recalibration: Measured effects of asynchrony adaptation on perception of a wide range of stimulus onset asynchronies (SOAs)
rather than being uniform, changes in perceived timing vary systematically as a function of the difference between adapted and tested SOA (the amount of which a perception changes depends on timing that you have adapted to and timing of stimulus)
20
Mechanisms of temporal recalibration: Theory suggests that the relative timing between two crossmodal events (such as audiovisual stimuli) is represented by the activity of a population of neurons. Each neuron in this population is tuned to a specific temporal offset between the two events. Which theory is this known as?
Population coding account theory -this explanation is also used for tilt and direction aftereffects, suggesting that multisensory timing might be processed in an analogous manner to simple unisensory properties
21
What type of recalibration is this? Adaptation to a consistent audio-visual ... discrepancy results in shifts in perceived auditory position. These changes act to recalibrate the correspondence between auditory and visual space.
Spatial recalibration
21
What type of recalibration is this? Adaptation to a consistent audio-visual ... discrepancy changes the perception of simultaneity. Current explanations for this effect include a change in auditory processing latency, or selective adaptation of neurons tuned to different audio-visual delays.
Temporal recalibration
22
Neural mechanisms of multi-sensory processing: Multisensory integration has been most thoroughly studied in which sub-cortical area of the brain?
Superior colliculus (SC) -a midbrain structure that receives inputs from ascending visual, auditory and somatosensory pathways as well as descending projections from the cortex (combining info across senses)
23
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) Cells in the superficial (outside) layers are purely ... , but cells in the deep layers are often ... (primarily audio-visual and visual-somatosensory) and sometimes ... (audio-visual-somatosensory)
visual, bimodal, trimodal (types of selectivity responsiveness when responding to multiple sensory modalities)
24
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) The multisensory neurons in SC is super-additivity. What does this mean?
the response to bimodal (eg. auditory and visual stimuli at same time) or trimodal input is often greater than the sum of the unimodal responses.
25
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) Name the 3 main rules studies of multisensory integration in SC have established in regards to super-additivity?
1- Spatial rule 2- Temporal rule 3- Inverse effectiveness rule
26
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) Which rule of multisensory integration in SC in regards to achieving a super-additive response this explaining? Receptive fields of different sensory inputs to a given multisensory neuron in SC are spatially aligned with one another, so maximum response is achieved only with co-located stimuli (that are co-aligned in space as we want to restrict our receptor fields to the same location space).
Spatial rule
26
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) Which rule of multisensory integration in SC in regards to achieving a super-additive response this explaining? Inputs must also be approximately aligned in time.
Temporal rule
26
Neural mechanisms of multi-sensory processing: Superior colliculus (SC) Which rule of multisensory integration in SC in regards to achieving a super-additive response this explaining? Super-additivity is strongest when responses to modality-specific input is weak. (The more weak the unimodal responses are, the more the super additive will respond.)
Inverse effectiveness rule
27
Neural mechanisms of multi-sensory processing: Multisensory integration has been identified in which in which other cortical area of the brain?
Association cortex
28
Neural mechanisms of multi-sensory processing: Which brain region receives converging inputs from multiple sensory modalities mainly in: 1) posterior portions of the superior temporal sulcus (STS) 2)regions of posterior parietal cortex, particularly the lateral and ventral intraparietal areas (LIP, VIP)
Association cortex
28
Which 2 brain areas have receptors that have co-localised receptive fields (multiple sensory neurons that have overlapping regions of sensitivity, resulting in responses to stimuli from the same spatial location)?
Superior colliculus (SC) Association cortex
29
Why is it beneficial for the Superior colliculus (SC) and Association cortex to have co-localised receptive fields?
This enhances the precision and accuracy of sensory perception by combining inputs from multiple neurons
30
Neural mechanisms of multi-sensory processing: primary sensory cortex Between which 2 cortexes are there direct anatomical connections (not distinct models, they both interact and relay information).
Primary Visual and Primary Auditory cortexes
30
Neural mechanisms of multi-sensory processing: primary sensory cortex Research on functional imaging results suggests that which cortex is activated during silent lipreading?
auditory cortex
31
Neural mechanisms of multi-sensory processing: primary sensory cortex When people are deprived input from one input of sense modality (such as blind or deaf) cross-modal recruitment of which cortex is involved in the blind and auditory cortex in the deaf?
occipital visual cortex
31
Multisensory neurons receiving input from peripheral sensory systems have been identified and extensively studied in the superior colliculus. What is this describing? A- Subcortical B- Association cortex C- Primary sensory cortex
Subcortical
32
A range of multisensory areas have been identified in cerebral cortex, particularly near the junctions of ‘unisensory‘ cortices e.g. superior temporal sulcus (STS) and ventral and lateral intraparietal areas (VIP, LIP). What is this describing? A- Subcortical B- Primary sensory cortex C- Association cortex
Association cortex
32
Some evidence has also been found for direct multisensory inputs to primary sensory cortices. What is this describing? A- Subcortical B- Association cortex C- Primary sensory cortex
Primary sensory cortex