Lecture 8 - Low Level Hearing Flashcards

1
Q

sound waves

A
  • auditory perceptions begin with sound waves (variations in air pressure)
  • waveform graph = can follow inc and dec in sound pressure. This info is picked up by ear and enters through outer ear
  • The physical properties of the sound wave determine perceptual qualities
  • amplitude = difs between peaks and valleys determining perceived loudness
  • frequency = higher is higher pitch.
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
2
Q

fourier analysis

A
  • everyday waveforms are not sinusoidal. any complex sound waveform can be created using finite no. of sinusoids
  • auditory system has to try and break this down to represent it in simple terms
  • waveforms (simple and complex) are often plotted as frequency spectra (covert waveform to tell you amplitude of dif sinusoidal functions in the waveform) showing the level of each frequ present in sound
  • lowest freq present is the fundamental frequency
  • harmonics are integer multiples of f0. take fundamental freq & are
    found in natural sounds
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
3
Q

spectrograms

A
  • have time on X and freq on Y. shows freq over time.
  • can change perceptual exp of pitch
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
4
Q

auditory path

A
  • three stages the brain interprets sound waves
    1. delivering the sound stimulus to receptor. involved part of outer ear middle ear and some inner
    2. converting the physical stimulus into an electrical signal. done in inner ear (cochlea) > CNS making sense of info & perceiving
    3. inferring perceptual qualities (loudness and pitch)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
5
Q

outer and middle ear

A
  • translates air vibrations into liquid vibrations
  • reflects sound and directs to ear canal which amplifies it
  • then get to middle ear transmits to osiccles which moves vibration to eardrum and outer part of inner ear
  • these vibrations take place in the air
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
6
Q

cochlea (inner ear)

A
  • cohclear partition contains basilar membrane which reacts to liquid vibrations in scala vestibuli and scala tympani
  • osicles knock on oval window which vibrats on scala tympani and vestibuli
  • vibrations now travel along liquid
  • sound vibrations travel along these
  • the cochlea partition and in middle is the organ or corti (inc structures that translate vibrations into neural impulses
  • movement towards basilar membrane which vibrates up and down to cause cilia to move laterally which open ion channels and lets K+ in causing depolarisation
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

theories of pitch perception

A
  • suggests pitch of complex sounds correspond to its f0
  • removing f0 does not change its perceived pitch - sounds the same = evidence that perception of pitch cannot be explained by which part of basilar membrane is vibrated most
  • pitch determined by f0
  • we can engage in pattern recognition and reconstruct sound frequencies
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
7
Q

tonotopic cochlea

A
  • Georg von bekesy (1960) -artificially stimulated ex vivo cochlea to reveal how cochlea converts vibrational info
  • basilar membrane responds like a travelling wave - dif parts respond differently to dif freq
  • travels from base to apex
  • physical properties of membrane mean dif places along it respond preferentially to dif frequencies
    > base is stiff/narrow = activated by high freq
    > apex is flexible and wide = low freq
  • dif freq are represented by dif physical parts of auditory system
  • as freq inc you also get nerve fibres that prfer wider tubing curves
  • requires differing amount of sound
  • what is the perceptual consequence of inc wide tuning curves
    > higher freq are more perceptually similar to one another = need more of a difference betwen them to determine difference
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
8
Q

is pitch derived from time encoding?

A
  • in addition to freq specificity, auditory nerve fibres are phase locked with respect to the stimulus = because an AP in that nerve fibres only correspond to an inc in sound pressure you get a specific pattern of activity
  • each nerve fibre only responds when there are peaks as ap only occurs when there is an inc in sound pressure so:
    > time intervals between action potentials are integer multiples of period of waveform. p is time between peaks
    > the population of responses across many fibres convey the sounds frequency
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
9
Q

is pitch derived from place encoding?

A
  • phase locking favourable as is reliable up to 5khz and human pitch perception breaks down above 5khz - sense of pitch aboce this is lost.
  • ability of AP to keep up with rises and falls in sound pressure also break down at 5khz
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
10
Q

two theories of how perceptual experience of pitch emerges

A
  1. place theory
    - there is frequency to place conversion in cochlea
    - the perceived pitch is related to the place of max response on the basilar membrane or due to pattern recognition
  2. time theory
    - the time pattern of neural impulses reflects the frequency of the stimulus (phase locking)
    - the perceived pitch is related to the time intervals between nerve spikes
    - in reality both contribute to pitch perception (Yost 2009)
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
11
Q

sound localisation

A
  • sound from all locations are combined into a single waveform that the ear detects
  • auditory system needs more cues
  • sounds can be localised across 3 dimensions
    1. azimuth (left-right)
    2. elevation (up-down)
    3. distance
  • each require dif cues some use info from one ear, some both
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
12
Q

representing sound in the azimuth

A
  • consists of binaural cues
  • interaural level dif (ILD)
    > sound reaching contralateral ear are attentuated (amplitude lower as shadowed)
    > better for higher freq
  • interaural timing differences (ITD)
    > sounds detected in contralateral ear arrive later due to further distance
    > better for lower freq
    > onset later in one ear
  • both of these cues use both ears so require central processing in CNS to integrate
  • wightman and kistler (1992) manipulated ITD of broadband sounds where sound on L ear arrive later or same as R ear and pps asked to locate sounds. ILD held constant. at baseline can plot jidged position vs actual position. when fixed at 0 for all sounds just changing ITD people give same response and think sound is clustered at 0 (ITD important here not ILD). when ITD at 45(-/+) loses slope effect and clusters seen = response bias
  • perceptual judgement of azimuth is determined by ITD
  • subjects become more accurate when low freq are removed in sounds = dual mechanism:
    1. ITDs are dominant when low freq present
    2. ILDs dominant when high freq are present
    -manipulation of ITD has greatest effect on lower freq perceptions
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
13
Q

elevation

A
  • spectral cues from pinna and head related transfer functions. they work together
    > pinna and head affect how sound waves are absorbed, reflected and defracted - creates freq specific distortions that vary depending on the elevation of sound source
  • head and pinna filter sound and not all sound freq treated same
  • everyones head and pinna is different size and shape so filtering process is individual and auditory system has to learn this.
  • Hofman et al (1998) gave people new ears changing shape of pinna (elevation info diminished) and showed
    > elevation perception impaired
    > elevation perception can be relearned over period of days
    > horizontal judgements not affected
  • shows neural plasticity in auditory system
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
14
Q

distance

A
  • brain combines several contextual cues to infer distance of sound:
    > sound level - dec with distance and change loudness
    > sound spectrum - high freq travel less distance and lower freq further away
    > direct-to-reverberant energy ratio - dec with distance
  • mershon & bowers (1979) - when people are closer they have shorter and more direct sound path. more reverberation from environ is present the further you are from sound source
  • direct sound level dec as function of distance and reverberant energy inc with distance (sound bounces more before reaching you) espec indoors
How well did you know this?
1
Not at all
2
3
4
5
Perfectly
15
Q

auditory pathway

A
  • main central path inc
    > cochlear nucleus - deals with info from 1 ear. have one each side
    > superior olivary nucleus - can make binaural comparisons. first tp combine left and right input so support azimuth.
    > inferior colliculus - info passes on and maps auditory space
    > medial geniculate nucleus - same as IC but contains more advanced maps & motivated by attention
    > primary auditory cortex
16
Q

ITD neurons

A
  • algorthmic models of ITD coding consist of coincidence detectors
  • Jeffress (1948) model: neural circuit consists of numbered coincidence detectirs. each preferentially responds along somewhere in horizontal azimuth
  • azon spreads and connects to each neuron - forming delay line. longer for neural impulse to reach neuron 9 than 1. same in R ear in opposite way
  • neuron in middle ear receives input from both ears at same time = can fire
  • if sound comes from R it reaches R ear first and travels along R ear axon before L ear can so AP occurs later
  • neuron only fires when receives info from each ear at same time
  • konishi et a; (1988) - found these in owls - owl equivalent to superior olivary nucleus set up in same way with dif paths and axon lengths. similar to jeffress model
17
Q

tonotopy

A
  • inferior colliculus is next critical hub. get rep of auditory space
  • integrates binaural and spectral info to form spatial map
  • inferior colliculus encdes freq in tonotopic map - neurons respond more to certain sound freq
  • Schreiner & Langner (1997) studied in cat - deeper neurons represent higher sound freq
  • human primary auditory cortex contains of tonotopically org map of space, orientation preference and sound etc with sensitive to freq neurons. but no auditory spatial map here
  • formisano et al 2003 - playing dif sounds at dif freq show dif response at A1
  • get u shape with responses lower in middle and higher at ends