Hearing Flashcards
(58 cards)
Pinna
Outer part of our ear
- responsible for funnelling soundwaves and for elevation localization
- unique for every human
tympanic membrane
membrane at the end of our ear canal, which vibrates when stimulated by pressure waves and then in turn moves our auditory bones.
Ossicles
Bones on our middle ear
- translate the sound energy from air medium to fluid medium
- malleus, incus, stapes
Oval window
- also called vestibular window
- between middle and inner ear
- covered by membrane
- vibrations get translated to the fluid in our inner ear
Inner ear
- 3 semicircular canals
- cochlea
- cochlear nerve
- vestibular nerve
- round/cochlear window
Auditory transduction
Vibrations of the oval window -> Basilar membrane -> Movement of Cilia -> “rubs” against tectorial membrane -> leads to changing of firing rates of auditory nerve cells
Cilia
Tip of hair cells
Organ of Corti
- receptor organ of hearing
- inside of the cochlea, on the basilar membrane
- made up of hair cells and auditory nerve endings
Tonotopic map
- The cochlea/basilar membrane is organized in a way, where high frequency sounds get detected towards the base and low frequencies get detected at the apex of the structure -> Tonotopic map with similar frequencies being adjacent to one another.
Place-code
different frequencies cause activity in certain locations in our auditory system.
Timing-code
Different sound frequencies trigger according firing rates . -> Neurons have tuning curves (V-shape)
Volley principle
Neurons are “phase-locked”. If however the frequency is to high for the neuron to fire every time this phase occurs, other neurons locked to the same phase fire for the times, when the other neuron is in its refractory period.
Phase locking
Means that a neuron is tuned to fire every time a pressure wave is at a certain stage in its cyclus.
What happens if a sound gets very loud?
Because of the tuning curves (the preferred frequency is only “preferred”, not “absolute”), neurons that are tuned to different frequencies also start to fire.
- > We can’t distinguish between frequencies very well if the sound gets too loud.
- In the cochlea, the sound travels all the way to the apex and winds back around, exiting through the round window.
Auditory Processing streams
Similarly to the visual pathways, there are 2 auditory processing streams.
Where? -> Dorsal
What? -> Ventral
-> Both end up in the prefrontal cortex
Physical and Psychological properties of sound
Physical -> Psychological
Frequency -> Pitch
Amplitude -> Loudness
Phase -> Timbre
Audible spectrum
20 - 20k Hz
Minimum audibility curve
Curve that plots, how many decibel are necessary at each frequency, for a human to perceive the sound
- > lowest threshold at ~2.75kHz
- > Extreme frequencies need more dB to be audible
Does the Fourier analysis include phase information?
No
Pain threshold of sound
~ 140dB
Three representations of sound
- Time domain / Waveform
- Frequency domain / Spectrum
- Time-frequency domain / Spectrogram
Time domain / Waveform
X-Axis: Time
Y-Axis: Amplitude
Frequency domain / Spectrum
X-Axis: Frequency
Y-Axis: Amplitude
Spectrogram
X-Axis: Time
Y-Axis: Frequency
Color-grading: Amplitude (Red=High; Blue=Low)