Similarities and differences between senses Flashcards Preview

PSY3431 Comparative Approaches in the Study of Brain and Behaviour > Similarities and differences between senses > Flashcards

Flashcards in Similarities and differences between senses Deck (33)
Loading flashcards...
1

structure of the mammalian ear

see notes

2

structure of the mammalian ear research

Knudsen (2004)

3

Knudsen (2004)

Experience exerts a profound influence on the brain and, therefore, on behavior. When the effect of experience on the brain is particularly strong during a limited period in development, this period is referred to as a sensitive period. Such periods allow experience to instruct neural circuits to process or represent information in a way that is adaptive for the individual. When experience provides information that is essential for normal development and alters performance permanently, such sensitive periods are referred to as critical periods. Although sensitive periods are reflected in behavior, they are actually a property of neural circuits. Mechanisms of plasticity at the circuit level are discussed that have been shown to operate during sensitive periods. A hypothesis is proposed that experience during a sensitive period modifies the architecture of a circuit in fundamental ways, causing certain patterns of connectivity to become highly stable and, therefore, energetically preferred. Plasticity that occurs beyond the end of a sensitive period, which is substantial in many circuits, alters connectivity patterns within the architectural constraints established during the sensitive period. Preferences in a circuit that result from experience during sensitive periods are illustrated graphically as changes in a ‘‘stability landscape,’’ a metaphor that represents the relative contributions of genetic and experiential influences in shaping the information processing capabilities of a neural circuit. By understanding sensitive periods at the circuit level, as well as understanding the relationship between circuit properties and behavior, we gain a deeper insight into the critical role that experience plays in shaping the development of the brain and behavior.

4

tonotopic arrangement of hair cells

see notes

• Sensitivity the same across the basilar membrane
- Deeper sounds = greater wavelength

5

location on basilar membrane defines which hair cells (auditory receptor cells) respond to different sound frequencies

• Cross section of the Organ of Corti (inner ear): ca 20,000 hair cells along basilar membrane
• Inner hair cells - 95% of afferent projections
- Tallest stereocilia in contact with tectorial membrane

see notes

Fettiplace and Hackney (2006)
• Stereocilia displaced, K+ channels stretch open, influx of K+ into hair cell
• Depolarisation (= receptor potential)
- Opening of Ca2+ channels, influx of CA2+ triggers ntm release to first-order auditory interneuron

see notes

6

Tympanic ears evolved at least 5 times in the vertebrate line (Schnupp and Carr, 2009; Ladich and Schulz-Mirbach, 2016)

see notes

7

Highly schematic representation of the amniote phylogenetic tree over 400 million years to illustrate the approximate time of origin of particular features of auditory systems

• Mammals - IHC/OHC - inner/outer hair cells
• Lizards - high/low freq hair cells
• Birds, crocs - THC/SHC - tall/short hair cells
• Parallels THC/SHC and IHC/OHC:
○ THCs and IHCs less specialised and receive strong afferent innervation
○ OHC innervated by few efferent fibres (5%), SHC receive no afferent innervation
• Amniotes arose from earliest tetrapods early in the Palaeozoic and inherited from them simple hearing organ with cochlear amplifier in stereovillar bundles
• Apart from lineages to turtles and Tuatara, that remained primitive, 3 main lineages to modern amniotes distinguished
• Splitting off first were mammalian ancestors, which gave rise to both egg-laying monotremes and marsupial-placental line
• Later, archosaur line originated and led to dominant land organisms of the Mesozoic
• Only crocodile-alligator and bird groups survived to modern times
• The last group to split off was the lizards and snakes within the lepidosaurs
• The tympanic middle ear originated independently in all groups during Triassic, initiating the evolution of unique configs of papillae, with all groups showing papillar elongation and hair-cell specialisations
• Because the hair-cell popns in the monotreme and marsupial-placental mammal groups are so similar, they arose before lineages diverged
• Same applies to birds and Crocodilla
• In lizards there are family-specific variations, suggesting that these hair-cell popns arose soon after Triassic
- Because monotremes don’t have coiled cochlea, coiling developed in marsupial-placental lineage

see notes

8

comparing vision and hearing

• Červeny et al. (2011)
- Red foxes hunting small animals show a specific behaviour known as ‘mousing’. The fox jumps high, so that it surprises its prey from above. Hearing seems to be the primary sense for precise prey location in high vegetation or under snow where it cannot be detected with visual cues. A fox preparing for the jump displays a high degree of auditory attention. Foxes on the prowl tend to direct their jumps in a roughly north-eastern compass direction. When foxes are hunting in high vegetation and under snow cover, successful attacks are tightly clustered to the north, while attacks in other directions are largely unsuccessful. The direction of attacks was independent of time of day, season of the year, cloud cover and wind direction. We suggest that this directional preference represents a case of magnetic alignment and enhances the precision of hunting attacks.

9

hearing (Konishi, 1973)

• Sound - movement of air particles set in motion by vibrating structure (e.g. string of instrument, membranes/other structures in the body)
• Wave chars of sound - alternate waves of compression and rarefaction of air, molecules move back and forth from regions of high pressure to low pressure - higher freq = shorter wavelengths
• Measures of sound - freq (reciprocal of wavelength) and amplitude (measured in decibels) - pressure of air on tympanum
• Most birds head up to 5-6kHz and the barn owl has exceptional high-freq hearing, with char freqs of 9-10kHz
- More than half of auditory neurons sensitive in range of 5-10kHz

see notes

Heffner and Heffner (2007)

• Audiograms are measured behaviourally - report if hear sound or not
• Threshold for tone when correctly selected > 50%
• SPL - sound pressure level (set at 0 for 1kHz)
• Curve normalised to standard value where at 1 kHz value set to 0 decibel - threshold can have neg values
• Define depending at which decibel level threshold cut-off set to define diff freq limits that depend on cut-off threshold
- Used in diff comparisons or used to characterise changes in hearing over age or between indvs/ in many other applications

see notes

10

psychoacoustics: a subfield of psychophysics (Dent, 2017)

• Audiograms: most common assessment of animal hearing
• Train using classical and operant conditioning
• Measurement of detection thresholds: stim varied in freq and intensity played back to animal - if responds in majority of trials correctly, stim above threshold
• E.g. budgerigar (Melopsittacus undulatus) learns to peck key to start variable waiting interval - trained with rewards and range of loud signal to respond correctly (shaping) - during testing phase, other signal variations interspersed - when can hear signal should peck right key - if not withhold response

• Set at 0 for humans and other audio adjusted accordingly
• Can compare whether animal has more sensitive hearing
• Whether can hear noises with much lower intensity threshold, e.g. with barn owl or cat as compared to humans or whether humans have more sensitive hearing than for example turtles, which have much less sensitive hearing than humans
• Also judge width of function and steepness
- Max, min - make a lot of comparisons between audiograms

see notes

11

Electrophysiology: AEP measurements as non-invasive method for studying hearing functions (Mann et al., 2007)

• AEP: auditory evoked potentials to determine sensitivity threshold for diff sound freqs - electrode on top of head - used for marine mammals and fish - play back sound and record it in diff intensities and show here for 400 Hz record when get reliable signal
• Faster and no need to train animal to auditory stim
• Audiograms generated from AEPs instead of ratios of correct behav responses
• Hearing in 8 Canadian freshwater fish: best in fish with connection between inner ear and swim bladder
• Diff impacts of anthropogenic noise pollution
- Either get very strong or no signal at chosen intensities - variation in diff species of fish

see notes

12

Owls moves its head to face a visual or sound target (Hill et al., 2010)

• Movement in space can be represented by angular deviation in 2 directions
○ Horizontal (azimuth) and vertical (elevation) - characterise how the head moves in space
• Align head to face sound source
• Trained to sit on perch
- Coil on head so in electromagnetic field all movements measured

see notes

13

Owls moves its head to face a visual or sound target (Hill et al., 2010) research

Knudsen and Konishi (1978)

14

Knudsen and Konishi (1978)

Auditory units that responded to sound only when it originated from a limited area of space were found in the lateral and anterior portions of the midbrain auditory nucleus of the owl (Tyto alba). The areas of space to which these units responded (their receptive fields) were largely independent of the nature and intensity of the sound stimulus. The units were arranged systematically within the midbrain auditory nucleus according to the relative locations of their receptive fields, thus creating a physiological map of auditory space

15

how do owls localise sound source (Knudsen et al., 1979)

• Search coil on top of owl's head lies at intersection of horizontal and vertical magnetic fields - movement induces current in search coil
• First viewing direction fixated with sound from zeroing speaker - play the sound back from position
• Head movement towards sound from target speaker measured and accuracy determined
• Head flick delay - 100ms, but sounds of 75ms also elicit flick (open-loop condition)
• Done in dark and soundproof chambers
• Rewarded if locates sound so turns head towards origin of sound
- 1. The dynamics and accuracy of sound localization by the barn owl (Tyto aIba) were studied by exploiting the natural head-orienting response of the owl to novel sound stimuli. Head orientation and movement were measured using an adaptation of the search coil technique which provided continous high resolution azimuthal and elevational information during the behavior. 2. The owls responded to sound sources with a quick, stereotyped head saccade; the median latency of the response was 100 ms, and its maximum angular velocity was 790~ The head saccade terminated at a fixation point which was used to quantify the owl's sound localization accuracy. 3. When the sound target was located frontally, the owl's localization error was less than 2 ~ in azimuth and elevation. This accuracy is superior to that of all terrestrial animals tested to date, including man. 4. When the owls were performing open-loop localization (stimulus off before response begins), their localization errors increased as the angular distance to the target increased. 5. Under closed-loop conditions (stimulus on throughout response), the owls again committed their smallest errors when localizing frontal targets, but their errors increased only out to target angles of 30 ~ . At target angles greater than 30 ~ , the owl's localization errors were independent of target location. 6. The owl possesses a frontal region wherein its auditory system has maximum angular acuity. This region is coincident with the owl's visual axis.

see notes

• Location accuracy as function of position of target speaker
• Target speaker in front - error less than 2 degrees
• 0 = position directly in front
• Y = number of errors across trials
• Degree by which animal manages to accurately locate speaker
• Further away sideways at 70 degrees speaker at side of animal - less accurate can locate speaker - error in range of 10 degrees in which it misses to accurately pinpoint and face speaker - facing direction shifted by 10 degrees
- Closer speaker frontally in frontal hearing field, less degrees of errors there will be - around 2 degrees of error in vertical and horizontal direction

see notes

16

precise prey localisation requires both ears

• Auditory space in front of owl
• (L/R - degrees of azimuth, +/- degrees of elevation)
• Close one of ears - systematic shift
• Position and angle of accuracy
- Ears not quite symmetrical

see notes

Knudsen (2002)

• Sound waveform in right ear delayed and attenuated relative to that in left - reach left ear sooner
• Correspondence of interaural timing diff ITD (b) and interaural level diff ILD ( c) values with locations in space for 6 kHz sound - sufficient to detect where in space relative to owl sound source located
• Plot interaural timing difference (ITD) required to detect stim depending on where in hearing space it is
• Right in front - sound reaches both ears at same time
• Higher up but close to central line = v small
• Further away, more sideways sound source is in hearing space longer delay becomes
• Sane for ILD - sound travels over longer distance, some attenuation - tiny diff in intensity between sound arriving at one ear and other ear - systematically varies across entire aural field - no diffs if sound source located directly in front of animal
- A bird sings and you turn to look at it — a process so automatic it seems simple. But is it? Our ability to localize the source of a sound relies on complex neural computations that translate auditory localization cues into representations of space. In barn owls, the visual system is important in teaching the auditory system how to translate cues. This example of instructed plasticity is highly quantifiable and demonstrates mechanisms and principles of learning that may be used widely throughout the central nervous system

see notes

17

precise prey localisation requires both ears research

Schnupp (2009)

18

Schnupp (2009)

Although ears capable of detecting airborne sound have arisen repeatedly and independently in different species, most animals that are capable of hearing have a pair of ears. We review the advantages that arise from having two ears and discuss recent research on the similarities and differences in the binaural processing strategies adopted by birds and mammals. We also ask how these different adaptations for binaural and spatial hearing might inform and inspire the development of techniques for future auditory prosthetic devices.

19

optic tectum is located in the midbrain of the bird brain

• Sensory info conveyed through midbrain to thal and further into cerebrum
• Auditory midbrain located on inner side of optic tectum (MLD - mesencephalicus lateralis dorsalis)
• Green = songbird's cortex, which dominates bird brain anatomy, functions similarly to human cortex - outer brain shell responsible for controlling perception and some aspects of complex behav - used to think songbirds had only thin and small cortex - believed nearly entire green region controlled only instinctual behav - bird's brain through to be nearly all instinct driven
• Dark and light blue regions = brain stem, which sit toward back of bird's neck and regulate unconscious behavs - serve as relay stations to cerebral regions (green, brown and orange) - darker blue = midbrain - processing station between thal (light blue), which collects and distributes sensory info, and cerebrum, responsible for higher brain functions such as vocal syntax - midbrain also transmits info between thal and spinal cord
• Yellow = cerebellum, which regulates fine movement controls
- Orange and brown (brain cut in half lengthways) = basal ganglia, which (with cortex) control learning and sequencing of movements - previously believed that primitive operations of small region of brain extended throughout green area - Jarvis believes that bird's basal ganglia also involved in memory and general learning, and suggests that at some point soon functions added to widely accepted view of function

see notes

• Ear with hair cells located along basilar membrane, and depending which sound impinges on tympanum it will be amplified and leads to vibrations in inner ear
• In cochlea where basilar membrane deflected depending whether low or high pitched sound deflections happen at apex or base of basilar membrane
• Hair cells don’t have axons they connect directly to it first order into neurons who have long axons that reach cochlear ganglion and from interneurons project into the cochlear nucleus
• Number of connections shown - into the superior olive into the lateral lemniscus and from there into the MDL in the midbrain
- From the MDL auditory info passed on to other brain areas inc. cerebrum

20

Measuring the interaural time difference (ITD) in the cochlear nucleus

• Jeffress model: sound location computed from diffs in delay and intensity between 2 ears (Jeffress, 1948)
• Carr and Konishi (1988) confirmed with studies of barn owl basic premises of model
○ Interaural time difference is an important cue for sound localization. In the barn owl (Tyto alba) neuronal sensitivity to this disparity originates in the brainstem nucleus laminaris. Afferents from the ipsilateral and contralateral magnocellular cochlear nuclei enter the nucleus laminaris through its dorsal and ventral surfaces, respectively, and interdigitate in the nucleus. Intracellular recordings from these afferents show orderly changes in conduction delay with depth in the nucleus. These changes are comparable to the range of interaural time differences available to the owl. Thus, these afferent axons act as delay lines and provide anatomical and physiological bases for a neuronal map of interaural time differences in the nucleus laminaris.
• Cochlear nucleus contains imp structure - coincidence detection mechanism
• Neuron that listens to both signals coming from both ears - takes signal time to travel - feeds signal into branches - if arrive at same time - location of sound source can be coded
• Located in the hind brain
• ITD coded in cochlear nucleus using coincidence mechanism - each neuron project signal in away segregated and can be traced back to correspond to diff ITD
- Info mapped onto structures in MLD in tectum

see notes

21

Measuring the interaural time difference (ITD) in the cochlear nucleus research

Smith and Price (2014)

22

Smith and Price (2014)

Sound source localization is critical to animal survival and for identification of auditory objects. We investigated the acuity with which humans localize low frequency, pure tone sounds using timing differences between the ears. These small differences in time, known as interaural time differences or ITDs, are identified in a manner that allows localization acuity of around 1° at the midline. Acuity, a relative measure of localization ability, displays a non-linear variation as sound sources are positioned more laterally. All species studied localize sounds best at the midline and progressively worse as the sound is located out towards the side. To understand why sound localization displays this variation with azimuthal angle, we took a first-principles, systemic, analytical approach to model localization acuity. We calculated how ITDs vary with sound frequency, head size and sound source location for humans. This allowed us to model ITD variation for previously published experimental acuity data and determine the distribution of just-noticeable differences in ITD. Our results suggest that the best-fit model is one whereby just-noticeable differences in ITDs are identified with uniform or close to uniform sensitivity across the physiological range. We discuss how our results have several implications for neural ITD processing in different species as well as development of the auditory system.

23

Location of sound sources are mapped in 2 dimensions onto the MLD

• Auditory space in front of owl
• (L/R - degrees of azimuth, +/- - degrees of elevation)
• Inner part of auditory region
• Tonotopic mapping of interneurons (according to freq tuning)
• Outer part - interneurons tuned to 6-8 kHz, but sensitive to spatial location of sound
- Move electrode along neural structures, records tonotopically mapped responses - correspond to particular positions in hearing field of owl

see notes

24

Location of sound sources are mapped in 2 dimensions onto the MLD research

Heffner and Heffner (2016)

Knudsen and Konishi (1978)

25

Heffner and Heffner (2016)

The ability to locate the source of a sound too brief to be either scanned or tracked using head or pinna movements is of obvious advantage to an animal. Since most brief sounds are made by other animals, the ability to localize such sounds enables an animal to approach or avoid other animals in its immediate environment. Moreover, it can be used to direct the eyes, thus bringing another sense to bear upon the source of the sound. Given the value of sound localization to the survival of an animal, it is not surprising that the need to localize sound has been implicated as a primary source of selective pressure in the evolution of mammalian hearing (Masterton et al. 1969; Masterton 1974).

26

Knudsen and Konishi (1978)

○ 1. The influence of sound location and sound frequency on the responses of single units in the midbrain auditory area (MLD) of the owl (Tyto alba) were studied using a movable sound source under free-field conditions. With this technique, two functionally distinct regions in MLD have been identified: a tonotopic region and a space-mapped region.
○ 2. MLD units were classified according to their receptive-field properties: 1) limited-field units responded only to sound from a small, discrete area of space; 2) complex-field units exhibited two to four different excitatory areas separated by areas of reduced response or inhibition: 3) space-preferring units responded best to a certain area of space, but their fields expanded considerably with increasing sound intensities; 4) Space-independent units responded similarly to a sound stimulus regardless of its location in space.
○ 3. Limited-field units were located exclusively along the lateral and anterior borders of MLD. These units were tuned to sound frequencies at the high end of the owl's audible range (5-8.7 kHz). They usually responded only at the onset of a tonal stimulus; but most importantly, the units were systematically arranged in this region according to the azimuths and elevations of their receptive fields, thus creating a physiological map of auditory space. Because of this latter, dominant aspect of its functional organization, this region is named the space-mapped region of MLD.
○ 4. The receptive fields of units in the larger, medial portion of MLD were of the space-independent, space-preferring, or complex-field types. These units tended to respond in a sustained fashion to tone and noise bursts, and these units were arranged in a strict frequency-dependent order. Based on this last property, this region is named the tonotopic region of MLD.
- 5. Because of the salient differences in the response properties of their constituent units, it is argued that the space-mapped region and the tonotopic region are involved in different aspects of sound analysis.

27

Parallel processing of time (interaural time difference ITD) and intensity (interaural level different ILD)

see notes


• Hearing info processed in 2 parallel pathways, one through magnocellular nucleus and laminar nucleus
• Another that receives input from first order interneurons of ear passes through angular nucleus - codes for intensity diffs - segregation of IDT and IDL at early stage
• Mapped into ITD and interaural intensity diffs that klater on converge in MDL in midbrain
- Allows to reconstruct location of sound source as spatial map in auditory space

28

Parallel processing of time (interaural time difference ITD) and intensity (interaural level different ILD) research

Manley et al. (1988)

29

Manley et al. (1988)

The nucleus ventralis lemnisci lateralis pars posterior (VLVp) is the first binaural station in the intensity-processing pathway of the barn owl. Contralateral stimulation excites and ipsilateral stimulation inhibits VLVp cells. The strength of the inhibition declines systematically from dorsal to ventral within the nucleus. Cells selective for different intensity disparities occur in an orderly sequence from dorsal to ventral within each isofrequency lamina. Cells at intermediate depths in the nucleus are selective for a particular narrow range of interaural intensity differences independently of the absolute sound-pressure level. A simple model of the interaction between inhibition and excitation can explain most of the response properties of VLVp neurons. The map of selectivity for intensity disparity is mainly based on the gradient of inhibition.

30

spatial mapping is projected to cortical areas (Grothe et al., 2010; Yao et al., 2015)

see notes


• Segregate sound info into ITD and ILD
• Pathway similar - have inner ear with superior olivary nucleus, the cochlear nuclei, inferior colliculus in midbrain
• MGL in thal and auditory cotex
- Projections from both sides interact at level of brainstem to produce comparisons of signal that originate from both ears

Jarvis (2009)

• Evolution of Pallium in birds and reptiles
• Example sensory (auditory) and motor (vocal) pathways in songbirds, in comparison with other vertebrates
• (a) auditory pathway in songbird showing ascending and descending input
• (b) similar auditory pathways, but sometimes with diff nomenclature used for indv nuclei, can be found for all amniotes examined
• Only sub-pathway through cochlea and lateral leminiscal nuclei shown
• once in telencephalon, parallels can be found in cell type connectivity, although pallial organisations diff and projections in amphibians mostly to striatum
• Layers and serial organisation/hierarchical organisation very similar
• Ear with hair cells, cochlear ganglion as first processing layer, cochlear nuclei, the leminiscal nuclei, corresponding structures, which is MLD in tectum of birds and interior colliculus s projections to what would be the equivalent of cortex in mammals that would be the cerebrum in birds
- Distantly related groups of vertebrates showing lots of similarities

see notes

• Light and sound propagate as waves that differ in freq and intensity
• Whilst light is absorbed by photoreceptor as quanta, sound vibrates internal structures of ear
• Spatial r'ships of stim in outer world coded through retinotopic mapping in visual pathways - in hearing pathways spatial r'ships largely lost - to use sound for accurate location of sound source, these need to be reconstructed in brain
• Auditory pathways parallel and serial connections, similar to vision - tonotopic maps result from arrangements of sensory interneurons in cochlea and imp binaural comparisons and reconstruction of spatial locations relative to body
• Research in birds contributed fundamental demo of neural mechanisms relevant to human hearing
- Audiograms allow comparisons between species to determine how hearing can be adapted to diff tasks and ecological needs