Audition Lecture Notes PDF
Document Details
Uploaded by SplendidWildflowerMeadow9331
2023
Tags
Summary
This lecture covers the topic of audition, including the different parts of the human ear, the process of transduction, how we perceive pitch, and hearing loss. It also discusses how sound waves move through the ear.
Full Transcript
Psy 120.3 Lecture 2023/10/11: Audition and Other Senses Audition Different forms of energy require different means of transduction. The human ear is divided into three distinct parts. The outer ear consists of the visible ear (the pinna), the auditory canal, and the eardrum, which responds to soun...
Psy 120.3 Lecture 2023/10/11: Audition and Other Senses Audition Different forms of energy require different means of transduction. The human ear is divided into three distinct parts. The outer ear consists of the visible ear (the pinna), the auditory canal, and the eardrum, which responds to sound waves gathered by the pinna and channeled into the canal. The middle ear is a tiny air-filled chamber behind the eardrum, containing the ossicles–the hammer, anvil and stirrup bones–all of which create a lever that mechanically transmits and intensifies vibrations from the eardrum to the inner ear. The inner ear contains the spiral-shaped cochlea, a fluid-filled tube that is the organ of auditory transduction. The basilar membrane of the inner ear is the structure that undulates when vibrations from the ossicles reach the cochlear fluid. Its wave-like movement stimulates thousands of tiny hair cells, specialized auditory receptor neurons embedded in this membrane. The hair cells release neurotransmitter molecules, initiating a neural signal in the auditory nerve that travels to the brain. There are three physical dimensions of sound waves that correspond to auditory perception: frequency, amplitude & complexity. (Table 4.3) Frequency: perception of pitch, measured in cycles per second or Hertz Amplitude: perception of loudness, measured in decibels Complexity: perception of timbre, which allows us to distinguish two sources with the same pitch and loudness Memorize Fig. 4.20 Anatomy of the Human Ear for the next exam, especially the areas of the brain dedicated to auditory perception Sound is converted to neural impulses in the inner ear. Cochlea: fluid-filled tube containing cells that transduce sound vibrations into neural impulses Basilar membrane: structure in the inner ear that moves up and down in time with vibrations relayed from the ossicles, transmitted through the oval window Sound causes the basilar membrane to move up and down in a traveling wave; the frequency of the sound determines where on the basilar membrane the wave is at its highest. When the frequency is low, the wide floppy tip of the membrane--the apex--moves the most. When the frequency is high, the narrow stiff end closest to the oval window--the base--moves the most. This action gives rise to the theory of place code: different frequencies stimulate neural signals at specific places along the basilar membrane. The movement of the basilar membrane causes inner hair cells to bend, initiating a neural signal in the auditory nerve. Axons fire the most in the hair cells along the area of the basilar membrane with the most motion. The brain processes the information about which axons are the most active. How Do We Perceive Pitch? From the inner ear, action potentials in the auditory nerve to the thalamus and ultimately to an area of the cerebral cortex called area A1, a portion of the temporal lobe that contains the primary auditory cortex. Auditory areas in the left hemisphere analyze sounds related to language and in the right hemisphere rhythms and music. There is also evidence that the auditory cortex is composed of two distinct streams, roughly analogous to the dorsal and ventral streams of the visual system. Spatial (where) auditory features allow the location of the source of a soundin space, and are processed by areas toward the caudal (back) part of the auditory cortex. Non-spatial (what) features, which allow identification of a sound are processed in the ventral (lower) part of the auditory cortex. All of hearing, however, depends on temporal aspects. Neurons in the area A1 respond to simple tones, and successive auditory areas in the brain process sounds of increasing complexity. The human ear is most sensitive to frequencies around 1000 to 3500 Hz. The human ear has two mechanisms to encode sound waves, one for high frequencies and one for low frequencies. The second, temporal code, occurs when the brain uses the timing of the action potentials in the auditory nerve to determine the pitch you hear. Timbre, or the complex texture of the sound you hear, depends on the relative amounts of different frequency components in sound, a mixture of the relative activity of the hair cells across the whole basilar membrane. How do we determine the location of a sound? By using binaural cues. Sounds arrive a little sooner at the closer ear than at the farther ear. The time difference is effective for locating the low-frequency sources of a sound, even when it is only a little off to one side. The high-frequency components are more intense in the closer ear than in the farther one because the listener's head blocks high frequencies. The farther a sound is off to the side, the greater the between-ear difference in the level of these high frequency components. Hearing Loss Conductive hearing loss arises because the eardrum and ossicles are damaged to the point that they cannot conduct sound waves effectively to the cochlea. The cochlea itself is normal, making this a mechanical problem with the moving parts of the ear (hammer, anvil, stirrup, eardrum). Sensorineural hearing loss is caused by damage to the cochlea, the hair cells, or the auditory nerve. It has two main effects: sensitivity decreases as sounds must more intense to be heard; acuity decreases so sounds smear together on the basilar membrane, making voices harder to understand, especially if there are competing sounds in the environment. Sensorineural hearing loss can be caused by genetic disorders, premature birth, infections, medications, and accumulated damage from exposure to intense sounds. A cochlear implant may offer some help; it is a device that replaces the function of the hair cells. The external parts of the device include a microphone and a speech processor, worn behind the ear. The implanted parts include a receiver just inside the skull and a thin wire containing electrodes inserted into the cochlea to stimulate the auditory nerve. What matters is age at implantation. See 'Big Technology for Little Ears'. Sound picked up by the microphone is transformed into electronic signals by the speech processor; the signal is then transmitted to the implanted receiver which activates the electrodes in the cochlea. Infants who have not yet learned to speak are especially vulnerable because they may miss the critical period for language learning. Without auditory feedback at this time, normal speech is nearly impossible to acheive. Early use of cochlear implants had been associated with improved speech and language skills for hearing-impaired children. Haptic Perception Memorize Fig. 4.24 for the next exam: the tactile receptive field, a small patch of skin that relates information about pain, pressure, texture, pattern, or vibration to a receptor. Haptic perception is the active exploration of the environment by touching or grasping objects with our hands. Four types of receptors located under the skin's surface enable us to sense pressure, texture, pattern, or vibration. Thermoreceptors are nerve fibers that sense cold and warmth. Each touch receptor responds to stimulation within its receptive field, and the long axons enter the brain via the spinal cord or cranial nerves. Pain receptors populate all body tissues that feel pain, distributed around muscles, bones and internal organs, as well as the skin. Sensory signals on the body travel to the somatosensory cortex. High acuity is defined as more of the tacticle brain is devoted to parts of the skin surface where sensitivity to fine spatial detail (acuity) is greatest, such as fingertips and lips. There is mounting evidence (again from DTI) for a distinction between 'what' and 'where' pathways in touch analogous to vision and sound. 'What' system provides information about the properties of surfaces and objects; 'Where' system provides information about a location in external space that is being touched or a location on the body that is being stimulated. FMRI evidence suggests that the 'what' and 'where' touch pathways involve areas in the lower and upper regions of the parietal lobe, respectively. Tissue damage is also transduced by A-delta fibres, which register the initial sharp pain, and the C-fibres which transmit the longer-lasting duller pain. Perceiving Pain There are two pain pathways: one pathway sends signals to the somatosensory cortex, identifying where the pain is occurring and what sort of pain it is. The second pain pathway sends signals to the emotional and motivational centres of the brain, such as the hippocampus and the amygdala, as well as to the frontal lobe. fMRI evidence shows us that we respond to others' pain, particularly in our frontal lobes. Referred pain occurs when sensory information from the internal and external areas converges on the same nerve cells in the spinal cord. An interesting side note: pain in the jaw, toothache or a headache: The agony of experiencing a heart attack can spread down to both arms, to the jaw, head or to the back. Some people have reported tooth pain or a headache as a major symptom of a heart attack. Gate-Control Theory Pain intensity cannot always be predicted from the extent of the injury; pull out a nose hair right now and see for yourself. A more accurate predictor is the amount of neural 'real estate' in the somatosensory cortex dedicated to that body part. That is why damage to the torso and legs is not as painful as damage to the lips or nose. Gate-control theory holds that signals arriving from pain receptors in the body can be stopped, (or gated) by interneurons in the spinal cord via feedback from two directions. Pain can be gated, for example, by rubbing the affected area, activating skin receptors. Pain can also be gated from the brain by modulating the activity of pain-transmission neurons. This neural feedback is elicited not by the pain itself, but rather by activity deep within the thalamus. This neural feedback comes from a region in the midbrain called the periaqueductal grey (PAG). Under extreme conditions such as high stress, naturally occuring endorphins can activate the PAG to send inhibitory signals to neurons in the spinal cord that then suppress pain signals to the brain. PAG is also activated through the use of opiate drugs such as morphine. There is also a pain-facilitation signal that increases the sensation of pain when we are ill (for example influenza). Finally it should be noted the gate-control theory is challenged by the perception that pain is a two way street: bottom-up (eg. skin surface) versus top-down (brain). Kinethesis One aspect of sensation and perception is knowing where parts of your body are at any given moment, or proprioception. Receptors in the muscles, tendons, and joints signal the position of the body in space, whereas information about balance and head movement originates in the inner ear. Muscle, joint, and tendon feedback about how your arms are moving can be used to improve performance through learning (primarily in the cerebellum). Maintaining balance depends primarily on the vestibular system, the three fluid-filled semi- circular canals and adjacent organs located next to the cochlea in each inner ear. The semicircular canals are arranged in three perpendicular orientations and studded with hair cells that detect movement when the head moves or accelerates. The bending of hair cells generates activity in the vestibular nerve which is then conveyed to the brain. Vision also helps us keep our balance. Bertenthal et. al. (1997) experimented with the visual aspect of balance by placing people in rooms that can be tilted forward and backward. If the room tilts enough, people will topple over as they try to compensate for what their visual system is telling them. When a mismatch between the information provided by visual cues and vestibular feedback occurs, motion sickness can result. But that is not nearly as interesting as the emerging phenomenon of cyberpuke. Check out: https://www.livescience.com/54478-why-vr-makes-you-sick.htmlPain The Chemical Senses, Our Most Primitive Olfaction is the least understood sense and the only one directly connected to the forebrain & amygdala. Remember that the other senses connect through the thalamus. This indicates how ancient this neural pathway must be. This mapping indicates that smell has a close relationship with areas involved in emotional and social behavior. Terms to remember: olfactory receptor neurons (ORN); odorant molecules; olfactory epithelium; glomerulus. Groups of ORNs send their axons from the olfactory epithelium into the olfactory bulb, a brain structure located above the nasal cavity beneath the frontal lobes. Humans possess 350 different ORN types, allowing us to discriminate between 10,000 odorants, as each one has a unique pattern of neural activity. Some dogs have 100 times more ORNs than do humans. Humans can sense some smells in extremely small concentrations, such as mercaptan, at 0.0003 ppm. Odour perception includes both information about the identity of the odour, as well as the emotional response. The object-centred approach suggests that information about the identity of the 'odour object' is quickly accessed from memory, triggering an emotional response. The valence-centred approach suggests that the emotional response comes first, providing a basis for determining the identity of the odour. Research presently suggests that odour perception is guided first by memory and then by emotion. Smell also exhibits sensory adaptation. Smells fade after a few minutes; reducing sensitivity allows us to detect new smells after the initial evaluation. Top down: fMRI evidence indicates that the orbitofrontal cortex responds more strongly to smells labeled as 'pleasant' rather than 'unpleasant'. Sense of Taste The tongue is covered with thousands of papilla, within each are hundreds of taste buds. Memorized the 'Taste Bud' figure for the next exam. Each taste bud contains 50 to 200 taste receptor cells. Taste perception fades with age; half of the taste receptors are lost by age 20. The test system contains only five main types of taste receptors: salt, sour, bitter, sweet, savoury (high proteins like meat and cheese (Yamaguchi, 1998)). Microvilli react to tastant molecules: salt to NaCL; sour to acids; bitter has 50 to 80 distinct binding sites; sweet to sugars and others. Savoury (umami) receptors respond to glutamate, an amino acid in protein foods. Glutamate is a major excitatory neurotransmitter, hence monosodium glutamate (MSG) is often used to flavour Asian foods. Taste & smell collaborate to produce flavour. Taste experiences also vary widely across individuals. About 50% of people (tasters) report a mildly bitter taste caffeine (for example), whereas 25%(non-tasters) do not. Bartoshuk, 2000 reported that the remaining 25% are super-tasters who find dark green vegetables bitter to the point of being inedible. Super tasters also tend to avoid fatty, creamy foods.