top of page
Writer's pictureGeorge Matsaridis

Lesson 8 - Perception, Action, Cognition and Emotions

(Previously hosted at Weebly platform, these are my personal notes from the Coursera platform "Synapses, Neurons and Brains" online lesson, taught by Prof. Idan Segev, The Hebrew University of Jerusalem (2013)).

What are the neuronal processing mechanisms for?

  • Sensation: The transformation of external events into neural activity.

  • Perception: processing of sensory information. We believe that the end result is a useful representation in terms of the external objects that produced the sensations.

  • Actions: Organisms use the representation of the world in order to act on it, optimizing rewards and minimizing punishments.

  • Emotions

The story of a sound So how sound becomes electricity? We have hair cells which reside in the ear. They actually have hairs (???). The sound comes because of the movement of these hairs. The motion of the hair opens or closes some "mechanosensitive ion channels" which are located on the hairs. This activity causes currency into the hair cell and this current causes depolarization in the hair cell. This depolarization acts on a synapse and modulates the amount of transmitters between the hair cell and the auditory nerve fiber and this causes changes in the spiking activity on the auditory nerve. This nerve in its one end is contacting the hair cell while its other end is fitted in the brain so the brain gets to know about the motion of the hair. How does sound cause vibration of the hairs? This happens because hair cells sit on top of a membrane called the Basilar one which moves in response to sounds. So the next question is: how does basilar membrane moves in response to sound? The basilar membrane itself sits on a long tube with a snail apparatus, called cochlea (again, excuse some etymology: in greek, we use "cochlea" to mean snail. The actual Greek word for snail is /kochlias/ (the stress is on the "i"). Propagation of mechanical energy in the cochlea The sound gets in from our actual, external ear, then through the ear canal which ends at the tympanic membrane. Vibration in the ear causes vibration in the tympanic membrane and these vibrations are transferred to the fluid that fills up the cochlea, so the vibrations of the fluid cause the basilar membrane to eventually vibrate. One important feature of the cochlea, discovered already from 19th century, is the fact that it's mechanically inhomogeneous so different frequencies affect different places in the cochlea. -19th century: Hermann von Helmholtz (1821-1894): the cochlea performs frequency decomposition of sounds, like a set of strings set into sympathetic vibration with sounds. - Early 20th century: Georg von Bekesy (1899-1972), the traveling wave (Nobel prize, 1961) - Late 20th century and on-going, many researchers: the cochlear amplifier and outer hair cells. To summarize, auditory transduction is performed by specialized cells, sitting in a specialized organ coupled to the physical stimulus.

Early processing of sensory information: The case of auditory localization Identification of sound direction involves comparing the incoming signal to both ears, therefore auditory localization requires computation. So how do we know the direction of sound? What are the physical cues we can use in order to identify the direction of a sound source? This cues are called binaural because they reach both ears. When a cue sound wave reaches the ear it produces two types of physical differences to the two ears: first, it reaches first the near ear and then the other one, so there is some time shift. Secondly, the head acts roughly as a shadow, so the sound will be louder to the near ear and softer to the far one. These differences between the two ears are called the Interaural Time Differences (ITDs). These differences between each ear at the peak of each sound wave are really small (hundreds of a millisecond). Phase locking The discharges of cochlear nerve fibres to low-frequency sounds are not random; they occur at particular times (phase locking). Detecting interaural time differences by coincidence detection - High fidelity of spike processing in the cochlear nucleus - no temporal summation - no spatial summation - Requires specialized neurons - Coincidence detection in the medial superior olive - ITDs of 10s of μS, spike width >100s of μs - Requires specialized neurons in specialized circuits - Short time constraints - Important use of dendrites to resolve some of the computational problems. Summary - Specialized needs for each sensory system (hearing, seeing, smelling, etc.) - Comparing timing between the two ears - Visual motion detection - Early processing of smells - Specialized circuits - The auditory brainstem - The retina in the eye - The olfactory bulb

How sensory information guides motion: the case auditory-guided head turns The model animal for these experiments is not a mammal, but a bird (owl). It's a fascinating animal that can hunt in complete darkness. When they hunt, they can hear the noises of a mouse and the information that reaches their ear tells them both about the direction and the distance of the mouse. The main cue for detecting the azimuth of the sound source is the ITDs. It was shown that vision actually teaches the auditory system where things are in the world (prisms experiments that shifted visual field of an owl by 20o degrees made also sound "to be heard" (or better, to be "thought" of being heard) from the same shifted direction (many parts of this lecture omitted...)

17 views0 comments

Related Posts

See All

Comments


.blog

αντίδοτο στο λίγο

bottom of page