Last week we talked about the chemical responses inside our brain. This week we will discuss the auditory system. This post focuses on the basic functions of the auditory system, interesting ear trivia and sound localization. This is because the lecture sparked our interest in the localization of sound and why it is a beneficial topic of research.
Sound can be described as a vibration (which means that it causes disturbance in air molecules) which propagates as an acoustic wave through a transmission medium such as solid, liquid or gas. When an object moves towards a patch of air it compresses it. Thus, increasing the density of air molecules and when it moves away it decreases the density of the molecules. These disturbances are transferred away from the source (or speaker) at about 343m/sec, known as the speed of sound at room temperature.
The Auditory System
As shown in the diagram above the ear has three parts: outer ear, middle ear and inner ear. The length of the auditory canal extends to about 2.5 cm and constitutes the outer ear together with the external skin. First, a sound hits the tympanic membrane which is universally known as the eardrum. Next, the sound travel through a series of delicate bones known as ossicles that generate a mechanical advantage by exerting a large force at one end and a small force at the opposite end. Finally, the ossicles pass these vibrations to the oval window which moves the fluid in the cochlea. This movement generates neuron signals which are transformed to hearing sensations.
Fascinating facts about ears
The vestibular system located inside the inner ear helps us to balance. The major cases of Vertigo are discovered inside the auditory system.
The hardest bone, the temporal bone is found inside the inner ear and the smallest bones hammer (malleus), anvil (incus) and stirrup (staples) constitute ossicles in the middle ear. Hence, ear houses both the hardest and smallest bones.
The ‘Eary’ Defense
The ear works around the clock. Our brain is responsible for ignoring unexpected sounds around us when we sleep and it only gets alerted when there is a sudden or a loud noise.
The left and the right
Since our birth until we hit puberty, in case of normal hearing the right ear performs better auditory processing than the left. However, according to a study  music is heard better from the right hemisphere or the left ear.
Right Diet == Better Hearing
Foods rich in Omega3 fatty acids, such as fish, salmon etc. increases the strength of the inner ear muscles. Foods with antioxidants, especially folic acid, are found in vegetables like Spinach, broccoli etc. They are beneficial in preventing the hearing loss by exposure to loud noise.
Squeaky Clean Ears
The ear wax produced by the ear canal protects the middle ear against dust, dirt and bacterial or fungal infections. If secreted in excessive amounts, it can cause blockages in the ear that affect the hearing. However, it generally moves outward on tiny hairs as we move our lower jaw to talk or eat.
In general, we use different techniques for locating sources in the horizontal plane (left–right) and vertical plane (up–down). Good horizontal localization requires comparison of the input from both ears whereas good vertical localization does not. Horizontal localization has interaural time delay which means it takes longer for the sound to reach the ear opposite to the direction of the sound, unless the sound comes from straight ahead and there’s no delay. When it comes to the time delay in general, sounds from the left side yield delays opposite to those on the right. For humans, it much easier to localize sudden sounds than sounds that are continuous or always on the background, so to speak, as they are always present in both ears.
The interaural delay or the interaural intensity won’t chance when perceiving sounds which occur in the vertical plane. This means that we don’t need both ears to be able to localize a sound on that plane. In theory, if one would like to seriously impair vertical sound localization, one should place a tube into the auditory canal to bypass the pinna. This would worsen the quality of the reflections of sounds waves which would be transferred into the ear canal and result in impaired sound localization.
The knowledge on how sound localization works has multiple applications in the field of engineering, one of these being robotics. Sound localization is currently and will be important in the future when constructing more human-like robots, which could be able to perform different kinds of tasks and functions based on the sounds of the environment.  One of the tasks in research is sound source localization which has multiple potential applications, such as a robot locating a human speaker in a service. It could also be used in mapping of unknown acoustic environments or with rescue tasks where visual data is impossible to obtain. 
This week we returned to studying the different senses and their operation mechanisms. Hearing is a mechanical sense to a surprisingly far extent and for that reason easier to understand. This chapter also made us think about sensory prosthetics, especially cochlear implants, that we touched on in exercise 3.4. It would be interesting to learn how cochlear implants could be further developed to enable restored hearing for all patients.
- Li X, Shen M, Wang W, Liu H. Real-Time Sound Source Localization for a Mobile Robot Based on the Guided Spectral-Temporal Position Method. International Journal of Advanced Robotic Systems. September 2012. DOI: 10.5772/51307
- Caleb Rascon, Ivan Meza. Localization of sound sources in robotics: A review. Robotics and Autonomous Systems. Volume 96. 2017. Pages 184-210. ISSN 0921-8890. DOI: 10.1016/j.robot.2017.07.011.