Learning diary update 13-11

This weeks lecture was on campus for the first time. It was certainly nice to be there in person; it allowed for much better concentration on the lecture and it felt like there was a better connection with the professor than via Zoom. this week lecture was based on the auditory and vestibular systems. it was quite interesting that how auditory system intricately designed. In the beginning one thing that is quite fascinating and practical that even one felt in their daily life, is the attenuation reflex.

One question we had was about the co-incidence detection of the ear. This detection happens when two signals, each from a different side of the ear, collide in the superior olive. My question was, how do these signals know to which axon they must go in to collide and summarise with each other? Wouldn’t a signal simply go in the nearest axon and generate an axon potential? we assume there is a barrier that only signals of a certian size can activate that axon, but wouldn’t that mean that this type of detection doesn’t work on quieter sounds which will send out a smaller signal?

we have another question about the tonotopic organisation of the auditory cortex. If the hearing is centralised in this brain area, could this mean that damage to that part would make someone deaf even if nothing is wrong with their brain? Could a neuronal deteriation disease like Alzheimers make someone deaf by deterioration of this auditory cortex?

The best comparison to what we have already learned is with the visual part of the brain. This especially through the audiovisual integration that happens in the brain. Here we learned that not only are these connected and integrated with each other, it is done so well that one can fool the other. we believe this is a perfect example of the relationship between vision and hearing.