Week 9

This week, we concluded our foray into the sensory systems of the human brain, at least as far as the tentative course schedule is to be trusted. Feels a little incomplete without a week focusing on the sense of touch, but then again, with all we’ve learned so far it isn’t that hard to intuit how it probably works.

The subject this week was hearing and balance, two senses closely connected in their operating mechanism and physical location of sensory organs, if not in purpose. Both operate by movement of hair cells activating neural responses that propagate into the brain, though what causes the hair cells to move is naturally quite different between the two senses. It makes sense that both share the same general mechanism, though, as both are fundamentally about sensing motion. Olfactory and gustatory senses have their similarities as well, as both are about sensing chemicals. In this framework, the sense of sight is a bit of an odd duck, being the only sense detecting radiation. Makes me wonder about how some birds can apparently detect the earth’s magnetic field, something fundamentally different to motion, chemicals or radiation. Perhaps the structure of how such a sense operates is completely different from any of our senses?

From a biomedical engineering perspective, the mechanical nature of hearing and balance makes it sound like damage to them could be easier fixed with various devices than with other senses. Unlike other senses where we basically go straight to neurons, the ear has a bunch of mechanisms for propagating the signal before we go to the neural level. As long as the underlying neural connections are still functional, engineering artificial ear parts sounds difficult but doable. At least more doable than interfacing electronics and neurons such that sensory information properly propagates into the brain, something that would likely be necessary for, say, fixing blindness-inducing damage in an eye.

Despite the sophisticated mechanisms involved in generating the signal in the ear, the actual signal processing part is quite similar to that of other senses, with some additional hoops to jump through when it comes to locating where a sound comes from. It makes sense that the fundamental process would be similiar, though – in the case of all senses, we have a distribution of input data with some kind of topology coding, be it location or frequency or whatnot, and we need to determine relevant things about the world around us from that information. If one general structure works well for this task, it is simply efficient to use it for multiple different senses.