On February 13th, 2018 we learned about

The way we sense sound is strangely tied to our sight

If you’re having a hard time hearing something, try looking more carefully. Despite what every kid learns by kindergarten, there’s mounting evidence that our hearing isn’t just up to our ears. Multiple studies are finding aspects of hearing that are shaped at least partially by our eyes, and that the two sensory systems are actually intertwined in our brains. Of course, that doesn’t mean it’s all perfectly clear, as some of the dynamics in this relationship remain a bit confusing, starting with our eardrums.

Ears move with eyes

The tiniest bones in your body are in your ear. While you probably can’t wiggle your ears as well as some other animals, you do still have some control over this anatomy. Usually, the three bones, or ossicles, in the middle ear move to help or hinder sounds passing through your ear, modulating their volume, and their movement is seen as a reaction to outside sounds. However, researchers using very sensitive microphones in a quiet space were able to listen to the vibrations created by a person’s ossicles, and found that they move in tandem with the eyes. In fact, they seem to actually start vibrating in the ear before eye movement can be detected, suggesting that both bits of anatomy are being triggered by the same motor functions in the brain.

The ossicles’ movement doesn’t appear to be completely arbitrary either. The bones will move inward or outward according to the direction the eyes are looking. They continue this vibration until the eyes stop. It’s unclear at this point why this occurs, although it’s thought that our brains may be merging the visual and auditory information gathered during this activity to build a more cohesive sense of the space around us.

Looking to listen

This kind of cross-sensory augmentation was a bit clearer in a second study that linked visual cues to sound clarity. Previous work had found that around 25 percent of the auditory cortex in the brain is actually responsive to light, despite the fact that light is… generally silent. It now appears that this sensory overlap is a way for our brains to pick which sounds are most deserving of our attention.

An experiment watched the brain activity of ferrets while they were presented with visual and auditory stimuli. Simulating a noisy environment, the ferrets were made to listen to multiple sounds layered on top of each other. While the ferrets listened, a light would flash at different rhythms, sometimes syncing up with one particular set of sounds or another. This synchronization was apparently picked up by the ferrets’ brains, which then started filtering the competing noises to prioritize that particular sound. When the visuals somehow matched the sound, that sound was processed to be easier to perceive.

While you’ve probably never noticed your eardrums wiggling, you may have noticed this second phenomenon at a party. By watching someone’s lips as they spoke, you were seeing visual information that was synced to the timing of their voice. Your brain could then prioritize sounds on that same rhythm, helping you make sense of what was said. However, the ferrets demonstrate that this wasn’t because you were lip-reading necessarily, as the furry critters experience a similar effect without anything resembling human speech. Instead, it seems that many of the relationships between our hearing and our site likely evolved in a distant ancestor, helping them make sense of the clattering sounds to their left, right, or right in front of them.

Source: Visual cues amplify sound by University College London, Medical Xpress