Skip to main content

How Vision Captures Sound Now Somewhat Uncertain

Duke study finds new insights on how neurons respond to visual or auditory stimuli 

Neurons in the eye-movement region of the brain use two different strategies for signaling the locations of sights and sounds. For visual stimuli, the neurons form a map akin to a "zone defense" in basketball. The location of a visual stimulus can be inferred from a "hill" of activity in this brain region, called the superior colliculus. For sounds, neurons throughout the region respond with a broad plateau of activity that isn't location-dependent, but varies in intensity. The "height" of the plateau acts like the dial on a meter to signal sound location. Plateau height on opposite sides of the brain may perform a neural game of tug-of-war to indicate where a sound is.

When listening to someone speak, we also rely on lip-reading and gestures to help us understand what the person is saying.

To link these sights and sounds, the brain has to know where each stimulus is located so it can coordinate processing of related visual and auditory aspects of the scene. That's how we can single out a conversation when it's one of many going on in a room.

While past research has shown that the brain creates a similar code for vision and hearing to integrate this information, Duke University researchers have found the opposite: neurons in a particular brain region respond differently, not similarly, based on whether the stimulus is visual or auditory.

The finding, which posted Jan. 15 in the journal PLOS ONE (http://tinyurl.com/kqtuwjg), provides insight into how vision captures the location of perceived sound.

The idea among brain researchers has been that the neurons in a brain area known as the superior colliculus employ a "zone defense" when signaling where stimuli are located. That is, each neuron monitors a particular region of an external scene and responds whenever a stimulus -- either visual or auditory -- appears in that location. Through teamwork, the ensemble of neurons provides coverage of the entire scene.

Jennifer Groh, Duke professor of psychology and neuroscience

But the study by Duke researchers found that auditory neurons don't behave that way. When the target was a sound, the neurons responded as if playing a game of tug-of-war, said lead author Jennifer Groh, a professor of psychology and neuroscience at Duke.   

"The neurons responded to nearly all sound locations. But how vigorously they responded depended on where the sound was," Groh said. "It's still teamwork, but a different kind. It's pretty cool that the neurons can use two different strategies, play two different games, at the same time."

Groh said the finding opens up a mystery: if neurons respond differently to visual and auditory stimuli at similar locations in space, then the underlying mechanism of how vision captures sound is now somewhat uncertain. "Which neurons are 'on' tells you where a visual stimulus is located, but how strongly they're 'on' tells you where an auditory stimulus is located," said Groh, who conducted the study with co-author Jung Ah Lee, a postdoctoral fellow at Duke. 

"Both of these kinds of signals can be used to control behavior, like eye movements, but it is trickier to envision how one type of signal might directly influence the other." 

The study involved assessing the responses of neurons, located in the rostral superior colliculus of the midbrain, as two rhesus monkeys moved their eyes to visual and auditory targets.

The sensory targets -- light-emitting diodes attached to the front of nine speakers -- were placed 58 inches in front of the animals. The speakers were located from 24 degrees left to 24 degrees right of the monkey in 6-degree increments.  

The researchers then measured the monkey's responses to bursts of white noise and the illuminating of the lights.

Groh said how the brain takes raw input of one form and converts it into something else "may be broadly useful for more cognitive processes."

"As we develop a better understanding of how those computations unfold it may help us understand a little bit more about how we think," she said.

The study was funded by the National Institutes of Health, No. R01 NS50942.

A copy of the journal article is available here through Duke Space, a university open-access archive of Duke research.