As a rapid stream of impulses arrives from the hair cells in the ear, the auditory system filters out a few simple, discrete aspects of complex sounds. Information about how high- or low-pitched a sound is, how loud it is, and how often it is heard is then channeled along separate nerve pathways to higher-order processing centers in the brain, where millions of auditory neurons can compute the raw data into a recognizable sound pattern.
The hair cells themselves contribute to this filtering process by responding to different frequencies at different locations along the basilar membrane. The cells at the bottom of the membrane respond more readily when they detect high-frequency sound waves, while those at the top are more sensitive to low-frequency sounds.
David Corey compares the arrangement to the strings of a grand piano, with the high notes at the base of the cochlea, where the basilar membrane is narrow and stiff, and the bass notes at the apex, where the membrane is wider and more flexible.
Hair cells also convey basic information about the intensity and duration of sounds. The louder a sound is at any particular frequency, the more vigorously hair cells tuned to that frequency respond, while their signaling pattern provides information about the timing and rhythm of a sound.
Konishi hypothesized that such timing and intensity information was vital for sound localization. So he placed microphones in the ears of owls to measure precisely what they were hearing as the portable loudspeaker rotated around their head.
He then recorded the differences in time and intensity as sounds reached each of the owl's ears. The differences are extremely slight. A sound that originates at the extreme left of the animal will arrive at the left ear about 200 microseconds (millionths of a second) before it reaches the right ear. (In humans, whose sound localization abilities are keen but not on a par with those of owls, the difference between a similar sound's time of arrival in each ear would be about three times greater.)
As the sound source was moved toward the center of the owl's head, these time differences diminished, Konishi observed. Differences in the intensity of sounds entering the two ears occurred as the speaker was moved up and down, mostly because the owl's ears are asymmetricalthe left ear is higher than eye level and points downward, while the right ear is lower and points upward.
Based on his findings, Konishi delivered signals separated by various time intervals and volume differences through tiny earphones inserted into the owls' ear canals. Then he observed how the animals responded.
Because owls' eyes are fixed in their sockets and cannot rotate, the animals turn quickly in the direction of a sound, a characteristic movement. By electronically monitoring these head-turning movements, Konishi and his assistants showed that the owls would turn toward a precise location in space corresponding to the time and intensity differences in the signals. This suggested that owls fuse the two sounds that are delivered to their two ears into an image of a single sourcein this case, a phantom source.
"When the sound in one ear preceded that in the other ear, the head turned in the direction of the leading ear. The longer we delayed delivering the sound to the second ear, the further the head turned," Konishi recalls.
Next, Konishi tried the same experiment on anesthetized owls to learn how their brains carry out binaural fusion. Years earlier, he and Knudsen had identified space-specific neurons in the auditory area of the owl's midbrain that fire only in response to sounds coming from specific areas in space. Now Konishi and his associates found that these space-specific neurons react to specific combinations of signals, corresponding to the exact direction in which the animal turned its head when phantom sounds were played. "Each neuron was set to a particular combination of time and intensity difference," Konishi recalls.
Konishi then decided to trace the pathways of neurons that carry successively more refined information about the timing and intensity of sounds to the owl's midbrain. Such information is first processed in the cochlear nuclei, two bundles of neurons projecting from the inner ear. Working with Terry Takahashi, who is now at the University of Oregon, Konishi showed that one of the nuclei in this first way station signals only the timing of each frequency band, while the other records intensity. The signals are then transmitted to two higher-order processing stations before reaching the space-specific neurons in the owl's midbrain.
One more experiment proved conclusively that the timing and intensity of sounds are processed along separate pathways. When the researchers injected a minute amount of local anesthetic into one of the cochlear nuclei (the magnocellular nucleus), the space-specific neurons higher in the brain stopped responding to differences in time, though their response to differences in intensity was unchanged. The converse occurred when neurons carrying intensity information were blocked.
"I think we are dealing with basic principles of how an auditory stimulus is processed and analyzed in the brain. Different features are processed along parallel, almost independent pathways to higher stations, which create more and more refined neural codes for the stimulus," says Konishi. "Our knowledge is not complete, but we know a great deal. We are very lucky."
Konishi has been able to express the mechanical principles of the owl's sound localization process as a step-by-step sequence. He has collaborated with computer scientists at Caltech in developing an "owl chip" that harnesses the speed and accuracy of the owl's neural networks for possible use in computers.
At Stanford University, Eric Knudsen has been conducting experiments on owls fitted with prism spectacles to determine whether distortions in their vision affect their sound localization abilities. Despite their exceptionally acute hearing, he has found, the owls trust their vision even more. When they wear distorting prisms, their hunting skills deteriorate over a period of weeks as their auditory systems try to adapt to the optical displacement of the prisms.
"The visual system has ultimate control and basically dictates how the brain will interpret auditory localization cues," Knudsen says.
He is also examining a particular network of neurons in the animals' brains where he believes auditory and visual system signals converge. "This network makes it possible for the owls to direct their eyes and attention to a sound once it's heard," Knudsen explains. His research is part of a new wave of studies that focus not just on single sensory pathways, but on how the brain combines information it receives from many different sources.
< Previous | Top of page | Next >