"We don't live in just an auditory environment. We typically have multiple senses interacting, with very important and very rich cues coming from visual input."
Assistant professor of psychology, Swanson Fellow in the Sciences and Engineering
Try to hold a conversation in a noisy restaurant, and you'll understand the premise of Aaron Mitchel's research: We depend on more than sound to figure out what someone is saying.
The field of speech perception has traditionally focused primarily on cues in the auditory signal. However, the way we react when we don't comprehend what someone is saying demonstrates that more is going on. When dealing with background noise, a speaker with an unfamiliar accent or a new language, we shift our gaze from the speaker's eyes to the lips.
"We don't live in just an auditory environment," Mitchel says. "We typically have multiple senses interacting, with very important and very rich cues coming from visual input." His research is focused on teasing out exactly what those visual cues are and how we use them.
One of the most difficult aspects of understanding speech is segmentation — where the breaks are between words. "Speech is a constant flow, so figuring out where a word begins and ends is actually very difficult," Mitchel says. "I've found that we can use facial information to help find those boundaries."
Visual cues also affect how we make sense of variability between different speakers. "I call this the 'you say tomato' problem," Mitchel says. "No two people say the same sounds the exact same way."
How do we know when to ignore variability between speakers and realize that different sounds have the same meaning? By asking test subjects to distinguish ambiguous sounds associated with different speakers, Mitchel has found that we adjust how we perceive speech depending on whom we think is producing that sound.
Understanding how visual cues inform speech perception has several potential applications. For example, speech segmentation is particularly difficult for individuals with hearing impairments, such as those with cochlear implants, and visual cues may help to overcome this challenge.
Training parents to accentuate the important visual cues or training children with implants to pay close attention to those cues could help cochlear-implant recipients better interpret what they hear. "First we have to do the basic research to know what the important visual cues are," Mitchel says.
Posted October 2012