High visuo-auditory integration performances in deaf subjects with cochlear implants

Pascal Barone, Cerveau et Cognition, CNRS UMR 5549, Toulouse, France

Abstract
Visual information derived from lipreading allows an improvement of speech comprehension in noisy environment. Because patients that received a cochlear implantation (CI) are highly sensitive to presentation of speech in noise, we have investigated the role of visuo-auditory interactions in speech intelligibility. In a previous study we have shown that patients present a greater word recognition using lipreading compared to control that remains stable several years after implantation. We have then compared visuo-auditory performances of patients to that of normal hearing subjects (NHS) submitted to paradigms with different acoustic degradations of the auditory stimuli. In a ''masking'' protocol we combined A and AV words with a white noise signal. In a ''simulation'' protocol, we applied the acoustic treatment of a cochlear implant, using speech-spectrum shaped noise in which the global temporal and amplitudes information of the envelop signal are preserved but the fine temporal cues within each spectral components are removed. Our results show that when considering subjects at similar levels of auditory performances, CI patients present a higher visuo-auditory gain than that observed in NHS through-out the degraded protocols. Our results suggest that patients have developed specific skills in visuo-auditory interaction leading to an optimization of the integration the visual temporal cueing in absence of fine temporal spectral information.

Not available

Back to Abstract