Seeing silent speech - cortical correlates in relation to visual processing requirements
Ruth Campbell, Department of Human Communication Sciences, University College London
Abstract
Seeing silent speech activates cortical regions that support the processing of heard speech, including auditory cortex, posterior parts of the superior temple sulcus (including but not limited to Wernicke's area) and inferior frontal regions (including but not limited to Broca's area). However, the conditions of presentation (stilled speech images to natural speech movement) and the task conditions (online speech processing or offline categorisation of material) appear to affect the relative dominance of different parts of the circuit. This review will explore the idea that two visual processing streams may contribute to the processing of visual speech - an anterior (ventro-frontal) stream for fine specified mouth information (lip, tongue position, mouth shape), which makes use of the object processing capacities of the visual ventral stream and the capacities of the inferior frontal regions for specifying details of facial gesture - and a posterior/dorsal stream, especially sensitive to activation in V5, which makes use of the dynamic aspects of the display more directly via inferior parieto-frontal pathways. The streams may converge in p-STS and may have distinct frontal components.
Not available
Back to Abstract
|