Closing Keynote Lecture

Crossmodal interactions in perception, memory and learning: on the scene and behind the scene

 

Ladan Shams, Ph.D.
UCLA Psychology Department

 What are the principles that govern crossmodal interactions? Comparing human observers’ multisensory perception with that of a Bayesian observer, we have found that humans’ multisensory perception is consistent with Bayesian inference both in determining when to combine the crossmodal information and how to combine them. The former problem is a type of causal inference. Causal inference, which has been largely studied in the context of cognitive reasoning, is in fact a critical problem in perception. Our Bayesian causal inference model accounts for a wide range of phenomena including a wide range of multisensory illusions, as well as counter-intuitive phenomena such as partial integration and negative gain. In accounting for both perception of objects in the environment as well as perception of one’s own body, our findings suggest that the same computational principles govern perception of the world and self.

Crossmodal interactions also play an important role in various types of learning and memory. We have found that multisensory experience enhances and accelerates unisensory perceptual learning, it instantaneously recalibrates unisensory representations, and improves unisensory episodic memory. These findings show that crossmodal interactions not only affect perception when signals from multiple modalities are present, but also influence the subsequent unisensory processing. In fact our recent findings show that in some cases, crossmodal interactions can aid learning even in the absence of multisensory experience: training in auditory modality produced substantial visual learning where training visual training failed to produce any significant learning. In other words, outsourcing the training to a different modality was key to learning. I will discuss the variety of ways in which crossmodal interactions can benefit learning and memory. Altogether, these findings suggest that crossmodal interactions influence both multisensory and unisensory perception, memory and learning in a robust and rapid fashion.



About the Speaker: Ladan Shams is a professor of Psychology, BioEngineering, and Neuroscience at UCLA, and the director of the Multisensory Perception Laboratory at UCLA. Dr. Shams received her Ph.D. in Computer Science at USC and her postdoctoral training in Cognitive Neuroscience at Caltech. Dr. Shams’ research interests focus on multisensory perception and learning in humans. Dr. Shams has served as Associate Editor of Frontiers in Integrative Neuroscience, and Frontiers in Human Neuroscience, as an Action Editor of Psychonomic Bulletin & Review, and is on the editorial board of  Multisensory Research. Dr. Shams is a member of the National Science Foundation College of Reviewers, the Society for Neuroscience, the Vision Sciences Society, and the International Multisensory Research Forum. She was featured by Chronicle of Higher Education as one of “five scholars to watch” and is frequently consulted as an expert by media outlets such as NPR, BBC and CNN. 

 

Event Timeslots (1)

Sunday, June 17
-