Closing Keynote Lecture

Crossmodal interactions in perception, memory and learning: on the scene and behind the scene Ladan Shams, Ph.D.UCLA Psychology Department  What are the principles that govern crossmodal interactions? Comparing human observers’ multisensory perception with that of a Bayesian observer, we have found that humans’ multisensory perception is consistent with Bayesian inference both in determining when to combine the crossmodal information and how to combine them. The former problem is a type of causal inference. Causal inference, which has been largely studied in the context of cognitive reasoning, is in fact a critical problem in perception. Our Bayesian causal inference model accounts for a wide range of phenomena including a wide range of multisensory illusions, as well as counter-intuitive phenomena such as partial integration and negative gain. In accounting for both perception of…
Read More

Opening Keynote Lecture

Multisensory Flavour Perception: Insights for/from the Spatial SensesProf. Charles SpenceHead of the Crossmodal Research Laboratory, Oxford University Food is both fundamental to our survival and fun to study. Furthermore, there is nothing that gets your brain going quite like the sight/smell of one’s favourite food when hungry.[1] And, as the eminent British biologist J. Z. Young once noted, it is perhaps no coincidence that the mouth and the brain lie so close together in most species.[2] No wonder then that the brain rapidly estimates the energy-density of potential food sources in the environment and devotes our limited attentional resources accordingly.[3] At the same time, however, it is much harder, practically-speaking, to study flavour (i.e., the chemical senses) than it is to study the spatial senses of vision, hearing, and touch. This…
Read More