Perception occurs through multiple sensory channels. The brain uses this rich information to make guesses about the objects and situations encountered in the environment. In this process, the information from different sensory modalities is often combined to improve recognition. Sometimes, the use of different modalities even leads to multisensory illusions such as the ventriloquism or the Mc Gurk-McDonald effect, which show that out of various inputs the brain tends to produce unified, sometimes misleading perception.
The team is interested in better understanding the integration of multisensory information in cortical circuits, and how it might lead to a unified percept.
To do so, we use state-of-the-art recording techniques such as two-photon calcium imaging in mice to observe the activity of large neuronal populations during multisensory perception. From these observations we hope to derive the principles by which a given sensory modality influences the neuronal representation of another modality, which will help us to understand the phenomenon of sensory perception in its full natural context.
Other interests of the team are:
- The principles of sensory discrimination learning. We combined theoretical models of learning with mouse behavior using state of the art automated behavioral setups, to understand how animals adapt to different situations.
- The mechanisms of auditory perception in mice and their relation to human hearing. For this project we combine behavioral task in mice with two-photon calcium imaging to understand how the brain processes time varying sounds and how these often non-linear mechanisms affect perception. This project is performed in collaboration with the Perception and Sound Design team at IRCAM (Institut de Recherche et Coordination Musique/Acoustique, Paris)