In the bewildering complexity of the nervous system, researchers in sensory neuroscience tend to conceptually separate individual modalities for isolated analysis. In fact, sensory neuroscientists frequently identify themselves as investigators of only, for example, vision or olfaction. While this reductionist approach is powerful in many aspects, it entails a risk of oversimplification. Though more practical, reductionist analysis of sensory neurobiology contrasts sharply with the nervous system’s everyday operation, which generally integrates multiple crossmodal cues. Therefore, with advances in our modality-specific understanding and the emergence of novel experimental and analytical tools, it is time to address the interplay between different sensory modalities. We believe that this approach (MultiSenses) will be key to a more holistic understanding of sensory-guided behaviors.

Multisensory processing has traditionally been studied on a single cell level, revealing several basic principles of crossmodal integration. Neural circuits, however, exhibit an additional layer of integration that transcends the complexity of any given cell, with dynamic characteristics whose analysis requires sophisticated computation. Ultimately, knowledge derived from cell, network and systems analysis needs to be combined to gain a realistic understanding of sensory-guided behaviors, adding yet another layer of complexity. To meet this multi-scale challenge and gain insights into the principles of crossmodal coding, this RTG will add