The aim of this project is to explore the properties of human behavior within ecological VR settings using psychophysics, kinematics, electromiography (EMG) and autonomic responses such as heart rate, all collected together while experiencing immersive virtual environments. Our first step into this complex and fascinating research venue concerns the translation and adaptation of traditional cognitive assessments into a virtual-reality (VR)-based platform, thus generating multimodal data (e.g., combining cognitive, sensorimotor and affective stimulations). Existing cognitive assessment tools have poor ecological validity, generally failing to assess comprehensively patients’ everyday behaviors. VR tests more vividly mimic relevant real-life activities, while providing a rich basis for exploring multimodal interactions giving rise to human behaviors with an unprecedented depth and precision. Notably, this approach may also carry huge clinical implications, as it may dramatically increase the detection of subtle cognitive and cognitive-motor interaction compromises, as well as unravel how affective states influence performances. For instance, we found that VR-based cognitive tests yielded greater differentiation between participants age groups, and by looking at hand movements kinematics during a cognitive task we are able to tap into the underlying cognitive process the participant is performing. Our ongoing studies on hand kinematics and cognitive-locomotion interdependencies will shed new light on the individual behavioural patterns accounting for cognitive capabilities and disabilities.