Funder: EC Project Code: 101002711
Overall Budget: 1,998,610 EURFunder Contribution: 1,998,610 EUR
Partners: Carlos III University of Madrid
Our physical body is the interface between ourselves and the world around us. The way we perceive our body, its appearance, configuration and motor abilities shapes our behaviour, emotion and social functioning. New sensor-based and bodily sensory feedback devices, such as those brought by immersive virtual reality, allow creating Body Transformation Experiences (BTE). These perceptual illusions, like being in a child’s body, lead to experience changes in one’s own body. Beyond fun experiences, the new field of BTE engineering brings a significant leap for wellbeing and health applications, as well as for the embodiment of robotic devices and virtual avatars, and smart clothing. However, the neuroscience of core sensory-driven changes in Mental Body Representations (MBR) currently represents piecemeal experiments in very controlled settings that restrict body movement. This hinders the transfer of the basic principles discovered to real-world complex environments and practical problems. Through systematic and iterative research, BODYinTRANSIT will establish a theoretical ‘BTE design framework’ for individualized sensorial manipulation of MBR with long-lasting effects in everyday use contexts. The framework will stand on four scientific pillars to induce, measure, support, personalize and preserve body transformations: 1) neuroscience of multisensory body perception, 2) data modelling of the links between MBR, behaviour, emotion and social functioning, 3) wearable-based embodied multisensory interaction design, and 4) field studies in real-life and on-the-move contexts with physically inactive users, somatic practitioners and users with body image concerns. Tangible outcomes will include design principles, data formats, models, measures and paradigms enabling and guiding BTE engineering innovations. BODYinTRANSIT envisions personalized BTE technologies able to measure MBR and adapt bodily sensory feedback to modify it – individually, online and while on-the-move.