Through approaches in immersive mixed reality, telerobotics, and biomechanical simulations and control, the objective is to advance the understanding of human interaction with remote environments.
We study how the combination of immersive mixed reality (MR) interfaces, intuitive control devices, real-time data from remote sensors (RGB-D cameras, microphones, F/T sensors, etc.) can allow high-fidelity in the perception-action loop, offering a real-time immersive interaction experience to the human user in telerobotics applications.
VR serves as a promising technology demonstration for safety training, providing risk free, immersive learning, among other features. We explore how xR technologies, combining VR with spatially contextualized physical interaction, can help make training sessions more effective in the acquisition of safety behaviour, while increasing the trainee’s engagement.
The overarching objective of our research is to improve our knowledge of how humans interact with remote environments, real or virtual. We investigate how predictive full-body biomechanical simulations, biophysiological parameter tracking, combined with xR can help understand a human user’s behaviour during their interaction with remote environments.
Advanced Robotics