At the PILOT station, the RT-UI subsystem shall use multimodal methods of user interaction, exploiting the potential of augmented and virtual reality to create systems that assist the operator in the task, are easy to learn and use, and account for operator ergonomics requirements. The ultimate goal is to develop intuitive teleoperation / telepresence systems that using a new generation of robots, for interventions that involve locomotion and manipulation in unstructured scenarios. This subsystem will adopt a plug-in architecture; a main computer shall be responsible for the communications between the PILOT station and the FIELD robot. The software, hardware, and communication interfaces for the different modules will be implemented in this main computer. The bidirectional interfaces shall manage the data flow for the teleoperation scheme between the PILOT and the FIELD systems.
This subsystem shall be developed within the Advanced Robotics (ADVR) lab of the IIT. The proposed interface combines the following capabilities:
- VR interface for intuitive real-time bilateral teleoperation in the simulated and real worlds,
- Motion mapping between master and remote robot, agnostic to the user viewpoint,
- Control commands through the VR motion controllers as well as commercial haptic master devices,
- Haptic feedback from the remote environment based on vision and sensing feedback,
- Physics plugin from simulation environment, with real-time scene update,
- Real-time point-cloud for augmented virtuality based 3D scene reconstruction,
- Real-time video streaming for first-person view of the scene.