Complex skills with dexterous manipulation of objects are key technologies for assistive robotics research and application. Because of the design complexity needed for robot controllers even for simple manipulation task, robots currently in use are mostly limited to specific tasks within known environment. Within this project, we aim to develop an AI empowered general purpose robotic system for dexterous manipulation of complex and unknown objects in rapidly changing, dynamic and unpredictable real-world environments, namely “Assistive Robotic System for Various Dressing Tasks through Robot Learning by Demonstration via Sim-to-Real Methods (Learn-Assist)”. This will be achieved through intuitive embodied robotic demonstration between the human operator enhanced with a motion tracking device and the robot controller empowered with AI-based vision and learning skills. The privileged use case of such a system is assistance for “stick-tobed” patients or elders with limited physical ability in their daily life object manipulation tasks, e.g., dressing of various clothes. To make possible such an embodied demonstration robotic system for complex manipulation, thus without any assumption about the object to be manipulated and the operating environment, Learn-Assist project features unique innovations simultaneously on different AI applied domains: robotics, computer vision and robot learning. Specifically, this project will make use of a dual-arm mobile robot and aims to bring breakthroughs in the following research topics: Object recognition and data effective learning, demonstration with embodiment technologies, learning by demonstration, Sim-to-Real reinforcement learning. As such, the Learn-Assist project fits perfectly the objectives of this call for proposals on AI, specifically in “advancing the state of the art in AI in order to accomplish complex tasks for robots” and “allowing high-level interactions with human users” and contributing in core AI technologies.
IIT Projects Search
Assistive Robotic System for Various Dressing Tasks through Robot Learning by Demonstration via Sim-to-Real Methods