Active Perception and Robot Interactive Learning
The Active Perception and Robot Interactive Learning (APRIL) laboratory focuses on the co-evolution of artificial intelligence and robotic technologies to drive breakthrough research to enable robots to perform complex tasks in real world such as manufacturing, logistics, healthcare, agri-food, and more. Robots should be able to learn new skills by interacting with humans and perceiving the environments using modern robot vision and learning techniques.
The overall research theme involves the creation of various robot perception and manipulation systems to augment robot capabilities of working in complex and dynamical environments. This includes research in areas such as robot active perception, robot learning and manipulation, robot deep reinforcement learning, robot sim2real learning, as well as development of novel robot mobile manipulation systems including smart end-effectors, sensing modules.
The APRIL Laboratory is part of the Advanced Robotics Department and counts with a range of state-of-the-art equipment that support our research and stimulate the curiosity of our researchers. The equipment includes customized robot mobile manipulators, Franka Emika robot arm, Kinova GEN3 robot arm, Schunk robot arm, robot grippers, dexterous robot hands, F/T sensors, RGB and depth cameras, server with high-performance GPU farms, etc. The researchers come from multi-disciplinary backgrounds, including computer science, robotics, mechatronics, electronics.
The following pictures show the mobile manipulators the APRIL Lab used in the past (KUKA MIIWA), is using at the moment (Rolling Panda) and will use in future (a new quadrupped mobile manipulator).
1. Learning control for robot dexterous manipulation tasks
2. Sim2Real deep reinforcement learning for robot manipulation
3. Objects detection, segmentation and tracking for manipulation
4. Action-perception coupled whole-body control for mobile manipulator
5. One-shot/Few-shot learning for robot grasping and picking
1. AutoMAP: AutoMAP (EU FP7 EUROC AutoMAP) addresses applications of robotic mobile manipulation in unstructured environments as found at CERN. This project is based on use case operations to be carried out on CERN’s flagship accelerator, the Large Hadron Collider. The main objective is to carry out the maintenance work using a remotely controlled robot mobile manipulator to reduce maintenance personnel exposure to hazards in the LHC tunnels – such as ionising radiation and oxygen deficiency hazards. A second goal is to allow the robot being able to autonomously carry out the same tasks in the assembly facility as in the tunnel on collimators during their initial build and quality assurance through the robot learning technologies. Click for more details
2. Learn-Real: LEARN-REAL (EU H2020 Chist-Era Learn-Real) proposes to learn manipulation skills through simulation for object, environment and robot, with an innovative toolset comprising: i) a simulator with realistic rendering of variations allowing the creation of datasets and the evaluation of algorithms in new situations; ii) a virtual-reality interface to interact with the robots within their virtual environments, to teach robots object manipulation skills in multiple configurations of the environment; and iii) a web-based infrastructure for principled, reproducible and transparent benchmarking of learning algorithms for object recognition and manipulation by robots. Strong AI oriented technologies (Deep Reinforcement Learning, Deep Learning, Sim2Real transfer learning) are our main concerns in this project.
3. VINUM: VINUM, namely Grape Vine Recognition, Manipulation and Winter Pruning Automation. The objective is to apply the state-of-the-art mobile manipulation platforms and systems, a wheeled mobile platform with a commercial full torque sensing arm and multiple sensors, and an under-develop quadruped robot mobile platform with a customized robotic arm and multiple sensors (collaboration with Dr. Claudio Semini) for various maintenance and automation work in the vineyard, e.g., pruning, inspecting to tackle the shortage of skilled workers. Together with project, VINUM is targeting at providing very robust solutions for outdoor application to deal with all kinds of natural objects. There are two activities carried out under the project using our developed mobile manipulators:
Lead Researcher: Fei Chen (contact: fei.chen -at- iit.it)
PhD Student (D2): Sunny Katyara (enrolled in UniNa, co-supervision with Prof. Fanny Ficuciello, Prof. Bruno Siciliano)
PhD Student (D1): Carlo Rizzardo (enrolled in UniGe, co-supervision with Prof. Darwin Caldwell)
PhD Student (D1): Tao Teng (enrolled in UniCatt, co-supervision with Prof. Matteo Gatti, Prof. Stefano Poni)
Fellow: Miguel Fernandes
Partners and Collaborations
We welcome all kinds of collaborations to speed-up the development of intelligent robots addressing real world applications. Therefore, we are always open to new and challenging opportunities to collaborate with other groups and institutions. We are also open to interested students and scholars who want to spend some short time for visiting or longer time for PhD or Post-doc in the group. In the past years, we have collaborated in the robotics area with: German Aerospace Center (Germany), European Nuclear Research (Switzerland), Prisma Lab @ Università degli Studi di Napoli Federico II (Italy), Robot Learning & Interaction Group @ Idiap research institute (Switzerland), Department of Sustainable Crop Production @ Università Cattolica del Sacro Cuore (Italy), Robotics Institute @ Shenzhen Academy of Aerospace Technology (China), Department of mathmetics @ École Centrale de Lyon (France).