Like humans, robots can learn to walk

Following a comparable path, robots could learn to move and walk as human being do. The goal of the MEMMO project is to develop a unified approach to motion generation for complex robots with arms and legs.

How do we learn to move and walk as a child? We have a memory of motion, we have a lot of trial and error, and, of course, we use our senses! The MEMMO – Memory of Motion – project relies on a similar approach. This approach is based on three innovative components. First, a massive amount of pre-computed optimal motions or trajectories are generated offline and compressed into a "memory of motion". Then these trajectories are recovered during execution and adapted to new situations with real-time model predictive control, with generalization to dynamically changing environments. Finally, sensors such as vision, inertial sensor, haptic sensor, are exploited for feedback control which goes beyond the basic robot state with a focus on robust and adaptive behaviour.

Humanoids robots, exoskeletons and quadruped robots

The project is organized around applications designed by the end-user partners. Among these applications there is a humanoid robot performing advanced locomotion and industrial tooling tasks for aircraft assembly. There is also an advanced exoskeleton paired with a paraplegic patient demonstrating dynamic walking, as well as a quadruped robot performing inspection tasks in a construction site. In order to achieve these goals, the project is uniting nine European partners and Idiap Research Institute.

Faster encoding of movements

Idiap is responsible of the research aspects related to the representation and encoding of movements. Our Institute objective is to compress motion data for fast recognition and adaptive motion synthesis. A probabilistic approach can be used to generalize movements to new situations (including new environments and initial conditions). To do so, models are developed to facilitate integration between learning and control, with trajectory distributions that are adapted to the current situation and can be used to quickly generate trajectory samples for further optimization.

From January 28th to 31st, Idiap hosted the MEMMO Winter School. This internal event gathered 40 participants during four days of lectures, labs and invited talks. A great way to foster human inputs!

More information:

Idiap’s partners in the MEMMO project: LAAS-CNRS, University of Edinburgh, Max Planck Institute Tuebingen, University of Oxford, PAL Robotics, Airbus Group, Wandercraft, CMPR Pionsat and Costain Group