Calinon, S., Evrard, P., Gribovskaya, E., Billard A. and Kheddar, A. (2009)
Learning collaborative manipulation tasks by demonstration using a haptic interface
In Proc. of the Intl Conf. on Advanced Robotics (ICAR), Munich, Germany, pp. 1-6.
Abstract
This paper presents a method by which a robot can learn through observation to perform a collaborative manipulation task, namely lifting an object. The task is first demonstrated by a user controlling the robot's hand via a haptic interface. Learning extracts statistical redundancies in the examples provided during training by using Gaussian Mixture Regression and Hidden Markov Model. Haptic communication reflects more than pure dynamic information on the task, and includes communication patterns, which result from the two users constantly adapting their hand motion to coordinate in time and space their respective motions. We show that the proposed statistical model can efficiently encapsulate typical communication patterns across different dyads of users, that are stereotypical of collaborative behaviours between humans and robots. The proposed learning approach is generative and can be used to drive the robot's retrieval of the task by ensuring a faithful reproduction of the overall dynamics of the task, namely by reproducing the force patterns for both lift the object and adapt to the human user's hand motion. This work shows the potential that teleoperation holds for transmitting both dynamic and communicative information on the task, which classical methods for programming by demonstration have traditionally overlooked.
Bibtex reference
@inproceedings{Calinon09ICAR, author = "S. Calinon and P. Evrard and E. Gribovskaya and A. Billard and A. Kheddar", title = "Learning collaborative manipulation tasks by demonstration using a haptic interface", booktitle = "Proc. Intl Conf. on Advanced Robotics ({ICAR})", year = "2009", month="June", location="Munich, Germany", pages="1--6" }
Video
Learning of a collaborative manipulation skill with HRP-2 by
showing multiple demonstrations of the skill in slightly different
situations (different initial position and orientation of the
object). The robot is standing in an half-sitting posture during the
experiment, where the 7 DOFs of the right arm and torso are used
in the experiment. A built-in stereoscopic vision is used to track
colored patches, and a 6-axes force sensor at the level of the
right wrist is used to track the interaction forces with the
environment during task execution. The robot is
teleoperated through a Phantom Desktop haptic device from
Sensable Technologies. An impedance controller is used to
control the robot.
This work is in collaboration with the Joint Japanese-French Robotics Laboratory (JRL) at
AIST, Tsukuba, Japan.