Idiap has a new opening for 3 PhD positions in perception, robot skill learning and HRI

The Perception and Activity Understanding group (Jean-Marc Odobez, http://www.idiap.ch/~odobez/) and the Robot Learning and Interaction group (Sylvain Calinon, http://calinon.ch) seek 3 PhD candidates for a Swiss NSF funded project aiming to study robot skills acquisition through active learning and social interaction strategies (ROSALIS, see below). The 3 PhD students will collaborate together, with 3 axes of research:

1) skill learning (Calinon)
2) perception (Odobez)
3) interaction modeling (Odobez+Calinon).
The preferred axis/axes of research must be mentioned in the application.

The project will start in early 2018, but positions can start earlier. The ideal PhD candidates should hold Master degrees in computer science, engineering, physics or applied mathematics. They should have a background in statistics, linear algebra, signal processing and programming. The positions are for 4 years, provided successful progress, and should lead to a dissertation. The selected candidates will become doctoral students at EPFL provided acceptance by the Doctoral School at EPFL. Annual gross salary ranges from 47,000 CHF (first year) to 50,000 CHF (last year).

Interested candidates should submit a cover letter, a detailed CV, and the names of three references (or recommendation letters) through the Idiap online recruitment system.

Interviews will start on September 1st, 2017. Late applications will be treated depending on whether positions have been filled or not.

ROSALIS project description:

Most efforts in robot learning from demonstration are turned toward developing algorithms for the acquisition of specific skills from training data. While such developments are important, they often do not take into account the social structure of the process, in particular, that the interaction with the user and the selection of the different interaction steps can directly influence the quality of the collected data. In ROSALIS, we propose to rely on natural interactions for skill learning, involving queries about the skills, and demonstrations made by both the human and the robot to show what it has learned.

The research will advance on several fronts. First, for skills representation, the robot learners will require an appropriate level of plasticity, allowing them to adapt, refine or freeze a skill primitive currently being learned. Furthermore, active learning methodologies will be developed, relying on heterogeneous sources of information (demonstrations, feedback labels, properties), allowing to make hypotheses about the skill invariants and to suggest demonstrations or queries. Secondly, to allow natural interactions, we will design perception algorithms to provide a higher level understanding of people behaviors and intentions, including gaze information and multimodal action recognition and segmentation. The different mechanisms will be integrated for modeling the interaction, implying the coordination (selection, timing) between the different interaction units.

We target applications of robots in both manufacturing (with Baxter and Franka robots) and home/office environments (with the Pepper robot), both requiring re-programming in an efficient and personalized manner.

About Idiap:   

To apply for this position, click on the following link: 3 PhD positions in perception, robot skill learning and HRI