Idiap has a new opening for a PhD positions in multimodal perception for robotics and gesture learning

The Perception and Activity Understanding group (Jean-Marc Odobez, http://www.idiap.ch/~odobez/) seeks one PhD candidate for a Swiss NSF funded project aiming to study robot skills acquisition through active learning and social interaction strategies (ROSALIS, see below). In the project, the PhD candidate will work on the multimodal perception of persons and objects, and collaborate with two other PhD students working on skill learning and interaction modeling.

The project will start in april 2018, but the position can start earlier. The ideal PhD candidate should hold a MS degree in computer science, engineering, physics or applied mathematics. S/he should have a good background in statistics, linear algebra, signal processing and programming, machine learning. The successful applicant will have good analytical skills, written and oral communication skills, and the ability to work in a multidisciplinary team.

The position is for 4 years, provided successful progress, and should lead to a dissertation. The selected candidates will become doctoral students at EPFL provided acceptance by the Doctoral School at EPFL. Annual gross salary ranges from 47,000 CHF (first year) to 50,000 CHF (last year).

Interested candidates should submit a cover letter, a detailed CV, and the names of three references (or recommendation letters) through the Idiap online recruitment system.

Interviews will start upon reception of applications until the position is filled.

---
About ROSALIS and the PhD position.

Most efforts in robot learning from demonstration are turned toward developing algorithms for the acquisition of specific skills from training data. While such developments are important, they often do not take into account the social structure of the process, in particular, that the interaction with the user and the selection of the different interaction steps can directly influence the quality of the collected data. In ROSALIS, we propose to rely on natural interactions for skill learning, involving queries about the skills, and demonstrations made by both the human and the robot to show what it has learned.

PhD position: Besides research on skills representation and active learning methodologies relying on heterogeneous sources of information (demonstrations, feedback labels, properties), the project will investigate novel perception algorithms to allow natural interactions between the robot and the teacher. The aim is to provide a higher level understanding of the teacher behaviors and intentions through audio, gaze, and gesture (arm, body, head) analysis, in relation with the (unknown) skill she is teaching to the robot. This implies understanding (and distinguishing) her communication signals (yes, no, explanation) including the feedback about and during demonstrations made by the robot, and the multimodal demonstrations (partial or global) she is making of the skill to be learned.

The different mechanisms (skill learning, active learning, perception) will be integrated in a global model of interaction, implying the coordination (selection, timing) of different smaller interaction units. We target applications of robots in both manufacturing (with Baxter and Franka robots) and home/office environments (with the Pepper robot), both requiring re-programming in an efficient and personalized manner.

ROSALIS is a SNSF funded project involving both the Perception and Activity Understanding group (Jean-Marc Odobez) and the Robot Learning and Interaction group (Sylvain Calinon) at the Idiap Research Institute.

To apply for this position, click on the following link: PhD positions in multimodal perception for robotics and gesture learning