Rémy Siegfried

About myself

I am from Saint-Maurice in Valais. I studied at EPFL where I obtained a Bachelor in Microengineering (2014) and then a Master in Robotics and Autonomous Systems (2016).

Current work

I am a PhD student at EPFL (EDEE) and I am currently working at Idiap as a research assistant in the Perception and Activity Understanding group under the supervision of Jean-Marc Odobez.

In the frame of the MuMMER project, I am working on modeling and infering attention in Human-Robot Interactions. Exploiting color and depth images as well as audio data, my goal is to estimate the individual attention of a group of people that is interacting with the robot. This information will allow it to better understand conversations dynamic and to react properly in social interactions. To achieve this, I explored different topics and tasks, like unsupervised gaze estimation calibration and eye movements recognition among others.

I am also interested in the application of gaze and attention estimation technologies in other fields, like communication analysis, psychological studies (e.g., study about the first impression building) or medical diagnosis (e.g., depression cues detection).

Past projects

  • During my studies, I performed two semester projects: one on the implementation of safety behaviour on quadrotor formation (at DISAL (EPFL)) and a second on the design of legs for a quadruped robot (at BIOROB (EPFL))
  • I worked during 7 monthes for senseFly (in Cheseaux-sur-Lausanne) on the motor control of their quadrotor and the development of an interface between a new camera and a fixed-wing drone.
  • I did my Master Project at LSRO (EPFL) under the supervision of Francesco Mondada. I worked in the field of learning analytics with mobile robots. I worked on methods that use the logs taken during a robot programming lecture to provide useful information to teachers and students in order to increase the learning outcome of lectures.
  • I was then hired for 6 more monthes to continue my master project and develop a tool that provides on-line hints to students learning robotic programming based on the results of my master project.


  • ManiGaze dataset 
    The ManiGaze dataset was created to evaluate gaze estimation from remote RGB and RGB-D (standard vision and depth) sensors in Human-Robot Interaction (HRI) settings, and more specifically during object manipulation tasks. The recording methodology was designed to let the user behave freely and encourage a natural interaction with the robot, as well as to automatically collect gaze targets, since a-posteriori annotation is almost impossible for gaze.


Events and media


Personal Page


Tel: +41 27 721 7707
Office: 308-6