Idiap Research Institute
Centre du Parc
Rue Marconi 19
PO Box 592
CH - 1920 Martigny
- During my studies, I performed two semester projects: one on the implementation of safety behaviour on quadrotor formation (at DISAL (EPFL)) and a second on the design of legs for a quadruped robot (at BIOROB (EPFL))
- I worked during 7 monthes for senseFly (in Cheseaux-sur-Lausanne) on the motor control of their quadrotor and the development of an interface between a new camera and a fixed-wing drone.
- I did my Master Project at LSRO (EPFL) under the supervision of Francesco Mondada. I worked in the field of learning analytics with mobile robots. I worked on methods that use the logs taken during a robot programming lecture to provide useful information to teachers and students in order to increase the learning outcome of lectures.
- I was then hired for 6 more monthes to continue my master project and develop a tool that provides on-line hints to students learning robotic programming based on the results of my master project.
- Improved mobile robot programming performance through real-time program assessment
R. Siegfried, S. Klinger, M. Gross, R. W. Sumner, F. Mondada and S. Magnenat
ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE), Bologna, July 2017
- Supervised Gaze Bias Correction for Gaze Coding in Interactions
R. Siegfried and J.-M. Odobez
Communication by Gaze Interaction Symposium (COGAIN), Wuppertal, August 2017
- Towards the Use of Social Interaction Conventions As Prior for Gaze Model Adaptation
R. Siegfried, Y. Yu and J.-M. Odobez
ACM International Conference on Multimodal Interaction (ICMI), Glasgow, November 2017
I am from Saint-Maurice in Valais. I studied at EPFL where I obtained a Bachelor in Microengineering (2014) and then a Master in Robotics and Autonomous Systems (2016).
In the frame of the MuMMER project, I am working on modeling and infering attention in Human-Robot Interactions. Exploiting color and depth images as well as audio data, my goal is to estimate the individual attention of a group of people that is interacting with the robot. This information will allow it to better understand humans and to react properly in social interactions.