Multimodal gesture, visual attention an interaction activity recognition for autism diagnosis

The Perception and Activity Understanding group seeks one highly motivated PhD candidate to work within the AI4Autism project aiming at improving the digital phenotyping of children with Autistic Spectrum Disorders (ASD).

The PhD candidate will work on the multimodal perception of small children involved in free play activities as well as their social interactions with adults. In particular, he will investigate deep learning methods and models for the recognition of gestures and visual attention events, including the modeling of their coordination, from visual data and IoT sensors. Experiments will be conducted on various project data (e.g. data coming from standard ADOS evaluation protocol of more than 300 toddlers with partial behavior annotations) as well as standard datasets from the computer vision and multimodal domains (for gesture recognition, attention).

The ideal PhD candidate should hold a MS degree in computer science, engineering, physics or applied mathematics. S/he should have a good background in statistics, linear algebra, signal processing and programming, machine learning. Experience in computer vision and deep learning are definitely a plus. The successful applicant will have good analytical skills, written and oral communication skills.

How to apply:  Interested candidates are invited to submit a cover letter, a detailed CV, and the names of three references through the Idiap online recruitment system:

Multimodal gesture, visual attention an interaction activity recognition for autism diagnosis