New opening for a PhD positions in Visual and Multimodal Sensing for Human-Robot Interaction (HRI)
The research will be conducted in the context of MuMMER (MultiModal Mall Entertainment Robot), a newly 4-year EU Horizon 2020 funded project involving several leading European institutions in the field.
The project will develop a humanoid robot (based on Aldebaran's Pepper platform) able to engage and interact autonomously and naturally with individuals or groups or people. To support this behaviour, the project consortium will develop and integrate new methods from audiovisual scene processing, social-signal processing, high-level action selection, and human-aware robot navigation. See http://www.edinburgh-robotics.org/news/201508/h2020-funding-award-interaction-lab-heriot-watt-university.
To apply or to get more info about the position, click on the following link: Visual and Multimodal Sensing for Human-Robot Interaction (HRI)