TACT-HAND

Improving control of prosthetic hands using tactile sensors and realistic machine learning

Intuitive and robust control of poly-articulated prosthetic hands by amputees is an as-yet unsolved problem, largely due to: (1) inadequate sensorization in the hand and in the human-machine interface; and (2) inadequate machine learning methods to detect the intent of the user. These problems cannot be easily solved since prosthetic hands pose severe limitations on weight, price, size, cosmetics and power consumption. They cannot be equipped with standard robotic sensors and, at the same time, a practical, reliable intent detection method is simply not yet available.

In TACT-HAND, we will employ and evaluate a new generation of tactile sensors coupled with realistic machine learning methods merging classification and regression to overcome both problems. "Realistic" means here that the system will work online and can be easily calibrated by the users. Different techniques will be investigated for this purpose, including multilinear algebra for tensor data analysis and online adaptation with multi-kernel learning algorithms.

Firstly, we will build a lightweight, wearable human-machine interface based upon high-resolution tactile sensors to augment/substitute traditional surface electromyography. Secondly, we will design and apply fast, incremental and intuitive machine learning methods to fully exploit the tactile interface and provide stable intent detection. Thirdly, similar sensor technology will be employed to build a tactile dataglove to sensorize a commercially available hand prosthesis. The aim is to advance the state of the art in prosthetic hand control with better grasping and manipulation together, including higher stability and reliability.

During the project, a pool of upper-limb amputees will be continuously monitored and their progress assessed while using the developed devices in order to assess its effectiveness and practical usability. Three institutions with a clear record in prosthetics and rehabilitation robotics (DLR), advanced sensors and force / impedance control (CITEC), and machine learning (Idiap), will tightly cooperate towards this aim.

Funding: SNSF (D-A-CH Programme)

Start date: 01/04/2015
End date: 31/03/2018