Researchers from EPFL (Ecole Polytechnique Federale de Lausanne) have developed a new robotic prosthetic hand that shares both human and robotic control. In particular, the wearer controls the fingers, while the smart prosthetic handles grasping and the manipulation of objects. The new technology combines two concepts from different fields — one involves deciphering finger movements from the muscle activity on the wearer’s stump, which translates to individual finger control. The other focuses on the robotic side, which allows the hand to grab objects and maintain contact for better grasping.
“When you hold an object in your hand, and it starts to slip, you only have a couple of milliseconds to react. The robotic hand has the ability to react within 400 milliseconds. Equipped with pressure sensors all along the fingers, it can react and stabilize the object before the brain can actually perceive that the object is slipping.” — Aude Billard, Head of EPFL’s Learning Algorithms and Systems Laboratory
Shared control of the robotic hand works using an algorithm that learns how to decode the user’s intention and translates that intention by kicking in the automation end of the prosthetic. For example, that algorithm will tell the hand to close its fingers around an object when it comes into contact with sensors positioned on the fingers and palm.
The team conducted several successful tests on several amputees and seven healthy subjects and found the robotic hand functions as intended; however the scientists state there are still challenges that need to be overcome before it becomes commercially available. They also feel that the hand could be used in several neuroprosthetic applications, including as a human-machine interface, as well as a bionic hand prosthesis.
Go to Source
Author: Cabe Atwell