ֱ

ֱLands $1.2 Million NSF Grant to Transform Prosthetic Hand Control

Prosthetic Hand, Artificial Hands, AI, Artificial Intelligence, Hand Control Machine Learning, Amputees, Dexterity, Automated Training, Reinforcement Learning, Sensing Technology, Grasp Function

The technology will empower amputees to maximize their individual potential for controlling the full dexterity of artificial hands.


By gisele galoustian | 10/26/2022

Most people use their hands seamlessly to perform daily tasks and more complex tasks like playing a musical instrument. But for more than 1.6 million Americans and millions worldwide who have suffered the loss of a limb, prosthetic hands simply fall short.

Current prosthetic hands have five individually actuated digits, yet only one grasp function can be controlled at a time. So the ability to use a screwdriver or can opener is largely impossible and more highly sophisticated tasks remain fodder for science fiction movies.

Researchers from ֱ’s College of Engineering and Computer Science are addressing these deficiencies. They have received a four-year, $1.2 million grant from the National Science Foundation to empower amputees to maximize their individual potential for controlling the full dexterity of artificial hands.

The project involves a novel bimodal skin sensor that combines machine learning motor intention classification algorithms and reinforcement learning. Clinicians will interact with 10 study participants over the course of one year for muscle training using a smartphone.

“Anything exceeding basic functionality remains elusive for prosthetic hands even though they are mechanically capable of such feats,” said Erik Engeberg, Ph.D., principal investigator, a professor in FAU’s Department of Ocean and Mechanical Engineering and a member of the ֱStiles-Nicholson Brain Institute. "One major bottleneck limiting more sophisticated functionality is the lack of an intuitive biosignal classification method that can reliably interpret hand motor intent as well as people can do for a broad range of tasks. This is no simple feat.” 

Engeberg and co-PI/key personnel Xiangnan Zhong, Ph.D., an assistant professor in FAU’s Department of Electrical Engineering and Computer Science; , M.D., an orthopedic surgeon and professor in the Department of Orthopedics, University of Utah; and , M.D., a professor of plastic surgery and neurosurgery at the University of Florida Health, will explore methods to help upper limb-absent people learn advanced control of sophisticated prosthetic hands with this automated training regimen that can be used at home.

“Automating this aspect of health care with remote learning functionality can help disabled people access treatment more quickly, more conveniently and at a lower cost,” said Engeberg. 

Currently, wrist flexor muscles are used to close the prosthetic hand while the extensor muscles open the prosthetic hand. In an unnatural manner, the user must then toggle between different grasp types. Although there are numerous options for dexterous wearable co-robot assistants, such as prosthetic hands, the dexterity of these devices is rapidly outpacing the ability for people to intuitively control them.

One main source of this problem stems from the inability to reliably interpret the intentions of the human operator over the course of months and years. Another problem is that the science behind customizable training programs to empower disabled people to harness the full potential of prosthetic hands has not been deeply explored.

“This non-intuitive functionality is why many amputees reject using artificial limbs, which is unfortunate because of the negative collateral effects at work and for pleasure, which drastically impact their quality of life,” said Engeberg. “The current clinical state-of-the-art has a minimal level of dexterous controllability; overcoming this problem is the goal of our research.”

For each of the 10 amputees recruited for the study, the researchers will 3D scan their residual limbs to fabricate form-fitting prosthetic sockets that are adaptable to anticipated changes in residual limb musculature over the course of the program. They will develop a novel bimodal robotic skin to sense biocontrol signals in the residual limbs that will be integrated within the customized prosthetic sockets to overcome limitations with current sensing technology. Researchers will train the classification algorithms using data gathered from in-home experiments performed over the course of one year. 

The technology also will provide the research team with the ability to monitor the patients’ usage data from remote locations, which has broad applications to connect disabled people located around the world with specially trained clinical teams. Clinical practice today requires people in physical therapy programs to visit their therapists for in-person evaluations prior to progression in the training regimen, which is a lengthy and expensive process that is sometimes impractical depending upon the distance required to travel.

“Losing an upper limb has a devastating impact on the ability to perform common daily activities,” said Stella Batalama, Ph.D., dean, ֱCollege of Engineering and Computer Science. “The uniquely holistic approach developed by professor Engeberg and his colleagues to transform the state-of-the-art for dexterous control of prosthetic hands could break through previously insurmountable barriers.”

In addition, research from this grant will be used to create learning experiences for high school students from low-income households to help educate the next generation of engineers and scientists.

-FAU-