Engineers Use AI to Fine-Tune Robotic Prosthesis for Natural Hand Dexterity

Researchers at the University of Utah used artificial intelligence to improve control of a robotic prosthetic hand, reducing cognitive effort while increasing grip precision and stability.

By RB Team Published: | Updated:
Engineers Use AI to Fine-Tune Robotic Prosthesis for Natural Hand Dexterity
An AI-powered robotic prosthetic hand equipped with pressure and proximity sensors autonomously adjusts grip to improve dexterity. Photo: Utah NeuroRobotics Lab

Engineers at the University of Utah have developed an artificial intelligence system that significantly improves the dexterity and intuitiveness of robotic prosthetic hands. By combining advanced sensors with machine learning, the researchers enabled a prosthesis to grasp objects in a way that more closely resembles natural human movement. The approach reduces the mental effort required by users while increasing grip precision and reliability.

For many prosthesis users, even simple tasks such as holding a cup or picking up a small object require deliberate finger-by-finger control. This added cognitive burden is one of the main reasons advanced prosthetic devices are often abandoned. The Utah team focused on restoring the subconscious, automatic aspects of grasping that most people take for granted.

Sensors and AI Enable Autonomous Grasping

The researchers modified a commercially available prosthetic hand by equipping it with custom fingertips capable of sensing both pressure and proximity. Optical proximity sensors allow the fingers to detect objects before physical contact, while pressure sensors provide feedback once an object is grasped. Together, these inputs give the prosthesis a form of artificial touch.

An artificial neural network was trained on grasping postures using proximity data from each finger. This allows the prosthetic hand to autonomously position its fingers at the correct distance to form a stable grip. Because each finger operates with its own sensor, the system adjusts all digits in parallel, producing precise and adaptable grasping behavior across objects of different shapes and sizes.

In testing, participants using the AI-assisted prosthesis demonstrated greater grip security and precision compared to conventional control methods. They were also able to complete tasks using different grip styles without extensive training, suggesting the system adapts naturally to user intent.

Sharing Control Between Human and Machine

A central design challenge was ensuring that artificial intelligence supported the user rather than competing for control. To solve this, the researchers implemented a bioinspired framework that shares control between the human and the AI system. The prosthesis assists with fine motor adjustments while allowing the user to initiate, modify, or stop actions freely.

“What we don’t want is the user fighting the machine for control,” said Marshall Trout, a postdoctoral researcher involved in the work. “Here, the machine improved the precision of the user while also making the tasks easier.”

The system blends rapid reactive responses, such as preventing excessive grip force, with higher-level planning that anticipates how objects should be grasped. This mirrors how humans naturally coordinate instinctive reactions with learned motor patterns.

Study Leadership and Future Directions

The study was led by engineering professor Jacob A. George together with Trout at the Utah NeuroRobotics Lab and was published in the journal Nature Communications. The research involved experiments with four transradial amputees, whose amputations occurred between the elbow and wrist.

Participants completed standardized dexterity tests as well as everyday activities requiring fine motor control. Tasks such as lifting a lightweight plastic cup, which require careful force modulation, became more reliable with AI assistance.

“As lifelike as bionic arms are becoming, controlling them is still not easy or intuitive,” Trout said. “Nearly half of all users will abandon their prosthesis, often citing poor controls and cognitive burden.”

George emphasized that the long-term goal is to embed intelligence directly into prosthetic devices so users can interact with objects more naturally. The team is now exploring how this AI-driven grasping approach could be combined with implanted neural interfaces, enabling thought-based control and the return of tactile sensations. By merging sensing, intelligence, and neural input, the researchers aim to make robotic prostheses feel less like tools and more like natural extensions of the human body.