MIT Ultrasound Wristband Enables Real Time Robotic Hand Control

MIT Ultrasound Wristband Enables Real Time Robotic Hand Control

Engineers at MIT have developed a wearable ultrasound wristband capable of tracking complex hand movements in real time and wirelessly controlling a robotic hand. The system combines continuous ultrasound imaging of wrist muscles and tendons with an artificial intelligence algorithm that translates those images into detailed finger and palm positions.

Aaron Saunders Deepmind Boston Dynamics

Featuring insights from

Aaron Saunders, Former CTO of

Boston Dynamics,

now Google DeepMind

Humanoid Robot Report 2026 – Single User License

2026 Humanoid Robot Market Report

160 pages of exclusive insight from global robotics experts – uncover funding trends, technology challenges, leading manufacturers, supply chain shifts, and surveys and forecasts on future humanoid applications.

The research addresses a longstanding challenge in robotics: replicating the dexterity of the human hand. Human hand motion relies on the coordination of dozens of muscles, joints, tendons, and ligaments. Capturing those subtle movements accurately and continuously has proven difficult using conventional vision systems, sensorized gloves, or surface electrical signal measurements.

Imaging the mechanics of dexterity

The MIT team integrated a miniaturized ultrasound transducer into a wristband roughly the size of a smartwatch, with onboard electronics comparable in size to a cellphone. The device continuously images muscles and tendons in the wrist as the wearer moves their fingers.

Because finger motion is driven by tendons that run through the wrist, imaging these structures provides indirect but information rich insight into hand configuration. According to the researchers, specific regions in the ultrasound images correlate with distinct degrees of freedom in the hand. The fingers and thumb together account for 22 degrees of freedom, representing the many ways they can extend, flex, and angle.

To build this mapping, volunteers wore the wristband while performing a wide range of gestures. Multiple cameras recorded the corresponding hand poses. The team then labeled regions of the ultrasound images to match particular degrees of freedom. An AI model was trained on this annotated dataset to recognize image patterns and predict hand positions continuously and in real time.

From wearable sensing to robotic control

Once trained, the system was tested on eight volunteers with different hand and wrist sizes. Participants performed gestures including the 26 letters of American Sign Language and manipulated objects such as a tennis ball, scissors, a plastic bottle, and a pencil. In each case, the wristband predicted hand positions with sufficient precision to reconstruct gestures.

The researchers demonstrated wireless control of a commercial robotic hand. As a user wearing the wristband mimicked playing a keyboard, the robotic hand reproduced the same finger motions in real time to play a simple tune. The same setup was used to control a desktop basketball game through finger taps. The wristband was also paired with a computer interface, enabling pinch and grasp gestures to manipulate virtual objects smoothly.

Implications for humanoid robots

For humanoid robotics, the most significant contribution may lie beyond teleoperation. The team is collecting hand motion data from a broad range of users with different anatomies and gesture styles. The long term objective is to build a large scale dataset of dexterous hand movements.

Such datasets could support training pipelines for humanoid robots that require fine manipulation skills, including complex assembly or even surgical assistance tasks. Compared with camera based capture systems, a wearable ultrasound band is less sensitive to occlusion and environmental noise. Compared with instrumented gloves, it preserves natural tactile sensation and finger mobility.

The researchers describe the system in a paper published in Nature Electronics. Future work includes further miniaturization of the hardware and expanding the AI model to support a wider range of gestures and users.

As humanoid platforms increasingly target tasks that depend on human level hand dexterity, interfaces and data collection tools such as wearable ultrasound trackers could play a critical role in bridging human demonstration and robotic execution.

Source: todaysmedicaldevelopments.com

Similar Posts

Aaron Saunders Deepmind Boston Dynamics

Featuring insights from

Aaron Saunders, Former CTO of

Boston Dynamics,

now Google DeepMind