MIT Wristband Lets Users Control Robotic Hands With Ultrasound

MIT researchers have built an ultrasound wristband that reads the motion of tendons and muscles to track hand movements in real time. The system lets wearers control robotic hands and virtual objects with surprising precision.

Scrolling through your phone, tying your shoes, or playing a piano melody all rely on an astonishingly complex choreography of muscles, joints and tendons in the hand. Capturing that level of dexterity in machines has long been one of robotics’ toughest challenges.

Now, MIT engineers say they have taken a major step toward that goal with a new ultrasound wristband that can track a person’s hand movements in real time — and use them to control a robotic hand or manipulate objects in virtual reality.

Caption: Graduate student Dian Li working with a robotic hand.

Credit: Melanie Gonick, MIT

The device, about the size of a smartwatch, uses ultrasound imaging to watch how muscles, tendons and ligaments move inside the wrist as the wearer flexes and bends their fingers. An AI algorithm then translates those internal images into the precise positions of the fingers and palm, 22 different “degrees of freedom” in all.

In lab demonstrations, a person wearing the band could wirelessly control a robotic hand across the room. When the wearer pointed, the robot pointed. When the wearer mimed playing piano, the robot’s plastic fingers tapped out a simple tune on a keyboard. The same setup let users shoot a tiny basketball into a desktop hoop and pinch to zoom and rotate objects on a computer screen.

The work, published in the journal Nature Electronics, is led by Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT.

Zhao noted the technology could quickly change how people interact with both robots and digital worlds.

“We think this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,” Zhao said in a news release. “It could also provide huge amounts of training data for dexterous humanoid robots.”

Why ultrasound at the wrist?

Today’s main approaches to capturing hand motion each have drawbacks.

Camera-based systems can track hands in 3D, but they require careful setup, can be blocked by other objects or people, and often struggle in different lighting conditions. Glove-based systems embed sensors in fabric, but the hardware can feel bulky, interfere with natural movement, and dull the sense of touch.

Another strategy, used in some prosthetics, reads electrical signals from muscles in the forearm or wrist. Those signals can distinguish broad gestures, like opening and closing a hand, but they are noisy and often too crude to capture subtle, continuous motion such as tracing a curve or shaping letters in sign language.

Zhao’s team wondered if they could go straight to the source of movement: the tendons and muscles that actually pull the fingers.

“The tendons and muscles in your wrist are like strings pulling on puppets, which are your fingers,” added first author Gengxi Lu. “So the idea is, each time you take a picture of the state of the strings, you’ll know the state of the hand.”

The group has spent years developing soft, skin-friendly ultrasound “stickers” — miniaturized versions of the probes used in hospitals, paired with a thin hydrogel layer so they adhere comfortably to the body. For this project, they built that imaging technology into a rigid band that wraps around the wrist, and added compact electronics, roughly the size of a cellphone, to drive the ultrasound and send data wirelessly.

Turning wrist images into finger positions

The first challenge was simply to see enough detail. Tests on volunteers showed that the band could produce clear, continuous ultrasound images of the wrist as people made different gestures.

The harder part was teaching a computer to interpret those grainy black-and-white images.

Each finger and the thumb can move in many directions and combinations, giving the hand 22 degrees of freedom. The researchers discovered that specific regions in the ultrasound images corresponded to particular motions — for example, one area changed when the thumb extended, another when the index finger bent.

To map those relationships, they had volunteers wear the band while moving their hands through a wide range of poses and grasps. Multiple cameras recorded the hand from different angles, capturing the exact position of each finger. The team then painstakingly labeled the ultrasound images, linking changes in certain regions to specific finger motions seen on camera.

Doing that kind of matching in real time would be impossible for a human, so the researchers turned to AI. They trained a machine-learning model to recognize patterns in the ultrasound images and associate them with the correct finger positions. When they tested the algorithm on new ultrasound data, it was able to accurately predict the corresponding hand gestures.

Putting the wristband to the test

With the AI system in place, the team tried the wristband on eight volunteers with different hand and wrist sizes. Participants formed a variety of gestures and grips, including all 26 letters of American Sign Language and everyday actions like holding a tennis ball, plastic bottle, scissors and a pencil.

In each case, the wristband’s predictions closely matched the actual hand positions, suggesting the system can generalize across different users and motions.

To show how this could be used in practice, the researchers wrote a simple computer program that connected to the wristband wirelessly. When wearers pinched their fingers together or spread them apart, the motion smoothly zoomed a virtual object in and out on a screen. Rotating and shifting their hands moved and manipulated the object in real time.

They then linked the band to a commercially available robotic hand. As a volunteer mimed playing a piano, the robot’s fingers pressed the keys in the same pattern, producing a basic tune. In another test, the robot copied finger taps to play a desktop basketball game, flicking a tiny ball into a hoop.

Toward everyday robotic helpers

The team is now working to shrink the hardware further and train the AI on a much larger and more diverse set of hand motions from people with different anatomies. The goal is a small, comfortable wristband that almost anyone could put on and immediately use to control robots or virtual tools with fine-grained precision.

“We believe this is the most advanced way to track dexterous hand motion, through wearable imaging of the wrist,” Zhao added. “We think these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.”

Beyond gaming and VR, the researchers imagine the technology could help teach humanoid robots to perform delicate tasks, from handling fragile objects in factories to assisting in surgery. Because the system can record detailed hand motions from many people, it could generate the massive training datasets that advanced robots need to learn humanlike skills.

Source: Massachusetts Institute of Technology