{"id":35490,"date":"2026-03-25T14:39:00","date_gmt":"2026-03-25T14:39:00","guid":{"rendered":"https:\/\/www.tun.com\/home\/?p=35490"},"modified":"2026-03-25T18:27:43","modified_gmt":"2026-03-25T18:27:43","slug":"mit-wristband-lets-users-control-robotic-hands-with-ultrasound","status":"publish","type":"post","link":"https:\/\/www.tun.com\/home\/mit-wristband-lets-users-control-robotic-hands-with-ultrasound\/","title":{"rendered":"MIT Wristband Lets Users Control Robotic Hands With Ultrasound"},"content":{"rendered":"\n<div class=\"wp-block-group\"><div class=\"wp-block-group__inner-container is-layout-constrained wp-block-group-is-layout-constrained\">\n<div class=\"wp-block-uagb-blockquote uagb-block-e7eb3fc3 uagb-blockquote__skin-border uagb-blockquote__stack-img-none\"><blockquote class=\"uagb-blockquote\"><div class=\"uagb-blockquote__content\">MIT researchers have built an ultrasound wristband that reads the motion of tendons and muscles to track hand movements in real time. The system lets wearers control robotic hands and virtual objects with surprising precision.<\/div><footer><div class=\"uagb-blockquote__author-wrap uagb-blockquote__author-at-left\"><\/div><\/footer><\/blockquote><\/div>\n\n\n\n<div class=\"wp-block-group is-content-justification-space-between is-nowrap is-layout-flex wp-container-core-group-is-layout-0dfbf163 wp-block-group-is-layout-flex\"><div style=\"font-size:16px;\" class=\"has-text-align-left wp-block-post-author\"><div class=\"wp-block-post-author__content\"><p class=\"wp-block-post-author__name\">The University Network<\/p><\/div><\/div>\n\n\n<div class=\"wp-block-uagb-social-share uagb-social-share__outer-wrap uagb-social-share__layout-horizontal uagb-block-ee584a31\">\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-ec619ce7\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.facebook.com\/sharer.php?u=\" tabindex=\"0\" role=\"button\" aria-label=\"facebook\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M504 256C504 119 393 8 256 8S8 119 8 256c0 123.8 90.69 226.4 209.3 245V327.7h-63V256h63v-54.64c0-62.15 37-96.48 93.67-96.48 27.14 0 55.52 4.84 55.52 4.84v61h-31.28c-30.8 0-40.41 19.12-40.41 38.73V256h68.78l-11 71.69h-57.78V501C413.3 482.4 504 379.8 504 256z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-32d99934\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/twitter.com\/share?url=\" tabindex=\"0\" role=\"button\" aria-label=\"twitter\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 512 512\"><path d=\"M389.2 48h70.6L305.6 224.2 487 464H345L233.7 318.6 106.5 464H35.8L200.7 275.5 26.8 48H172.4L272.9 180.9 389.2 48zM364.4 421.8h39.1L151.1 88h-42L364.4 421.8z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n\n\n\n<div class=\"wp-block-uagb-social-share-child uagb-ss-repeater uagb-ss__wrapper uagb-block-1d136f14\"><span class=\"uagb-ss__link\" data-href=\"https:\/\/www.linkedin.com\/shareArticle?url=\" tabindex=\"0\" role=\"button\" aria-label=\"linkedin\"><span class=\"uagb-ss__source-wrap\"><span class=\"uagb-ss__source-icon\"><svg xmlns=\"https:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 448 512\"><path d=\"M416 32H31.9C14.3 32 0 46.5 0 64.3v383.4C0 465.5 14.3 480 31.9 480H416c17.6 0 32-14.5 32-32.3V64.3c0-17.8-14.4-32.3-32-32.3zM135.4 416H69V202.2h66.5V416zm-33.2-243c-21.3 0-38.5-17.3-38.5-38.5S80.9 96 102.2 96c21.2 0 38.5 17.3 38.5 38.5 0 21.3-17.2 38.5-38.5 38.5zm282.1 243h-66.4V312c0-24.8-.5-56.7-34.5-56.7-34.6 0-39.9 27-39.9 54.9V416h-66.4V202.2h63.7v29.2h.9c8.9-16.8 30.6-34.5 62.9-34.5 67.2 0 79.7 44.3 79.7 101.9V416z\"><\/path><\/svg><\/span><\/span><\/span><\/div>\n<\/div>\n<\/div>\n<\/div><\/div>\n\n\n\n<p>Scrolling through your phone, tying your shoes, or playing a piano melody all rely on an astonishingly complex choreography of muscles, joints and tendons in the hand. Capturing that level of dexterity in machines has long been one of robotics\u2019 toughest challenges.<\/p>\n\n\n\n<p>Now, MIT engineers say they have taken a major step toward that goal with a new ultrasound wristband that can track a person\u2019s hand movements in real time \u2014 and use them to control a robotic hand or manipulate objects in virtual reality.<\/p>\n\n\n\n<div style=\"height:19px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"700\" height=\"467\" src=\"https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/MIT-hand-tracker.jpg\" alt=\"\" class=\"wp-image-35505\" srcset=\"https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/MIT-hand-tracker.jpg 700w, https:\/\/www.tun.com\/home\/wp-content\/uploads\/2026\/03\/MIT-hand-tracker-300x200.jpg 300w\" sizes=\"auto, (max-width: 700px) 100vw, 700px\" \/><\/figure>\n<\/div>\n\n\n<p class=\"has-text-align-center\"><em> Caption: <\/em>Graduate student Dian Li working with a robotic hand.<\/p>\n\n\n\n<p class=\"has-text-align-center\"><em>Credit: <\/em>Melanie Gonick, MIT<\/p>\n\n\n\n<div style=\"height:1px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p>The device, about the size of a smartwatch, uses ultrasound imaging to watch how muscles, tendons and ligaments move inside the wrist as the wearer flexes and bends their fingers. An AI algorithm then translates those internal images into the precise positions of the fingers and palm, 22 different \u201cdegrees of freedom\u201d in all.<\/p>\n\n\n\n<p>In lab demonstrations, a person wearing the band could wirelessly control a robotic hand across the room. When the wearer pointed, the robot pointed. When the wearer mimed playing piano, the robot\u2019s plastic fingers tapped out a simple tune on a keyboard. The same setup let users shoot a tiny basketball into a desktop hoop and pinch to zoom and rotate objects on a computer screen.<\/p>\n\n\n\n<p>The work, <a href=\"https:\/\/www.nature.com\/articles\/s41928-026-01594-4\" target=\"_blank\" rel=\"noopener\" title=\"\">published<\/a> in the journal <em>Nature Electronics<\/em>, is led by Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT.<\/p>\n\n\n\n<p>Zhao noted the technology could quickly change how people interact with both robots and digital worlds.<\/p>\n\n\n\n<p>\u201cWe think this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,\u201d Zhao said in a news release. \u201cIt could also provide huge amounts of training data for dexterous humanoid robots.\u201d<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Why ultrasound at the wrist?<\/h3>\n\n\n\n<p>Today\u2019s main approaches to capturing hand motion each have drawbacks.<\/p>\n\n\n\n<p>Camera-based systems can track hands in 3D, but they require careful setup, can be blocked by other objects or people, and often struggle in different lighting conditions. Glove-based systems embed sensors in fabric, but the hardware can feel bulky, interfere with natural movement, and dull the sense of touch.<\/p>\n\n\n\n<p>Another strategy, used in some prosthetics, reads electrical signals from muscles in the forearm or wrist. Those signals can distinguish broad gestures, like opening and closing a hand, but they are noisy and often too crude to capture subtle, continuous motion such as tracing a curve or shaping letters in sign language.<\/p>\n\n\n\n<p>Zhao\u2019s team wondered if they could go straight to the source of movement: the tendons and muscles that actually pull the fingers.<\/p>\n\n\n\n<p>\u201cThe tendons and muscles in your wrist are like strings pulling on puppets, which are your fingers,\u201d added first author Gengxi Lu. \u201cSo the idea is, each time you take a picture of the state of the strings, you\u2019ll know the state of the hand.\u201d<\/p>\n\n\n\n<p>The group has spent years developing soft, skin-friendly ultrasound \u201cstickers\u201d \u2014 miniaturized versions of the probes used in hospitals, paired with a thin hydrogel layer so they adhere comfortably to the body. For this project, they built that imaging technology into a rigid band that wraps around the wrist, and added compact electronics, roughly the size of a cellphone, to drive the ultrasound and send data wirelessly.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Turning wrist images into finger positions<\/h3>\n\n\n\n<p>The first challenge was simply to see enough detail. Tests on volunteers showed that the band could produce clear, continuous ultrasound images of the wrist as people made different gestures.<\/p>\n\n\n\n<p>The harder part was teaching a computer to interpret those grainy black-and-white images.<\/p>\n\n\n\n<p>Each finger and the thumb can move in many directions and combinations, giving the hand 22 degrees of freedom. The researchers discovered that specific regions in the ultrasound images corresponded to particular motions \u2014 for example, one area changed when the thumb extended, another when the index finger bent.<\/p>\n\n\n\n<p>To map those relationships, they had volunteers wear the band while moving their hands through a wide range of poses and grasps. Multiple cameras recorded the hand from different angles, capturing the exact position of each finger. The team then painstakingly labeled the ultrasound images, linking changes in certain regions to specific finger motions seen on camera.<\/p>\n\n\n\n<p>Doing that kind of matching in real time would be impossible for a human, so the researchers turned to AI. They trained a machine-learning model to recognize patterns in the ultrasound images and associate them with the correct finger positions. When they tested the algorithm on new ultrasound data, it was able to accurately predict the corresponding hand gestures.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Putting the wristband to the test<\/h3>\n\n\n\n<p>With the AI system in place, the team tried the wristband on eight volunteers with different hand and wrist sizes. Participants formed a variety of gestures and grips, including all 26 letters of American Sign Language and everyday actions like holding a tennis ball, plastic bottle, scissors and a pencil.<\/p>\n\n\n\n<p>In each case, the wristband\u2019s predictions closely matched the actual hand positions, suggesting the system can generalize across different users and motions.<\/p>\n\n\n\n<p>To show how this could be used in practice, the researchers wrote a simple computer program that connected to the wristband wirelessly. When wearers pinched their fingers together or spread them apart, the motion smoothly zoomed a virtual object in and out on a screen. Rotating and shifting their hands moved and manipulated the object in real time.<\/p>\n\n\n\n<p>They then linked the band to a commercially available robotic hand. As a volunteer mimed playing a piano, the robot\u2019s fingers pressed the keys in the same pattern, producing a basic tune. In another test, the robot copied finger taps to play a desktop basketball game, flicking a tiny ball into a hoop.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Toward everyday robotic helpers<\/h3>\n\n\n\n<p>The team is now working to shrink the hardware further and train the AI on a much larger and more diverse set of hand motions from people with different anatomies. The goal is a small, comfortable wristband that almost anyone could put on and immediately use to control robots or virtual tools with fine-grained precision.<\/p>\n\n\n\n<p>\u201cWe believe this is the most advanced way to track dexterous hand motion, through wearable imaging of the wrist,\u201d Zhao added. \u201cWe think these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.\u201d<\/p>\n\n\n\n<p>Beyond gaming and VR, the researchers imagine the technology could help teach humanoid robots to perform delicate tasks, from handling fragile objects in factories to assisting in surgery. Because the system can record detailed hand motions from many people, it could generate the massive training datasets that advanced robots need to learn humanlike skills.<\/p>\n\n\n\n<div style=\"height:10px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<p><strong>Source:<\/strong> <a href=\"https:\/\/news.mit.edu\/2026\/wristband-enables-wearers-control-robotic-hand-with-own-movements-0325\" target=\"_blank\" rel=\"noopener\" title=\"\">Massachusetts Institute of Technology<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>MIT researchers have built an ultrasound wristband that reads the motion of tendons and muscles to track hand movements in real time. The system lets wearers control robotic hands and virtual objects with surprising precision.<\/p>\n","protected":false},"author":3,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"single-no-separators","format":"standard","meta":{"_acf_changed":false,"_uag_custom_page_level_css":"","_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[17],"tags":[103],"class_list":["post-35490","post","type-post","status-publish","format-standard","hentry","category-tech","tag-mit"],"acf":[],"aioseo_notices":[],"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"The University Network","author_link":"https:\/\/www.tun.com\/home\/author\/funky_junkie\/"},"uagb_comment_info":0,"uagb_excerpt":"MIT researchers have built an ultrasound wristband that reads the motion of tendons and muscles to track hand movements in real time. The system lets wearers control robotic hands and virtual objects with surprising precision.","_links":{"self":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/35490","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/comments?post=35490"}],"version-history":[{"count":12,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/35490\/revisions"}],"predecessor-version":[{"id":35503,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/posts\/35490\/revisions\/35503"}],"wp:attachment":[{"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/media?parent=35490"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/categories?post=35490"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tun.com\/home\/wp-json\/wp\/v2\/tags?post=35490"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}