Insider Brief
- MIT researchers developed a wearable ultrasound wristband that tracks hand movements in real time with high precision, using AI to translate wrist imaging into full finger and hand positions with 22 degrees of freedom.
- The system enables wireless control of robotic hands and virtual environments, while offering a new approach to capturing fine motor data beyond the limits of cameras and sensor gloves.
- The team said the technology could serve as a scalable data pipeline for training dexterous humanoid robots, with plans to miniaturize the device and expand datasets across more users and motion types.
Massachusetts Institute of Technology researchers have developed a wearable ultrasound wristband that can track hand movements in real time with high precision. It’s a system they say could improve control in robotics and virtual environments while generating large-scale training data for dexterous AI systems, according to MIT.
The work, published in Nature Electronics and supported by MIT, the U.S. National Institutes of Health, the National Science Foundation, the U.S. Department of Defense and Singapore’s National Research Foundation, points to a new approach for capturing fine motor activity beyond the limits of cameras and sensor gloves.
“We think this work has immediate impact in potentially replacing hand tracking techniques with wearable ultrasound bands in virtual and augmented reality,” noted Xuanhe Zhao, the Uncas and Helen Whitaker Professor of Mechanical Engineering at MIT. “It could also provide huge amounts of training data for dexterous humanoid robots.”
According to MIT, the study finds that ultrasound imaging of wrist muscles, tendons and ligaments can be translated into continuous finger and hand positions using an AI model trained on labeled motion data. MIT researchers report the system can track 22 degrees of freedom in the hand, enabling precise recognition of gestures, grasps and intermediate movements that are difficult to capture with existing techniques. In testing across multiple users, the system accurately mapped hand positions while participants performed tasks including object manipulation and American Sign Language gestures.
The implications extend beyond input devices. The MIT team indicated the wristband could serve as a scalable data collection tool for training humanoid robots in dexterous manipulation tasks, including applications such as surgery or complex assembly. By capturing high-resolution motion data directly from human users, the system could help address one of the core problems in robotics: the lack of large, high-quality datasets for fine motor control.
Methodologically, the researchers combined continuous ultrasound imaging with supervised machine learning. Participants wore the wristband while their hand movements were simultaneously recorded using external cameras, allowing the team to label ultrasound image regions with specific hand positions. The trained model was then able to infer hand motion directly from ultrasound input in real time, enabling wireless control of a robotic hand and interaction with virtual objects.
Looking ahead, MIT researchers plan to reduce the size of the device and expand the dataset to include a wider range of users and motion types. The team suggested that wearable ultrasound-based tracking could replace existing hand-tracking systems in virtual and augmented reality while also serving as a foundational data pipeline for training next-generation robotic systems.
Image credit: Melanie Gonick, MIT