Once you take a moment to really think about it, the simple act of scrolling through your phone is far from simple. The complexity is hidden in over 34 muscles, 27 joints, and more than 100 tendons and ligaments in your hand – all working together in perfect harmony. Our hands are remarkable and their dexterity has long been a challenge to replicate in robotics and virtual reality.
This is where the geniuses at MIT come into the picture. They have developed an innovative ultrasound wristband, that’s capable of tracking our hand movements in real time with impressive precision. Once you wear this wristband, it starts generating ultrasound images of your muscles, tendons, and ligaments in motion. Now here’s the truly revolutionary part: these images are then translated into the corresponding positions of the fingers and palm through an AI algorithm. The possibilities are huge!
Imagine controlling robots with your hand gestures or interacting in virtual environments in real-time? That’s not the future – MIT has brought it to the present. The team demonstrated how a person wearing the wristband could control a robotic hand. As they gestured or pointed, the robot mimicked these movements, enabling the wearer to make the robot do tasks, such as playing a simple piano tune or shooting a basketball hoop. The wristband can also allow users to interact, enlarge, or minimize virtual objects on computer screens.
Auf dem Weg zur Perfektion gibt es unweigerlich Herausforderungen. Es gibt zahlreiche Methoden, mit denen versucht wird, die komplizierten Bewegungen unserer Hände zu erfassen, z. B. Kameras oder mit Sensoren ausgestattete Handschuhe. Sie sind jedoch immer noch unzureichend, da es oft Probleme mit Sichtbehinderungen oder eingeschränkten natürlichen Bewegungen gibt. Umgebungsgeräusche beeinträchtigen häufig Methoden, die elektrische Signale von Muskeln nutzen, und diese Techniken sind in der Regel nicht sehr empfindlich für subtile Bewegungen.
The team at MIT led by Zhao has turned towards ultrasound. They’ve developed miniature ultrasound stickers that adhere to the skin to capture more dexterous and continuous hand movements. After a successful testing phase, where clear images of wrist movements were produced, they were able to relate these images to specific hand positions. They labelled wrist image regions with corresponding degrees of freedom and trained an AI algorithm to recognize these patterns. The results were impressive, with the wristband successfully tracking and predicting hand positions, covering even complex gestures like American Sign Language.
The next goal? To make the wristband’s hardware even smaller and train the AI software on a broader range of gestures and movements. In the words of Zhao, the lead brain behind this innovation, “We believe this is the most advanced way to track dexterous hand motion, through wearable imaging of the wrist. We think these wearable ultrasound bands can provide intuitive and versatile controls for virtual reality and robotic hands.”
As we look ahead, it’s clear that wearable tech like this ultrasound band could revolutionize the way we interact with technology, robots, and virtual reality. But it’s not just about controlling humanoid robots or interacting with VR; the potential of this tech goes even further. Imagine using this wristband to train robots on dexterity tasks or manipulating virtual objects in design applications or video games? The possibilities are mind-boggling!
It’s an incredible example of the power of AI and how it’s set to redefine the future. This research was supported by MIT, the U.S. National Institutes of Health, the U.S. National Science Foundation, the U.S. Department of Defense, and the Singapore National Research Foundation. Let’s explore more AI opportunities, shall we?
Looking to leverage AI solutions for your company? Check out implementi.ai and discover the potential of AI-driven automation for your business processes.
Mit diesem fortschrittlichen Experiment haben die Wissenschaftler versucht, uns der Zukunft einen Schritt näher zu bringen. Weitere Einzelheiten finden Sie auf der MIT-Nachrichten.
Diese Website verwendet Cookies.