Dear colleagues and friends,
A dear colleague shares this article written by Alex Shipps, published on February 20, 2024 in the MIT News bulletin by the Massachusetts Institute of Technology (MIT) and translated by us for this space. Let's see what it's all about...
You've probably met someone who identifies as a visual or auditory learner, but others absorb knowledge through a different modality: touch. Being able to understand tactile interactions is especially important for tasks such as learning delicate surgery and playing musical instruments, but unlike video and audio, touch is difficult to record and transfer.
To take advantage of this challenge, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) in collaboration with other institutions developed an embroidered smart glove that can capture, reproduce and transmit tactile instructions.
To complement the portable device, the team also developed a simple machine learning agent that adapts to how different users react to tactile feedback, optimizing their experience. The new system could help teach people physical skills, improve the teleoperation of responsive robots and help with virtual reality training.
On January 29, an open access article was published in Nature Communications describing the scientific work carried out.
Will I be able to play the piano?
To create their smart glove, the researchers used a digital embroidery machine to seamlessly integrate tactile sensors and haptic actuators (a device that provides tactile feedback) into textiles. This technology is present in smartphones, where haptic responses are activated when touching the touchscreen.
For example, if you press an iPhone application, you'll feel a slight vibration coming from that specific part of your screen. In the same way, the new adaptable portable device sends feedback to different parts of the hand to indicate optimal movements to execute different skills.
The smart glove could teach users to play the piano, for example. In a demonstration, an expert was assigned the task of recording a simple melody on a section of keys, using the smart glove to capture the sequence in which they pressed the keyboard with their fingers. A machine learning agent then converted that sequence into haptic feedback, which was then inserted into students' gloves to follow instructions.
With their hands on that same section, the actuators vibrated in the fingers corresponding to the keys below. The channel optimizes these directions for each user, taking into account the subjective nature of tactile interactions.
“Humans perform a wide variety of tasks constantly interacting with the world around them,” says Yiyue Luo MS '20, lead author of the paper, a doctoral student in the Department of Electrical Engineering and Computer Science (EECS) and affiliated with CSAIL. “We don't normally share these physical interactions with others. Instead, we often learn by watching their movements, such as playing the piano and dance routines.”
“The main challenge when it comes to transmitting tactile interactions is that everyone perceives haptic feedback differently,” Luo adds. “This obstacle inspired us to develop a machine learning agent that learns to generate adaptive haptics for individuals' gloves, presenting them with a more practical approach to learning optimal movement.”
The portable system is customized to fit the user's hand specifications using a digital manufacturing method. A computer produces a cutout based on the measurements of people's hands; then, an embroidery machine stitches the sensors and haptics together. In 10 minutes, the soft, fabric-based portable device is ready to use. Initially trained on the haptic responses of 12 users, its adaptive machine learning model only needs 15 seconds of data from new users to personalize feedback.