Insider Brief
- A European Union–funded study found that haptic feedback delivered through wearable robotic exoskeletons improved coordination between human collaborators more effectively than visual or auditory cues alone.
- Researchers tested the system with 20 pairs of violinists and found the strongest performance when touch, sight and hearing were combined, highlighting the role of multisensory input in precise motor tasks.
- The findings suggest applications beyond music, including training, rehabilitation and human-robot collaboration, though further testing is needed outside controlled environments.
Researchers in a European Union–funded project found that violinists connected through wearable robotics exoskeletons achieved better synchronization when haptic feedback was introduced.
The work, coordinated by Italy’s Università Campus Bio-Medico di Roma was part of the CONBOTS project and supported by nearly €5 million in Horizon 2020 funding, explored how physical interaction mediated by wearable robots can enhance human motor coordination.
The study, study published in Science Robotics, found that the connected violinists outperformed in conditions where only visual and auditory cues were available and the strongest performance occurred when all three sensory inputs — touch, sight and hearing — were combined, according to the university.
To test the concept, researchers equipped pairs of violinists with upper-limb exoskeletons that transmitted force feedback based on differences in their movements. According to researchers, the system allowed each participant to physically sense the other’s motion, effectively simulating direct contact during a task that typically relies on visual and auditory coordination.
The experiment involved 20 pairs of musicians, including both amateur and professional performers, who completed tasks under different sensory conditions. Participants were not informed they were physically connected, yet still showed measurable improvements in alignment and timing when haptic feedback was active.
The findings suggest that touch-based feedback operates as an implicit communication channel, enabling faster and more intuitive coordination than visual signals, which require conscious attention.. researchers indicated this could have implications beyond music, including applications in motor learning, rehabilitation and human-robot collaboration.
The study was coordinated by Professor Domenico Formica, and involved the Sant’Anna School of Advanced Studies in Pisa and its spin-off IUVO srl, the National Research Council (CNR), Newcastle University in the UK, and the University of Ghent in Belgium.
“We are entering an era where robots can mediate physical communication between humans in entirely new ways,” noted Formica, who is with the Research Unit of Neurophysiology and Neuroengineering of Human-Technology Interaction (NeXTlab) of Università Campus Bio-Medico di Roma. “This study is a first step toward systems that physically connect people, enhancing their coordination, learning, and rehabilitation.”