Insider Brief
- A new NIH- and BRAIN Initiative-funded study from Carnegie Mellon University demonstrates real-time, noninvasive brain control of individual robotic fingers, marking a major advance in EEG-based brain-computer interfaces (BCIs).
- Published in Nature Communications, the study used deep learning to decode finger-specific intentions from EEG signals, enabling human subjects to control robotic finger motion through thought alone.
- Led by Professor Bin He, the research aims to enable practical applications like brain-driven typing and offers scalable, surgery-free alternatives to invasive BCI systems for people with disabilities.
A new study funded in part by NIH- and the BRAIN Initiative from Carnegie Mellon University brings noninvasive brain-computer interfaces (BCIs) closer to everyday use, demonstrating the real-time control of individual robotic fingers using only brain signals.
According to the study published in Nature Communications, the goal is to advance toward more refined applications like brain-powered typing. Carnegie Mellon’s breakthrough shows the potential of noninvasive BCIs to support people with disabilities and expand the range of physical tasks robots can perform via thought alone.
“Improving hand function is a top priority for both impaired and able-bodied individuals, as even small gains can meaningfully enhance ability and quality of life,”explained Bin He, professor of biomedical engineering at Carnegie Mellon University, who led the study. “However, real-time decoding of dexterous individual finger movements using noninvasive brain signals has remained an elusive goal, largely due to the limited spatial resolution of EEG.”
EEG is electroencephalography, which the team used to decode finger-level movement intentions and drive a robotic hand with impressive accuracy, according to Carnegie Mellon. Unlike invasive BCI approaches, which require surgical implants, EEG-based systems are noninvasive and potentially more scalable.
In a first for EEG, human subjects were able to control two- and three-finger robotic movements just by imagining finger motion, aided by a deep learning system tuned for fine motor decoding. The goal is to advance the research to the point that typing is possible, the researchers indicated.
“The insights gained from this study hold immense potential to elevate the clinical relevance of noninvasive BCIs and enable applications across a broader population,” He pointed out. “Our study highlights the transformative potential of EEG-based BCIs and their application beyond basic communication to intricate motor control.”