Study Advances New Finger-Level BCI Robotic Hand Control

Insider Brief

  • A new NIH- and BRAIN Initiative-funded study from Carnegie Mellon University demonstrates real-time, noninvasive brain control of individual robotic fingers, marking a major advance in EEG-based brain-computer interfaces (BCIs).
  • Published in Nature Communications, the study used deep learning to decode finger-specific intentions from EEG signals, enabling human subjects to control robotic finger motion through thought alone.
  • Led by Professor Bin He, the research aims to enable practical applications like brain-driven typing and offers scalable, surgery-free alternatives to invasive BCI systems for people with disabilities.

A new study funded in part by NIH- and the BRAIN Initiative from Carnegie Mellon University brings noninvasive brain-computer interfaces (BCIs) closer to everyday use, demonstrating the real-time control of individual robotic fingers using only brain signals.

According to the study published in Nature Communications, the goal is to advance toward more refined applications like brain-powered typing. Carnegie Mellon’s breakthrough shows the potential of noninvasive BCIs to support people with disabilities and expand the range of physical tasks robots can perform via thought alone.

“Improving hand function is a top priority for both impaired and able-bodied individuals, as even small gains can meaningfully enhance ability and quality of life,”explained Bin He, professor of biomedical engineering at Carnegie Mellon University, who led the study. “However, real-time decoding of dexterous individual finger movements using noninvasive brain signals has remained an elusive goal, largely due to the limited spatial resolution of EEG.”

EEG is electroencephalography, which the team used to decode finger-level movement intentions and drive a robotic hand with impressive accuracy, according to Carnegie Mellon. Unlike invasive BCI approaches, which require surgical implants, EEG-based systems are noninvasive and potentially more scalable.

In a first for EEG, human subjects were able to control two- and three-finger robotic movements just by imagining finger motion, aided by a deep learning system tuned for fine motor decoding. The goal is to advance the research to the point that typing is possible, the researchers indicated.

“The insights gained from this study hold immense potential to elevate the clinical relevance of noninvasive BCIs and enable applications across a broader population,” He pointed out. “Our study highlights the transformative potential of EEG-based BCIs and their application beyond basic communication to intricate motor control.”

Need Deeper Intelligence on the AI Market?

AI Insider's Market Intelligence platform tracks funding rounds, competitive landscapes, and technology trends across the global AI ecosystem in real time. Get the data and insights your organization needs to make informed decisions.

Related Articles

OpenAI Reshuffles Leadership Roles to Support AI Growth and Strategic Execution

OpenAI has implemented a series of leadership changes as it continues scaling its AI research and enterprise operations. The update was confirmed following internal communication

Numos Closes Seed Round from General Catalyst to Build AI Finance Platform CFOs Can Trust

Insider Brief PRESS RELEASE — Numos, an AI platform built for enterprise finance teams, announced a $4.25 million seed round led by General Catalyst with

Public Opposition to AI Data Centers Grows as Surveys Highlight Energy and Community Concerns

Public sentiment toward AI infrastructure is becoming increasingly divided, as new surveys reveal rising resistance to data center development despite continued demand for artificial intelligence

Stay Updated with AI Insider

Get the latest AI funding news, market intelligence, and industry insights delivered to your inbox weekly.

Subscribe today for the latest news about the AI landscape