top of page

How Artificial Reality Could Help Control Prostheses

For many people with amputations, robotic prostheses hold the promise of some measure of restored functionality. Controlling those robotic limbs has been a challenge. Now, a multidisciplinary team of engineers in China hopes to create an augmented reality (AR) interface that would allow people to control their robotic prosthetics with their minds.


Pengfei Chen, graduate student at the School of Artificial Intelligence and Data Science at the Hebei University of Technology, and one of the main contributors to the project, says that the team was inspired by the movie, Ready Player One, and aimed to design a system that used “augmented reality” to improve on non-invasive brain-computer interface (BCI) systems.


“Our team previously understood and learned about augmented reality technology, and we decided to design our new AR system to integrate virtual data with the real world for this project,” Chen said.


Non-invasive BCI involves detecting the signals externally using electroencephalography (EEG) that is then transmitted to a computer that will tell the prosthesis how to perform. The research team, led by Zhiguo Luo with the Defense Innovation Institute of the Academy of Military Sciences in Beijing, opted for a non-invasive BCI approach because it is generally preferable to invasive methods that require electrodes to be implanted in the brain.


The AR-based BCI system consists of a cap-style headset fitted with EEG sensors that respond to visual stimulus. When a user signals with his brain that he wants the prosthesis to move, the sensors recognize that signal and send it to a computer that processes and transmits the data to a robotic arm controller. This controller then instructs the arm on how to execute the action.


The AR component removes the need for the user to look at a monitor. Instead, the user can focus solely on the arm without distraction, producing a quicker result.


So far, the interface can activate the robotic arm to pick up an object, relocate it, and release it. But, how well it does this is dependent on how well the user focuses on the action. If the user becomes fatigued, the system doesn’t work as well.


“A critical advantage of our system is that it is a lot more flexible and easier to control for the user,” Chen said. “It also adapts to stimulation according to the user’s current state.”


The team has found that decoding the signals in a clean and precise way is a challenge. EEG data is pretty good at determining simple instructions, but responding quickly to multiple instructions at once slows the system down.


“We have to strike a balance between the frequency of the signal, and the comfort level for the user,” Chen said.


The team will continue to increase the number of actions the system can convey, working toward a design that can perform even the most subtle movements. Chen envisions a world where the AR-based BCI technology could be adapted to many parts of the body to potentially help those who are disabled regain some control over their bodies.


So far, the system has been tested on ten individuals who are not disabled. The individuals were tested on the speed and accuracy of their ability to get the robotic arm to move. They found that almost 95 percent of the movements performed were accurate to the signals the individuals had initiated. On average, the arm responded within about two seconds of receiving the signal. These results show that the interface is not only effective, but relatively accurate and quick compared to other systems under development.


0 comments

Kommentarer


bottom of page