Design Engineering

Controlling robots through brainwaves? According to MIT we’re not far off

Devin Jones   

Automation General Machine Building CSAIL MIT Robot

Robot mind meld interface turns EEG/EMG impulses into robot instruction with little training.

 

Baxter

Baxter can now respond to multiple choice activities/photo courtesy of Sharon Vanderkaay

According to the Massachusetts Institute of Technology (MIT), controlling artificial intelligence with our minds is a lot closer to reality than we think.

Based on previous work dating back to 2017 the Computer Science and Artificial Intelligence Laboratory at MIT (CSAIL) has been working on the ability to control robots with the wave of a hand or quite literally, with specific brainwave patterns.

It works like this: Using a computer interface that closely monitors brainwave and muscle activity, a system of multi-choice tasks can be manipulated once a user recognizes an error the robot has made. From there, they can make hand gestures to move through a screen, selecting the correct function for the robot to execute.

Advertisement

At the core of this is a multi-choice variability system, something CSAIL has been focusing on since their work with binary choice activities dating back to 2017. Prior to this multi-choice system, the binary interactions with the robot were fairly simple; think sorting cups into a specific order. In gathering data from an electroencephalography (EEG) monitor that records brain activity, the interaction between human and AI took immense concentration. For example, the robot would only recognize specific brainwaves when the user looked at coloured lights with a correlated task the robot had been trained to react to.

What’s fascinating about the new system is that no training is required, allowing users to “pick-up and play,” with little instruction needed.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” said CSAIL Director Daniela Rus, “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

A demonstrated example the team gave is robot moving a power drill to one of three possible targets on the body of a mock plane. What’s important to understand though, is that the robot had never interacted with the user beforehand, demonstrating the system’s ease-of-use.

In creating the system, CSAIL relied on a network of electrodes placed on the users’ scalp and forearm, which in turn utilized the power of EEG for brain activity and electromyography (EMG) for muscle activity. Predictably, both metrics aren’t perfect, with EMG signals being difficult to map to more specific motions besides left and right.  Additionally, EEG signals aren’t reliably detectable, leaving gaps in the communicative feedback.

While still currently being tested in a lab setting the team at CSAIL believe “Baxter,” (a humanoid robot from Rethink Robotics) will eventually be useful for the elderly or workers with language disorders or mobility issues.

Advertisement

Stories continue below

Print this page

Related Stories