Brain-computer Interface Reads Your Mind To Take Over Tasks
Computer performance doubles roughly every two years but human performance unfortunately does not. So, in order to stay in control of increasing machine power we need to outsource part of that control. And who’s better suited for the task than those machines themselves?
MIT, Indiana University and Tufts University have combined their efforts to develop Brainput, a brain-computer interface (BCI) which can read and respond to a user’s mental state. It can identify if a user is multitasking and th...
Computer performance doubles roughly every two years but human performance unfortunately does not. So, in order to stay in control of increasing machine power we need to outsource part of that control. And who’s better suited for the task than those machines themselves?
MIT, Indiana University and Tufts University have combined their efforts to develop Brainput, a brain-computer interface (BCI) which can read and respond to a user’s mental state. It can identify if a user is multitasking and the level of performance. When Brainput detects a multitasking user reaching certain stress-levels it autonomously delegates tasks to other actors.
Lead researcher Erin Treacy Solovey and her team created a test environment in which a user is remotely controlling two robots, the blue bot and the red bot. The bots are exploring a virtual environment and need to reach a certain location within a given time frame. As the bots aren’t allowed to run into any obstacles whilst making their way through the maze, the user is constantly multitasking between the two robots.
Using functional near-infrared spectroscopy (fNIRS), a cheap and portable sensing method, the user’s brain activity is constantly monitored. When stress-levels go up Brainput relays this information to the red robot so that it can modify its behavior to better support multitasking. If the situation permits it, the bot can switch to autonomous mode, temporarily relieving the user from his task.
The Brainput team explores a different approach to interfaces. Instead of using it as a direct input device like a mouse or a keyboard, they use their BCI as a passive input channel which gives the machine more information about the user and the tools to respond accordingly. Ultimately this could result in man machine interaction modeled on co-operation rather than command and control. A nice gesture on our part that might convince the machines to abandon the Terminator scenario.
Although the researchers have proven that Brainput works, it is still in an early stage of development. Next up they want to test the system’s performance with larger groups of simulated robots and then move on to physical robots. You can find their paper via the link below.
Enhancing Interactive Systems with Streaming [pdf]
Via: Extremetech.com
MIT, Indiana University and Tufts University have combined their efforts to develop Brainput, a brain-computer interface (BCI) which can read and respond to a user’s mental state. It can identify if a user is multitasking and the level of performance. When Brainput detects a multitasking user reaching certain stress-levels it autonomously delegates tasks to other actors.
Lead researcher Erin Treacy Solovey and her team created a test environment in which a user is remotely controlling two robots, the blue bot and the red bot. The bots are exploring a virtual environment and need to reach a certain location within a given time frame. As the bots aren’t allowed to run into any obstacles whilst making their way through the maze, the user is constantly multitasking between the two robots.
Using functional near-infrared spectroscopy (fNIRS), a cheap and portable sensing method, the user’s brain activity is constantly monitored. When stress-levels go up Brainput relays this information to the red robot so that it can modify its behavior to better support multitasking. If the situation permits it, the bot can switch to autonomous mode, temporarily relieving the user from his task.
The Brainput team explores a different approach to interfaces. Instead of using it as a direct input device like a mouse or a keyboard, they use their BCI as a passive input channel which gives the machine more information about the user and the tools to respond accordingly. Ultimately this could result in man machine interaction modeled on co-operation rather than command and control. A nice gesture on our part that might convince the machines to abandon the Terminator scenario.
Although the researchers have proven that Brainput works, it is still in an early stage of development. Next up they want to test the system’s performance with larger groups of simulated robots and then move on to physical robots. You can find their paper via the link below.
Enhancing Interactive Systems with Streaming [pdf]
Via: Extremetech.com