Brain-Computer Interfaces are Incoming: Are we prepared?

Humans controlling machines with their minds may sound like something from a sci-fi movie, but it’s becoming a reality through brain-computer interfaces. Understanding this emerging technology now can help ensure that effective policies are in place before BCI becomes a part of everyday life.



Humans controlling machines with their minds may sound like something from a sci-fi movie, but it’s becoming a reality through brain-computer interfaces. Understanding this emerging technology now can help ensure that effective policies are in place before BCI becomes a part of everyday life.

Today, with increasingly advanced technology, the “intention” control in science fiction movies is gradually moving from science fiction to reality. BCI technology is the technical basis of such “super-power” by converting brain signals into the control command of the external device. In order to form two-way interactions, information input mode going into the brain has also been added, resulting in brain + computer intelligence. By connecting the brain to the outside world, the intelligence of the brain and the intelligence of the machine complement and work together to form a symbiotic relationship.

BCI is already widely used in clinical medicine, educational research, game entertainment, consumer electronics, and many other fields. This technology will bridge the gap between people and machines, thus creating huge social and commercial value which is attracting attention from technology giants and capital from all walks of life.

Neuralink, a black technology pioneer founded by Elon Musk (with more than $100 million in financing), released a BCI system this year. The system can be used for closed-loop neuromodulation — which can be used for both neuronal signaling — and electroencephalographic stimulation for the diagnosis and treatment of a series of neurological diseases. The diseases range from epilepsy, depression, Parkinson’s syndrome, and Alzheimer’s disease. Facebook is reportedly also investing in this field with the goal to help people with impaired language function.

. . .

How does BCI Work?


One of the biggest challenges facing brain-computer interface researchers today is the basic mechanics of the interface itself. The easiest and least invasive method is a set of electrodes — a device known as an electroencephalograph (EEG) — attached to the scalp. The electrodes can read brain signals. However, the skull blocks a lot of the electrical signal, and it distorts what does get through. To get a higher-resolution signal, scientists can implant electrodes directly into the gray matter of the brain itself, or on the surface of the brain, beneath the skull. This allows for much more direct reception of electric signals and allows electrode placement in the specific area of the brain where the appropriate signals are generated. This approach has many problems, however. It requires invasive surgery to implant the electrodes, and devices left in the brain long-term tend to cause the formation of scar tissue in the gray matter. This scar tissue ultimately blocks signals.

Regardless of the location of the electrodes, the basic mechanism is the same: The electrodes measure minute differences in the voltage between neurons. The signal is then amplified and filtered. In current BCI systems, it is then interpreted by a computer program, although you might be familiar with older analog encephalography, which displayed the signals via pens that automatically wrote out the patterns on a continuous sheet of paper.

In the case of a sensory input BCI, the function happens in reverse. A computer converts a signal, such as one from a video camera, into the voltages necessary to trigger neurons. The signals are sent to an implant in the proper area of the brain, and if everything works correctly, the neurons fire, and the subject receives a visual image corresponding to what the camera sees.

Invasive BCI would require surgery. Electronic devices would need to be implanted beneath the skull, directly into the brain, to target specific sets of neurons. BCI implants currently under development are tiny and can engage up to a million neurons at once. For example, a research team at the University of California, Berkeley, has created implantable sensors that are roughly the size of a grain of sand. They call these sensors “neural dust.” Take a look at the so-called neural dust captured below
"Neural Dust" Sensors


. . .

So, Are we really prepared for BCI?

One of the most exciting areas of BCI research is the development of devices that can be controlled by thoughts. Some of the applications of this technology may seem frivolous, such as the ability to control a video game by thought. If you think a remote control is convenient, imagine changing channels with your mind.

However, there’s a bigger picture — devices that would allow severely disabled people to function independently. For a quadriplegic, something as basic as controlling a computer cursor via mental commands would represent a revolutionary improvement in the quality of life. But how do we turn those tiny voltage measurements into the movement of a robotic arm?


A more difficult task is interpreting the brain signals for movement in someone who can’t physically move their own arm. With a task like that, the subject must “train” to use the device. With an EEG or implant in place, the subject would visualize closing his or her right hand. After many trials, the software can learn the signals associated with the thought of hand-closing. Software connected to a robotic hand is programmed to receive the “close hand” signal and interpret it to mean that the robotic hand should close. At that point, when the subject thinks about closing the hand, the signals are sent and the robotic hand closes.

. . .

Conclusion:

As with any emerging technology, BCI carries many risks and unknowns. Before BCI matures, it’s important for developers to plan ahead and consider the ethical and policy issues surrounding complicated and potentially frightening scenarios.

For instance, advanced BCI technology could be used to reduce pain or even regulate emotions. What happens when military personnel is sent into battle with a reduced sense of fear? And when they return home, what psychological side effects might veterans experience without their “superhuman” traits? Now maybe the perfect time to think through these scenarios and ensure that there are guardrails in place ahead of time.

There can be a knee-jerk reaction to emerging technology-that it will take jobs away or it will be militarized, But BCI is not that different from the automobile; it can be dangerous, but it can be very helpful. I wish we had these policy discussions about artificial intelligence and robotics 20 years ago because, in many ways, folks are now being reactive. People fear what they don’t understand. We all need to understand BCI, so we can ensure that we’re not reckless with it.

As BCI developers prepare, they should carefully weigh the opportunities against the risks.

Comments