Menu

Search

  |   Technology

Menu

  |   Technology

Search

AI Could Hijack Artificially Augmented Brain, Checkmate For Cybernetics?

Brain-Computer Interface (BCI).Anders Sandberg/Wikimedia

The debate regarding the dangers posed by artificial intelligence in the tech industry has been going on for some time now. One of the possible solutions to prevent humans becoming obsolete by intelligence machines is machine augmentation or cybernetics. However, it seems even that solution has been checked before it could be implemented, with AIs potentially hijacking augmented brains.

One of the biggest proponents of augmenting the human brain through the use of a brain-computer interface (BCI) is Tesla CEO Elon Musk. That is what his startup called Neuralink is trying to accomplish. However, a group called Morningside Group, which is comprised of 27 experts in the field of neuroscience, machine intelligence, and others are warning that this could be a problem.

A BCI basically works by hacking the human brain in order to give it better abilities. These could potentially include better memory retention, improved reaction time, and boosting focus. However, since it’s a technology that relies on computers, it can also be hacked. An advanced AI can easily do this.

The group published their findings in the journal Nature, which brings up four areas that need attention when developing BCI technology. A lot of it has to do with privacy, security, and even personal identity.

“Such advances could revolutionize the treatment of many conditions, from brain injury and paralysis to epilepsy and schizophrenia, and transform human experience for the better,” the paper reads. “But the technology could also exacerbate social inequalities and offer corporations, hackers, governments or anyone else new ways to exploit and manipulate people. And it could profoundly alter some core human characteristics: private mental life, individual agency and an understanding of individuals as entities bound by their bodies.”

While these concerns don’t exactly necessitate halting developments of BCI, they do present the challenge of overcoming these issues. If they are not addressed, AIs might find it a lot easier to disarm humans than before.

  • Market Data
Close

Welcome to EconoTimes

Sign up for daily updates for the most important
stories unfolding in the global economy.