homehome Home chatchat Notifications


Brain implants could allow soldiers to fire weapons with their thoughts and turn off fear -- but what about the ethics of all this?

From warfare to entertainment and VR, brain-computer interface development has extended beyond prosthetics for patients with disabilities. Missing is full ethical consideration of the consequences.

Nancy S. Jecker and Andrew Ko
December 2, 2022 @ 6:06 pm

share Share

Credit: ERIKA WOODRUM/HHMI/NATURE.

Imagine that a soldier has a tiny computer device injected into their bloodstream that can be guided with a magnet to specific regions of their brain. With training, the soldier could then control weapon systems thousands of miles away using their thoughts alone. Embedding a similar type of computer in a soldier’s brain could suppress their fear and anxiety, allowing them to carry out combat missions more efficiently. Going one step further, a device equipped with an artificial intelligence system could directly control a soldier’s behavior by predicting what options they would choose in their current situation.

While these examples may sound like science fiction, the science to develop neurotechnologies like these is already in development. Brain-computer interfaces, or BCI, are technologies that decode and transmit brain signals to an external device to carry out a desired action. Basically, a user would only need to think about what they want to do, and a computer would do it for them.

BCIs are currently being tested in people with severe neuromuscular disorders to help them recover everyday functions like communication and mobility. For example, patients can turn on a light switch by visualizing the action and having a BCI decode their brain signals and transmit it to the switch. Likewise, patients can focus on specific letters, words or phrases on a computer screen that a BCI can move a cursor to select.

However, ethical considerations have not kept pace with the science. While ethicists have pressed for more ethical inquiry into neural modification in general, many practical questions around brain-computer interfaces have not been fully considered. For example, do the benefits of BCI outweigh the substantial risks of brain hacking, information theft and behavior control? Should BCI be used to curb or enhance specific emotions? What effect would BCIs have on the moral agency, personal identity and mental health of their users?

These questions are of great interest to us, a philosopher and neurosurgeon who study the ethics and science of current and future BCI applications. Considering the ethics of using this technology before it is implemented could prevent its potential harm. We argue that responsible use of BCI requires safeguarding people’s ability to function in a range of ways that are considered central to being human.

Expanding BCI beyond the clinic

Researchers are exploring nonmedical brain-computer interface applications in many fields, including gaming, virtual reality, artistic performance, warfare and air traffic control.

For example, Neuralink, a company co-founded by Elon Musk, is developing a brain implant for healthy people to potentially communicate wirelessly with anyone with a similar implant and computer setup.

In 2018, the U.S. military’s Defense Advanced Research Projects Agency launched a program to develop “a safe, portable neural interface system capable of reading from and writing to multiple points in the brain at once.” Its aim is to produce nonsurgical BCI for able-bodied service members for national security applications by 2050. For example, a soldier in a special forces unit could use BCI to send and receive thoughts with a fellow soldier and unit commander, a form of direct three-way communication that would enable real-time updates and more rapid response to threats.

To our knowledge, these projects have not opened a public discussion about the ethics of these technologies. While the U.S. military acknowledges that “negative public and social perceptions will need to be overcome” to successfully implement BCI, practical ethical guidelines are needed to better evaluate proposed neurotechnologies before deploying them.

Utilitarianism

One approach to tackling the ethical questions BCI raises is utilitarian. Utilitarianism is an ethical theory that strives to maximize the happiness or well-being of everyone affected by an action or policy.

Enhancing soldiers might create the greatest good by improving a nation’s warfighting abilities, protecting military assets by keeping soldiers remote, and maintaining military readiness. Utilitarian defenders of neuroenhancement argue that emergent technologies like BCI are morally equivalent to other widely accepted forms of brain enhancement. For example, stimulants like caffeine can improve the brain’s processing speed and may improve memory.

However, some worry that utilitarian approaches to BCI have moral blind spots. In contrast to medical applications designed to help patients, military applications are designed to help a nation win wars. In the process, BCI may ride roughshod over individual rights, such as the right to be mentally and emotionally healthy.

For example, soldiers operating drone weaponry in remote warfare today report higher levels of emotional distress, post-traumatic stress disorder and broken marriages compared to soldiers on the ground. Of course, soldiers routinely elect to sacrifice for the greater good. But if neuroenhancing becomes a job requirement, it could raise unique concerns about coercion.

Neurorights

Another approach to the ethics of BCI, neurorights, prioritizes certain ethical values even if doing so does not maximize overall well-being.

Proponents of neurorights champion individuals’ rights to cognitive liberty, mental privacy, mental integrity and psychological continuity. A right to cognitive liberty might bar unreasonable interference with a person’s mental state. A right to mental privacy might require ensuring a protected mental space, while a right to mental integrity would prohibit specific harms to a person’s mental states. Lastly, a right to psychological continuity might protect a person’s ability to maintain a coherent sense of themselves over time.

Person using a brain-computer interface, wearing an EEG cap connected to a laptop
Brain-computer interfaces can take different forms, such as an EEG cap or implant in the brain. oonal/E+ via Getty Images

BCIs could interfere with neurorights in a variety of ways. For example, if a BCI tampers with how the world seems to a user, they might not be able to distinguish their own thoughts or emotions from altered versions of themselves. This may violate neurorights like mental privacy or mental integrity.

Yet soldiers already forfeit similar rights. For example, the U.S. military is allowed to restrict soldiers’ free speech and free exercise of religion in ways that are not typically applied to the general public. Would infringing neurorights be any different?

Human capabilities

A human capability approach insists that safeguarding certain human capabilities is crucial to protecting human dignity. While neurorights home in on an individual’s capacity to think, a capability view considers a broader range of what people can do and be, such as the ability to be emotionally and physically healthy, move freely from place to place, relate with others and nature, exercise the senses and imagination, feel and express emotions, play and recreate, and regulate the immediate environment.

We find a capability approach compelling because it gives a more robust picture of humanness and respect for human dignity. Drawing on this view, we have argued that proposed BCI applications must reasonably protect all of a user’s central capabilities at a minimal threshold. BCI designed to enhance capabilities beyond average human capacities would need to be deployed in ways that realize the user’s goals, not just other people’s.

For example, a bidirectional BCI that not only extracts and processes brain signals but delivers somatosensory feedback, such as sensations of pressure or temperature, back to the user would pose unreasonable risks if it disrupts a user’s ability to trust their own senses. Likewise, any technology, including BCIs, that controls a user’s movements would infringe on their dignity if it does not allow the user some ability to override it.

A limitation of a capability view is that it can be difficult to define what counts as a threshold capability. The view does not describe which new capabilities are worth pursuing. Yet, neuroenhancement could alter what is considered a standard threshold, and could eventually introduce entirely new human capabilities. Addressing this requires supplementing a capability approach with a fuller ethical analysis designed to answer these questions.

Nancy S. Jecker, Professor of Bioethics and Humanities, School of Medicine, University of Washington and Andrew Ko, Assistant Professor of Neurological Surgery, University of Washington

This article is republished from The Conversation under a Creative Commons license. Read the original article.

share Share

China Now Uses 80% Artificial Sand. Here's Why That's A Bigger Deal Than It Sounds

No need to disturb water bodies for sand. We can manufacture it using rocks or mining waste — China is already doing it.

Over 2,250 Environmental Defenders Have Been Killed or Disappeared in the Last 12 Years

The latest tally from Global Witness is a grim ledger. In 2024, at least 146 people were killed or disappeared while defending land, water and forests. That brings the total to at least 2,253 deaths and disappearances since 2012, a steady toll that turns local acts of stewardship into mortal hazards. The organization’s report reads less like […]

After Charlie Kirk’s Murder, Americans Are Asking If Civil Discourse Is Even Possible Anymore

Trying to change someone’s mind can seem futile. But there are approaches to political discourse that still matter, even if they don’t instantly win someone over.

Climate Change May Have Killed More Than 16,000 People in Europe This Summer

Researchers warn that preventable heat-related deaths will continue to rise with continued fossil fuel emissions.

New research shows how Trump uses "strategic victimhood" to justify his politics

How victimhood rhetoric helped Donald Trump justify a sweeping global trade war

Biggest Modern Excavation in Tower of London Unearths the Stories of the Forgotten Inhabitants

As the dig deeper under the Tower of London they are unearthing as much history as stone.

Millions Of Users Are Turning To AI Jesus For Guidance And Experts Warn It Could Be Dangerous

AI chatbots posing as Jesus raise questions about profit, theology, and manipulation.

Can Giant Airbags Make Plane Crashes Survivable? Two Engineers Think So

Two young inventors designed an AI-powered system to cocoon planes before impact.

First Food to Boost Immunity: Why Blueberries Could Be Your Baby’s Best First Bite

Blueberries have the potential to give a sweet head start to your baby’s gut and immunity.

Ice Age People Used 32 Repeating Symbols in Caves Across the World. They May Reveal the First Steps Toward Writing

These simple dots and zigzags from 40,000 years ago may have been the world’s first symbols.