PPrivate thoughts may not be private for much longer, heralding a nightmare world where lost political views, thoughts, obsessions, and feelings could be interrogated and punished, all thanks to advances in neurotechnology.
Or at least that’s what one of the world’s leading neuroscience legal ethicists believes.
In a new book, The Battle for Your Brain, Duke University biosciences professor Nita Farahany argues that such intrusions into the human mind by technology are so close that public discussion is long overdue. and lawmakers should immediately put in place brain protections like they would any other. realm of personal liberty.
Advances in thought hacking and monitoring, with Orwellian fears of mind control running just below the surface, is the subject of Farahany scholarship along with urgent calls for legislative guarantees for thought privacy, including liberties of “cognitive fingerprinting”, which fall within an area of ethics. widely referred to as “cognitive freedom”.
Certainly the field is advancing rapidly. The recent release of ChatGPT and other AI technology innovations demonstrated that some aspects of thinking simulation, called machine learning, are already here. It has also been widely noted that Elon Musk’s Neuralink and Mark Zuckerberg’s Meta are working on brain interfaces that can read thoughts directly. A new field of cognition-enhancing drugs is being developed, called nootropics. Technology that allows people to experience paralysis to control an artificial limb or write text on a screen just thinking they are in progress.
But aside from the many benefits, there are clear threats around political indoctrination and interference, police or workplace surveillance, brain fingerprinting, the right to have thoughts, good or bad, the implications for the role of “intent” in the justice system and soon.
Farahany, who served on Barack Obama’s commission to study bioethical issues, believes that advances in neurotechnology mean that intrusions through the brain’s privacy door, whether through military programs or through research laboratories, Well-funded research at big tech companies are available through brain-to-computer innovations like wearable technology.
“All the major tech companies have massive investments in all-in-one devices that have brain sensors,” Farahany said. “Neural sensors will become a part of our everyday technology and a part of how we interact with that technology.”
Coupled with advances in science aimed at decoding and rewriting brain functions, they are widespread and pose a discernible risk, Farahany argues, and require urgent action to bring them under agreed controls.
“We have a moment to get this right before that happens, by becoming aware of what is happening and making critical decisions that we must make now to decide how we use technology in ways that are good and not misused or oppressive.”
The brain, Farahany warns, is the only space we still have for respite and privacy, and where people can cultivate a true sense of self, and where they can maintain how they feel and how they react to themselves. “In the very near future that will not be possible,” he said.
In a sense, we already use technology to translate our thoughts and help our minds. The ability of social networks to read minds is already offered, for free, through participation with likes and dislikes, predictive algorithms, predictive text, etc.
But advances in neurotechnologies, which exploit a direct connection to the brain, would offer more precise and therefore potentially dangerous inroads into a hitherto private realm.
“I wrote this book with neurotechnology at the forefront as a wake-up call, but not just neurotechnology, but all the ways that brains can be hacked and traced and are already being hacked and traced,” Farahany said.
Concerns about military-focused neuroscience, called the sixth dimension of warfare, are not new in themselves.
The Defense Advanced Research Projects Agency (Darpa) has been funding brain research since the 1970s. In 2001, the military umbrella launched a program to “develop technologies to augment fighters.”
François du Cluzel, project manager at the Nato Act Innovation Hub, issued a report in November 2020 titled cognitive warfare That, he said, “is not limited to the military or institutional world. Since the early 1990s, this capacity has tended to apply to the political, economic, cultural, and social fields.
The US government has blacklisted Chinese institutes and companies it believes are working on dangerous “biotech processes to support Chinese military end-uses,” including “alleged brain control weaponry.”
At the end of 2021, the department of commerce aggregate 34 China-based entities to be blacklisted, citing some for participating in the creation of biotechnology including “alleged brain control weapons” and for “acting against foreign policy or national security interests” of the US. USA
Nathan Beauchamp-Mustafaga, policy analyst at the Rand Corporation and author of the China Brief, has warned of an “evolution in warfare, moving from the natural and material domains (land, sea, air and electromagnetic) to the realm of the human mind.”
Farahany argues that societies need to move beyond tackling cognitive warfare or banning TikTok. Legislation is needed to establish brain rights or cognitive freedoms, as well as raise awareness of the risks of intrusion posed by digital platforms embedded with advances in neuroscience.
“Neuro-rights” laws are already being drafted, including protections on the use of biometric data in legal and health settings. Two years ago, Chile became the first nation to add articles to its constitution to explicitly address the challenges of emerging neurotechnologies. The US state of Wisconsin has also passed laws on the collection of biometric data about the brain.
Most legal protections refer to the disclosure of brain data collection, not to neurological rights themselves.
“There is no comprehensive right to cognitive freedom, as I define it, that applies to much more than neurotechnologies, but to self-determination over our brains and mental experiences, which applies to many of the digital technologies that we are approaching today”. Farahany said.
Or, as Farahany writes in his book: “Will George Orwell’s dystopian vision of mind crime become a modern reality?”
The answer could be yes, no, or maybe, but none of that precludes an urgent need for formal brain protections that lawmakers or commercial interests may be unwilling to put in place, Farahany believes.
She said: “Cognitive freedom is part of a much larger conversation that I think is incredibly urgent given all that is already happening, and the increasing precision with which it will happen, within neurotechnology.”