As EEG and fMRI “smartcaps” make their way into general use, Duke philosopher and law professor Nita Farahany says it’s time to engage in a “battle for our brains.” The 70,000 thoughts we have daily are becoming more and more transparent. It’s time to fight for our neurorights.
I’d like to know how they count them, but Nita Farahany says researchers have determined that we have 70,000 thoughts per day.
I just did the math. That’s just under 3,000 per hour and 49 per minute. That sounds about right to me. They’re hard to count as I go but, as I write, I figure I’m throwing off thoughts at the rate of about one per second, assuming each word counts as a thought, as I suspect it does. That’s worth researching.
At any rate, it used to be that criminals could access and steal our thoughts only by hacking into our computers and phones, but now our brains are open season as well. We can strap an EEG machine onto our foreheads and monitor our brains for signs of all kinds of things. Or we can put on a helmet that has an fMRI machine built into it.
Call them “smartcaps.,” a new class of powerful brain-decoding devices moving into the Brain-Computer Interface,
As Farahany recently explained at the World Economic Forum, explains, smartcaps “measure a firing of neurons in your brain and the tiny electrical discharges that occur as a result of those firings,” Such devices can measure the brain’s activity in different states and provide feedback that enables users and experts who are monitoring these signals to change these states.
Sports teams are starting to use them to track and enhance individual performance, she says. Transportation companies are using them to detect drowsiness in pilots and truckers.
Introducing Kernel Flow
A cutting-edge “time domain functional near infrared spectroscopy” (TD-fNIRS) system called Kernel Flow introduced in late 2020 is helping researchers better understand “the effect of anything on the brain,” the company claims.
“Weighing a couple of pounds each, the helmets contain nests of sensors and other electronics that measure and analyze a brain’s electrical impulses and blood flow at the speed of thought, providing a window into how the organ responds to the world,” Bloomberg’s Ashlee Vance reported. “The basic technology has been around for years, but it’s usually found in room-size machines that can cost millions of dollars and require patients to sit still in a clinical setting.”
She continues: “The promise of a leagues-more-affordable technology that anyone can wear and walk around with is, well, mind-bending. Excited researchers anticipate using the helmets to gain insight into brain aging, mental disorders, concussions, strokes, and the mechanics behind previously metaphysical experiences such as meditation and psychedelic trips. “
“To make progress on all the fronts that we need to as a society, we have to bring the brain online,” she quotes Bryan Johnson, who spent more than five years and raised about $110 million—half of it his own money—to develop the helmets.
The Prospect of Employer Surveillance
All of this is very exciting, says Farahany, but here’s the problem: “There’s nothing in any country that protects you against this type of information being used against you by an employer.”
Tough questions lie ahead. Consider an airline that wants to protect the public from suicidal commercial pilots who might opt to end their lives by taking down a plane, the fate met by 144 passengers and six crew members on Germanwings Flight 9525, which co-pilot Andreas Lubitz crashed in the French Alps on March 24, 2015. Lubitz had been treated for suicidal tendencies and declared “unfit to work” by his doctor, but Lubitz kept this information from his employer and instead reported for duty.
“Although Mr. Lubitz was a high-profile example of pilot suicide, his was not an isolated case,” The New York Times reported. “Over the past two decades, at least a half-dozen fatal airline crashes have been attributed to deliberate actions by the pilot. Other episodes and close calls have been quietly played down by investigators.”
According to a Harvard study conducted shortly after the crash, “hundreds of pilots currently flying are managing depressive symptoms, perhaps without the possibility of treatment due to the fear of negative career impacts.” Out of 1,837 pilots surveyed, 233 met the depression threshold and 75 reported having suicidal thoughts.
In wake of the crash, the European Aviation Safety Agency called for the “mandatory and comprehensive psychological screening by a qualified specialist of all prospective pilots either during their initial training or before they are hired.”
A logical next step would be to include periodic brain scans to search for signs of potential problems. The public interest certainly would be served by such a precaution, but what of the concerns pilots certainly will have about having their thoughts surveilled in this way?
We have the Genetic Information Nondiscrimination Act, which protects workers from employers who might want to use their genetic information against them, says Farahany. But we have no “Neurological Information Nondiscrimination Act” or its equivalent.
“Ultimately, this is the question not just about the productivity of our workforce, but about the culture of our workforce and our society,” Farahany says. “We have to ask if there are any limits.”
Conversations in Our Churches
AI-powered technology affects privacy interests across many social vectors, says AI and Faith founder David Brenner, pointing to a package of articles on privacy issues the organization published in its March 2021 newsletter. This is from his introduction:
“Sensibly regulating this technology to protect privacy and encourage human flourishing depends on articulating foundational principles for the value of personal privacy, Finding such principles in the ancient wisdom of faith doctrines and beliefs is a great opportunity for people of faith to contribute to and stabilize the increasingly fluid and ungrounded secular debate around privacy.”
Key principles also are to be found in law, especially in the decisions and other writings of Louis Brandeis. In a remarkable analogy to our modern day, Brandeis published The Right to Privacy in response to the arrival in 1888 of the Kodak Brownie, a lightweight camera with a price tag of $1—cheap enough to be marketed to the masses. The Brownie was regarded as a dangerous “thief of privacy,” the revolutionary “social media” of its time.