Nita Farahany (@NitaFarahany) is a law professor at Duke University; a leading expert on the ethical, legal, and social implications of emerging technologies; and the author of The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology.
What We Discuss with Nita Farahany:
- Consumer technology that can track, decode, and even manipulate what goes on in the brain is no longer just a plot device in some far-flung sci-fi novel — it’s already beginning to come to market.
- An ALS patient recently set a record for communicating through a brain implant at 62 words per minute (in comparison, ALS-afflicted physicist Stephen Hawking was only able to communicate at about 15 words per minute by the time of his death in 2018).
- Though still in its infancy, consciously transmitted brain-to-brain communication has proven successful in the laboratory.
- Functional magnetic resonance imaging (fMRI) scans can accurately sense political bias from subjects’ unconscious thoughts.
- Brain scans reveal that a significant percentage of coma patients who can’t speak or move are aware of the world around them and can communicate through electroencephalogram (EEG) sensors.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
What are the ethical, legal, and social implications of new neurotechnologies and their impact on our ability to think and make decisions? This may seem like a question for distant generations yet unborn, but many of these technologies are already on their way to market or emerging from nascency sooner rather than later.
On this episode, Duke University law professor and The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology author Nita Farahany joins us to discuss how these technologies — such as brain-computer interfaces, brain scans, and neurostimulation — have the potential to improve our lives in many ways, but they also raise concerns about privacy, autonomy, and the potential for misuse. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. We appreciate your support!
This Episode Is Sponsored By:
- Peloton: Learn more at onepeloton.com/row
- Shopify: Go to shopify.com/jordan for a free 14-day trial
- SimpliSafe: Learn more at simplisafe.com/jordan
- Other People’s Pockets: Listen here or wherever you find fine podcasts!
Did you miss our conversation with Stanford neuroscientist David Eagleman about the conscious brain vs. the subconscious brain, exploring new senses, intellectual flexibility, technological brain augmentation, and the umwelt? Catch up with episode 655: David Eagleman | How Our Brains Construct Reality here!
Thanks, Nita Farahany!
If you enjoyed this session with Nita Farahany, let her know by clicking on the link below and sending her a quick shout out at Twitter:
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at email@example.com.
Resources from This Episode:
- The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita A. Farahany | Amazon
- Nita Farahany | Website
- Nita Farahany | Twitter
- Nita Farahany | Facebook
- Nita Farahany | Instagram
- The Most Accurate Smart Ring | Oura Ring
- Measure Alertness. Eliminate Fatigue. | SmartCap
- Inside Facebook Reality Labs: Wrist-Based interaction for the next computing platform – Tech at Meta
- EEG (Electroencephalogram) | Mayo Clinic
- What is ALS (Amyotrophic Lateral Sclerosis)? | The ALS Association
- Breakthrough Technology for the Brain | Neuralink
- The Leader in Brain-Computer Interface Technology | Blackrock Neurotech
- An ALS Patient Set a Record Communicating through a Brain Implant: 62 Words per Minute | MIT Technology Review
- Stephen Hawking Speaks at MIT (1994) | MIT
- With BrainNet, Three People Play Tetris with Their Minds | Futurity
- Brain Scans Remarkably Good at Predicting Political Ideology | Ohio State News
- Some People Who Appear to Be in a Coma May Actually Be Conscious | Scientific American
- Better Laws. Stronger States | Uniform Law Commission
810: Nita Farahany | Thinking Freely in the Age of Neurotechnology
[00:00:00] Jordan Harbinger: Special thanks to Peloton for sponsoring this episode of The Jordan Harbinger Show.
[00:00:04] Coming up next on The Jordan Harbinger Show.
[00:00:07] Nita Farahany: I'm inviting the reader to imagine the possible ways in which even with the current technology, not having to get into real decoding of complex thought in the brain, it still can interfere with freedom of contract. It still can be misused in the workplace. It still can be misused, but there's already actually a lot of use by governments worldwide of neurotechnology in ways that are good and ways that are bad and nefarious.
[00:00:37] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. We have in-depth conversations with scientists, entrepreneurs, spies, and psychologists, even the occasional drug trafficker, money laundering expert, economic hitman, or tech mogul. And each episode turns our guest wisdom into practical advice that you can use to build a deeper understanding of how the world works and become a better thinker.
[00:01:03] If you're new to the show or you want to tell your friends about the show, our starter packs are how you do it. These are collections of favorite episodes organized by topic, and they help new listeners get a taste of what we do here on the show — topics like persuasion and influence, crime and cults, China, North Korea, abnormal psychology, and more. Just visit jordanharbinger.com/start or search for us in your Spotify app to get started. Of course, you can use our AI chatbot search thingamajig to go and find anything from any episode of the show, including promo codes, Feedback Friday answers and advice, et cetera. jordanharbinger.com is where you can.
[00:01:36] Today, on the show, we'll explore some new technologies that can actually read our brainwaves and see into us like nothing ever before. It sounds like science fiction. In fact, I thought it was many years away when I first started researching this, but in fact, this technology is already here. The good news, lots of medical innovations and new science on the horizon. The bad news, social media companies, governments, and police are actually some of those on the forefront of adoption development of these new discoveries, so wah, wah, sad trombone. But today we will uncover how this tech works, what it can and cannot do, and how we've got a special opportunity right now to shape how it's used before it's too late. I think the tech and conversation were fascinating in this one, and I know you'll be into it as well.
[00:02:18] So here we go with Nita Farahany.
[00:02:24] Your book almost starts off like science fiction, and I think we talked about this pre-show where I said, 'Oh yeah, 10 years from now—" and you're like, "No, hold on, let me stop you right there. The idea that somebody can read our brainwaves and possibly see what we're thinking. I know it sounds impossible or so far in the future that it's not worth worrying about, but here's some stuff to worry about, Jordan.
[00:02:45] Nita Farahany: First, I would say most people, when I start to talk about the coming age of consumer brain wearables, the fact that we can track and hack your brain, they're all like, "Oh yeah, okay, I've got a lot of things to worry about, but that's not something I'm going to worry about because that's far off into the future." No, it's here. It's arrived. There are already consumer brain sensors. Let me just, I'll back up even a step further.
[00:03:05] Jordan Harbinger: Sure.
[00:03:05] Nita Farahany: People are very familiar with, at this point, having a heart rate sensor in their watch. They're very familiar with a Fitbit that they wear that tracks their movements.
[00:03:13] Jordan Harbinger: I got a ring on right now that does that, right?
[00:03:15] Nita Farahany: Yeah.
[00:03:15] Jordan Harbinger: No big surprise.
[00:03:16] Nita Farahany: Probably picks up like mood and changes in, you know, temperature.
[00:03:19] Jordan Harbinger: Mood? It's not a mood ring. This isn't 1991.
[00:03:22] Nita Farahany: No, no, no, no, but there are. Do you have the Oura?
[00:03:26] Jordan Harbinger: I have the Oura Ring. Yeah, so they sponsor of the show, so you can mention them by name. Shout out to Oura Ring.
[00:03:30] Nita Farahany: Okay.
[00:03:30] Jordan Harbinger: I don't know if we have a code, but whatever. Check the deals page. Yeah, I think it does body temperature whether I'm sleeping, although I'm always sort of like, how does it know if I'm actually asleep? Sometimes it's a little bit wrong, but it's kind of hard to complain. It's amazing. It's a ring and it still knows that I'm moving around or that I have a fever. It's incredible.
[00:03:47] Nita Farahany: It's very small and it tells a lot of information about you and it's making inferences based on how much you're moving and what your temperature is and how temperature changes. All of those things, it's making inferences, but it's pretty good. People are used to GPS location being tracked pretty much at all times now through their phones and other devices that they carry around with them. But when I tell them that, okay, and now it's the era of brain sensors doing the same thing, they're startled. Like it just hadn't occurred to them that we would soon have sensors that you could wear on your scalp and in your ears that would pick up your brain activity in much the same way that all of the rest of our bodily information is being decoded.
[00:04:25] And those sensors are already being used by millions of people worldwide and headsets that they wear and used for meditation, and they are already being used in workplaces where more than 5,000 companies worldwide are using even just one of these companies' devices, a company called SmartCap that uses these sensors that are in baseball caps or on a headband or inside of a hardhat that picks up brain activity. And here it's the electrical activity in the brain that it's picking up to try to decode if a person is wide awake or asleep. And launching literally within the next few weeks, and also in the next few months, are a bunch of products that are now multifunctional ones. And what, by that I mean, you know, we both have in earbuds right now.
[00:05:11] Jordan Harbinger: Mm-hmm.
[00:05:11] Nita Farahany: And those earbuds can allow us to take conference calls and do podcasts and listen to music and the next generation of them have brain sensors in them that also pick up the electrical activity in the brain and allow us to track a lot of what's happening there. Likewise, headphones, major headphone manufacturers have partnered with neurotech companies so that the soft cups that go around your ears have a bunch of sensors that are put into them that pick up brainwave activity as well. And so the era of tracking and decoding, and even manipulating and changing the brain has arrived. And it's about to go mainstream as the huge tech companies like Meta and Google and Apple and Microsoft are all in on developing and integrating these devices into their devices.
[00:05:57] Jordan Harbinger: If I didn't know who you were and we hadn't talked and clicked in the way that we did, I would almost think like this is one of those kooky, she's like reading into stuff like, oh my God, they're putting microchips in the vax, right? It's like almost like that sounding level of this is impossible. How is it possible? How are my brainwaves coming out of my head into earbuds that don't exist yet, and being read by somebody? It just sounds fake.
[00:06:22] Nita Farahany: Yeah.
[00:06:22] Jordan Harbinger: I know it's not because I read the book, but I can't wrap my tiny brain around this yet.
[00:06:28] Nita Farahany: Yeah. Well, I mean, so first of all, I would say one of the challenges of being in the space that I'm. is that there are a whole bunch of people who believe that their brains are being tracked, which they're not.
[00:06:41] Jordan Harbinger: They wear tinfoil hats. Like, are those people right now? Have we come that far?
[00:06:46] Nita Farahany: I mean, maybe in the future, right? But like there are a whole bunch of people who have believed for very long time that there have been chips placed in their brains and that their brains are being hacked and tracked by other people. I'm talking about a very mainstream, real technology that every major tech company is investing in developing and many are already on the market. So Meta, for example, just had a big announcement a couple of weeks ago. You know, they've had huge layoffs across the board.
[00:07:15] Jordan Harbinger: Yeah.
[00:07:15] Nita Farahany: It's unclear exactly what's happening with the metaverse for them, but it's very clear that they are doubling down on their AR investment on augmented reality. And one piece of reality labs and probably the biggest acquisition, one of the earliest things that I write about in chapter one of the book is their acquisition of this company called CTRL-labs. And CTRL-labs was like my aha moment. I've been thinking about neurotechnology forever, but it was hard to see how it could go mainstream, both because there wasn't like the killer app to make it mainstream, but also it was a separate device you'd have to wear. It wasn't a sensor that you could embed into everyday technology.
[00:07:51] And I saw this presentation by CTRL-labs at the time where they were talking about taking neural sensors, brain sensors, and putting them into basically the form of a watch to pick up brain activity as it goes from your brain, down your arm, and through motor neurons, tells your hand to type or to move or to swipe. And they were like this, this could be the universal control over all of the rest of our technology. We can integrate this into a watch. And I was like, wow. That blew my mind. First of all, the idea that you could type from your mind, that you could swipe from your mind, that you could interact with the rest of your technology rather than just what was then niche applications. Things like meditating with neural feedback. So I was like, okay, Apple is going to buy them because that's going to go right into the Apple watch.
[00:08:36] Jordan Harbinger: Yeah.
[00:08:37] Nita Farahany: And when a year later, Meta announced that they had acquired them for at least half a billion dollars, I was like, wow, this is about to get incredibly real because—
[00:08:48] Jordan Harbinger: Yeah.
[00:08:48] Nita Farahany: —Meta has decided that this is part of its suite. And so a couple of weeks ago they announced that they're launching an early 2025 with their neural interface watch together with AR and that it will be the way in which you'll navigate the joystick, the controller, and that's a big deal. Other companies are already launching their earbuds and their headphones, but the fact that like the major tech players are all in on this, this isn't science fiction. I'm not some crazy person.
[00:09:17] Jordan Harbinger: Mm-hmm.
[00:09:17] Nita Farahany: I'm telling people about technology that somehow the mainstream has just missed that the target of all of the major tech companies, the next big thing in technology is the quantification, the decoding, the tracking and hacking of our brains.
[00:09:32] Jordan Harbinger: How accurate is this stuff? Because it seems like, okay, maybe I can turn a switch on and off, or I can look at something and click on it but it seems like how — how detailed are these things able to get? Am I able to, you know, like when I dictate on my phone, I'm talking out loud, do I just not have to do that anymore? Or is it more like, okay, no, you can open your web browser and click on something and that's pretty much it? The rest of it is the same.
[00:09:57] Nita Farahany: Yeah, it's a great question. So it depends on the application. So first EEG is what we're talking about right now. There are different kinds, and so EEG means electroencephalography. What that picks up is brain wave activity. So as you think, as you do anything, you have neurons firing in your brain. They give off tiny electrical discharge. Hundreds and thousands and millions of neurons are firing as you think and do things. And those happen in patterns, characteristic patterns that can be picked up from EEG, from these different sensors. And then AI, as it gets better and better and better, can decode with those patterns mean.
[00:10:32] So first, it depends on the sensors. Like are we talking about one in each ear, which is not picking up a huge amount of brain activity? Are we talking about 16? Like the ones that are going to be in your headphones, so you're going to have more activity that's being picked up. There's some noisiness like interference with muscle movements and eye twitching and other things. Also, like I have a lot of hair. The hair can interfere with the signal quality. And so I'd say, you know, these are not perfect in picking up data to begin with, but the AI has gotten much better. The sensors have gotten much better, and the ability to filter out noise and filter out muscle twitches have gotten much better so that at this point, brain states can be picked up. You can't literally decode complex thoughts like my inner monologue. Simple EEG in my ears is not going to pick that. But don't let that fool you into thinking then there's nothing to worry about because there's a lot of information you can probe from. So simplest things you can pick up — are you paying attention, are you engaged, are you bored, are you happy, are you sad, are you tired? So brain states like that.
[00:11:36] But just as you described, like left, right, you can pick up. And translating that kind of brain activity into moving around the screen or on, off for your light switches, those kinds of things you can already do, and then getting to more complex, you know, navigate around the screen, replace your mouse, replace your keyboard, that's coming and decoding your intention to type, that's part of what CTRL-labs has been working on and seems to be making huge advances on which is instead of typing on an actual keyboard, you can think about typing once you've trained it. So like you practice a couple of times, it picks up what those signals are. Then it remembers those are the signals and you think about doing the same thing and it decodes that activity.
[00:12:19] Jordan Harbinger: See, that's really interesting just from a medical perspective of, I'm terrified of getting something like ALS, you know, the one where your body just sort of slowly shuts down. It's just a nightmare. And I pray that they just fix this or figure out how to cure this. But in the meantime, if somebody can't type, I mean, even if you just are quadriplegic or something like that, I mean, wouldn't it be amazing if you could just look at your screen instead of having to say everything out loud, which must get exhausting after a while you could just sort of look at something and you could swipe and the thing would move for you, and then maybe you dictate a few things here and there, but the rest of the navigation is all done with your eyes. It might not even need that. It might just be able to know what you're looking at, but the less effort, the better, depending on your level of ability, I suppose.
[00:13:01] Nita Farahany: Even, not necessarily depending on your level of ability. But yeah, first of all, I would say the possibilities for people who have locked-in syndrome, who have ALS, who have the inability to communicate, who've lost their ability to speak, who've had a stroke, who are suffering paralysis and no longer have the physical movements that they once had to be able to regain their independence and ability to communicate with other people through brain-computer interface is extraordinary.
[00:13:28] And there's implanted technology. This is what people have most heard about. It's stuff like Neuralink or—
[00:13:34] Jordan Harbinger: Mm-hmm.
[00:13:35] Nita Farahany: Blackrock Neurotech. There's only about 40 people in the world right now who have those kinds of implanted technology and those are all part of clinical trials, but still the advances are unbelievable in that group.
[00:13:47] I mean, just a couple of weeks ago, there was a report out of a research lab with an ALS patient who was able to communicate at 62 words per minute, who otherwise could not speak just from brain-computer interface technology.
[00:14:03] Jordan Harbinger: Wow.
[00:14:03] Nita Farahany: I mean, just by comparison, Stephen Hawking at the end of his life, having the best technology available to him. He didn't have brain-computer interface, right? But people were creating custom technology just for him to enable him to speak. He was only able to speak at 15 words per minute. For most of his life, he was at one word per minute.
[00:14:20] Jordan Harbinger: Oh gosh.
[00:14:21] Nita Farahany: Like that's extraordinary. The possibility is amazing. But I mean, think about it for you too, Jordan. Like you're driving. I know you would never text while driving.
[00:14:30] Jordan Harbinger: I actually don't do that.
[00:14:31] Nita Farahany: Good.
[00:14:32] Jordan Harbinger: You basically just need one close call.
[00:14:34] Nita Farahany: Yeah.
[00:14:34] Jordan Harbinger: And you're like, oh my God. Or you have to have kids and then you realize you probably shouldn't do that. Yeah.
[00:14:39] Nita Farahany: But suppose you need to send a quick message and you would never do that while driving. Some people would.
[00:14:46] Jordan Harbinger: Mm-hmm.
[00:14:46] Nita Farahany: They shouldn't. You just. Think about it and you're like, "Oh, you know what, I'm just going to think about sending off the message now." Some people do that voice, you know? Like, my car has a little button you can push and it's really inaccurate. And most of the time I just get so frustrated I give up on trying to actually send the text message. But you can do that. You can type thinking about typing. And the average person, I think will find that the friction of interacting with other technology that we haven't even realized as friction. It's a little weird that we have to use a mouse to move around a screen.
[00:15:16] Jordan Harbinger: It is.
[00:15:16] Nita Farahany: It's weird that we're on a QWERTY keyboard. There are these like steps between what you're thinking and what you're communicating with the rest of the world that are actually a lot of friction in getting your thoughts on communication out. And so I think the idea of a lot of this is eventually it could make our interactions with other technology much more seamless. You just think about moving and you move or you move your hands when you're in virtual reality and you end up just moving in exactly that same natural way. You don't have to use a joystick that says go left. You just think about going left or you go left or you swipe your hand left and you're there.
[00:15:51] Jordan Harbinger: I don't know if it's the inverse, but the other direction of this would be then it sort of follows that the technology would be able to speak back to you without you, like I'm looking at you on a screen, so if I wanted to bring this computer set up with me, I have to bring a freaking screen, a keyboard, and a mouse. So if they figure out the keyboard and the mouse, great, but I still need this damn screen. So if the technology can communicate back into my brain without, in a very similar way perhaps, then all I need is the computer, which essentially. I don't know what size the computer is in my phone, but it's at least the size of my phone or smaller. But there's a screen there too and there's a lot of battery tech in there. So what if we didn't need the battery to be as big because we didn't need the screen, the touch screen and the display and the lights and all that stuff? Maybe the phone could be half the size or twice the power or whatever it is. And I'm just interfacing with it while it's in my pocket using my brain.
[00:16:41] Nita Farahany: Yeah, entirely possible. I think the possibilities are limited only by our imagination for what could eventually be made possible. And there are researchers I talk about in a chapter I call Beyond Human in the book. I talk about some of the research that's already been done to try to enable brain-to-brain communication, and I give this example of this experiment, which kind of blew my mind. Really we're going to look back on it one day and think like, oh, how primitive was that? But they wanted to see if they could put three different people in three different rooms and have them play a game of Tetris with each other, where the three different people would only communicate with each other brain to brain, and they would only communicate with the Tetris game by brain.
[00:17:26] And so one person who was the receiver is what they called them, would get messages from the two different senders. The two different centers would see the full screen. They'd see the block falling from the top. They'd see the bottom of the screen and whether or not the piece needed to be turned. And if it needed to be turned or turned left or right in order for it to fit at the bottom of the screen and clear the line. And so there would be one person who would send the message by brain using one of these EEG headsets. The person who was receiving it would have an EEG headset, and then they had something that writes to the brain called transcranial direct current stimulation. So they'd get like a little flash in their brain, which would help them visualize and give them the kind of yes-no message to their brain directly.
[00:18:09] And they do this, they play it, they play for a while. They did this, I think like 16 different times with a number of different. And the groups were able to do this winning at like a rate of 88 percent of the time.
[00:18:22] Jordan Harbinger: Wow.
[00:18:22] Nita Farahany: Just by this brain-to-brain communication. And there's even been research studies where people have done brain-to-brain text communication. Like think of text message you want to send and then transmit it to the other person by brain-to-brain communication and then have them visualize it. And that's pretty extraordinary stuff too. So we might not need screens.
[00:18:44] Jordan Harbinger: But they're not looking at a screen, they're just seeing the text in their head.
[00:18:47] Nita Farahany: The person sees the text in their head, right? They have a screen in front of them, they see the message and then transmit it not through the computer, but through brain-computer interface to the other person. The other person sees a screen.
[00:18:59] Jordan Harbinger: Oh, okay. because I was like, holy crap. That's amazing even more incredible. Okay. So no, they see a screen that makes more sense.
[00:19:04] Nita Farahany: No, no. But the first experiment is more incredible because they are seeing—
[00:19:07] Jordan Harbinger: That's right.
[00:19:07] Nita Farahany: —seeing the yes, no in their brain through a light.
[00:19:10] Jordan Harbinger: Yeah. That is cool.
[00:19:11] Nita Farahany: Like, it's literally like a flash of light that is helping, like giving them the message directly to their brains.
[00:19:17] Jordan Harbinger: So this is already sort of happening. We're just at the level of smoke signals in some way or low resolution.
[00:19:23] Nita Farahany: For that part, yeah.
[00:19:24] Jordan Harbinger: Yeah.
[00:19:24] Nita Farahany: Yeah. I mean, brain-to-brain communication, we are at the earliest stages of it, right? We're in kind of infancy, scratching the surface, but conceptually, people are already playing it out.
[00:19:34] Jordan Harbinger: Crazy. That's amazing.
[00:19:36] Nita Farahany: It is crazy.
[00:19:37] Jordan Harbinger: Does it only work for conscious thought? So I was thinking when we were talking about the medical stuff. Okay, great. I can communicate when I'm unable to type or speak. That's amazing. Where are we if somebody's in a coma and it's like, "What's going on in their brain?" Can we do that yet? Or is that stuff buried under layers that we aren't able to access?
[00:19:56] Nita Farahany: Yeah, I mean, that's such a good question and it's such a hard question, so I'm going to break it down in two ways. One is what we've just been talking about is consciously transmitted—
[00:20:07] Jordan Harbinger: Right.
[00:20:07] Nita Farahany: —communication and thought, but you can pick up a lot of pre-conscious signals, and that's part of some of the scary stuff that I write about in the book, is kind of probing the brain for information. So I could show you an image like I really want to know if you react positively or negatively to images of Republicans and Democrats. And so, I show you a bunch of pictures and I don't want you to tell me your political affiliation. I just want to guess it.
[00:20:30] Jordan Harbinger: Sure.
[00:20:30] Nita Farahany: And so I show you a bunch of different Democrats and I show you a bunch of different Republicans. And then I look at the unconscious signals in your brain, both recognition, but also emotional responses that you have. And then I say, okay, I can tell with a pretty high degree of accuracy what your feelings are with respect to—
[00:20:46] Jordan Harbinger: Right.
[00:20:46] Nita Farahany: —these candidates and these parties. I can even do that with messages.
[00:20:49] Jordan Harbinger: That's interesting. Like the machine's broken, it says you can't stand any of these people. Hmm. I don't know. I have to recalibrate it.
[00:20:55] Nita Farahany: Right. Well, that's why I then shifted, and maybe I'm going to show you some messaging as well.
[00:20:59] Jordan Harbinger: Right.
[00:20:59] Nita Farahany: That like aligns better with the parties that might help me too. But I can pick up that kind of unconscious information. But the second part you asked, which is the person who's in a coma, this is really hard stuff because there was a long period of time where we thought people who are in a persistent vegetative state, meaning like that's a medical and diagnostic classification. There's no more conscious thought like that. That's gone. There's nothing happening in that way there. And part of that was an inability to access their thoughts. And so there were researchers who started to take people who had been classified in a persistent vegetative state and put them into large neurotechnology machines called functional magnetic resonance imaging machines. It's basically like an MRI, but it's looking at the brain instead. They would ask them a series of questions and they would say like, if the answer to the question is yes, so there we yes-no questions, think about walking through the rooms of your house. If the answer to the question is no, think about playing tennis. And this would engage different parts of the brain's motor activity or spatial kind of reasoning. And they found that for a few of the patients that they put in, they got responses and they got accurate responses, and they came to believe that somewhere between like 20 to 40 percent of the people who'd been diagnosed as being in a persistent vegetative state were actually minimally conscious or had locked-in syndrome.
[00:22:20] Jordan Harbinger: Oh my God, that's such a nightmare. So they're trapped in an unconscious body.
[00:22:24] Nita Farahany: Right.
[00:22:25] Jordan Harbinger: Oh my God.
[00:22:26] Nita Farahany: That is like most of our worst nightmare, right? A different project that I've been working on for the past few years. I've been working with something called the Uniform Law Commission. And the Uniform Law Commission has representatives from every state that have been appointed usually by the governor of that state, to work on model laws where uniformity law is useful in the United States. And one of those laws that the Uniform Law Commission has written is called the Uniform Determination of Death Act, which has two parts. The first part is the death, as most of us understand it or know it, which is when like the heart stops beating and you stop breathing, which we call circulatory and respiratory death. But for a limited number of people, maybe you have like an in-hospital heart attack or something like that, and you come into the hospital and you're pro on mechanical ventilation, some of those people have had such catastrophic brain injury that they have no more brain activity, right? They're never going to wake up again. They're never going to breathe on their own again. They have no brainstem reflexes, and those people are classified as being brain dead is kind of shorthand of it, right? Or death by neurologic criteria. So we're working on updating that definition because it's kind of fallen out of step, the legal definition from the medical practice. And that's a tricky category if you think about the question you asked because the more we learn about the brain, you know, the more questions are like, how can you know for sure that there's complete loss of brain activity, that it's irreversible, loss of brain activity. Those are some of the kind of complex questions that people grapple with in that.
[00:23:59] Jordan Harbinger: Oh man, it's so upsetting to think that there's people right now on breathing machines who are like, oh my God, wake me up or kill me. This is awful. I can't move, I can't do anything. But I know like the time is passing and they can't do — oh God, absolutely a nightmare.
[00:24:17] Nita Farahany: We just don't know. We don't have access. I mean, so like you are assuming all of that happening. Even more terrifying part, just a layer on top of that is we don't know what's happening, right? We have no way to actually decode, like what are they thinking or feeling, even if we can get to some basic yes-no answers. But even that, like are they, are they thinking like, I wish you would turn off the machines.
[00:24:38] Jordan Harbinger: Sure.
[00:24:38] Nita Farahany: Are they able to think in that full way? Like, have they suffered so much brain injury that that doesn't happen? And I should just distinguish that's different than the people who would be classified as brain dead. Those people have literally no, I mean, it's all functions of the entire brain have stopped. They don't even have brainstem reflexes. The people who are in a persistent vegetative state don't satisfy that classification.
[00:24:57] Jordan Harbinger: Yeah, just I'm thinking about somebody who like, feels pain and can't do anything about it. It's just, oh God, it's new nightmare unlocked, unfortunately. That's just awful.
[00:25:06] Nita Farahany: I do hear it for some people, by the way, in reading the book that it is new nightmare unlocked in multiple ways.
[00:25:17] Jordan Harbinger: You're listening to The Jordan Harbinger Show with our guest Nita Farahany. We'll be right.
[00:25:21] This episode is sponsored in part by Shopify. Hear that? It's every entrepreneur's favorite sound. Another sale on Shopify. Shopify is the commerce platform revolutionizing millions of businesses worldwide. Whether you're selling towels or t-shirts, scrunchies or munchies, Shopify simplifies selling online and in person so you can focus on successfully growing your business. Shopify covers every sales channel from an in-person POS system to an all-in-one e-commerce platform. It even lets you sell across social media marketplaces like TikTok, Facebook, Instagram, packed with industry leading tools ready to ignite your growth. Shopify gives you complete control over your business and your brand without having to learn any new skills and design or code. And thanks to 24/7 help and an extensive business course library, Shopify is there to support your success every step of the way. What's incredible to me about Shopify is how no matter how big you want to grow, Shopify is there to empower you with the confidence and control to take your business to the next level. Now, it's your turn to get serious about selling and try Shopify today. This is possibility powered by Shopify.
[00:26:21] Jen Harbinger: Sign up for a one-dollar-per-month trial period at shopify.com/jordan in all lowercase. Go to shopify.com/jordan to take your business to the next level today, shopify.com/jordan.
[00:26:33] Jordan Harbinger: This episode is also sponsored by SimpliSafe. If you binge on the show a horror binger, so to speak, see what I did there. I didn't even think of that one myself. How pathetic is that? You'll know that I highly recommend SimpliSafe home security and I'm not the only one. US News recently named Simply Safe, The Best Home Security System of 2023. CNET recently awarded them their Editor's Choice for Home Security. Remember CNET? Damn, they're still around. Well, you won't have to worry about still being around because SimpliSafe is going to keep you around for a long time. Uh, I'll stop. SimpliSafe is awesome though. It's really easy to set up even for a crank old fart like me, who hates when things don't pair or click or work automatically. Every time I run into setup glitches, I just lose my freaking mind.
[00:27:14] Soundbite: Ain't nobody got time for that.
[00:27:15] Jordan Harbinger: So I can attest that SimpliSafe setup is smooth and simple. We love it. We set our alarm every night. I know, but the bar is low, people. But it is priceless to have peace of mind knowing SimpliSafe's 24/7 professional monitoring agents are ready to dispatch police if anything happens. You know what I really like though? The ability to lock and unlock our doors remotely and let strangers into my house. I access our cameras, I arm and disarm our system from anywhere, usually bed. We have some construction going on in our backyard and I want to know when the workers are coming in and out of our yard and I want to know if somebody else is coming in and out of the yard. We have cameras monitoring their progress and the cameras also keep an eye on the tools they leave out there as well. I got stuff that my kids leave out there. Now that we've had SimpliSafe for years, frankly, I'd feel naked without it, like using a phone, without a phone case, but not in a stylish way. So check out SimpliSafe if you want to secure your home.
[00:28:04] Jen Harbinger: Customize the perfect system for your home in just a few minutes at simplysafe.com/jordan. Go today and claim a free indoor security camera plus 20 percent off your order with interactive monitoring. That's simplisafe.com/jordan. There's no safe like SimpliSafe.
[00:28:22] Jordan Harbinger: If you're wondering how I managed to book these folks for the show, it's always about networks, warm introductions, and people sending them to me. And of course, I'm showing you how to build your network for free, whether you have a podcast or not, of course. It's for personal and business reasons, anything you want, jordanharbinger.com/course. It's all about improving your relationship-building skills in a non-cringey, non-schmoozy way, very down to earth, nothing awkward in here. Just practical exercises that are going to make you a better connector, a better friend, a better peer, a better colleague, jordanharbinger.com/course.
[00:28:51] Now back to Nita Farahany.
[00:28:55] It is impressive though, what we can see in the brain. I know you mentioned in the book smart football helmets that can detect concussions. That's really incredible and sounds really useful. I know you kind of just mentioned before, you can tell if somebody's a conservative or liberal or any number of, I guess, yes, no, kind of, or emotional response questions. It seems like we're quickly heading to a place where people in the government could maybe look inside our heads and depending on the resolution of the machinery here, and I think that's one of the scariest parts of all of this. You said that this is already happening in China. What's going on in with it in China? That just doesn't surprise probably anybody that they're experimenting with this first.
[00:29:35] Nita Farahany: Right. There are smart helmets, there's possibilities of testing epilepsy. There's all kinds of things that in our own hands, neurotechnology can enable us to do. It can help us to make brain health just as important as all of the rest of our health that can help us to quantify and access it. But anything we can access, right? Presumably, other people can access as well. And if you know the devices are issued by a company, then the company has access to that brain data as well. In many instances already, governments have gained access to the wearables that people are using from heart rate to your phone, to your email, to everything else that you use, all the rest of the technology that's quantifying and tracking things. And in China, they're already using neurotechnology to track workers and students. And it sounds like doing so in some dystopian ways, which should surprise no one, but—
[00:30:27] Jordan Harbinger: Yeah.
[00:30:27] Nita Farahany: —you know, was still startling to me because in the same way of, you know, oh, is this just some person who's talking about science fiction? You know, I would imagine it like, oh gosh, imagine if an authoritarian government started to use this technology in ways that would interfere with people's freedom of thought and then, boom, like news articles. It's happening in China. And so here's a couple of things that are happening there already. One is China has already required workers to wear brain sensors during their workday to track fatigue levels for a conductor or commercial driver, workers on factory floors, their attention, if their mind is wandering, or even reports where if their emotional instability or just feelings show that they could be disruptive to the workplace, they're sent home.
[00:31:07] Jordan Harbinger: Hmm.
[00:31:07] Nita Farahany: Which is not a good thing if you're in China and told, "You're not doing your job. You need to go home." The Wall Street Journal book a report a number of years ago about how in a province in China that there was an elementary school that was requiring all of the students to wear EEG headsets that was monitoring their attention levels during their school day. And then there was a console in the front of the classroom where the teacher could look at the students and see who was paying attention and whose mind was wandering. And that information was even sent to the parents and to the state as well. And as you think about a system like a social credit system that's built on or you're paying attention enough in school or mind wandering, that's terrifying enough. But you add on top of that, you know, we've been talking about what can the technology do, it almost doesn't matter what the technology can do. If the government requires you to wear brain centers and says, "We are tracking your brain throughout the day. And if you think any thoughts that are divergent or dissident, we will know." And when you have that kind of informational asymmetry and power asymmetry, I worry about the chilling effect on the variability to think freely, to dare to think freely, and the conformity and you know, kind of inner silence that can result from that. And so one, you can already decode a lot on, a lot more than people think. and even if you couldn't, the mandated use of it is terrifying in an environment like that.
[00:32:34] Jordan Harbinger: It's like the TV that looks at you and listens in 1984, only it's in your brain, which is way worse.
[00:32:42] Nita Farahany: Yes.
[00:32:42] Jordan Harbinger: Right?
[00:32:43] Nita Farahany: No, I agree. I mean, like I went back and read 1984. Recently, and I was like, oh, wow, like, yeah.
[00:32:48] Jordan Harbinger: We have this.
[00:32:49] Nita Farahany: Yes.
[00:32:50] Jordan Harbinger: We have this.
[00:32:50] Nita Farahany: We have this like, yes, exactly.
[00:32:52] Jordan Harbinger: The chapter where the TV was blocked by a bunch of stuff. They don't have to worry about that because it's in your head now.
[00:32:57] Nita Farahany: That's right.
[00:32:57] Jordan Harbinger: It seems like this could be used to sell us things in the future. And I kind of assume that's where Meta is going to start with this. Like, if you think your ads are targeted now, wait until it knows what you're thinking in that moment about how you're fat and you got a wedgie and needed pants with a more comfortable waistline, which is a random example, people. Just random.
[00:33:15] Nita Farahany: Okay.
[00:33:15] Jordan Harbinger: But like, they're going to know, they're going to know. Oh man, I'm bored. You know, I should go on a vacation. Look at the weather outside and it's like, oh, are you bored and you want to go on vacation? Look how nice Turkey is right now.
[00:33:26] Nita Farahany: I mean, how about this? Like, you're hungry as you're walking past a store and—
[00:33:31] Jordan Harbinger: Mmm.
[00:33:32] Nita Farahany: You know, between your phone and your brain sensors, we're able to pick up and give you real time alerts of like, "Hey, 20 percent discount to walk into this store right here and get yourself a nice bar of chocolate." The precision will be startling. So companies are already using brain activity information to market to individuals. There's a whole field called neuromarketing and I write about this as well, which is a field which is really designed to see how brains react to marketing. People are terrible about reporting their actual preferences. I shouldn't say terrible. People report it, it just doesn't actually reflect what they do and what they buy.
[00:34:08] Jordan Harbinger: Yeah. It's not right.
[00:34:08] Nita Farahany: Right. So they're very willing to report it. It's just not that helpful. And actually marketing to people is what marketers have found over time. And so instead, marketers started to wonder, could we look at how brains react to information instead? So could we show them a marketing advertisement and see do they show engagement? Do they show enthusiasm? Do they show boredom? Does their mind start to wander in the middle of the commercial? And, you know, does the mind wander to some different area? And none of this stuff happens in isolation. Eye tracking, all of the other metrics that have been used for a long time are added to brain activity, but it turns out brain activity adds a lot to the marketing information. It tells how engaged and at which point you're the most engaged. It tells, you know, how you feel.
[00:34:52] When Avatar, the original Avatar was first launched, James Cameron was approached by a neuromarketing company who said, he had said something in a magazine, in an interview about how people he thought would be so deeply engaged by the Navi World and the imagery and the cinematography and their brains would light up or something. And a neuromarketing company was like, "Hey, look, we can actually show you the brain lighting up. Can we offer you our service?" So we partnered with them and they trialed different trailers for Avatar using neuromarketing, seeing the extent to which people responded favorably or disfavorably and how engaged they were with the different ones and how emotionally charged they were in response to it. And they went with the trailer that had the strongest neuromarketing features. And I mean, you know, you could attribute it to the neuromarketing or many other things, but that was one of the most successful movies of all time. You know, hence it being remade and relaunched and everything that's happening with it right now.
[00:35:49] So that's already happening, but that's happening with volunteers who are choosing to go into labs or at home using headsets to help marketers figure out how our brains react. Like who needs test subjects when we're all wearing brain sensors in our earbuds and in our headphones while we're at work, while we're scanning social media? And you come across an ad, which Facebook knows how long you're spending on that ad. They know whether or not you're looking at it. They know whether or not you've clicked on it, but now they know how your brain reacts to it as well. And they can commodify and sell that information to give much more precise feedback and targeted information to companies.
[00:36:28] So that is, I think the reason Meta is partly at least really interested in commodifying brain data. In an interview recently on their AR product launching with the neural interface in 2025, they said as much, which is that of course, the neural interface watch is going to be integrated with WhatsApp and social media and everything else, which, you know, not hard to connect the dots to figure out that that means the brains are going to be connected to all of that in our brain-based reactions to all the advertisement is going to be sold and commodified to target us.
[00:37:00] Jordan Harbinger: It's really incredible. I mean, you think clickbait is bad now it's going to be really bad when AI can tailor it directly to you based on things that you are thinking that you don't necessarily even express. That's really creepy.
[00:37:14] Nita Farahany: You know, it's interesting. So I went on to ChatGPT the other day, in my chapter on mental manipulation, I write about brain heuristics, the shortcuts that put our brains in danger of misinformation and manipulation. And I was just curious like, "Okay, well, ChatGPT, what are all of the brain heuristics that you know about? Like what are all the shortcuts that make humans susceptible to disinformation or misinformation or manipulation?" They were able to just spit them all out for me really quickly, and I said, "Okay, great. And how could generative AI be used to manipulate or tap into all of that to make disinformation worse?" And it was very precise and being able to show me all the ways in which generative AI could do so, and I said, "Great. And so then how does neurotech make that more profound and dangerous?" And they had a very thoughtful response about the brain-based response to all of the generative AI and how it could be a generative process, which is like, here's AB testing, but AB testing with your brain here is a sensational headline with false information that has pseudoscience attached to it, which taps into all of your shortcuts in your brain that make you think quickly, share information and not actually stop to really verify it. And you just put two side by side and see how the brain reacts to it based on the brain sensors, and then push out the information that is the most likely to lead to the greatest spread. So it's a—
[00:38:39] Jordan Harbinger: It's crazy.
[00:38:40] Nita Farahany: —terrifying possibility. Yes.
[00:38:42] Jordan Harbinger: It's like a virus that's tailored to you, that adapts to you, like it lands on you and goes, "Mmm, this isn't going to work, but here's this other thing that's going to work." And then, you push it out to your family and friends, then it lands on them and does the exact same thing. Crazy.
[00:38:54] Nita Farahany: Yep.
[00:38:54] Jordan Harbinger: I know that employers can maybe monitor our brain data for productivity. Might keep us honest at work. The truck driver example that you give where they say, "Hey, this person's tired. They shouldn't be driving." That seems like a good use of this. But of course, then we go all the way down the rabbit hole and we end up with our employers just knowing what we're thinking at work, which is really, not what anybody signed up for.
[00:39:16] Nita Farahany: Yeah. I like that. Lull people into thinking like, "Yeah, that might be okay." And then, "Aw. No!"
[00:39:21] Jordan Harbinger: Just kidding. Yeah.
[00:39:22] Nita Farahany: Yeah. Well, okay, so the tricky thing is this stuff is nuanced, right? It's not all good. It's not all bad. And so one of the examples that I give in the book is fatigue management. So right now, truckers, long haul truckers are subject to a whole lot of surveillance because of the cargo that they're carrying, of the fact that they're on the road are much more likely to be dangerous than you or I in like a small car that is unlikely to pose as much damage to other people if we get into a car accident and if we're tired and fall asleep at the wheel. So in order to monitor drivers, there's all kinds of information that's on the car technology that's on the truck to track their driving and driving behavior. There's also in-cab cameras that are trained on their faces to look and see if they're tired or if they're awake and other in-cab features to do that, which I say to say they have very little privacy right now on the workplace in their jobs.
[00:40:21] Brain activity monitors, like brain sensors that monitor their fatigue levels, which have been shown to be really very accurate and maybe even more accurate than some of these other ways to try to see whether or not they're wide awake or asleep, like, you know, checking the steering wheel or the car, those types of sensors are already being used by thousands of companies to track truckers, and I don't think that's a particularly bad use of it. But I actually think that picking up only fatigue levels from the brain, that you don't have a strong interest in mental privacy if you are a truck driver in how sleepy you are while driving on the job. The challenge is just limiting it to that, right? Which is, if an employer is monitoring a person's brain for whether or not they're tired or awake, you could have access to a whole lot more of information about what's happening in the brain. And it shouldn't be, right? We should extract just the piece of information we want, overwrite the rest of the brainwave data, and not use it for those other purposes.
[00:41:16] And so I show that as an example to say there's a way we could do this, which could be beneficial to society with a minimum impact on individual privacy. And there may be some instances in which we think a person doesn't have an absolute right to keep other people out of their brains. But at the other end of that, probing a person's brain for, are they thinking about a startup? Are they thinking about unionization? Are they working with people that you want to try to find out if they're trying to collaborate and rise up against the unfair practices in the workplace? Are they happy or sad when you call them up and offer them a raise to try to hijack the process of negotiation to make it more favorable to the employer? None of that should be on the table. This as a tool of widespread employee surveillance is, I think, a way to take workplaces that are already introducing far too much surveillance and making them truly dystopian, truly unpleasant places that kind of strip us of the humanity and dignity of work.
[00:42:15] Jordan Harbinger: Yeah, it seems like the negotiation example, which is in the book is pretty simple to understand, right? Your boss offers you a crappy raise and sees your brain light up, and then you try to negotiate and they're like, "Well, we already know that you're interested in the two percent raise we offered, so you're going to take that so we're just not going to budge." And you go, "Well, I need five percent." "Well, we know you're lying because we saw your reaction from your brain waves. You can't hide that." That super sucks. And I didn't realize that you could detect people who wanted collective action, but I suppose if we are doing some of the clunking around in the brain, you can sort of figure out what people want to do or what they're thinking about at work. Even if it's not like, I want to start a union is emblazoned on your head, like the Sims game. They can still say like, "Oh, when this person goes to these meetings, there's a ton of brain activity. Or when they're meeting with other people who also have these preferences or stated or otherwise, their brains collectively light up. Let's not allow them to work together, or let's fire some of them, or whatever."
[00:43:19] Nita Farahany: Yeah. So I mean, you know, one of the ways in which I imagined we could see that kind of unionization-busting behavior happen, there's this really cool feature of working together when we're working collaboratively together, which is you start to see synchronization of brainwave activity between individuals.
[00:43:37] Jordan Harbinger: Mm-hmm.
[00:43:38] Nita Farahany: And that's great. It helps people collaborate better. It helps them literally get onto the same wavelength.
[00:43:43] Jordan Harbinger: Mm-hmm.
[00:43:44] Nita Farahany: But you know, if you were an entrepreneurial employer who is trying to figure out who is having synchronization of brain activity that shouldn't have synchronization of brain activity, right? So these people aren't working together on projects. They don't have any reason to actually be collaborating and you start to see that kind of sync up between brains, you know, it gives you a tool of surveillance to look into and try to figure it out. And you start seeing that at scale and you begin to wonder is there some kind of unionization, is there some kind of collective action that's starting to start?
[00:44:14] I'm sort of inviting the reader to imagine the possible ways in which, even with the current technology, right? Not having to get into real decoding of complex thought in the brain, it still can interfere with freedom of contract. It still can be misused in the workplace. It still can be misused a lot by the government. And we haven't even touched on it. I mean, we talked about China, but there's already actually a lot of use by governments worldwide of neurotechnology in ways that are good and ways that are bad and nefarious.
[00:44:46] Jordan Harbinger: Yeah. I mean, okay. Understanding that a truck driver is asleep behind the wheel of an 18-ton rig, good. Being able to tell if employees are actually working or jerking around on the Internet all day, medium to good, but the amount of stress added to a job because my employer can monitor my brain and I'm already, let's say an Amazon worker trying to put a billion things into little pots at a speed that constantly is increasing. It's already miserable. Now, I can't even freaking go to the bathroom without my boss monitoring what I'm thinking while I'm in there, while I'm on my 15-minute break, or even just while I'm walking between bins. No, thank you. The tech does have to be made useful without being made totally oppressive.
[00:45:27] Nita Farahany: Yeah, I think that's. Incredibly well said, which is the real question for us right now is not whether the tech is here or not, it has arrived. It's not whether the tech is good or bad. It is both. It is how do we make it empowering for people and not introduce the most oppressive kind of Orwellian vision of what this could ultimately be. And since I already give lots of examples of how seeds of that are starting to be planted. How do we turn the tides to make this technology that does just become something that you're happy to have your ring on, you're happy to have your brain sensors on because you're able to focus on your own brain health and wellness and well-being, and you're able to quantify and see things that you couldn't before. You're able to take your brain health as seriously as you take the rest of your physical health without worrying that your employer is, you know, trying to use it to make it a dystopian workplace without worrying that the government's going to subpoena all of your brain data and use it against you in whatever context that they want to use it against you for.
[00:46:28] Jordan Harbinger: Yeah, that's a good point. And I'm just sort of in my head almost joking, like, this place already sucks enough. I'm already walking between, running between bins, trying to put someone's stupid fidget spinner in the robot machine before I get dinged and demerited, and now I can't even think about being somewhere else while I'm doing this. I mean, like, screw you Bezos, or whoever's in charge of this.
[00:46:50] Nita Farahany: Let me underscore that for you for a moment, which is—
[00:46:53] Jordan Harbinger: Yeah.
[00:46:53] Nita Farahany: —I can't even think. So people have been willing to give up their privacy in so many ways. They say things like, "Well, I don't have anything to hide, so it doesn't really matter. Or, I like getting the personal targeted advertisements. I bought something from my feed last week that was something I didn't even know that I needed." But I want people to really focus on "I can't even think freely" because it's not about whether or not you have something to hide or not. It's not whether or not there's something that could be used that is used against you by the government. It's whether or not you have a space of mental reprieve where you can think freely, whatever you want to think. Anything, right? Imagine, fantasize, have a bad thought. We all have bad thoughts. Have a safe space where that can happen. And that's the space I want us to make sure we preserve and keep because we say that in jest, but that's what's at risk here, is that ability to actually think freely.
[00:47:50] Jordan Harbinger: Well, it seems like thought crime, or whatever they had in Minority Report is not impossible. And again, it sounds dystopian, but okay, maybe I do want people planning to murder other people to be stopped and put into prison. But then again, they haven't actually done anything yet. So good luck drawing that line in the sand. Like, I haven't thought about gutting somebody who's a terrible person and then, you know, but not really. I'm not really going to do it. I just sort of want you to get pushed into traffic right now because you're a terrible person. I mean, maybe I'm the only one with intrusive thoughts like that, but I assume I'm not, right? You've been pissed off before. Do I need to get arrested for that?
[00:48:26] Nita Farahany: You have a one-year-old and a three-year-old, I think you said something like that.
[00:48:29] Jordan Harbinger: Right.
[00:48:29] Nita Farahany: So yeah, so I have a three-year-old and an eight-year-old and I adore my children. They are my life.
[00:48:35] Jordan Harbinger: Sure.
[00:48:35] Nita Farahany: But when they're both having temper tantrums at exactly the same time, I'm exhausted and want to go to sleep and they refuse to go to sleep.
[00:48:42] Jordan Harbinger: Mm-hmm.
[00:48:42] Nita Farahany: There are times where I think not the best thoughts about—
[00:48:45] Jordan Harbinger: Sure, yeah.
[00:48:45] Nita Farahany: —you know, even my kids. And that's okay. We're human. That's part of what it means to be human, is cursing out and imagining for a moment something terrible happening to the person who just honked in front of you or cut you off—
[00:48:59] Jordan Harbinger: Right.
[00:48:59] Nita Farahany: —in line.
[00:49:00] Jordan Harbinger: I could make this look like an accident. Oh my God, well, I would never do that. Well, too late. It's in the thought crime machine.
[00:49:06] Nita Farahany: Yes. I don't want us to have to censor our thoughts more than we already do, and I'll say this, which is, I'm glad we filter our thoughts. I'm glad we don't act on our bad impulses. That's part of thinking freely is having that space where something you know pops up and you're just like, "Oh God, you know what? I'm a terrible person. Of course, like I don't want anything terrible to happen to that person in front of me, but wow, they really pissed me off." We go through all of this different kind of filtering and that's part of us deciding what we want to share with the rest of the world and what we don't want to share with the rest of the. I don't want those filters to be taken off involuntarily and unconsented to and interpreted where you have biases, but you're working actively to overcome them. But your biases can be picked up and detected through neurotechnology, like that's none of anybody's business. And how are they going to be able to tell the difference between a bias which you genuinely hold versus one that is an implicit bias that you are working on overcoming? That's the kind of stuff that I just, I want people to have the space to work through being a better human, to being the best version of themselves they want to be, to be whoever they want to be, but to have the space that they can do that. Figure that out for themselves.
[00:50:16] Jordan Harbinger: That's a really good point. Like maybe you just have a strong aversion to Persian people because your neighbor growing up was a terrible human being and now every Persian you meet reminds you of them, but then you just pause and you realize that they're not.
[00:50:28] Nita Farahany: Yes.
[00:50:29] Jordan Harbinger: That little flaw being broadcasted every single time would not be, definitely not be good in any way.
[00:50:35] Nita Farahany: It wouldn't be good and it would be, it would change how other people perceive you. It would change your ability to work on yourself, but also change your incentive. Like, yeah, you grew up next to somebody, that's how you felt. But every time you encounter somebody new like me, who's Persian, you think like—
[00:50:48] Jordan Harbinger: Mm-hmm.
[00:50:48] Nita Farahany: —this is a human being, I'm going to take on their own. I recognize I have that bias and I'm working against it. I'm working to put that away. That's part of the exercise of my growth as a human being. But instead, I encounter you and I see that little flash that says you're biased against me. And I just say, "I want nothing to do with you. You have a bias against me." And you're like, "No, no, no, I'm working on it. Like, this was an early childhood exposure." Don't care.
[00:51:13] Jordan Harbinger: Right.
[00:51:13] Nita Farahany: Your brain tells me the objective truth. That's not the future we want to live in. That's not how we want to interact with one another. We want the space to continue to grow and change and have that space of mental reprieve.
[00:51:24] Jordan Harbinger: Also, is it just as a guy to have any sort of flash of any, like, I don't want every woman I talk to to be like, "Oh, he's one of those." It's like, I am, but I'm trying not to be.
[00:51:35] Nita Farahany: Yes.
[00:51:35] Jordan Harbinger: I am one of those, but really I'm trying not to be.
[00:51:38] Nita Farahany: Right.
[00:51:39] Jordan Harbinger: At least not in an appropriate situation.
[00:51:39] Nita Farahany: I can't imagine it would be a good thing for most relationships either, you know?
[00:51:43] Jordan Harbinger: No.
[00:51:43] Nita Farahany: I think I give the example that the tamer version of that is you walk into your friend's house and they have an ugly — and this really happened to me but not the brain part of it, right? Well, you walk into a person's house and they bought a new couch that they're so proud of and it is like a hideous mustard yellow. One of my friends is probably listening now and they're like, "Oh my God, she's talking about my couch."
[00:52:03] Jordan Harbinger: Yeah, you got to be careful.
[00:52:04] Nita Farahany: I got to be careful. And you know, they said, "Do you like my new couch?" And you say, "Oh, I love it." And you were thinking that is, "Wow. I mean, that is the worst couch I've ever seen."
[00:52:13] Jordan Harbinger: Right, in a circus tent. Well, you are Persian, so we know about all the decor.
[00:52:19] Nita Farahany: Well said, we like our gold and we like our bright yellows and things like that. But in any events—
[00:52:24] Jordan Harbinger: Sure.
[00:52:24] Nita Farahany: —I walked in and I think it's hideous but did I say that couch is hideous? Did I register my disgust? No, I said, "Oh, it's lovely."
[00:52:34] Jordan Harbinger: Yeah. It would be right at home in one of Saddam Hussein's palaces.
[00:52:37] Nita Farahany: It's just, it's perfect. It's perfect. So, you know, those little white lies, those are okay to tell too.
[00:52:45] Jordan Harbinger: Mm-hmm.
[00:52:45] Nita Farahany: I mean, we can have these thoughts of "No, you're yellow couch is hideous," and we don't need to have them broadcast against our will to other people. We don't need it. And in that case, you wouldn't hear through neurotechnology an output that said, "No. Nita thinks that it is a hideous couch," but my disgust would register on these brain sensors. And if it's like we share location data, if we start to share emotional data, and so my friend says, "What do you think of my couch?" And then checks the emotion data that I've shared with everybody in her friend group. And she sees disgust that is registered that changes the nature of interactions with other human beings.
[00:53:23] Jordan Harbinger: This is The Jordan Harbinger Show with our guest, Nita Farahany. We'll be right back.
[00:53:28] This episode is sponsored in part by Peloton. Trying a new workout is like learning a new skill. It can be overwhelming, and the uncertainty can be a major barrier to actually getting started. Peloton's approach to convenience is very helpful for people who are looking to take on a new fitness skill or routine. Everything is designed to be as simple and streamlined as possible from the easy-to-use touchscreen interface to the wide range of class options and personalized recommendations, you can access a variety of live and on-demand classes, including cycling, running, strength. Now, there's an incredible rower, which I really enjoy, all from the comfort of your own home. Rowing is great as a full-body workout, which means you'll be engaging multiple muscle groups at once, including your legs, core, arms, and back. This will help you burn more calories, of course. It will help you build more strength especially, and improve your overall fitness. Correct rowing form isn't intuitive, at least it certainly wasn't for me, and doing it correctly is harder than it sounds, especially once you start getting tired because, of course, your form always breaks down when you get tired. Form Assist shows you a figure of yourself as you row, and when you screw up a portion of the body, your body turns red. That's a good way to avoid getting super, super injured or tweaking something and not being able to work out, which stops a lot of people who are diving in either for the first time or getting back into it after a long time. So try Peloton Row risk-free with a 30-day home trial. New members only. Not available in remote locations. See additional terms at onepeloton.com/home-trial.
[00:54:52] This episode is sponsored in part by Other People's Pockets. Have you ever wondered how on earth your friend bought their home or why your coworker meticulously splits the tab down to the last freaking Diet Coke? Other People's Pockets is a show about other people's money. Host Maya Lau asks people from all walks of life to get radically transparent about their personal finances, to learn more about who we are, what makes us tick, and to level the playing field a little bit. You'll hear from a dominatrix who gets paid to bully men at the ATM and irreverent astrologer, making bank against all odds, and a business prodigy who flipped his services from drugs to dumbbells and more. You can find Other People's Pockets wherever you get your podcasts.
[00:55:29] If you like this episode of the show, I would love it if you do what other smart and considerate listeners do, which is take a moment and figure out how you might support the show. I don't need your cash, but our sponsors sure do, and all of them are listed at jordanharbinger.com/deals. You can find any sponsor on that page. Also, use our AI chatbot on the website at jordanharbinger.com. Search for any promo code from anyone who's ever supported the show. Thank you so much for supporting those who support us.
[00:55:52] Now for the rest of my conversation with Nita Farahany.
[00:55:57] I know people are like, "Okay, well, I would just turn that feature off." But it's not about that. It's about brain hacking, data leaks in the future. We thought the Equifax credit breach was bad. And for people who don't remember that, I think like a hundred million social security numbers and credit scores or whatever were leaked from Equifax. Maybe it wasn't quite that many, but it was a massive, massive breach. What would a hack or data leak of brain data look like? I mean, that could be much more damaging. So what? You have my social security number. That's annoying. But what if you know a bunch of stuff about me that even I don't necessarily realize?
[00:56:33] Nita Farahany: Yeah. So first, I mean, I should say there's some really interesting research that I talk about in the book on functional biometrics. And I'm going to start here with response to that because it's useful to know that brain data, just from brain data can identify a person.
[00:56:49] Jordan Harbinger: So it's biometric or whatever that's called.
[00:56:51] Nita Farahany: Yeah. Do you have a favorite song? I mean, you're probably listening to a lot of nursery school rhymes right now.
[00:56:56] Jordan Harbinger: Yeah, I was just thinking like the wheels on the bus go around and around.
[00:56:58] Nita Farahany: Yeah, exactly.
[00:56:59] Jordan Harbinger: That's the current top play.
[00:57:01] Nita Farahany: Exactly. So take Wheels on the Bus, which I hear too often and you hear too often. And we could both just like sing that we are right now, we're both singing it in our head for a second.
[00:57:10] Jordan Harbinger: Mm-hmm.
[00:57:10] Nita Farahany: It's playing in our head. So if you use a brain sensor while intentionally recording singing the wheels on the bus, go around and round in your head.
[00:57:18] Jordan Harbinger: Mm-hmm.
[00:57:19] Nita Farahany: Okay. It creates a neural signature, a functional biometric. And then you could use that just like you could use your face to unlock your phone. You could use singing the wheels on the bus, go around and run in your head—
[00:57:32] Jordan Harbinger: Huh?
[00:57:33] Nita Farahany: —to unlock the phone, which means that if you have a whole bunch of data stored in the cloud, a whole bunch of brain data stored in the cloud, and you have people's functional biometric token, right? That has a unique neural signature, you can identify, re-identify, de-identified, or aggregate data in the cloud to pick up, like if here's an individual storage that this is all of Jordan's data, you can say that's Jordan's data, which you can re-identify.
[00:57:59] Jordan Harbinger: Oof.
[00:57:59] Nita Farahany: Okay, so we'll start there, which is, it's not so easy to just keep everything anonymous. Re-identification of data has turned out to be very easy over time. The second is there's what we can do today, and then there's the fact that AI is going crazy, gangbusters fast and pattern recognition, and there is raw brainwave data. So the kind of data that we're collecting from brains is kind of like full resolution, big data. Instead of extracting just that one little piece of information, "are you fatigued or not," I'm recording a lot of data and then I'm using, I shouldn't say I, scientists can record a whole bunch of data, companies can record a whole bunch of data and then they can extract the piece of information that they want from it.
[00:58:39] So if you store all that data and then later want to go back and mine it, you can go back and mine it over time, and if a hacker gets access to it, they can mine it for all kinds of things. From simple things like if I want to figure out if you're suffering from cognitive decline over time to the scarier things, like, you know, the fact that already researchers have shown that you can take brain sensors and probe a person's brain for information like their pin number or their address. Like you can put different images into the environment, subliminal priming into their computer screen, for example, while they have brain sensors. And then probe for recognition of like, "Is this your pin number? Is it this combination? Is it this combination? Is it this combination?"
[00:59:20] Jordan Harbinger: Wow.
[00:59:20] Nita Farahany: So you could actually accurately then get what the information is that's sort of brain jacking as opposed to the information in the cloud, which you could mine for a bunch of insights about the person.
[00:59:29] Jordan Harbinger: Thoughts and emotions are definitely not unique. What about the data itself is unique? What's the fingerprint-like part of my thoughts or brain?
[00:59:38] Nita Farahany: If you and I both put on a headset, we have to train the headset for a moment using software to our unique brain patterns. So every brain pattern and kind of the way, like the way your neurons fire exactly, to sing the Wheels on the Bus depends on your slight differences in wiring in your brain and your slight differences in who you are and all the environmental inputs that led to the slight differences in how your brain actually looks and acts. So you train the device and kind of pair the device to your brain. So the software has like a baseline program. You do the baseline program after a couple of minutes, it's then customized to you. And that's because while Wheels on the Bus has the same lyrics, how your brain processes it and how my brain processes it are just slightly different, right? So that we can see both people are singing a song in their head and decode that both people are singing a song in their head, but then pick out the differences based on how you're singing it versus how I'm singing it based on how our neurons fire in the brain.
[01:00:42] And so that's why the best way to think about this is authenticating people rather than identifying people. And what I mean by that is in order to match your face against you, at some point you gave an image of your face, and then every future scan of your face is matching it to that original image. It's authenticating you. And the same as you would give a baseline recording of you singing a song, and then every future attempt would try to match that song to the baseline of you singing the song. And that's how we authenticate people through biometrics and brain biometrics are unique in that way.
[01:01:16] Jordan Harbinger: That's really interesting. And it makes sense. So somebody can identify me via my brainwaves, even if I'm probably in the future, even if I'm not trying to think about something specific as instructed. So, will I someday be able to, or be subject to identity checks? Not with facial recognition or identification on a card, but because of my brain output? You know, do I go to the airport and then clear or TSA says, "Don't worry that Mr. Harbinger, we got you when you walked in."
[01:01:42] Nita Farahany: I think so. I mean, you know, one of the things that I thought was crazy was it was more than a decade ago that TSA, that kind of precursor that, plus what was Northwest Airlines at the time were in a partnership to try to figure out if they could remotely pick up brain activity signals as a way to authenticate people in airports. And that didn't go anywhere as far as I know because it's so noisy, right? Remotely picking up brain activity is right now just sort of a science fiction idea because you need something, a sensor or something that's actually touching your head to pick up that brain activity. But once you're wearing those sensors all the time, it should be a lot easier to do so.
[01:02:19] And governments from the US to China, worldwide are investing heavily in this idea of brain biometrics and part of the reason why it's that static biometrics are easier to fake. Static meaning like they don't change over time. Your face, that changes some, but it's still like other than like a facelift or something, you know, facial reconstruction, it's still static. Functional biometrics, like how you move, you know, changes in your brain activity, those are a lot harder to fake. And so they're considered to be more secure ways of actually authenticating people. So I think passwords will be thing of the past, and at some point, I believe brain biometrics will be used at wide scale.
[01:03:01] Jordan Harbinger: It seems like that could also be misused. Like you mentioned, you can't really put a mask over your brain, or now we're back to tinfoil hats. Maybe those are becoming actually useful, but you can't stop identify, like, I don't have a choice but to identify myself when I'm out and about. It reminds me of when China, this might sort of be apocryphal, but I guess China had this big concert from an artist, and they used facial recognition to capture a bunch of people who were criminals that just happened to attend this concert because they had every single person identified and they're like, "Oh, here's this guy that's on the run. We knew he liked this artist and we were looking for it." Or maybe they weren't. Maybe they just found a huge crowd of people and went, "Oh, here's three criminals. Let's go arrest these guys."
[01:03:41] Nita Farahany: Yeah. Look, anytime we give government access to the brain, I worry about it.
[01:03:46] Jordan Harbinger: Mm-hmm.
[01:03:46] Nita Farahany: And, you know, whatever the benefits that could be realized from government access to the brain, I think that the downside potential and the interference with freedom of thought with government access of individual brains is too profound to make it worth it. Like I'd rather have a different kind of biometric like I'm willing to take my insecure password over giving government access to the brain because of the likelihood of interference.
[01:04:10] And, you know, we already see it happening. You mentioned Minority Report earlier, and I usually start by first day of criminal law class. I teach first-year students criminal law, usually start the first day by talking with them about Minority Report. Most of them at this point haven't seen it and have never even heard of it.
[01:04:25] Jordan Harbinger: It came out before they were born or something. Yeah.
[01:04:27] Nita Farahany: Yeah, exactly. But I'm like, okay, you got to go watch it because I'm going to make a lot of reference to it in this class because it is at least for a certain generation, a really profound movie and making us think about the kind of concept of thought crime. And so I'm catching up at least a few in this generation and seeing the movie, but the idea of interrogating people's brains for crime that was based on pre-crime. Now, in a bunch of countries, there's brain fingerprinting technology that's being used to interrogate criminal suspects' brains to try to use what they find there to convict them. That's not Minority Report. That's real. It's happening. That's like testing for pre-conscious, unconscious signals in the brain of recognition of a murder weapon or a suspect who shouldn't be there, or congruence between two different facts. Like is the body in the lake? Is the body in the river? Is the body in the house? Right? And then seeing if the brain shows the body is actually in the river because it responds in a particular fashion that can be probed and is being probed by law enforcement worldwide.
[01:05:32] Jordan Harbinger: I wonder if being Iranian or Persian has anything to do with your interest in this area, because Iran, for people who don't know, what are the most oppressive countries in the world in terms of the Islamic regime over there, and any government having access to our brains is terrifying. But a government like Iran, which is authoritarian, and a theocracy is just unimaginably horrible because their power is unchecked, and they would absolutely misuse this technology at the first possible opportunity in order to cement their power even further. I just have zero doubt about any government doing this, but especially ones where the power is concentrated into a handful of old dudes.
[01:06:05] Nita Farahany: Yeah, old religious—
[01:06:08] Jordan Harbinger: Old religious fanatics.
[01:06:09] Nita Farahany: Pseudo-religious, you know, zealots. So absolutely, the fact that I'm Iranian American has such a huge influence in how I think about this. I grew up with parents who grew up in Iran. All of my extended family, 17 first cousins, aunts and uncles, all of them still live in Iran. My mother's brothers were in the Shah's military. One of them was kept in prison and threatened with daily execution regularly after the revolution. My parents had intended to go back to Iran. They had just come here for my father to finish his schooling and were unable to go back because of the revolution. So I grew up in a household where the stories that I heard, the daily conversations were probably very unlike a lot of other Americans who aren't as attuned to what's happening in an authoritarian regime and don't have family members who are directly affected by it and don't firsthand both talk about it regularly and see it happening.
[01:07:10] As I've tried to talk with family members there over the years, you know, when the Green Revolution was happening, they are so afraid to talk about it for fear of censorship and my mother's brother's family literally are under surveillance almost all of the time still. But, you know, they're so afraid to talk about it that in a couple of times that I've been to Iran, even being in a car with them, like, you'll bring up something political and they'll say, "No, no, no, no, don't, don't talk about it. Don't talk about it in here," on the assumption that at all times in all places, they're being listened to, even when they're not, right? The chilling effect that that has on people is so much more profound than you can really understand and appreciate if you haven't been exposed to it. Because it doesn't matter how good the technology is, the control and the brutality and the randomness with which that control and power is exercised has such a psychological hold on people that it is chilling.
[01:08:11] And so when I think about a technology like this, which somebody else might just look at and think like, "Oh, you know what? She's being sensationalist. Like, this is all good. It's all exciting. We should just be looking at the upside of technology." You know, my mind instantly goes to how is Iran going to use it and what is it going to do to the one space that people have. They're afraid to talk to me. What's going to happen if their thoughts could be monitored as well, or even if they just believe that their thoughts are being monitored? And so, you know, I think I've always approached technology with that greater degree of alarm and skepticism and concern of looking at the possibilities and then seeing them play out.
[01:08:52] And that doesn't mean that we in the United States should kind of rest easily thinking like, "Well, okay, but we're not there. And so the government isn't going to misuse it against us." I give a lot of examples of bad misuses that have happened here in the United States too.
[01:09:05] Jordan Harbinger: Well, plus the sort of 20 questions game you mentioned earlier with the, what sounds like neural spyware where you can show people images or numbers or something. I mean, yeah, you could get somebody's pin number. It seems like a long way to go to rob somebody. But what if you wanted to find out somebody's secrets, like their sexuality and they weren't open about that or something they did? That all seems quite possible with this technology.
[01:09:28] Nita Farahany: Or what if you're a politician and you really want to know whether or not you can figure out who's going to vote and how you can push them and what things push their buttons in order to do so? I mean, we already saw an example of that with Cambridge Analytica. What if you have much more exacting information that you can buy and commodify? And if you know what's being sold by companies to politicians is brain-based reactions of information. It is not hard to imagine manipulation of campaigns in democracy in modern times here in the United States with misuse of this kind of data.
[01:10:00] Jordan Harbinger: Tell me about this P300 brain wave, that seems like a lie detector almost.
[01:10:07] Nita Farahany: Yeah, it's not that, but it's certainly been touted as one.
[01:10:10] Jordan Harbinger: Sure.
[01:10:10] Nita Farahany: It's a great question. So this goes back a very long way in neuroscience research, which was the discovery that a person's brain reacts before they're consciously aware of it. And the example is that there was an experiment where, people were told like, "Here's a left click button and here's a right click button. Whenever you have the urge, click left or click right. But don't tell me when it is. Just whenever you have the urge to click, click the button and make a note of the time like as you're clicking. So here's a little stopwatch in front of you, just mentally like make a note of that time." And what they found was before a person was consciously aware of their like urge to click and could record that time about 200 to 400 milliseconds beforehand, their brain registered a response. This kind of wave, this P300 wave. And so, some entrepreneurial researchers thought like, "I wonder if we could use that for other things. And I wonder if we could use that kind of pre-conscious signaling to have it like light up in recognition to something." And so they showed people a series of images and saw across a lot of different contexts that P300 shows up.
[01:11:19] So there was a researcher by the name of Larry Farwell who decided that he wanted to figure out if he could create something called Brain Fingerprinting technology, which is to take that P300 signal and apply it in the criminal context, like go to a police file, find information that hadn't been shared with the public and shouldn't otherwise be known, and show a series of those images or words or prompts to a person and see whether or not their brain showed recognition or lack of recognition. And the signals look different. The amplitude of the wave that is kind of read out looks different if you recognize something or if you don't recognize something. And so the idea of using the P300 was to show, like if I show you things that you shouldn't know from a crime scene and your brain shows recognition as opposed to lack of recognition, that tells us something about your likelihood of being guilty or being innocent.
[01:12:14] Jordan Harbinger: Right. Like if I've seen the body and nobody else but the police has seen the body, I might be the guy who put the body there.
[01:12:20] Nita Farahany: Yeah, exactly. Now, I mean, if you've seen the body, because it is somebody you know, you're going to have recognition of it regardless. And so you have to be careful about how you design the probe. And that's been part of the challenge, is that it's kind of an art form rather than a science form to design those probes and ways that really show recognition that matters to us. Recognition, because you shouldn't know it other. And so like something that's a unique piece of clothing at the crime scene that you shouldn't have seen, there's no reason you should recognize it or images of the crime scene, right? So you describe the body like the mangled body that you should definitely not have any recognition of because you haven't seen it in that way. If you have recognition of that, that may say like, you were there or you did it.
[01:13:02] There's other signals like this in the brain. There's something called the N400 signal. I mentioned congruence. This is the idea that like these two things go together or these two things don't go together. And so you could say like, I want to know if these two things go together. And so I say, the body and dead, or the body and alive, or your wife and dead and the wife and alive, and you're trying to see like which things pair in a person's brain and which things don't pair in a person's brain. And then make inferences again to try to probe the brain for information. And that is a technique that lots of places are in investing a whole lot of money in. DARPA has a big program right now, the Defense Advanced Research Project Agency, which always is investing in like the cutting edge and sometimes really scary and sci-fi seeming things—
[01:13:53] Jordan Harbinger: Like the Internet.
[01:13:53] Nita Farahany: They have a program that's focused on that, like the Internet. Yeah. So it's being used that kind of, that task that P300 he introduced, it got a ton of funding here in the United States. It's happened in a few cases here, but hasn't taken off. And the scientists really have been very down on the P300. Nevertheless, he, in what really could be like a movie of its own, sold the technology to this company called Brainwave Sciences that had Michael Flynn and this like ex-KGB spy on the board of it and stuff. And they started selling it across the world to other law enforcement. And those police, there are reports of them regularly using this technique to convict people of murder.
[01:14:35] Jordan Harbinger: What should we expect in terms of how this tech deploys and unfolds over the next few years? How close are we to reality for stuff both like Neuralink or the earbuds that you mentioned actually being useful and when can I upload my brain to the Internet?
[01:14:52] Nita Farahany: All good questions. I'm going to do one thing first, which is, I'm going to say, I think it's here and we can expect within the next two to five years it to be really just widespread, like as common as the other kinds of technology that we have. But we have a moment, like, we have like a millisecond where I think we could do it differently. Like we've talked about a lot of the kind of more frightening aspects of this and the biggest reason I wrote this book, besides helping people to really understand the technology and where it is and where it's going, is to give a pathway forward, right? To say there is a pathway forward by recognizing what I call a right to cognitive liberty. And I lay out what that idea is and how we can use it in the book. And that pathway I think, would change the terms of service and change our relationship with the technology in ways that could empower people and make the kind of scary downside less likely to occur, but we only have a little bit of time because, to your question of when is it coming? Just this spring, in the next few weeks, the earbuds, the multifunctional earbuds are launching. Headphones are launching this spring with major headphone manufacturers that have the brain sensors embedded in it.
[01:16:03] You can already pick up a neurotech headset from, you know, dozens and dozens of companies. Neuro gamers are already playing video games using these headsets. It's just going to become more ubiquitous. Meta is launching their neural interface in early 2025. The question of when is it here? It's here. How widespread is it going to be in the next couple of years? It's going to be quite widespread by five years. I expect it to be pervasive across society.
[01:16:28] As for uploading your brain, you got a little bit more. We really have to have the likes of Neuralink and other companies end up with their products being really safe and effective, and people wanting to implant them in the brain to get to the point where we have such high resolution that you get to upload the entirety of your brain and then you got to ask the question. Why would you possibly do so?
[01:16:51] Jordan Harbinger: Yes. Exactly. What are you thinking? Well, we know what you're thinking I guess at that point.
[01:16:56] Nita Farahany: Yes.
[01:16:57] Jordan Harbinger: Yeah. Nita, thank you so much. Really, really fascinating. Really interesting. And I think in a few years people will look back at this and say, "Wow, she was right about probably everything." And hopefully, we do sort of miss, they miss us with the totalitarian control element of this and we stick to the good stuff. I guess time will tell.
[01:17:14] Nita Farahany: I hope that's right. I hope we move quickly toward adopting a right to cognitive liberty so that when we look back on this a few years ago, we say like, whew, we like dodged a bullet and we're all enjoying that and we just share that with each other, brain to brain.
[01:17:28] Jordan Harbinger: Thank you very much.
[01:17:29] Nita Farahany: Thank you for really thoughtful questions and a fun conversation and keeping it light and interesting and also really thorough and comprehensive.
[01:17:37] Jordan Harbinger: We've got a trailer of our interview with Reid Hoffman, founder of LinkedIn and an investor in one of Silicon Valley's top VC firms. He drops by the show to discuss how we can tell when we're informing our intuition with the best available data or if we're just procrastinating to avoid making important decisions. And why never give up is terrible advice and how to separate our winning instincts from our losing ideas. That's coming right up after the show.
[01:18:03] Reid Hoffman: A piece of advice I most often give entrepreneurs is, don't just work on the product, work on your go-to market. It's a huge world. It's eight billion people, right? How do you stand out against eight billion people? Actually, in fact, that's kind of challenging.
[01:18:16] Jordan Harbinger: Yeah, that's a good, are we at eight already?
[01:18:17] Reid Hoffman: Yes.
[01:18:18] Jordan Harbinger: Oh my gosh.
[01:18:19] Reid Hoffman: Yeah. They, oh, I build this thing in a corner. No one sees it and maybe the best thing ever, but no one sees it, so it's never used. That's the problem on the entrepreneurship side. So network, one, key component, another one is, which is you have plan A, you have plans B, which is how to think about like, "Well, if A is not working out, maybe B will work, or maybe B will be a different path," or you know, that kind of thing. And then you have a Z plan, which is, "It's not working out at all. What's my lifeboat plan? I'm going to row to a different set of plan A and plan Bs."
[01:18:47] There's always luck, there's always timing. The game is not so much, can I be one of the heroes that's written about in the next a hundred years? But the game is, can I do something that where I started from, I can make something interesting. You're playing your own game. Yes, your passion's important, but you should be paying attention to market realities. You should say, "Well, what do the opportunities look like? What does competition look like? What's the best match for me to what the opportunity landscape looks like?" You could always say, well, more data is useful. The test is what's the minimum set of data that you would actually in fact make this decision on?
[01:19:24] Jordan Harbinger: We need to separate our winning intuition or instincts from our losing ideas.
[01:19:30] Reid Hoffman: More often than not, greater than 50 percent of the time, you're going to have to give up on that idea. Everyone loves to tell these narratives of, "Well, when I was two I knew what I was going to do when I was 40."
[01:19:41] Jordan Harbinger: Yeah, it sounds good.
[01:19:42] Reid Hoffman: And it was a, yeah, and it was a straight line that was kind of smooth sailing. The wind was at our backs. It was kind of unproblematic. It's always fiction.
[01:19:50] Jordan Harbinger: For more with Reid Hoffman in a two-part mashup that includes cameos by the founder of Spotify, the CEO of Yahoo, and more, check out episode 207 of The Jordan Harbinger Show.
[01:20:03] Really interesting. I did not lie in the intro when I said this was fascinating. Look, I'll say this right now, I will never willingly seed my brain data to corporations, but I have an addendum to this. What I just said is a big fat lie. I definitely want to be able to use the products and services that are going to require me to seed this data, at least in part and it's going to be virtually unavoidable. Unless I want to be the equivalent of the guy who refuses to use a freaking cell phone in 2023. What I'm saying is, I think you're going to have to do this in some way in order to participate in modern society, because otherwise, you're not going to be able to control or use devices or anything at some point.
[01:20:38] I mean, imagine you're watching TV and it's like, oh, you don't want to change the channel with your brain. You want to use a remote? Good luck. You have to buy one. It's after-market. It's going to be like that at some point. because it's going to be so much easier to just look at the TV and say, I want it louder. I want it quieter. I want to change the channel. I want to watch something else. It's going to be how you unlock your phone. You're going to log into your bank, you're going to pay for things like that. Is this farfetched? Maybe a little bit. I almost feel like it's certainly down the line. They already use facial recognition for payments in China, bank loans in China. This is also not rumor-mill crap from Twitter. My Chinese teacher told me that she uses this herself. This is not a sketchy tabloid thing. This is a real thing that is very mainstream in China right now.
[01:21:22] Now, what about the medical uses of this technology? Early disease detection, yay. Companies finding out first and using that data against us, boo. Detecting seizures beforehand and getting alerts up to an hour beforehand, this is all pretty amazing stuff. I'm really looking forward to it. I just think the downside is very, very real. And probably social media is any indication. We're going to see the downside almost more than we see the upside, and that's what we have to be careful of here.
[01:21:47] And remember, it's not just thoughts and cognition that can be read or monitored. It can also be emotions and other data that never quite reaches the level of consciousness for most of us, most of the time. It sounds like whoever has this data will actually know our brains better than we know our brains, which means they have a pretty distinct advantage when it comes to persuading or influencing us. That is a little bit worrisome as well. So we got to figure out how to usher in this era, this inevitable era in a way that doesn't result in total dystopia, and that is what Nita is on the forefront of with her work. We need a right to mental privacy. We need to store data locally, not on company servers. We need cognitive and mental privacy policies, and frankly, we should have a right to access and keep and view our own data. Not only get it through the filter or lens of a company or a doctor. It's not something that should be left up to other people to decide for us, in my opinion.
[01:22:41] As for the work stuff, the employee monitoring, maybe the data for complex processing is done by a third party who does the brainwave collection and data processing, as opposed to just giving raw data to the employer who could misuse it or use it for things that are illegal or unethical. In other words, some random third party may be seized that an anonymized employee has a gambling addiction and is doing it at work. As opposed to that person's manager knowing about that specific issue, they just get a report that the employee is distracted at work and maybe they're legally barred from asking exactly what that cause is. Just like they are for medical issues right now. It's kinda like HIPAA for your brain, kind of. Not that the supplier would never do anything unethical on behalf of a client, but they do have to go a little bit further to do so. And I'm old enough to remember the accounting scandal where one of the big four lied on behalf of, I think it was Enron. But you know, adding a layer of accountability never really hurts, does it?
[01:23:32] The government stuff is really what should scare all of us. Even now, and here's a benign use of this, the government has used fitness tracker data to see if people were moving at the time of crimes or actually asleep. I mean, imagine somebody accuses you of murder and you can prove that you were asleep because you were wearing your Oura Ring or an Apple watch at the time, and it said that you were out cold in your bed. This is just another step in that direction, so it's not wildly unprecedented that the government, prosecutors, judges, defense attorneys, et cetera, juries even would have access to our brain data in a trial.
[01:24:04] I also wonder, will there end up being a gap between those who have these devices and this technology and cognitive enhancement and those who cannot, those who can afford it? Is this fair? Is it unfair? Is this steroid in sports or is this kids who can afford to have an SAT tutor versus those that don't? And is that inherently unfair? Lots to chew on and this problem instead of issues, certainly not going anywhere.
[01:24:28] Big thank you to Nita Farahany, all things Nita will be in the show notes at jordanharbinger.com. You can use our AI chatbot. Speaking of things that kind of want to look inside your brain. Our AI chatbot can help you find anything we've ever done here on the show. Transcripts in the show notes, videos on YouTube. Advertisers, deals, discount codes, all ways to support the show at jordanharbinger.com/deals. Please consider supporting those who support us. I'm at @JordanHarbinger on both Twitter and Instagram. You can also connect with me on LinkedIn.
[01:24:55] And I'm teaching you how to connect with great people and manage relationships using the same system, software, and tiny habits. That's our Six-Minute Networking course. That course is free. I use that stuff every single day, jordanharbinger.com/course. And many of the guests on the show subscribe to this course. Come join us, you'll be in smart.
[01:25:13] This show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Millie Ocampo, Ian Baird, and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for this show is you share it with friends when you find something useful or interesting. If you know somebody who's into the brain or maybe a little bit of sci-fi stuff, they might really dig this one. The greatest compliment you can give us is to share the show with those you care about. In the meantime, do your best to apply what you hear on the show, so you can live what you listen, and we'll see you next time.
[01:25:44] Special thanks to Peloton for sponsoring this episode of The Jordan Harbinger Show.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.