Julia Galef (@juliagalef) is the host of the Rationally Speaking podcast, co-founder of The Center for Applied Rationality, and author of The Scout Mindset: Why Some People See Things Clearly and Others Don’t.
What We Discuss with Julia Galef:
- How to spot bad arguments and faulty thinking — even when the source is you.
- The difference between having a soldier mindset that defends whatever you want to be true, and a scout mindset that’s motivated to seek out the truth regardless of how unpleasant it might be (and which you should try to cultivate).
- How to tell if you’re making reasonable mistakes or foolhardy leaps of faith that carry consequences far outweighing the value of the lesson.
- The best ways to manage and respond to uncertainty.
- How your brain matches arguments you misunderstand with ones you’ve already decided you don’t agree with — and what to do about it.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
How often have you caught yourself defending something you wanted so badly to believe was true, only to discover that reality painted a far less flattering truth than that desired belief? We’ve all been there. Wishful thinking is as much a part of human nature as loving and wanting to be loved. But being able to identify when we’re believing in and defending things that just aren’t true means we can find the truth and confront it no matter how uncomfortable it may be. It’s what today’s guest calls the soldier mindset versus the scout mindset.
On this episode, The Scout Mindset: Why Some People See Things Clearly and Others Don’t author and Rationally Speaking podcast host Julia Galef joins us to discuss how to ensure we’re keeping the defensive soldier mindset in check while nurturing the curiosity of the truth-seeking scout mindset. We tackle reasonable mistakes versus negligent decisions, managing and responding to uncertainty, untethering misunderstood arguments from incorrect preconceptions, and much more. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some of the links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. Thank you for your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
This Episode Is Sponsored By:
GoodRx helps millions of Americans save up to 80 percent on prescription drug costs with an all-in-one Rx pharmacy discount and medication price comparison app. Go to goodrx.com/jordan to find out more and make use of this 100 percent free service!
Better Help offers affordable, online counseling at your convenience. If you’re coping with depression, stress, anxiety, addiction, or any number of issues, you’re not alone. Talk with a licensed professional therapist for 10 percent off your first month at betterhelp.com/jordan!
Simple Mobile was founded on the idea that there is a better way to do wireless. Unlimited, no-contract plans starting at $25. Just text BYOP to 611611 to see if your phone is compatible, and visit simplemobile.com to find out more!
Nuun is a refreshingly effervescent hydration beverage, enhanced with electrolytes, minerals, carbs, and vitamins, and bursting with a natural, low-calorie fruity flavor. Go to nuunlife.com and enter code JORDAN for 25 percent off your first order!
What time is Bourbon Time? Join us in reclaiming 6-7 pm as the happiest hour to do whatever it is that makes you happy — and if that involves a glass of bourbon, remember to drink Knob Creek responsibly!
Saving money on your car insurance is easy with Progressive. It’s an average savings of $699 a year for customers who switch and save! Get your quote online at Progressive.com and see how much you could be saving today!
Miss the episode we did with Habits Academy’s James Clear? Catch up by listening to episode 108: James Clear | Forming Atomic Habits for Astronomic Results!
Thanks, Julia Galef!
If you enjoyed this session with Julia Galef, let her know by clicking on the link below and sending her a quick shout out at Twitter:
Click here to thank Julia Galef at Twitter!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- The Scout Mindset: Why Some People See Things Clearly and Others Don’t by Julia Galef | Amazon
- Rationally Speaking Podcast
- Center for Applied Rationality
- Julia Galef | Website
- Julia Galef | Twitter
- Julia Galef | YouTube
- The Dreyfus Affair | My Jewish Learning
- Metaphors We Live By by George Lakoff and Mark Johnson | Amazon
- The Map and the Territory by Michel Houellebecq | Amazon
- Rational Irrationality | Wikipedia
- Bryan Caplan | Twitter
- Spock | Star Trek
- Geek’s Guide to the Galaxy
- Letter to General Ulysses S. Grant | Abraham Lincoln Online
- Karen Ruth Levy | Twitter
- Obama’s Yes-Man Test Will Help You Get Honest Feedback by Nabil Alouani | Medium
- Conformity Bias | Ethics Unwrapped
- 2 Types of Uncertainty and 3 Ways to Respond to Them by Julia Galef | Fast Company
- Julia Galef: Why You Think You’re Right — Even If You’re Wrong | TED
- The Power of Intuition: How to Use Your Gut Feelings to Make Better Decisions at Work by Gary Klein | Amazon
- Gary Klein | Twitter
- When Beliefs Become Identities, Truth-Seeking Becomes Hard | Rational Animations
- Silicon Valley | Prime Video
- Keep Your Identity Small | Paul Graham
- The Ideological Turing Test: More Important Than Ever? by Josh Hamilton | Medium
- Megan McArdle | Twitter
- Annie Duke | How to Make Decisions Like a Poker Champ | Jordan Harbinger
- Daniel Kahneman | When Noise Destroys Our Best of Choices | Jordan Harbinger
- De Minimis | Wikipedia
Julia Galef | Why Some People See Things Clearly and Others Don’t (Episode 536)
Jordan Harbinger: Coming up next on The Jordan Harbinger Show.
[00:00:02] Julia Galef: What I advocate for is just trying to hold your identity more lightly, trying to maintain a little bit more emotional distance from your beliefs. And so you can still call yourself, you know, a feminist or whatever you are, but try to hold that more lightly. So that involves things like reminding yourself that label is contingent. So yes, I'm a feminist now because that label just best describes my beliefs, but if I were to come to the conclusion that feminism is actually wrong or harmful, then I would not be a feminist anymore. So essentially, you're trying to make the label feel like just a label and not like a flag that you're waving or a badge that you're wearing.
[00:00:45] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. We have in-depth conversations with people at the top of their game, astronauts, entrepreneurs, spies, psychologists, even the occasional Russian chess Grandmaster, rocket scientist, or undercover agent. And each episode turns our guests' wisdom into practical advice that you can use to build a deeper understanding of how the world works and become a better critical thinker.
[00:01:13] If you're new to the show or you're looking for a handy way to tell your friends about it, we've got episode starter packs. These are collections of your favorite episodes organized by popular topics. That'll help new listeners get a taste of everything we do here on the show. Just visit jordanharbinger.com/start to get started or to help somebody else get started. Which of course is always, always appreciated.
[00:01:33] Now, today, we all think we're rational thinkers, even though we know that we're not. Well, most of us at least should know that we are not. Often on the show, we cover cognitive bias and other ways that our decisions can be impacted by outside influences or inside ones. One of my favorite things to do is just find out where I'm consistently wrong and then figure out a way to stop that mistake from happening.
[00:01:55] Today, my friend, Julia Galef is going to teach us how to spot some of our own habits when it comes to bad thinking and give us some strategies to mitigate. She's the co-founder of The Center for Applied Rationality, which sounds pretty fancy and impressive to me. She also wrote a new book called The Scout Mindset, which is what we're going to talk about today. So get ready to improve your decision-making skills and learn to spot some bad arguments and bad thinking even when the source is you.
[00:02:19] And if you're wondering how I managed to book all of these great authors, thinkers, creators every single week, it is because of my network. And I'm teaching you how to build your own network for free, not for podcast guests mostly, but for career stuff, personal reasons, whatever you need it for. You got to dig that well before you're thirsty, people. You just never know. Most of the guests, they subscribed to the course already. Come join us, you'll be in smart company where you belong. You can find it all for free over at jordanharbinger.com/course. Now, here's Julia Galef.
[00:02:49] Let's talk about why some people see things clearly and others don't. That's sort of the premise of the book, right? Is it that kind of where we're going with this?
[00:02:57] Julia Galef: Ah, yeah, that's right. That's a simplification, of course, that fits in a book, subtitle on the cover.
[00:03:03] Jordan Harbinger: It is one sentence.
[00:03:04] Julia Galef: Yeah.
[00:03:05] Jordan Harbinger: Yeah. You did write a whole book, not just one sentence.
[00:03:07] Julia Galef: I had a longer version of it which was like why some people in some contexts see things somewhat more clearly on average than other people in those contexts, but my editor next to that, so we went with the one that you saw.
[00:03:21] Jordan Harbinger: I see why that didn't work as well as the one that you eventually came up with. The scout mindset though, seeing things as they are not how we wish they were is sort of deceptively simple. Because I think a lot of people I met — look, myself included years ago before doing a decade and a half worth of shows about cognitive bias and neuroscience, people think they see things as they are. And I routinely see this argument online, right? If someone's arguing on Twitter, "I'm just telling you the facts." And it's like, well, okay, you've got your feeling and your political belief and then your religious doctrine. And then like some set of facts that you kind of have misunderstood. And now you think, "I'm just telling you what's right in front of all of our eyes," and it's never the same. You know, we can get as philosophical as we want about how our brains construct reality, but at the end of the day, nobody really seems to see things how they are. All we can do is get close, right?
[00:04:10] Julia Galef: Yeah. I encounter three — well, at least three big kinds of reaction to the idea of trying to see things more clearly. One is the kind of people that you're describing, who basically think they are already seeing everything as it is. And they're just perceiving unfiltered reality. There's no lens of interpretation or bias or error going on there. That's a common reaction. Another category are the people who I think are a little bit more self-aware who do agree that, "Yes, I'm sure I'm wrong about some things. I'm sure I have some biases," because everyone does. But there's never a moment where they're like, "Oh, yes, this is a thing that I might be biased about."
[00:04:48] On any particular issue, they feel like, well, you know, the right answer is clear to this topic. And so there's never a particular case in which they think they might actually be biased. And then the third category are people who don't actually think people should try to see things as they are. And I think a lot of people, maybe the majority of people feel this way about at least some topics that like, for example, no, you shouldn't try to have an unfiltered objective view of yourself and your strengths and weaknesses or the probability of success of your startup. You should try to be as positive as possible regardless of the facts, because that will be a self-fulfilling prophecy and yada yada yada.
[00:05:26] So these are three different kinds of reactions to the idea that we should try to see things more clearly. And I kind of try to address all of them in the book.
[00:05:35] Jordan Harbinger: That makes sense. It's kind of like, "Oh, I'm sure I have some bias, but not on this particular thing that we're arguing about right now on this one. I'm definitely right."
[00:05:43] Julia Galef: Exactly. Which, of course, I sympathize with. When you're wrong or when you're biased, it never actually feels like you are in the moment. So it takes some real kind of trust in the outside view that like, even though it feels like I'm definitely right, there is at least some chance that I'm not because everyone's wrong about lots of things and I've been wrong in the past. And so you have to kind of do that abstract leap in your brain to override your certainty that in this case you're definitely right.
[00:06:07] Jordan Harbinger: The capacity to fool ourselves knows no bounds really, right? I mean, we can always convince ourselves with something. And often we don't really know that we're doing that. And the book starts with the Dreyfus Affair. And I feel like a lot of folks have heard about this. I'd heard about it before, but I didn't really know what it was. What is the Dreyfus Affair? What does this tell us about bias and motivated reasoning?
[00:06:26] Julia Galef: Yeah, so I started — I love the story. I just find it so striking and moving. And I started my book with it because it kind of encapsulates these two different modes of thinking, which I call soldier mindset and scout mindset on the opposite end of the spectrum. So what happened was this was in France, in the late 19th century. So essentially, there was this crumbled up memo that was discovered in a wastebasket in the German embassy. But it was found by a French woman who was working as a spy for the French. And so she pulled it out of their waste basket and brought it back to the general staff of the French army and they read it and it revealed that someone in their ranks, in the French army, very high up, was leaking military secrets to the Germans which was alarming.
[00:07:10] And so they started this investigation to figure out who's the mole or who's the leak. And they pretty quickly anchored on to this man in the high ranks of the army named Alfred Dreyfus. What was unique about him was that he was the only Jewish member of the general staff of the army. And anti-semitism was extremely common in France. It seems extremely likely that his Jewishness was a large part of why the investigators anchored on him so quickly. So they started investigating him, trying to dig up any suspicious stories about his character or his past. And they found a lot of rumors that they thought painted him in a bad light.
[00:07:47] They searched his home and they didn't find any evidence, but they figured well that's probably just because he hit it all really carefully. They took a sample of his handwriting to get it analyzed by a handwriting expert to see if it matched the memo that had been found. They got kind of mixed reviews. Their in-house analyst was like, "Yes, sure. This is a match." But the outside analyst was like, "Ah, this is not really very close," but they discounted that opinion because they figured, "Well that handwriting analyst, he works for the Bank of France and there's a lot of Jews in the Bank of France. And so probably there's conflict of interest so we can ignore him."
[00:08:19] So anyway, I'm giving you all these details to kind of give a sense of the type of reasoning that the investigators were doing, where they looked really hard and were really eager to believe any, even vague unsubstantiated rumors about Dreyfus. And on the other hand, they worked really hard to discount any failure to find evidence or any dissenting opinions about whether the handwriting matched and things like that. So they ended up convicting Dreyfus and he was kicked out of the army and he was sentenced to solitary confinement for life on this barren island called Devil's Island. And he kept insisting he was innocent, but it didn't do anything.
[00:08:54] And so that would have been the end of his story, except that a year later, this new investigator named Colonel Picquart was put in charge of the department that had done the investigating. And he started going through the old files and he had assumed that the case against Dreyfus was really strong. But actually when he looked into it, he realized, "Man, we just have this really flimsy, circumstantial evidence. This is not a strong case at all." And so he started asking around and confronting his colleagues in the department with this terrible evidence. And he was shocked to learn that no one else was all that concerned about the risk that they might've actually convicted an innocent man. Oh, and I should mention there was more spying happening. They found more memos after Dreyfus was sentenced to solitary confinement. And so of course, Picquart was like, "Clearly, we haven't actually caught the spy. So we should be really worried that we convicted Dreyfus unfairly." And the response he got from his colleagues and superiors was, "Well, probably Dreyfus taught someone else to write in his handwriting and that's a new spy, but Dreyfus is also a spy."
[00:09:51] Jordan Harbinger: Crafty Jews, right? That's what they're thinking in France over there at that time.
[00:09:57] Julia Galef: I know. And so, you know, to us as outsiders, over a century removed and also to Picquart, this seemed like obviously a convoluted excuse or rationalization to justify their conviction of Dreyfus, but no one else would admit that. And I should mention that Picquart himself was also pretty anti-semitic. He was like on the record making anti-Jewish jokes and he didn't even like Dreyfus personally as a person, but he cared so much more about what is actually true. That motivation pushed him to keep trying to figure out if Dreyfus was actually innocent, even though he had nothing to gain personally from exonerating Dreyfus. And in fact, a lot to lose, like he was making enemies. He got threatened by other people in the army. They even sent him, Colonel Picquart, to jail for, I don't know, months or over a year for his investigations, but he still pushed forward because he was like, "No, the truth matters. We don't want to convict an innocent man." and so eventually, fortunately, Picquart triumphed and Dreyfus was exonerated. And the real spy, this guy named Esterhazy was caught and, I think, fled the country and died in infamy in exile. And so this is now called the Dreyfus Affair.
[00:11:03] And what I love about it is that it shows these two extremes of human reasoning. For on the one hand, we have the capacity to be in what I call soldier mindset where our motivation is just to defend what we want to be true against any evidence that might threaten it. And so we work really hard to accept things that we want to believe and to reject things we don't want to believe. And that's what the investigators who convicted Dreyfus were doing. And then on the other hand, we have this other mode that we can often get into that I call scout mindset in which our motivations, just to figure out what's actually true, whether or not that's convenient or flattering or pleasant. We just want to know what's actually out there so that we can act on that. And so I think Colonel Picquart is a shining example of someone in scout mindset.
[00:11:46] Jordan Harbinger: So why call it the scout mindset? Did you come up with this? Is this like a clever, brand mindset that you've come up with?
[00:11:52] Julia Galef: Well, it is my metaphor, but it's like a Kymera of a few different metaphors that were already out there that I kind of combined into one. So to start with soldier mindset, that metaphor comes from the fact that our language, the way that we talk about reasoning and argument and belief is very militaristic, which is something that was first to my knowledge, at least pointed out by a linguist named George Lakoff in his book Metaphors We Live By. So he talks about how our language about beliefs is as if they're buildings that are supposed to be as strong and impenetrable as possible. So we'll talk about like buttressing a position or, you know, supporting evidence or belief resting on firm foundations. And then conversely, when we talk about arguments, it's as if we're soldiers fighting on a battlefield, so we try to shoot down opposing arguments or like poke holes or find the weak points in someone's logic.
[00:12:41] And so to me, that made it make sense to call that mode of thinking soldier mindset. And then scout mindset is just a natural mirror image of that because the scout's role is not to attack our defendants to go out and see what's actually there and form as accurate map of reality as possible so that you can navigate the situation as well as possible. And that too, I kind of borrowed that metaphor from — it's a metaphor from a philosopher from mid century called The Map and the Territory, which is just, it sounds very simple, but it's something that we have a hard time keeping in mind as you brought up that our perceptions of the world are like a map that we're drawing and it's not the same as the actual territory of actual reality.
[00:13:20] There's always some disconnect. Your map is always going to be simplified or imperfect or, you know, missing a lot of information. And we sort of acknowledged that in theory, but it's really hard to keep that in mind in practice as you pointed out. So I kind of mashed that up with the metaphor about beliefs and arguments being a military endeavor and came up with the soldier versus the scout.
[00:13:41] Jordan Harbinger: We're obviously a mixture of scout and soldier. Why are we sometimes more scout-ish and other times more soldier-ish? That's the question, right? So sometimes we can be more cool, detached, objective, whatever. And other times we're just like wrapped up in the emotions and the bias and things like that. And like we mentioned, at the top of the show, most of us think we're more scouts than we are, and we ended up being more soldier than we'd like.
[00:14:04] Julia Galef: Yeah. So to some extent there are differences between people where some people are just — they're more committed to the idea of seeing things accurately than others are. And so they try harder to double check their intuitive judgements, or to look for ways that their views might be wrong or flawed. So they're definitely individual differences, but there's also within a given person they'll fluctuate between scout and soldier mindset depending on the topic or the day, even. I think I'm often pretty well into scout mindset, but there are some conversations with some people that just really pushed me towards the soldier end of the spectrum. And in retrospect, at least I can notice, I was not actually listening to them. I was just reaching for defenses of my argument and waves the flaws in theirs. And there's something about the way they challenged me that puts me into that mode.
[00:14:55] And, you know, it depends on context as well. So you could imagine, for example, a financial trader who is in scout mindset at work and really wants to figure out, "Which of my assumptions about the market were wrong," you know, he tries to test his assumptions and then comes home as very much in soldier mindset when it comes to his personal relationships and is unwilling to entertain the possibility that someone else's view of things might be right, or no, the idea that there might be problems in his relationships, things like that. So it really depends on your mood and sort of how much emotion or ideological baggage you have on a particular topic.
[00:15:33] Jordan Harbinger: Tell me about the rational irrationality hypothesis. This is one of those sort of tongue twister concepts that seems pretty useful.
[00:15:40] Julia Galef: Yeah, it sounds almost paradoxical rational irrationality, but it does make sense. This is a theory that a lot of economists and evolutionary psychologists have talked about, but the name rational irrationality was coined, I believe, by Bryan Caplan, who's an economist at George Mason University. And the idea of the rational irrationality thesis is that it's talking about two different kinds of rationality. They're called epistemic and instrumental. So epistemic rationality is essentially what I'm talking about with scout mindset, trying to see things as accurately as possible. Whereas instrumental rationality is about achieving your goals, like whatever those might be. Being happy or getting rich or helping the world, whatever your goals are, instrumental rationality is pursuing them as effectively as possible.
[00:16:24] And so the theory of rational irrationality is that humans evolved to essentially be epistemically rational only when it helps them. And in other times to kind of automatically be epistemically irrational to see things inaccurately, to be in soldier mindset, because that's good for their goals. And the idea, the claim is that we evolved with kind of a good, intuitive sense of when we should have true beliefs and when we should have false beliefs, because that would be more useful. And so this is kind of, it's a big counter argument to what I'm saying in the book. What I'm saying is that we should be in scout mindset much or all of the time and that having more accurate beliefs would be good for us. And so rational irrationalities say, "No, no, no. We're already at the optimal balance between true and false beliefs. And if we tried to see things more accurately, that would be bad for us," so that's their claim.
[00:17:16] And so in the book, I tried to explain the rational irrationality view and then explain why I don't actually think that makes sense and I think we should be more accurate.
[00:17:25] Jordan Harbinger: Why do we seem to have a bias towards the soldier mindset? Like what is it about humans and human brains and evolution that leads us to be — why can't we be more like, you know, Spock or whatever, where we're just like, "Oh, I'm completely rational and have no emotional—" Why did we evolve this, do you think?
[00:17:42] Julia Galef: Well, as a side note, I actually think Spock is not very rational, epistemically or instrumentally rational. I think he's kind of a bad character of what people imagine rationality looks like, but he's not, but that's a topic for another thread or another day.
[00:17:56] Jordan Harbinger: That's a Star Trek podcast. This is not a Star Trek podcast. Yeah, like that's a different argument.
[00:18:02] Julia Galef: Yeah, I spend a lot of time in that on Geek's Guide to the Galaxy, but I understand if you don't want to spend the next half hour dissecting Star Trek episodes.
[00:18:09] Jordan Harbinger: Yeah, I'm unequipped to deal with that.
[00:18:12] Julia Galef: I mean, to be honest, I'm not as well equipped to dissect Star Trek as many of my listeners are so, or my readers, I've gotten a bunch of very esoteric challenges to my Star Trek claims that I can't really defend. Anyway, but your question was actually about why did we evolve to be so often in soldier mindset?
[00:18:30] Jordan Harbinger: Yeah.
[00:18:31] Julia Galef: And so what seems pretty clearly true to me, although it does not like bunch of like super high quality studies out there proving this, but I'm describing what seems pretty obviously true to me, and you can judge for yourself, is that soldier mindset benefits us and makes us feel good in the short term. So for example, if you are in soldier mindset and you convince yourself that, "No, actually, that fight I got into wasn't my fault. It was all the other person's fault." Or, "That thing that went wrong at work wasn't my fault. There was nothing I could've done to prevent it." Or, you know, "It was the fault of the people above me who made me do whatever." We all have a ton of excuses that we come up with to justify the belief that makes us feel good or look good and that does — it feels good in the moment. You feel a sense of relief and self-righteousness, and you feel validated and so on, and it can even make you look good in the short term to be able to convince other people that you weren't at fault and things are going great. And the problems aren't the result of anything you did, whatever. So it's not that there are never any benefits that result from soldier mindset because there are. The problem is just that those benefits tend to be really weighted towards the short term.
[00:19:44] So you feel good now, if you convince yourself, you didn't make a mistake, but what you're sacrificing is that you're less likely to notice in the future, if you're going to make that similar mistake again, or, you know, just generally, if you refuse to acknowledge that you might be wrong about, I don't know, some political debate or something like that, even though that doesn't have any direct bearing on your life being wrong about some political issue, you're losing out on the ability to just be more self-aware and more able to notice that you're wrong about things and acknowledge that you're wrong about things in general, which is an important life skill, even well outside of the domain of politics.
[00:20:20] And so there are all these benefits to scout mindset, benefits in terms of fixing our flaws and changing things that we don't like about our life, and just generally becoming more self-aware and emotionally resilient but those benefits take a little while to build up. And unfortunately, one feature of human psychology is that we are much more motivated by the short term than by the long term. And we're much more motivated by the kind of immediately salient direct benefits, like feeling good or saving face in a conversation that we're having right now. Those are very motivating to us, whereas the longer-term rewards of increased self-awareness and improving your life over the long run. Those are less motivating in the moment. And so I think that scout mindset versus soldier mindset is just like a lot of other choices and trade-offs that humans have to make between sacrificing our comfort in the moment in order to build strength or working on that thing that we've been putting off instead of leaving it for ourselves tomorrow. Those are kind of the kinds of trade-offs that the human brain is not very good at making. And so we ended up undermining our own goals over the long run.
[00:21:24] This is like the center of my objection to the rational irrationality people, which is like, I don't know why you would think that humans are optimally evolved to make these choices when it comes to seeing the world clearly, when we obviously are not optimally evolved to make choices about, you know, exercising now versus putting it off or breaking our diet versus sticking to it. Like there's all these ways in which our psychology clearly gets in our way. And I think scout and soldier mindset is well in that category.
[00:21:54] Jordan Harbinger: I mean, how many times even just — so the fitness one is a good example, but how many times have you had this conversation where you're annoyed with somebody and you're like, "Should I send them an email that's a little bit passive-aggressive or nasty, or like really calls this out." and then you're like, "No, that would be a bad idea." And then you're like, "So I'm going to send an email anyway, that kind of does that, but makes me feel like I'm not doing that, but I'm also going to feel satisfied that I totally just did that." And then you're like, "Damn it," like two days later, you're like, "That was — I should not have sent that."
[00:22:20] Julia Galef: Yeah, that's a great example. Yeah.
[00:22:23] Jordan Harbinger: We've all done that.
[00:22:25] Julia Galef: So the version of that, that I was just thinking about the other day, is when I'm in a disagreement with someone, especially if I'm like a little irritated that they could possibly think what they think. I often feel like I'm being perfectly polite and reasonable in my responses to them. And then if I kind of step outside of myself and listen to what I'm saying, it's obvious that I'm actually being kind of biting and I'm maybe strawmanning them a little bit, but it doesn't feel, it's so easy to convince yourself that you're being perfectly polite and reasonable, but you know, you use turns of phrase, like — in one argument recently, I remember I started to use the phrase, "So in your mind, blah, blah, blah—" And I felt, I told myself at the time, I'm just trying to like understand their point of view but when you listen to that phrase, like, so in your mind that sounds kind of sarcastic or like—
[00:23:15] Jordan Harbinger: So what you're saying is—
[00:23:17] Julia Galef: That would be a better way to say it. That's like how you would say it, if you weren't annoyed and disdainful of them but—
[00:23:23] Jordan Harbinger: I suppose, yeah.
[00:23:24] Julia Galef: —but the phrase like so in your— I don't know if someone said to me, "So in your mind, blah, blah, blah." I would think they were kind of being disdainful. But anyway, this is just an example.
[00:23:32] Jordan Harbinger: Oh, you know what I'm thinking now?
[00:23:34] Julia Galef: Yeah. You just never really hear that phrase followed by something reasonable. It's always followed by like, "So in your mind, this is all a conspiracy," or, you know, something kind of contemptuous like that.
[00:23:47] Jordan Harbinger: You're listening to The Jordan Harbinger Show with our guest Julia Galef. We'll be right back.
[00:23:52] This episode is sponsored in part by GoodRx. Do you know different pharmacies charge different prices for the same drug? I had no idea about this. Imagine if your grocery store acted like a pharmacy, so milk can only be found behind the counter, and you have no idea how much it's going to cost until you check out. In that checkout, you find out that this particular brand of milk at this particular store is $105 and you can't price check it against any other brands. And the weirdest part — there's another company you've never heard of that milk is $5. Welcome to the wild world of prescription drug pricing in America. GoodRx is a neat platform that gathers current prices and discounts to help find the lowest cost pharmacy for your prescription at over 70,000 pharmacies like CVS, Kroger, Walgreens, Rite Aid, Vaughn, Walmart and more. You can save up to 80 percent, which you know, when drugs are so expensive around here, that's a hell of a lot of money. So don't go to the pharmacy without checking GoodRx first. It's the number one most downloaded medical app. It's free to use. There's no weird membership stuff. My father-in-law recently went to get his prescription filled and even though he already has good insurance, he pulled up the GoodRx app and showed the pharmacist and it saved him 30 bucks. And this is a guy who likes saving 30 bucks, let me tell you.
[00:25:01] Jen Harbinger: Start saving up to 80 percent on your prescriptions today. Go to goodrx.com/jordan. That's goodrx.com/jordan. Goodrx.com/jordan. GoodRx is not insurance, but can be used instead of insurance. In 2020, GoodRx users received an average savings of over 70 percent of retail prices.
[00:25:18] Jordan Harbinger: This episode is also sponsored by Better Help online therapy. A lot of us have been stuck in a little COVID hole for a while, and a lot of people are still feeling down and emotionally out of sorts. You might not feel depressed or can't get out of bed or whatever at some total loss, but if you are feeling a little bit off or your relationships are suffering, that could be a sign that you should talk to somebody and I highly recommend therapy. I think it's a great thing to do. The more sane you are, the more you could probably use a little bit of therapy just to stay that way. I highly recommend checking out Better Help, betterhelp.com/jordan. Fill out a questionnaire. It takes just a few minutes. Start communicating with your therapist in under 48 hours. It's video chat, phone chat, live chat, whatever you're comfortable with. It's obviously confidential and you can change counselors anytime if you need a better fit.
[00:26:02] Jen Harbinger: Online therapy is convenient and more affordable than in-person therapy. And our listeners get 10 percent off your first month of online therapy at betterhelp.com/jordan. That's better-H-E-L-P.com/jordan.
[00:26:14] Jordan Harbinger: At Simple Mobile, you get the no contract advantage. You can get a powerful nationwide 5G network all without a contract. 5G capable devices and SIM required. Actual availability, coverage, and speed may vary. 5G network not available in all areas. 5G upload speeds not yet available. Simple Mobile, out with the old, in with the simple.
[00:26:33] Now back to Julia Galef on The Jordan Harbinger Show.
[00:26:37] It's like those people that say, "I'm not racist, but—" and then they always 100 percent of the time they follow that up with a semi-racist or a full-on racist comment. Like nobody starts a sentence like that. "Look, I'm not trying to be rude, but—" or like, you know, "It's just my—"
[00:26:52] Julia Galef: No offense but—
[00:26:53] Jordan Harbinger: No offense, no offense, but offensive thing, right? Exactly offensive statement.
[00:26:58] Julia Galef: Right. I didn't feel like I was doing that in the moment. It was only clear to me when I sort of stepped outside and listened to myself. Anyway, I was agreeing with you, but it's very easy to kind of fool yourself into thinking you're being perfectly objective and reasonable and only realize later that you weren't.
[00:27:11] Jordan Harbinger: Yeah. Story of my life. How do we know if we've got the scout mindset going then? Is there like a test we can use? You have some criteria in the book that I think is really useful, right? Like, do we test our claims and prove ourselves wrong sometimes? Like things along those lines.
[00:27:30] Julia Galef: Right. So precisely because of this problem where we always feel like we're reasonable and rational, even when we're not. I wanted to caution people that it's not enough to say like, "Well, of course, I'm a scout all the time. I feel perfectly reasonable and rational. And I feel like I care about the truth." You have to kind of look at your track record, your actual behavior to see if you're acting like a scout, instead of just whether you feel like one, because everyone does. So I talked about some kind of behavioral cues of scout mindset that you can look for. One of them is whether you can think of actual concrete examples in recent memory, in which you realized someone else was right and told them so.
[00:28:09] I have this example I love of Abraham Lincoln who was a wonderful scout in so many ways. He kept coming up in my research, but he wrote a letter during the Civil War to General Grant who had executed this military maneuver. That Lincoln thought was a terrible idea, but Grant went ahead and did it anyway and it really worked out. They secured this very strategically important city in the war. And so Lincoln was pleasantly shocked and wrote Grant a letter saying, "You know, I just wanted to tell you that you were right and I was wrong." That was his phrase. And I just loved —because he didn't have to do that. No one was forcing him to admit he was wrong. And so that's kind of an example of the type of thing that I think is a good hallmark of being a scout. You know, unprompted, unforced, do you tell people when you think they were right about something that you disagreed with them about?
[00:28:54] And then I think another example, I'll just give you one more of a behavioral cue of scout mindset is can you name critics that you have, who you think are reasonable? And we tend to focus on our critics who are unreasonable. And we all have critics like critics of your beliefs, of your political or ideological beliefs, critics of your lifestyle. Like people who think you should definitely have children when you're not, or people who think you should definitely breastfeed when you don't or the converse or people who think you shouldn't work in tech because tech is evil, whatever. We all have people who disagree with us and criticize our choices and our beliefs. So I think a good test is can you point to criticisms or to people who are critical of you, who you think are like smart and well intentioned, even if you disagree with them. Because the ability to recognize that someone is making a reasonable point, given their premises, that takes some emotional strength and some — it takes a commitment to seeing things clearly in the face of a very natural urge to be defensive.
[00:29:55] So those are two examples of the kinds of things that I think are hallmarks of scouts.
[00:29:59] Jordan Harbinger: I think this makes sense. I think one that I particularly liked that you also wrote about was do you take precautions in order to avoid fooling yourself, right?
[00:30:06] Julia Galef: Yeah.
[00:30:07] Jordan Harbinger: Are we coming up with ways to mitigate bias that we know is going to come into play early on? Are you putting into place systems or are you putting into place checks and balances to make sure that you're not just able to rationalize something or that you just ended up being wrong in the first place? And I love what you said about telling people after the fact later, unprompted, that you realize they were right, because you're not waiting for them to back you into a corner or embarrass you on social media or whatever. You're just saying—
[00:30:30] Julia Galef: Exactly.
[00:30:31] Jordan Harbinger: "Hey, by the way, you were right about that thing," because that shows — well, there's like an ego hit there, right? So it means you sort of value truth over your own ego, which probably not a lot of people are doing that, most of the time.
[00:30:43] Julia Galef: Yeah. I tried to pick criteria that I thought like, actually do, you know, separate the wheat from the chaff. And they're not a thing that everyone does all the time and because it's easy. And so, yeah, testing your beliefs or noticing that you were wrong, unprompted, unforced is a big one. It is much easier for people to find cases where they kind of grudgingly admitted they were wrong because they were forced to, because they were backed into a corner. And I don't think you get that many points for that. So yeah, I wanted to make that distinction.
[00:31:11] And a couple of examples since you brought up that other criterion of putting systems in place to kind of prevent you from fooling yourself. An example of that, that I like is before you do some endeavor, some projects like at work, decide ahead of time, what would count as a failure versus a success. That kind of checks your ability to after the fact, you know, rationalize, "Well, okay. So it didn't quite work out the way I expected, but that's okay. We wouldn't have expected it to work that soon anyway, so we should really give it another year." Or, you know, there are all these ways that we can rationalize because it's unpleasant to decide, "Well, this thing I was putting so much time and effort into is a failure," or, you know, "The thing that I vouched for to my colleagues actually didn't work." It can be very tempting to come up with rationalizations to keep like zombie projects limping along for years past when they should have been scrapped.
[00:32:02] One woman that I talked to and didn't end up writing about in the book, her name is Karen Levy and she ran basically anti-poverty non-profit in Africa and she would do this whenever she had meetings with stakeholders before the project launch, she'd say, "Okay, let's decide what would make this project a failure versus a success." And so they have that in writing and then a year later they can go back and compare it to what the standards they set for themselves. So they can't move the goalposts later. That's the kind of thing that I try to do in my personal life as well.
[00:32:34] Jordan Harbinger: Yeah, I suppose that makes sense because otherwise you can just rationalize your way into something. I mean, sometimes it makes sense to move goalposts if you go, "Oh, we thought we were going to get a 10 percent ROI, but that is completely unrealistic. We need to shoot for one percent," or, "We've already hit that. It doesn't make any sense to keep this as the goal because we hit it in the first week of our 50 week project here. Let's double it and see if that's a stretch." But other times, yeah, if we're just saying, "Oh, I'm just going to do this and see what happens." It's really easy to rationalize that something was a win. And I see this a lot with like stock traders and cryptocurrency traders who spend like tens or hundreds of thousands of dollars and they lose it and they go, "Well, on the upside, I learned all about this. And I guess if you look at it this way, then it was better that I did this now because dah, dah, dah," and you're going, "Cool, but you lost five million — no matter what way you want to paint it, you lost five million dollars." Like you'll learn something. What did you learn that you already knew that you shouldn't have done this, but you did it anyway and when you break rules that you already had, you pay consequences. I don't know if that's a great lesson that you've learned right here. Like you learn something you already knew. You just suffered a huge amount of consequences. Like I see this a lot with people in that position.
[00:33:41] I also think viewing being wrong is an opportunity to hone your prediction skills or your deduction skills. That's great, but you shouldn't delude yourself into every time you totally blow something, be like, "Well, on the upside, I learned a lesson." It's like, maybe you didn't need to learn that lesson in such a manner. You could have just come up with what it means to be successful or learn, or do something first with paying one percent of the cost that you did to learn that particular lesson. So I think there is value there.
[00:34:08] Julia Galef: Yeah, but I think it's a subtle and really important point because I do sometimes find myself cringing when I see like other people with similar attitudes to me who are advocating for truth seeking and intellectual honesty and the value of noticing you were wrong. All of which I think is great, obviously, but sometimes that message can kind of get twisted into like, "It's never bad to be wrong." And like, you always kind of have an all-purpose excuse or justification if you were wildly wrong about something or made a terrible mistake. So it's kind of tough to talk about without airing too much on one side or the other. But the question that I try to ask myself after I had noticed I was wrong about something is like, was this a reasonable mistake to make? Like, given the information I had at the time and given the amount of time I had to make this decision, like, did I make a good call given what I knew at the time?
[00:35:03] And often I think the answer is yes. Like I was wrong about this, but it was a reasonable guess for me to make at the time. And now I know more and I know better, so I won't make that mistake again. And so I shouldn't feel bad about being wrong. But then other times, you know, even like with what I knew at the time, it should have been obvious to me if I was being a little bit more careful that this was a bad call, you know, I'll often ask myself like, was I negligent? I'm not asking myself, was I wrong? But rather like, was I negligent in the way I made this decision? And that's the question that I think should determine whether you — not like self-flagellate endlessly, but like, whether you feel a little bit bad about your error versus whether you don't is whether you were negligent in the way you made that call. Does that make sense?
[00:35:46] Jordan Harbinger: Yeah, it does. So like, if the decision making process was good, but the result ended up being bad. That's still — you did it right. You just ended up being wrong in this particular instance. Or it just ended up not working out in this particular instance. But if you look and you end up being wrong and it's like, oh, well, like let's say you lose money on a bunch of investments in one year, you have a negative return, not great. Nobody likes that for their investments, but you diversified. Everything was great. You put a bunch of money into the S & P 500, and you had all these little diversity baskets of funds, and then the whole economy kind of tanked, and you ended up with a negative return and lost some money. It's like, okay, well that was, I did everything, right? Like the conventional wisdom says to do this way. I did it. I lost money. That's the way the cookie crumbles. But if you lose money and you go, "Yeah, I put everything into random penny stocks that this guy on the Internet sold me and I lost money." Well, that was a bad decision. So that's why you ended up losing that. Then you should learn a hard lesson, but there's kind of no point, like you said, in self-flagellating if you made a decision that turned out to be wrong, but all the pieces were in place and you did the process itself correctly. Like maybe you can look at our normal decision as nine times out of 10 or 99 times out of a hundred, this decision would have yielded a better resilience. We really got unlucky here.
[00:37:04] Julia Galef: Right, exactly. And of course, there's still some potential to fool yourself and say like, "Well, the results were bad, but I—" And you can find some way to justify your decisions. There's always some room to fool yourself. To account for that, I like to kind of zoom out and look at like, "Well, do I at least sometimes notice that I've made a decision badly or negligently." Because if you never find yourself saying like, "Yep, I screwed that one up," then there's probably something wrong there. And you're probably not actually being as honest with yourself as you should be, but it's at least the right criteria to follow even if you're not always following it perfectly.
[00:37:38] Jordan Harbinger: There's some great tests in the book or thought experiments, I should say, in the book, one that I loved was the conformity test aka the yes man test. So you pretend you have one view and you see what everyone else thinks, but your view is really the opposite. And you're trying to test and see if people will just agree with you to what, for you to get off their case, or also agree with you, just so maybe you like them more. I suppose this works really well in business where maybe like you're the boss and you want to see if somebody is going to convince you not to do something that you already know is a bad decision.
[00:38:07] Julia Galef: Yeah. Barack Obama says that he did this with some of his subordinates. Yeah, I think he called it a yes man test or yes man radar or something. They would be agreeing with something they thought he believed, maybe he did believe it. And then he would say, "Well, actually, you know, I'm not sure that's true anymore. Or actually I don't believe that. Can you tell me why you believe it?" So then he would wait to see whether they would still stick to their guns and say they believed the thing or whether they would say, "Oh, you know ? Yeah, you know, you're right. It actually doesn't make sense," whether they would shift positions with him.
[00:38:41] And I like using a kind of twist on that test to test yourself for whether you're being a yes man. But when it kind of works, similarly, when you find yourself agreeing with someone, whether that's someone else's opinion in a strategic meeting, a strategic decision-making process, you know, what your team should do, or whether that's a political opinion or I don't know, ethical judgment, just imagine to yourself that the consensus changed. Like imagine that the person in the meeting who's advocating for, I don't know, scaling up faster, told you actually, "You know what, I was just being devil's advocate there. I don't necessarily think we should scale it faster." Notice whether your opinion changes too, or whether you think to yourself, "Well, I don't know. I don't care about that. I think scaling up still makes sense." Or on the societal level, like if you imagine the consensus about, I dunno, abortion or some other ethical issue changed, and most people disagreed with you or like most people in your social group disagreed with you, would your opinion change too.
[00:39:37] And this isn't necessarily telling you whether you're right or wrong, but it's just telling you how much of your opinion was downstream of someone else's opinion or up the consensus. And that can be valuable information that might make you want to look more critically at what you actually believe.
[00:39:52] Jordan Harbinger: Yeah. There's a lot of good tests in the book. Selective skeptic test was one. Status quo and bias test, I mean, we don't have to go over all of these here, but there's a few more I'd love to cover before we end up wrapping the show here. There's three rules for communicating uncertainty. And I think this is particularly important, especially when people expect certainty. And I think this will come in handy for people who have to talk to a boss or somebody on their team where there's maybe an unrealistic expectation of certainty.
[00:40:18] Julia Galef: Yeah, so in the process of writing this book and advocating for scout mindset, I spent a lot of time talking to people to try to understand, what are your hesitations about or objections to the idea of being more of a scout more of the time? Like, what am I arguing against here? I think that's a valuable process.
[00:40:37] Anyway, one of the main hesitations or objections that came up was the idea that if you try to see things realistically, that inevitably involves acknowledging uncertainty, because if you're being intellectually honest, you're not going to be 100 percent certain about everything, especially like plans that you've made, a particular business plan or a startup. You can't if you're being honest with yourself. You can't be certain that it's going to work out. There's a lot of unknowns and luck involved, even if you're brilliant and trying really hard. And yeah, being a scout means seeing a lot more uncertainty than you otherwise might be seeing. And the problem with that people feel is that being uncertain will make you look weak and wishy-washy and unconfident, especially if you're in a position of leadership, like a manager or CEO, or like in the public eye, or if you are supposed to be in a position of authority or expertise. Like you're a consultant, you know, giving your expert opinion to your clients, or you're a lawyer or a doctor, someone who people look up to for the answer, or you're like pitching your startup to investors and they want certainty.
[00:41:40] And so this is a very reasonable concern, but I think people are neglecting all of the ways that exist to talk about uncertainty without seeming weak. And so I collected a bunch of examples of people who I think are unusually good at acknowledging uncertainty in their plans, but still come off as very confident and charismatic and are very like influential people. And so I was looking at like, what are they doing, right? How are they talking about uncertainty without losing people's confidence? I broke down into a few things. One tip to talking about uncertainty is just to manage people's expectations about how much certainty is even possible in this area.
[00:42:19] So for example, there are a number of lawyers who read interviews with them, and they've talked about how they communicate uncertainty to new clients about whether they're going to win the case or how much they can expect to be awarded. And bad lawyers will just give a certain estimate like, "Oh, you're definitely going to win," or, "You're definitely going to get this much money." and they don't want to do that because that's not intellectually honest, so instead they'll say like, "Look, it is not possible at this stage to determine that." And if someone else tells you, they know for sure, then you should run in the opposite direction because that person doesn't know what they're doing.
[00:42:52] And so essentially they're demonstrating confidence about the amount of uncertainty involved in a case like that at that stage. That's like very effective for, if not everyone, a lot of people to sort of show them like, "Oh, like uncertainty is inevitable," and you shouldn't trust someone who says otherwise. So that's one tip is managing people's expectations, confidently about how much certainty is possible. And then another one is to give people a plan for how to cope with uncertainty. Because this is often why people don't want to hear uncertainty from you is it makes them feel like, "Well, I don't know what to do." You know, if I'm a patient of a doctor and he tells me, "Well, you know, we don't know what's causing your symptoms or we don't know what's going to happen to you." Then, I, the patient, feel like, "Well, what do I do? This it's really upsetting." And so that's why you want someone to just like tell you what to do. And so an intellectually honest doctor, you know even if they can't determine exactly what's causing your symptoms, we'll say something like, "Okay, here's what we're going to do. We're going to do these next tests, or we're going to monitor these symptoms. And, you know, if we still can't figure it out, then we'll do X, Y, Z." And consultants will often do this too. Like they'll say, "We can't know for sure how this plan is going to work out, but we're going to double back and check how things are going at these stages. And if things go badly, then here's our contingency plan." And so they're not promising things will go well, but they're promising that they've thought a lot about what to do, how to manage the uncertainty. And that helps a lot too.
[00:44:14] And then the last tip for talking about uncertainty while seeming confident, and that's just to find different ways to be inspiring because often the reason people don't want to acknowledge uncertainty is they want to inspire people. Like, you know, they're founders talking about their vision for their company to investors or whatever, and they want to be inspiring. But there are a lot of ways to do that without actually claiming to be more certain than you actually can be. So for example, a friend of mine, who's I think an excellent scout and very probabilistic thinker, very aware of uncertainty. He has the startup incubator whereas a number of startups that he's kind of helping get on their feet. And I asked him like, "Does your awareness of uncertainty make it hard for you to inspire employees or investors?" And he said, "No, no, not at all because there's lots of ways to do that. That don't require certainty." For example, one of his companies is developing apps to help people with depression and anxiety. And so he says, "You know, I talk about people that our apps have helped. So I like tell these inspiring stories that people have been helped by our company and that doesn't require any intellectual dishonesty and it's very like moving." Or he'll talk about why he personally cares, like why he's so passionate about this company, or he'll even like paint a vision for what is possible, like what the company could achieve in five years without claiming this is guaranteed because you can't actually claim that.
[00:45:31] And so I think a lot of what's going on here is just that people haven't, they haven't really looked very hard for ways to be charismatic and inspiring without being intellectually dishonest. But I think there are a lot of those ways if you just look for them. So those are my three tips: manage people's expectations about how much certainty is possible. Give people a plan for how to cope, how you plan to cope with uncertainty or how they can cope with it. And find ways to inspire that don't require, you know, claiming to be certain about things you can't be.
[00:46:00] Jordan Harbinger: This is The Jordan Harbinger Show with our guest Julia Galef. We'll be right.
[00:46:05] This episode is sponsored in part by Nuun. When you work up a sweat, say from dancing at a music festival, I remember those, or from working on that fitness, you lose vital electrolytes and minerals that your body needs in order to keep moving and recover efficient. Nuun Sport is optimized for hydration and mineral replenishment before, during, and after a workout. You're just dropping a fizzy tablet into your water bottle to support your hydration, anytime, anywhere. Handy at concerts and festivals. Handy if you're sick of drinking water and only water on an airplane somewhere. Nuun Sport is made with only one gram of sugar and carefully sourced premium ingredients that are certified non-GMO, gluten-free, and vegan. Available in 13 delicious flavors, including fan favorite cherry limeade, which has an extra boost of caffeine. And if you really want to work out on tilt.
[00:46:48] Jen Harbinger: To get your game changing hydration, visit nuun.life/jordan. That's any N-U-U-N.life/jordan and enter code Jordan for 25 percent off your first order.
[00:46:57] Jordan Harbinger: This episode is also sponsored by Bourbon Time. There's no denying that this past year is that us all spending way too much time at home, usually inside and because a lot of us, including myself, or working from home too, and made each day string together and feel really exhausting. I don't know about you, but I barely left my kitchen. A lot of us have found ourselves blurring the lines between work and rest, which took a toll on our energy. The folks at Knob Creeks see this phenomenon happening and they want to help us reclaim our evenings. Beat the burnout and take back the hour of 6:00 to 7:00 p.m. is that one hour where you just let yourself do whatever makes you the happiest. I find myself outside reading, walking around, getting a little sun, popping an audiobook or a podcast, and just do a couple laps around the old neighborhood. I'm resting my vocal cords from talking all day and getting those steps in working on that dad bod.
[00:47:43] Jen Harbinger: We're leaving burnout behind starting now. Join us in reclaiming 6:00 to 7:00 p.m. As the happiest hour. So you can do whatever it is that makes you happy. And if that involves a glass of bourbon, remember to drink Knob Creek responsibly. Jordan, take it away.
[00:47:56] Jordan Harbinger: Knob Creek Kentucky Straight Bourbon Whiskey, Kentucky Straight Bourbon Whiskey with natural flavors, and Straight Rye Whiskey, 45 to 60 percent alcohol by volume. Copyright 2021 Knob Creek Distilling Company, Clermont, Kentucky.
[00:48:05] This episode is also sponsored in part by Progressive. Progressive helps you get a great rate on car insurance, even if it's not with them. They have this nifty comparison tool that puts rates side-by-side, so you choose a rate and coverage that works for you. So let's say you're interested in lowering your rate on your car insurance, visit progressive.com to get a quote with all the coverages you want. You'll see Progressive's rate, and then their tool will provide options from other companies, all lined up and easy to compare, so that all you have to do is choose the rate and coverage that you like. Progressive gives you options so you can make the best choice for you. You could be looking at saving money in the very near future. More money for say a pair of noise-canceling headphones, an Instapot, more puzzles, whatever brings you joy. Get a quote today at progressive.com. It's just one small step you can do today that can make a big impact on your budget tomorrow.
[00:48:51] Jen Harbinger: Progressive Casualty Insurance Company and affiliates. Comparison rates not available in all states or situations. Prices vary based on how you buy.
[00:48:58] Jordan Harbinger: Hey, thanks so much for listening to this show. We work really hard on it. We try not to cram a bunch of commercialism down your throat, but we do need to keep the lights on somehow. And we love it when you support those who support this show. We put all the codes, all the URLs, everything you need to get discounts on these products. They're all in one place. jordanharbinger.com/deals is where you can find it. So all those codes, everything, you don't have to remember that. I know you're probably running, walking outside, doing a little bench press. You can go to jordanharbinger.com/deals and find all of that in one place. And again, please do consider supporting those who support us.
[00:49:31] Don't forget we've got worksheets for many episodes. If you want some of the drills and exercises talked about during the show, those are also in one easy place. The link to the worksheets is in the show notes. jordanharbinger.com/podcast is where you can find all of those. And now for the conclusion of our episode with Julia Galef.
[00:49:49] Now there's a couple of other really interesting concepts in here that I think are highly useful. One was how we misunderstand arguments and our brains sort of match them with arguments that we've heard before.
[00:50:01] Julia Galef: Yeah.
[00:50:01] Jordan Harbinger: And we've already decided that we disagree with. That is really insightful because I feel like people do that all the time and don't even notice it just slides right under the radar.
[00:50:10] Julia Galef: Yeah. I'm so glad you brought that up because I also think that's an important point and no one ever asked me about that. So I appreciate the chance to talk about it. Yeah, so I was talking about that in the context of why it's so hard to learn from disagreements with people with different views from you, because I think we kind of expect it to be easier than it is. By which, I mean, we tend to have this implicit assumption that well, if people are being reasonable and arguing in good faith, then it should be clear who's right, and that person should change their mind. And so if that doesn't happen, then you get frustrated and you feel like, "Well, these people are being unreasonable and they're stubborn and close-minded," and you kind of write them off. That assumption kind of backfires and actually does make it harder for you to learn from disagreements because you end up thinking that everyone is being stubborn and close minded.
[00:51:02] When in reality, it's just really hard, even if everyone's smart and reasonable and arguing good faith, which is already not a super realistic assumption, but even if those conditions are met, it's really hard because our beliefs are all — well, those number of reasons why it's hard, partly it's because our beliefs are all kind of interconnected. They're this like network of related beliefs about, you know, what kind of sources do you trust? What, which news sources or pundits do you think are reasonable? And, you know, you have all these background assumptions about how science works or doesn't work. And so if you want to change one particular belief of your own or someone else's, it kind of requires changing all of these related beliefs around it. And so one big reason it's hard to learn from disagreements, but another reason is the one you brought up, that we tend to map what we hear onto things we're already familiar with.
[00:51:52] I'll give you one example from my life. There's this common decision expert named Gary Klein. Who's written a number of popular books about decision-making in the real world and I've learned a lot from Gary Klein. I cite him in the book and he's also changed my mind about a number of things, including pointing out some of the problems with academic studies of decision-making, which I hadn't noticed before. He's really valuable, but I completely discounted him for years after I first heard of him because one of his books is I forgot the exact title, but it's something like the power of intuition and he talks about the power of intuitive decision-making. I heard that and I knew that he was critical of academic studies. And so I immediately mapped him onto this archetype of people that I was already familiar with, who say things like, "You know, don't trust science, just listen to your gut."
[00:52:43] Jordan Harbinger: Right.
[00:52:43] Julia Galef: Scientists may tell you that vaccines don't cause autism, but if you believe that they do, then you should just trust your own intuition. And so I think that's ridiculous. And so I just like lumped him in with those people and didn't really listen to what he was saying. And it wasn't until years later that I liked to read his papers more closely. I was like, oh, this guy's actually, he's being very reasonable. He's not saying what those people are saying. He's just talking about this kind of intuitive pattern matching things that our brains do. Kahneman called it System 1 decision-making. It can be very accurate in many contexts and certainly more useful than slow and deliberate System 2 reasoning in many contexts. It's not perfect, of course, but neither is System 2 reasoning.
[00:53:21] And so Gary Klein was just talking about when intuitive decision-making is valuable because it often is. And that's, what's a perfectly reasonable thing to say. And it's very different from the people who reject all scientific evidence entirely. But that kind of thing is often in play when you hear someone's argument and you immediately write it off as being stupid or unreasonable, I think often what happens is your pattern matching it to something, you're already familiar with that stupid, and you're not actually hearing the nuances of what that person is saying.
[00:53:51] Jordan Harbinger: There's a tendency to do that, right? Because it's easier instead of saying, "Oh, I have to reevaluate this belief and see if what this person is arguing makes sense. And do I have to change my beliefs?" We just go, "Ah, that's similar to this other thing I've already decided sucks, next."
[00:54:04] Julia Galef: Right.
[00:54:05] Jordan Harbinger: Right.
[00:54:05] Julia Galef: Right. And we don't usually realize we're doing it.
[00:54:08] Jordan Harbinger: No, no, no. Of course that dialogue doesn't happen explicitly or monologue doesn't happen explicitly in our heads. It's just that it's easier for us to say, "Eeh, I don't really—" Like for me, I spent years not trying things sushi or other — this might even be a good example. A lot of people from my area, Michigan, we don't try sushi. Why? Raw fish? Well, that can't be good. I've smelled raw fish. That's disgusting. Why would I want to eat that? And it's like, well, it's easier to just say raw fish is gross and unhealthy and dangerous than it is to go millions of people eat this, maybe I'm wrong. Let me try, test it, evaluate whether or not I've been wrong this whole time. And then change my belief accordingly. That's really tough. It's easier to say raw fish, gross, next. Let's go out for burgers and steaks.
[00:54:52] A lot of things also come down to identity, right? Our identity is often wrapped up in beliefs and you touched on this before. I'd love to get some examples of identity wrapped up in beliefs and maybe what we can do about this, because that, well, one it's dividing our entire freaking country right now, this kind of thing. But also I think one of the chief ways that people are wrong is they simply have way too much of their identity wrapped up in their system of belief.
[00:55:17] Julia Galef: Yeah. So the most familiar examples of beliefs that become part of our identity are our politics and religion, being a liberal or conservative, or libertarian, or an atheist or a Baptist. Those beliefs are part of our identities in the sense that we feel proud to be an atheist or a Baptist or a liberal, et cetera. You know, we really take it personally when someone disagrees and it becomes much more quickly an attack on us as a person, as opposed to an attack on something we happen to believe. And so this is why there's like the common etiquette rule to not bring up politics or religion at a social gathering or on a date.
[00:55:55] So those are very recognizable examples, but really anything, any issue can become part of your identity. Your views about whether breastfeeding is beneficial or your decision to have children or not, or to take something less baby-related, your views on scientific evidence or on nutrition. Or I lived in the bay area for several years, and so as I quickly learned people's views on programming languages and like which programming languages are better than which other ones can be very, very identity laden and provoked the same kinds of like righteous and defensive reactions.
[00:56:31] Jordan Harbinger: That show Silicon Valley, did you ever see that?
[00:56:34] Julia Galef: You know what? I watched a few episodes of it and I couldn't continue because it was too true to life. That it was actually a little bit painful. So I think it's a great show, but I haven't seen most of it.
[00:56:43] Jordan Harbinger: There's an episode where one of the chief guys, he's one of the founders, he's just like so nerdy and he's hapless/hopeless. And he meets this girl and everyone's like, "This is BS. This girl is attractive. There's no way she likes you." And they're all kind of like, "What's wrong with her if she likes you." And she's perfect. And she's pretty. And she loves coding and programming. And then they're talking about what, whether she uses tabs or spaces to like do a new line of code or whatever. And she's like, "Well, tabs, it's faster. It's one keystroke." And he's like, "Yeah, but spaces are more precise. And you know, you can map them one to one—" I don't know any programming, so I don't really know. They can't reconcile this difference between tabs or spaces and they get into these, like knock-down drag-out fights and end up breaking up over tabs versus spaces.
[00:57:25] Julia Galef: I really wish I'd seen that episode before writing my book because I would have included that as an only slightly caricatured example of what happens.
[00:57:35] Jordan Harbinger: Yeah. I mean, it's only slightly, right? You have people that watch like Star Trek and they're like, "Oh, the new one, that's not canon. It's not part of the storyline. It's totally separate." And people are like, "Well, wait a minute, it's still has Spock in it." And they're just arguing back and forth on the Internet or Star Wars, the same thing like, "Oh, this is an off shoot. It doesn't count. It's not part of the same timeline. They should never have created it. Like any sort of fantasy universe has this." So what do we do about this? We obviously need to figure out a way to maybe not map beliefs to identity that way we can hold them more lightly. Is that even possible?
[00:58:08] Julia Galef: Yeah. I don't think it's possible to completely detach your identity from your beliefs. In the book, I talk about how there's this famous essay by Paul Graham, a tech investor and entrepreneur and SAS. And it influenced a lot of people when it first came out 10, 15 years ago, it's called keep your identity small. And he's basically pointing out this phenomenon that I've been talking about and says, "All else equal, you want to let a few things into your identity as possible. If you want to be able to think as clearly as possible." And so I and a lot of people I know read that and were like, "Yeah, that makes a lot of sense. I'm going to try to avoid labeling myself in any way, so that I don't distort my thinking. So I'm going to avoid political labels. I'm going to avoid calling myself a vegan or an atheist." And the problem is just that this is not logistically feasible. Like labels are just practically useful to be able to just characterize your beliefs and your choices. And just emotionally, it's probably not reasonable to get rid of all feelings of attachment to your beliefs about things like politics.
[00:59:07] So instead what I try to do now, and what I advocate for is just trying to hold your identity more lightly, trying to maintain a little bit more emotional distance from your beliefs. And so you can still call yourself, you know, a feminist or whatever you want, but try to hold that more lightly. So that involves things like reminding yourself that label is contingent. So yes, I'm a feminist now because that label just best describes my beliefs. But if I were to come to the conclusion that feminism is actually wrong or harmful, then I would not be a feminist anymore. So essentially you're trying to make the label feel like just the label and not like a flag that you're waving or a badge that you're wearing.
[00:59:49] There's some ways to, I think, make that shift happen more easily. One thing that I really like to do is called the ideological Turing test.
[00:59:57] Jordan Harbinger: Okay. Can we explain what a Turing test is? I don't know if everyone knows what a Turing test is.
[01:00:01] Julia Galef: Right, right. So it's named after the Turing test, which was a kind of theoretical experiment proposed by Alan Turing, in which he said, "This is how you could determine whether an artificial intelligence is actually as intelligent as a human. You have a human interact with the AI and also with other humans. And see if the human can tell the difference, like can pick out which one of these people they're talking to is actually just an AI. And if humans can't tell, then functionally speaking that AI is as intelligent as a human." And so the ideological Turing test is a play on that. Also coined by Bryan Caplan, as it happens, who came up with the phrase rational irrationality. And so the idea of the ideological Turing test is this is a way to tell if you really understand the views of someone you disagree with. Can you argue for their side, so convincingly that other people couldn't tell whether you actually believe those views or not?
[01:00:54] And so it's usually talked about as kind of a cognitive test, like a test of whether you genuinely understand the other side and it is a cognitive test, but I think it's also an emotional exercise because it can be so unpleasant to force yourself to explain — you know, if you're a liberal, it can be so unpleasant to try to argue for conservative positions without strawmanning them, without caricaturing them or without adding your own editorializing about how evil or stupid they are.
[01:01:23] Jordan Harbinger: Right.
[01:01:23] Julia Galef: And just explain the views as a conservative would, it's actually really hard and people are often resistant to even try it because it feels like giving aid and comfort to the enemy.
[01:01:32] Jordan Harbinger: Sure, because otherwise when they do that, they go, "Oh, I just really don't like poor people and immigrants are bad. We should just throw them in cages." And it's like, "Okay, obviously you don't really believe this. Got it. You're caricaturing the other side."
[01:01:45] Julia Galef: Yeah.
[01:01:46] Jordan Harbinger: But if you can do it and say, "All right, well, no one can tell that I'm necessarily arguing this out of a Turing test." Then it shows that you really understand their arguments. You can make them relatively convincingly and it's disingenuous to say, "Well, it's impossible to do that because their arguments are never convincing." It's like, "Well, that's not true," but people must say that a lot. Right?
[01:02:07] Julia Galef: Yeah. I mean, clearly, they're like convincing to many people.
[01:02:10] Jordan Harbinger: Right.
[01:02:10] Julia Galef: They do, yes. That's why I made that face. Like I hear that objection a lot. And you know, it's often not feasible to actually like do this in front of an audience and have them get it. That's the ideal, but not really often practically feasible. So instead, I treat the ideological Turing test, it's kind of a north star, a guiding principle to aim at. And the bare minimum version of this, I think, is just to ask yourself when you're like describing something you disagree with to ask yourself, the way I'm describing it, "Does this even sound like someone might say genuinely?" Because if you say something like, "Yeah, poor people are all terrible and deserve to die." You can at least ask yourself, like, "Does this sound like something a Republican would happily sign his name to, if not, I'm probably caricaturing it."
[01:02:49] Jordan Harbinger: Yeah. Yeah. I mean, they see it online all the time, right? Where people say like, "Oh yeah, I can argue like a libtard — communism." And you're like, "Has anyone ever posted that in a discussion? We are really in good faith discussing these concepts." "Not really." Right. So I can see this being more of a thing that you do in your head. And you have to just be really honest with yourself as to whether or not you are being reasonable and honest in this exercise in his Turing test.
[01:03:14] Julia Galef: So the nice thing about this, I think, is it's a test. Like it's a criterion, but it's also an exercise that I think helps separate your identity from your beliefs. It focuses your attention more on understanding the different ideas involved and less on fighting for your side and putting down the other side and making them look bad.
[01:03:32] Jordan Harbinger: Right.
[01:03:33] Julia Galef: I find that just having this habit of doing this ideological Turing test in your head with views that you disagree with can just help create some of that distance that helps you hold your identity more lightly.
[01:03:42] Jordan Harbinger: Right. This makes sense. So then your identity, and I think you mentioned this before, it's not a flag that you're waving proudly. It's not a tribal belief system. You're maintaining your own beliefs. You're holding your identity a little bit more lightly, like you said, and you're knowing where your identity and your beliefs diverge. And you're okay with this and not going, "Uh-oh, I need to conform my beliefs more strongly to this group that I feel a part of, because if I don't, then I'm an outsider and that's a bad feeling," which causes us to pretend that we believe things or to believe things that we really don't or shouldn't—
[01:04:12] Julia Galef: Or to convince ourselves.
[01:04:13] Jordan Harbinger: And to convince ourselves, yeah, exactly. Okay.
[01:04:16] Julia Galef: I think another really important aspect of holding our identity more lightly is to have things to pride yourself on that orange beliefs. So we're humans. Like, it's almost an emotional necessity to be able to feel proud of things and to feel, you know, to have things to aspire to, or try to live up to you just don't want those to be beliefs about the world, because then you have to believe those things or else you'll feel bad about yourself.
[01:04:39] Jordan Harbinger: Things that you've convinced yourself of.
[01:04:41] Julia Galef: Right. And so I think kind of a nice way to use this property of human psychology for good, instead of for bad is to kind of pride yourself on being more intellectually honest, and being more of a scout, basically. So congratulate yourself when you noticed that you were wrong about something. Feel good about that. Congratulate yourself when you are able to understand the views of someone you disagree with, or congratulate yourself when you proactively try to prove yourself wrong about something, or try to look for evidence that might disconfirm your view. Those are all things you can deservedly feel good about that don't tie you to any particular belief. You know, you're setting up the incentives, the emotional incentive, so that you actually reward yourself for getting stronger and more accurate over time, instead of rewarding yourself for sticking to a particular belief. So I think that's a good way to kind of flip this principle.
[01:05:33] Jordan Harbinger: Yeah. I like that. You mentioned in the book, the more your message makes you feel good about yourself, the less likely it is to persuade someone else.
[01:05:41] Julia Galef: That's actually a quote from Megan McArdle who I really like.
[01:05:44] Jordan Harbinger: Okay. Yeah. I love that. I feel like there's probably an exception to that, but also it sort of illustrates your point, right? Like if you're really waving that belief identity flag around, do you really have a good logical argument about it? Or are you just saying like, "Yo, this is my tribe. I'm repping my tribe right now." Or do you have a really good set of reasons and a well-thought out beliefs or ideology, right?
[01:06:08] Julia Galef: Well, I mean, it's also just practically speaking if you do want to convince anyone that they're wrong. It's just a bad strategic choice to make your case in a very self-righteous and self-congratulatory way, because even if you are right, even if you do have good reasons for holding your belief, if you say it just in a tone dripping with condescension, then you're not actually going to change anyone's mind. So yeah, I think that's another — people often worry that like, "Well, you have to hold your belief strongly, or you'll never change the world. You'll never win over hearts and minds," but in practice, most of the time holding your beliefs really strongly tied to your identity makes you worse at effecting change because it makes people not want to listen to you.
[01:06:49] Jordan Harbinger: Exactly. Yeah. And I think there's something there. Thank you very much for helping us make better decisions and think more clearly. It's a super important skill. I really appreciate it. The book is called The Scout Mindset. We'll link to it in the show notes. Is there anything that you want to make sure you deliver that I haven't brought out just yet?
[01:07:05] Julia Galef: I guess one thing I want to leave people with is that you — I was just talking about the importance of like setting up the incentives so that you're rewarding yourself for things that actually help you in the long run instead of hurting you. And I think an important piece of that is when you notice you're in soldier mindset, you should feel good about that because — that sounds paradoxical cause it's bad to be in soldier mindset or whatever, but soldier mindset is just so universal and innate and just baked into how the human brain. But if you don't notice yourself ever getting defensive or rationalizing something, then that's not actually a good sign because odds are, you're not just like the one human who's never in soldier mindset. It's much more likely that you're just like really bad at noticing. And so you should reward yourself emotionally, congratulate yourself for noticing when you're defensive or rationalizing, et cetera, because that makes you better at noticing. And that's actually a step in the right direction.
[01:07:55] Jordan Harbinger: Julia, thank you so much. I love this kind of stuff. The ability to make better decisions, there's kind of no real downside to this. I mean, your ego might take a hit, but your business is going to be better. Your relationships are going to be better, your friendship, marriage, everything, your relationship with your kids is probably going to be better. I mean, this is kind of one of those skill sets that applies everywhere, both at work and at home, and you can't really ever be too good at this.
[01:08:17] Julia Galef: I've great to hear you say that preaching to the choir, but yeah, thank you. It's been a great conversation. You ask great questions and it's a pleasure to be on the show. Thank you.
[01:08:27] Jordan Harbinger: I've got some thoughts on this one, but of course, before we get into that, most of us have big goals that we'd like to accomplish anything from getting in better physical shape, to quitting a lifelong vice, to learning a new language, Habits Academy creator, James Clear shares, processes, and practicals we can use to incrementally change our own lives for the better. Here's a quick bite.
[01:08:46] James Clear: It's not a single one percent change that's going to transform your life. It's a thousand of them. Whenever I feel like giving up, I think about the stone cutter, who pounds a stone a hundred times without a crack showing, and then on the hundred and first blow, it splits in two. And I know that it wasn't the hundred first that did it, it was all the hundred that came before. Newsworthy stories are only about outcomes. When we see outcomes all day long on social and on the news, we tend to overvalue them and overlook the process. Like you're never going to see a news story that is like, man eats salad for lunch today. Like this is just not right. It's only a story six months later when a man loses a hundred pounds.
[01:09:21] The real reason habits matter is because they provide evidence for the type of beliefs that you have about yourself. And ultimately you can reshape your sense of self, your self-image, the person that you believe that you are, if you embody the identity enough. A lot of people watch too much TV or don't want to play as many video games to do or whatever. If you walk into pretty much any living room, where do all the couches and chairs face? They all face the TV. So it's like, what is this room designed to get you to do? You could take a chair and turn it away from the television. Or you could also increase the friction associated with the task. So you could take batteries out of the remote so that it takes an extra five or 10 seconds to start it up each time. And maybe that's enough time for you to be like, do I really want to watch something? Or am I just doing this?
[01:10:01] The point here is, do you want to build a good habit? You've got to make it obvious. If you want to break about how to you just make it invisible. Your entire life, you are existing inside some environment and most of the time you're existing inside environments that you don't think about. Right? You're like, and in that sense, you're kind of like the victim of your environment. But you don't have to be the victim of it. You can be the architect of it.
[01:10:21] Jordan Harbinger: For more with James Clear, including what it takes to break bad habits while creating good ones and how to leverage tiny habits for giant outcomes, check out episode 108 on The Jordan Harbinger Show with James Clear.
[01:10:35] You know, surprisingly, I really enjoyed hearing about where I'm wrong and where many of us are wrong, right? Annie Duke taught us thinking in bets. That was episode 40 of the show, how we can use percentages of confidence, right? We should be asking ourselves how confident do I feel that X, Y, Z is the case. Instead of just choosing one outcome or another, we put percentages to our predictions and that gives us the ability to evaluate our own confidence and to change our mind wellness.
[01:11:02] Also Julia mentioned in the book that there are ripple effects of bias or misjudgment. This is something I hadn't thought about before, but it is true. Let's say that we don't think that we have one bias. We did talk about that a little bit with Daniel Kahneman in episode 518, but when we think that we are not biased and others are, we view ourselves as perfect, we may be less patient with others. Also, if we have one bias, it can lead to other biases later on down the line, cascading bias is what this was called. And this is a fascinating concept where being wrong in one way can lead us to be wrong in many other ways that just make us more and more wrong. No surprise there. So the key is to mitigate bad decisions and bad thinking early on in the decision chain or the results chain.
[01:11:44] Also covered in the book with something called the De Minimis. This is interesting not going to get into the whole thing here, but one of the main takeaways for me was it's easy to explain a way, let's say conflicting evidence if we're doing this with one piece of evidence at a time. Let's say we're involved in an MLM, right? And it's not really working but we brushed that aside because other people around us that we know who are in this multi-level marketing scheme, say that it's going to work and it's going to be great. So we brush aside the fact that it's not working for us and we take an excuse and then some of those people, they start to say, "Well, you know what, this isn't working for me either," but we brushed that aside, right? Because it's just a little thing and it doesn't really make that big of a difference all these other people believe. And then we start to see that the founders have moved from one MLM to another. And that those past ones had really bad reputations and were shut down, something like that. But we brush it aside because it's just a little thing, amidst all these other factors.
[01:12:34] So what we can do here instead is put all of the conflicting evidence together and see if it changes our conclusion, right? Because if we just get little drips of inconsistencies or of negative evidence or evidence that something is true, that we don't want to be true, especially we might be able to brush it aside. But if we collect all of it and we put it all together and then we present it to ourselves in one go, would that change our mind? Would that change our decision and the answer is often yes. And this is actually a really common tactic with things like interventions. People say, "Oh, I only missed a couple of days of work. Well, it's only been recently that this is happening," but when you do the intervention right, you take all of these transgressions from all these different people and you stack them together. And sometimes that can be the reason that somebody finally sees an issue because it's not just their mom complaining one day and their dad complaining another day and their brother or sister complaining on the weekend. This is a bunch of people putting evidence right in your face all at once. This can often sway us. So do it with your own observation and your own decisions. If you have conflicting evidence, collect it. If it's not persuasive one bite at a time, see if it's persuasive when added altogether.
[01:13:45] Big thank you to Julia Galef. The book is called The Scout Mindset. Links to her stuff will be in the show notes. Please do use our website links if you buy the book. That always helps support the show. Worksheets for the episode are in the show notes. Transcripts are in the show notes. There's a video of the interview going up on our YouTube channel at jordanharbinger.com/youtube. We've also got a brand new clips channel with cuts that don't make it to the show, or just highlights from the interviews you can't see anywhere else. jordanharbinger.com/clips is where you can find that I'm at @JordanHarbinger on both Twitter and Instagram, or just hit me on LinkedIn.
[01:14:19] I'm teaching you how to connect with great people and manage relationships, using systems and software and tiny habits. The same stuff I've used for years. I wish I'd used it for twice as long. It's our Six-Minute Networking course. The course is free. Just go to jordanharbinger.com/course, and learn how to dig the well before you get thirsty. Most of the guests on the show, they subscribe to the course. Come join us, you'll be in smart company.
[01:14:40] This show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Millie Ocampo, Ian Baird, Josh Ballard, and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for the show is that you share it with friends when you find something useful or interesting. If you know somebody who loves to make better decisions or is really interested in rational thinking, share this episode with them. I hope you find something great in every episode of this show. Please share the show with those you care about. In the meantime, do your best to apply what you hear on the show, so you can live what you listen, and we'll see you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.