Richard Clarke (@ghsrm) is the former National Coordinator for Security, Infrastructure Protection, and Counter-Terrorism for the United States, and is the co-author of Warnings: Finding Cassandras to Stop Catastrophes. [Note: This is a previously broadcast episode from the vault that we felt deserved a fresh pass through your earholes!]
What We Discuss with Richard Clarke:
- What is the Cassandra Coefficient?
- Discover how to use the Cassandra Coefficient to filter signal from noise when it comes to warnings.
- Understand how to spot your own cognitive biases and what you can do to diminish their effect on your decisions.
- Learn to persuade people to see things from your perspective and motivate them to take action.
- Find out what someone at the top levels of government does when their personal politics don’t agree with those of the current administration.
- And so much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Predicting the future is a tough job, but getting colleagues to agree on what to do about protecting that future is doubly so. Just ask Richard Clarke, who, as National Coordinator for Security, Infrastructure Protection, and Counter-Terrorism, warned us about the inevitability of September 11th — and bin Laden’s likely involvement — when nobody would listen. If you’re wondering how he could have possibly predicted such a tragedy, Richard joins us to talk about the book he recently co-authored, Warnings: Finding Cassandras to Stop Catastrophes.
Listen to this episode in its entirety to learn more about what someone at the top levels of government does when their personal politics don’t agree with those of the current administration, what happens when someone in such a position gets “agenda inertia,” what Richard means by “First Occurrence Syndrome” and how it makes preventable disasters — from Saddam Hussein’s invasion of Kuwait to 9/11 to Fukushima — more likely, how we can use the Cassandra Coefficient to filter signal from noise when it comes to warnings, how to spot our own cognitive biases and what we can do to diminish their effect on our decisions, how to tell a “Cassandra” from a kook, how we can persuade people to see things from our perspective when we find ourselves in the Cassandra position and motivate them to take action, and lots more. Listen, learn, and enjoy! [Note: This is a previously broadcast episode from the vault that we felt deserved a fresh pass through your earholes!]
Please Scroll Down for Featured Resources and Transcript!
Please note that some of the links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. Thank you for your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
This Episode Is Sponsored By:
- Adore Me: Shop now at adoreme.com
- BiOptimizers Biome Breakthrough: Get 10% off with code JORDAN10
- Progressive: Get a free online quote at progressive.com
On the Mac Geek Gab Podcast, Dave Hamilton and John F. Braun come together weekly to answer your questions and discuss things of interest to Apple and Mac geeks. Listen here or wherever you find fine podcasts!
Miss the conversation we had with Gift of Fear author and security legend Gavin de Becker? Catch up with episode 329: Gavin de Becker | The Gift of Fear Part One here!
Thanks, Richard Clarke!
If you enjoyed this session with Richard Clarke, let him know by clicking on the link below and sending him a quick shout out at Twitter:
Click here to thank Richard Clarke at Twitter!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- Warnings: Finding Cassandras to Stop Catastrophes by Richard A. Clarke and R.P. Eddy | Amazon
- Good Harbor Security Risk Management
- Good Harbor | Twitter
- The Myth of Cassandra | Greek Mythology
- What Is the Cassandra Coefficient? | Quora
- Debunking the Voter Fraud Myth | Brennan Center for Justice
- Myths and Facts About Immigrants and Immigration | Anti-Defamation League
- Could Sea Level Rise Swamp Cities within a Century? | Scientific American
- Madoff Whistleblower: SEC Failed to Do the Math | Morning Edition, NPR
- Michael Hastings Conspiracy Theories: Web Goes Wild after NSA, CIA Reporter Killed in Crash | New York Daily News
- Did the Oklahoma Bomber Have Help from Al-Qa’ida’s Explosives Expert? | The Independent
620: Richard Clarke | Warnings, Cassandras, and Catastrophes
[00:00:00] Jordan Harbinger: Coming up next on The Jordan Harbinger Show.
[00:00:02] Richard Clarke: When I think about risk management, I always ask the question of what could happen. And when somebody says, "Oh, that couldn't happen." "Say why? What's preventing it from happening?" And they always say, "Well, we have this system in place." The next question is, "Okay, how good is your system that's in place? If I were a malevolent actor and I wanted to make this bad outcome occur, what could I do to your system that's in place preventing it from happening?" And if people are frank, they always admit there's a way.
[00:00:39] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. We have in-depth conversations with people at the top of their game, astronauts, entrepreneurs, spies, psychologists, even the occasional Fortune 500 CEO, legendary Hollywood director, or a tech mogul. Each episode turns our guests' wisdom into practical advice that you can use to build a deeper understanding of how the world works and become a better critical thinker.
[00:01:09] If you're new to the show, or you want to tell your friends about the show, we've got our episode starter packs. They're collections of top episodes organized by topic. That'll help new listeners get a taste of everything that we do here on the show. Just visit jordanharbinger.com/start to get started. These playlists are also available on Spotify if you want to have a look for them there. Just search for The Jordan Harbinger Show right there in Spotify.
[00:01:31] Today, Richard Clarke, the former National Coordinator for Security, Infrastructure Protection, and Counter-terrorism for the United States, which to me, honestly, those should be three separate jobs, but I guess not. Anyway, he served under Reagan Clinton and both Bush administrations. So he is not new to the game. He predicted the bin Laden attack, you know, September 11th. He's the man who warned us about September 11th and nobody else would listen. So this is a guy who historically can spot things, at least disasters, pretty damn well. On this episode, we'll learn how to filter signals from noise when it comes to warnings, using something called the Cassandra coefficient. It's a matrix of cognitive bias and other factors that influenced the way we think and make decisions. We'll explore some cognitive biases of our own — you know, how I love my cognitive bias talk — and learn how they function as well as what we can do about those biases. And we'll uncover some table exercises to persuade people to see things the way we see them and motivate them to take action. Lots of interesting principles in this one, enjoy this episode with Richard Clarke.
[00:02:32] And if you're wondering how I managed to book amazing people like this every single week, it's because of my network. And I'm teaching you how to build your network for free over at jordanharbinger.com/course. Now, this course is about improving your networking and connection skills and also inspiring others to develop a personal and professional relationship with you. It'll make you a better networker, a better connector, and a better thinker. That's jordanharbinger.com/course, totally free. And by the way, most of the guests on the show already subscribe and contribute to the course. So come join us, you'll be in smart company where you belong.
[00:03:06] Now, here's Richard Clarke.
[00:03:10] Having left government, do you find now that just life is easier because you can use things that you buy in stores instead of having to have them examined.
[00:03:18] Richard Clarke: Life is easier outside of government in general, not just government, but if you live in the top-secret code world in government, there are all sorts of things you cannot do.
[00:03:29] Jordan Harbinger: What are some of those things? I'm curious. Is it like you can't check your email obviously from an Internet cafe, that goes without saying. What's the most inconvenient thing that you'd found?
[00:03:39] Richard Clarke: There's a whole list. I mean, when you're in government, let's say you want to go on your vacation to Mexico or Canada, you have to file an application to leave the country. They may or may not get around to approving it before your vacation time rolls around. Little annoyances all the time.
[00:03:58] Jordan Harbinger: Yeah. I suppose if you're going to Russia, you'd probably have to apply two years in advance and they might just say no.
[00:04:03] Richard Clarke: Yeah, they'd probably just say no. At the end, the big difference being outside of government is you don't have any obligation anymore to worry about what's going on in the world. I still worry about it, but it's not my responsibility. You know, something can go really bad and I don't have to solve it. I don't have to be available at two o'clock in the morning when something goes bad on the other side of the world. If a phone call occurs at two o'clock in the morning now, it's a wrong number.
[00:04:29] Jordan Harbinger: Right. And then you go, "Why do I still have a landline? I don't work for the government anymore. What is this thing even doing in here?"
[00:04:35] Richard Clarke: Yeah. Exactly, exactly.
[00:04:36] Jordan Harbinger: I thought, wow, you know, you don't have to deal with saying something to a president and then they're not listening to you or the call comes in, like you said, and now you're on the hook for something. And everyone's looking at you, that's a relief. On the other hand, you also maybe don't have the same amount of influences you had when you were a national coordinator for security, you're on the hook for stuff. And then on the other, other hand, you can also be on the hook for something, have predicted things in advance, told all the right people, and then no one listened to you, which is kind of what warnings is about.
[00:05:10] Richard Clarke: Yeah, that's certainly true. You can still have influence on the outside. It's obviously a lot less. If you have a responsive president and you're in the job I had, you could walk down the hall, asked to see him, probably see him relatively soon thereafter, tell him something he'd say yes, and then you had policy and you could go do something. There's no substitute for that, but you can still have influence on the outside.
[00:05:34] And the great thing about not being in government and trying to influence the world, you can say whatever the hell you think. You don't have to clear it with 27 people. You know, when you're a senior White House official and you go on TV or you give a speech or you go to the Congress, you can't say what you believe. You can only say what the party line is. Sometimes that is at odds with what you believe. And every time that occurs you have to ask yourself, "Is this the time I say no?" Is this the time I say, "You can take your job and shove it"? You know, we all knew going into White House positions that would happen. When I tell people now is when you go into a White House job, know where your red lines are in advance, know what you won't do, know what you won't say in advance, because once you're there, you know, it can be easily become a case of situational ethics.
[00:06:27] Jordan Harbinger: Right. So if you don't have that line set up into advance, you try to make the line on the fly. And then, of course, each time you have to do something a little bit shady, or you go, "Well, you know, this time, maybe this is different and maybe I won't do this again."
[00:06:41] Richard Clarke: I think you see this now with a National Security Advisor, H.R. McMaster, who's a very good guy by reputation. I don't know him, but a lot of my friends do. An honest guy, a guy who used to tell truth to power, maybe he still does, but he has to say things publicly that clearly make him uncomfortable. His friends say he's making a trade-off. He's deciding how much of my personal reputation am I willing to lose in order to stay in the job to prevent something crazy from happening. So here he is paying a personal price in terms of his place in history, his personal reputation. He's not paying it for himself. He's paying it for the opportunity to stay there down the hall from the president and moderate the president's behavior, and maybe prevent him from doing something crazy when he wakes up at three o'clock in the morning and wants to bomb somebody. That's a very interesting dynamic. Because in a way with a lot of people in the White House have suffered from, no matter who the president is, no matter how long you serve there, there's always that issue of, "Okay, I'll do this because I'm egotistical enough to think if I continue in the job, I will be able to make things better."
[00:07:56] Jordan Harbinger: So you really have to decide, okay, do I pull the ripcord now? This thing flies wide open possibly in the media and something gets done about it. It's like an undercover agent. Do I blow my cover now and stop this particular crime or this particular attack or this particular whole other thing from happening? Or do I stay here and keep my cover because something even worse might be coming later on down the line that I might be in a position to prevent?
[00:08:20] Richard Clarke: Yeah. I mean, that was the dilemma I had with the Bush 43 administration. I had served Bill Clinton for eight years. I had served Bush 41 before that. So I'd been in the White House for a while and the Bush 43 administration came in and I knew them all from the Bush 41 administration. And they said, "Oh, you're still here. Oh, good, we know you. We like you. We trust you. Can you stay?" And I probably should have said no, but I was arrogant enough again, to think that I could influence things. And I had a number of jobs, including killing bin Laden, which weren't done yet. And so I stayed and it became clear to me by the end of May, they came in January, that they were not going to take terrorism seriously enough. I probably ought not to be there.
[00:09:06] Maybe if I left, they could get one of their own people to do the terrorism job and maybe they'd believe him or her. So I asked in June, beginning of June, I asked to be relieved. They were shocked by that. It had somewhat of the effect I wanted to have because it did shock them. And they said, "Why?" I said, "Well, you guys don't understand this issue. You're not paying enough attention to it. Something bad is going to happen. And, you know, I don't want to be the guy on duty when that happens, because you didn't do enough." And they were set back a little by that, but not enough, but they agreed. I could be reassigned on October 1st.
[00:09:41] Jordan Harbinger: October 1st, 2001.
[00:09:43] Richard Clarke: Unfortunately, September 11th occurred between the two. I took the reassignment in October and I probably just should have quit the government altogether. But I had some things I wanted to get done. And one of them was to create a cybersecurity campaign, a cybersecurity program in the government. And I did spend about 18 months, two years putting together the first national strategy on cybersecurity. And about the time I was done with that, it was very clear that Bush wasn't going to invade Iraq no matter what. And so I gave him the national strategy. He signed it, he approved it, and I quit the next day because I did not want to be in the White House, in an administration that was doing something as ridiculously stupid and counterproductive as invading Iraq.
[00:10:29] Jordan Harbinger: You predicted the bin Laden attack. Take us through this. You knew that something was coming. How specific were you able to get and how come people did not listen? Because actually, your reputation proceeds you as a guy who can really get a lot of stuff done in government, kind of by any means necessary, ruffle some feathers herein. And I read about this particular thing, coincidentally, in a book about cybersecurity, where you're so far portrayed as, and then Dick Clarke comes in and pisses off this general and that general. And it's like, but he knew what needed to happen and people kind of shrugged it off and said, "That's just how he works. And you know, you don't mess with it."
[00:11:09] Richard Clarke: It was because I always was hired by bosses who wanted somebody to bulldoze through the barriers beginning in the state department. Once you get hired to break crockery and make things happen and you succeed at that, then your next job will be the same because that's your reputation and people will hire you to do that. So I had three or four jobs in a row where my boss wanted me to shake things up, make the government more effective, make it productive on a particular issue. When you're a bureaucrat, you're not an elected official, it's not a popularity contest. You don't have to be popular. You just have to get the job done.
[00:11:50] I always tried to be popular with my own troops, my own staff. And I think I succeeded at that, but I didn't particularly care if people didn't like me. If I was kicking them in the ass because they weren't doing their job or firing them or closing their program or transferring authority to somebody else and creating a new program. In the government, you don't have a right to a sinecure. I know people think government jobs are sinecures and people don't work hard, but that's not true. In senior-level national security jobs in Washington, people work very hard and no one has a right to those jobs. I did have a reputation for breaking crockery, but I had a reputation for getting things done. And then the Bush administration comes in and says, "Yeah, we'd like you to handle this terrorism thing. We don't know anything about it. We don't have anybody from the Bush campaign who wants the job. So yeah, you do it," but there's clearly yeah, you do it and don't bother us.
[00:12:43] Jordan Harbinger: Right. Check the box off on the spreadsheet.
[00:12:46] Richard Clarke: Yeah. And four days into the administration, I sent them a memo saying we need an urgent cabinet meeting because something is about to happen and we need to get them before they get us. And that was January and they didn't get around to having that meeting until September 4th. Look, I've struggled with this for years. Why didn't they pay attention?
[00:13:03] One of the things that happen is that you get leadership in any organization, whether it's a company or a university or a federal government that has its own agenda. They came in with an agenda with things they wanted to get done. No one takes a senior-level job without having in mind an agenda. And so you get agenda inertia, and when some intervening variable comes in, you know, some fact, inconvenient fact comes in off the left-field wall, maybe you have to change your agenda. Nobody wants to. Everybody wants to do the agenda that they signed up for.
[00:13:37] For the Bush, people had an agenda. They wanted to do something about Iraq and they had issues with China and Russia. They had arms control things they wanted to do with ballistic missiles and whatnot. All of those things were the agenda of the Bush 41 administration, had one, it got defeated for reelection. And all of those people with exception of the President Bush himself, all of those people had been in that administration. So in a way, it says though they were picking up where they left off eight years before trying to complete their agenda.
[00:14:09] Well, in the intervening eight years, things changed. The threat went from the former Soviet Union to being non-state actors and multinational issues, multinational issues, and non-state threats. And they had missed that somehow. I don't know where they were for eight years. Well, I do they're off at Halliburton making money and Exxon and various places. So they came in with their agenda. They didn't want to work on anything else.
[00:14:37] Also, we talk about in the book Warnings, I talk about first occurrence syndrome. When the thing that you are saying is about to happen has never happened before. If it happens, it's the first occurrence. People tend to disbelieve it. They only know what they've seen, what's in their experience. So, if you say, there's going to be a major terrorist attack in the United States. Yeah, there was Oklahoma City, but that was a couple of crazy Christians, Americans. You say they're going to be a bunch of crazy Muslims come here and do something inside the United States from outside. They don't believe you because that's not in their experience. Intellectually, it makes no sense because we all know that things are constantly occurring for the first time.
[00:15:21] And if you say to someone, "Isn't life about things occurring for the first time, isn't history a list of things that occurred for the first time?" They'll all say yes, but when you actually sit them down and say, "Look, this thing is going to happen," in the back of their minds, they're thinking, "Eh, it never happened before."
[00:15:39] Jordan Harbinger: Right. It's like the dykes in New Orleans failing and flooding the whole city.
[00:15:43] Richard Clarke: Or four nuclear power plants melding down at Fukushima, the Bernie Madoff Ponzi scheme, or the challenger shuttle blowing up, all the things we talk about in the book are first occurrence syndrome. Technically, if you go back, sometimes you find, take Fukushima, there had actually been a tsunami at that location, but it was 400 years ago.
[00:16:07] Jordan Harbinger: Wow.
[00:16:08] Richard Clarke: And so when our Cassandra in the book, the Japanese civil engineer says, "No, no, no, you can't do this. You can't build here. It's a floodplain." And he says in open testimony, at the hearing, "There was a tsunami here 400 years ago." And the response is, "See, there hasn't been one that's 400 years. We don't have to worry about that." Because it wasn't in the experience of the people who were making the decision. Sometimes, it can be in your experience because your grandfather told you about it, you know? So it can be something that's older than you. But it's in your experience, frame of reference, and in the case of Fukushima, the 400 years earlier tidal wave, when there was only a small fishing village there, that's not in anybody's history.
[00:16:50] Jordan Harbinger: Right. And also possibly, "Well, you know, that could be inaccurate, maybe they exaggerate it. Also, we stand to lose a lot of money if we don't build here. So let's just not worry about it."
[00:16:58] Richard Clarke: Particularly, when you're trying to build something, there's always somebody who says you can't build wherever you're trying to build. And this guy says, "Hey, I walked up the hill behind the floodplain and there's an old plaque from 400 years ago, a brass plaque that says, do not build your home below this line."
[00:17:15] Jordan Harbinger: Wow.
[00:17:16] Richard Clarke: Amazing. Right?
[00:17:17] Jordan Harbinger: "We're not building any houses. We're building a nuclear power plant. It'll be fine."
[00:17:21] Richard Clarke: And we'll put up a seven-foot wall.
[00:17:23] Jordan Harbinger: Yeah.
[00:17:23] Richard Clarke: It will be fun.
[00:17:24] Jordan Harbinger: Tell us Cassandras are because you referenced them a lot in the book and we'll be doing it on the show. And I think some people won't necessarily know what that means.
[00:17:31] Richard Clarke: Yeah. So Cassandra in Greek history, the blessing and the curse from the gods, the blessing was that she could accurately see the future, pretty good. The curse was that she could also see disasters coming. And when she told anybody in a disaster, it was coming, no one would believe her. And so she went mad. She went crazy because she kept seeing these disasters and say, "Oh, we have to stop this. The fall of Troy is going to happen." And the king of Troy laughed at her. And then, of course, Troy fell and everyone got killed. All of her friends and family got killed. So she went mad that no one would listen to her.
[00:18:05] In the book Warnings, we talk about Cassandra as being people who are experts, who are data-driven, but who are outliers. They see things first before the other experts do. And they're right. We know in retrospect, but they've ignored it. We talked about Cassandra event as being the disaster that was predicted by the expert but happens anyway. And we talk about the Cassandra coefficient, which is our little formula for figuring out when you've got an expert who was an outlier, telling you something. If you apply our little formula or a little Cassandra coefficient, you might be able to tell whether or not this person is a nut who is crying, "The sky is falling," as like Chicken Little or whether they are someone who's actually predicting something that you should pay attention to.
[00:18:54] Jordan Harbinger: So is that how leaders who get lots of warnings and all types of advice from all types of people, like you mentioned, there's always somebody who's going to say, "You can't build here," is the Cassandra coefficient, this formula, how leaders can filter the signal through the noise?
[00:19:07] Richard Clarke: That's what we're proposing. It's like a 22 box matrix. The four main factors are the nature of the issue, the nature of the person giving the warning, the nature of the decision-maker or the audience, and what the critics are saying, and we've got four or five boxes under each of those, four-column heads. And we applied that to the seven case studies where we know these people were right because the Cassandra event has already occurred. And they work pretty well with them. And then we took it and we applied it to seven more case studies in the book where people are predicting things today. So there are 14 stories, they're basically stories. Then each story is about a person, seven of those people we know were in fact Cassandras and seven are potential Cassandras and you get to judge yourself whether or not these seven current-day people are Cassandras or not.
[00:20:03] Jordan Harbinger: You're listening to The Jordan Harbinger Show with our guest Richard Clarke. We'll be right back.
[00:20:08] This episode is sponsored in part by adoreme.com Valentine's Day is just days away. Don't get caught empty-handed, check out adoreme.com, which offers hundreds of styles of lingerie, sleep and loungewear. An impressive range of 77 different sizes, which I didn't even know there were that many, but whatever, I guess there are A cup through I cup, and extra small to 4X size. Adore Me is the first lingerie brand to offer extended sizing across all categories. I just got to imagine the back problems with an I cup. That's where my head goes, I guess. They just dropped a Valentine's Day collection with free shipping and returns in case you get the sizing wrong. And how could you not with 77 sizes. I ordered a grip of stuff for Jen. They shipped it out so fast. The quality is really nice, incredible pricing with sets that start at just 24.95. Adore Me is also doing their part in sustainability. With matching bra and panty sets made from recycled material, recycled underwear. It gets me every time. Sleepwear, made from organic cotton and digitally printed swimwear to save on water and energy. Subscribe with a super flexible VIP membership that gets you 10 bucks off each set, access to buy-one-get-one-free sales, and more perks. Do whatever you want in you're Adore Me lingerie. They're here to support you.
[00:21:11] Jen Harbinger: Shop now at adoreme.com.
[00:21:13] Jordan Harbinger: This episode is also sponsored by BiOptimizers. Did you know one of the biggest ways you can boost your immunity is by supporting your gut health? In fact, 70 percent of your immune system is in your gut. I don't know how they know that, but now we all know that gut health is a real deal thing. In fact, if you heard our Bill Sullivan episode, you know, we even think with our gut. Jonathan Jacobs, an MD and professor at UCLA, has said the microbiome and the immune system are critically intertwined. That means if you eat the wrong things, your immune system could suffer. But if you eat the right things, who knows, maybe we can make our immune system stronger. That's the idea anyway. Biome Breakthrough contains powerful probiotics and prebiotics as well as a one-of-a-kind ingredient called IgY Max, a patented egg-based protein.
[00:21:52] Jen Harbinger: The best time to take Biome Breakthrough is first thing in the morning, mix it in eight ounces of water, and drink it on an empty stomach for less gas and bloating. Try Biome Breakthrough risk-free at biomebreakthrough.com/jordan and use JORDAN10 to receive 10 percent off any order with the 365-day money-back guarantee. No questions asked. That's biomebreakthrough.com/jordan.
[00:22:14] Jordan Harbinger: Thank you so much for listening to and supporting the show. I really love conversations like this. It makes me so happy that you enjoy listening to conversations like this. That we create for you. By the way, all those discount codes, all those URLs, I know those are a pain and they're always different. We put them all on one page for your convenience. jordanharbinger.com/deals is where you can find it. It works on your phone at least it's supposed to. If it doesn't, please let me know. It means I've got to fix it again. Please consider supporting those who support this show. jordanharbinger.com/deals.
[00:22:45] Now, back to Richard Clarke.
[00:22:48] Can you take us through an example of one that has already happened and how the formula comes into play? Because I would love to teach people at least a rough outline of how they might use this in their own life.
[00:23:00] Richard Clarke: Well, so the first one in the book is a guy named Charlie Allen who held the title, National Intelligence Officer for Warning. He didn't work for CIA, he worked with it. And his job was to issue warnings when he thought something bad was about to happen in the national security world. There had been other people before him who would have the job. It basically was a job that came out of the review of the Pearl Harbor disaster, because there was a big study after Pearl Harbor, after the World War II. Why did we miss Pearl Harbor?
[00:23:32] One of the recommendations was that you should have a warning officer. So they had one. No warning officer had ever issued a major warning because nothing had ever happened that rose to that level. And Charlie Allen, one day in July of 1990, was looking at the intelligence about what was going on in Iraq. And he saw troop movements and unusual troop movements, and he looked further and he ordered additional satellite photography to be collected, ordered additional communications intelligence to be collected. And he came to the conclusion that Saddam Hussein was about to invade Kuwait and occupy it and make it the 19th province of Iraq.
[00:24:12] And so for the first time in history, national warning officers sat down and wrote a warning of war and it goes to the president , a warning of war. Well, they held a meeting. Charlie explained his rationale and the CIA people said, "No, that's not going to happen." "Why?" "Well, no Arab nation has ever invaded another Arab nation." First occurrence syndrome, it never happened before. It's summertime. It's July. Arabs don't fight in the desert in July. It's 130 degrees. They won't do it. They tried out all of these objections. But Charlie had the data and he said — what all of our Cassandras say in the book, every one we interviewed said the same thing — "I want to be wrong. I hope I'm wrong. Here's the data. Tell me what's wrong with the data." And of course, they couldn't.
[00:25:02] He was looking at data that showed that these troop movements were not the kind of true movements you do for an exercise. There are things that you would only do if you were serious about going to war. You don't do them in a way that the potential enemy like Kuwait is unable to see them. If your job is to intimidate the enemy, rather than actually going to war, you want him to see all this stuff, but Saddam was doing it in a very stealthy way so that Kuwait wouldn't know it was happening. He was moving things that you would only move in the case of real war.
[00:25:33] Now, we know in retrospect because Saddam Hussein was much later on detained by the United States and interrogated. And he was asked about, "What would you have done in July of 1990? If George Herbert Walker Bush had picked up the phone and called you and said, 'Saddam, looks like you're thinking of going into Kuwait, don't do that. Or we'll have to kick you out.' What would you have done?" And Saddam laughed in the interrogation. And he said, "Are you kidding? If I gotten that call from Bush, I would never would have done it, but it was beyond my wildest imagination that you guys would go to war to kick me out of Kuwait. I had no reason to believe that that would happen." So we now know if the president had listened to the warning of war and done what Bush 41 did all the time, pick up the phone and call another world leader. If he'd just spent 10 minutes on the phone with Saddam in July of 19, think how different the world would be today.
[00:26:29] Jordan Harbinger: Geez. If we can mitigate or prevent one disaster, because one person asked one more question, it's a thought exercise that's a little bit depressing, frankly.
[00:26:38] Richard Clarke: Yeah. And it's that one more question. So if you fast forward in the second Iraq war, Colin Powell is asked to go up to the UN and give a speech thing that Saddam still has weapons of mass destruction. Colin Powell has a great reputation internationally. He's going to preserve that. So he says, "I'm not going to go up there and give this speech," that the White House had provided him. "Unless, I can sit down with CIA analysts and they tell me everything I'm going to say is true." So he goes up to CIA, he's doing due diligence. He goes up to CIA, sits down with a real analyst. It's not just with George Tenet, the director.
[00:27:14] And he goes through line by line, and he says, "Okay. And this line that says that they're building a bio weapon, biological weapon, what's the source for that?" And the analyst looks it up and says, That's a report from German intelligence and Powell says, "Oh, well, German intelligence. They're pretty good. Okay, let's move on." And he doesn't ask that one more question. How do they know, how does German intelligence now? If you'd asked that question, the answer would have been, "Well, they have a source who is the brother-in-law of the exile leader, exile Iraqi leader, Chalabi."
[00:27:52] If Powell had heard that and he knew the Chalabi was a liar and was trying to get the United States into a war with Iraq. If he had heard that he would have said, "Oh, well, I don't believe Chalabi's brother-in-law about biological weapons. What other evidence do you have about biological weapons?" The answer would have been not much. And throughout Powell's series of questions of the CIA analyst, if he'd had a skeptic with him, if he'd had an intelligence analyst with him who was skeptical and ask that one more question time and time again, he would have realized that the intelligence on WMD was certainly not enough to go to war.
[00:28:30] Jordan Harbinger: It's just unbelievable to go and trace those steps back 20/20 hindsight, and to see these things failing. How can we do a better job making ourselves heard or making sure our message is taken seriously, if we are an officer, you know, Cassandra, in some way in our company, or of course, in the military is more obvious example? Because it seems like convincing those in power to take warning seriously is more, often from the look of it and from the book, more about the leader overcoming their own logical biases. So far as maybe we've already presented a solid case, we then have to get them to get over their own stuff.
[00:29:07] Richard Clarke: Well, that's true. But the potential Cassandra also has to act rationally and very often one of two things happen or both, either the Cassandra gets more and more agitated because no one's paying attention and no one's believing her. And therefore, it gets a little bit more shrill and maybe doing things that hurts there cause. That's always a possibility. It's also, frankly, it's not true in every case, but in a lot of cases, the person who is that outlier, expert but outlier, they see things before the other experts do. They are sometimes a little different kind of personality. They are sometimes a little on some spectrum or other where they present as somebody who's a little bit unusual. And that just gets worse, of course, when people don't believe them.
[00:30:00] We talked to an Israeli psychiatrist who said, "Yeah, what you're talking about is someone who has sentinel intelligence." I said, "What's that?" And he said, "Well, sentinel intelligence is people with high anxiety. But not so high that it affects their performance, may affect their performance a little, but they're highly functioning people with high anxiety." They just naturally, genetically, instinctively look for the problems, look for what could go wrong. They're risk managers in the way they're natural risk managers, these people. They look at anything and say, "Okay, what could go wrong?" Whereas most of us don't do that, these people do, and we got to get the person who is making the case to do it in a convincing way. And maybe if they're not the best presenter, they need to get somebody else to be the presenter. That's the first thing.
[00:30:55] My Israeli psychiatrist said, if you're sitting in the restaurant and the restaurant kitchen catches on fire, you're the first guy to smell the smoke. Not only do you smell the smoke, but you don't wait, you get up and pull the fire alarm because you have that self-confidence that you're right. And that's what we found with all the Cassandras. They weren't worried about the fact that no one disagreed with them. They didn't think that made them wrong. They had a lot of self-confidence in the facts, in the numbers, in the data.
[00:31:24] One technique that we suggest people use is you have a simulation and you have a tabletop exercise in which you get the decision makers to agree to play out for planning purposes or training purposes some future day. And you walk them through that future day in a way where they suspend incredulity. You walk them through that future day, "And oh, by the way, that future day slips into being the kind of crisis you think is actually going to happen." Then they're playing the game. They're playing themselves, they're playing the role they have. They're seeing themselves now for the first time having to deal with this terrible catastrophe that. If you can get them to do that, usually at the end of that simulation exercise, they turn to you and say, "I don't ever want that to happen in the real world. What do I have to do to prevent that?"
[00:32:16] Jordan Harbinger: Right. So essentially we paint the picture of what are you willing to do right now to avert that scenario that they've painted in the picture. What are you going to wish maybe you did now to avoid that scenario? And all these kinds of hypotheticals that maybe they were avoiding before because of bias.
[00:32:33] Richard Clarke: Yeah. And you know, I tried that in a memo to the national security principles before 9/ 11. I said, "Imagine a day in the very near future, when there are hundreds of Americans dead, lying in the streets, what would you have wished you had done to prevent that? You can do that now." That didn't work, but it frequently does if you can get the decision-maker to really get into that. If it's a tabletop exercise or a similar, you can have fake CNN broadcasts and fake news reports and fake intelligence reports, fake congressional questions, or if you're a private company, the fake board of directors questions and fake Wall Street Journals stories or whatever it is, and you see the price value of your stock plummeting because of the crisis. If you can get them to really feel at a instinctive gut level, what it would be like to live through that crisis. And then they're going to say to you, "Well, what's really preventing that from happening?" And you say, "Well, A and B you're preventing it from happening, but A and B could be overcome and it could happen anyway." And then they started saying, "All right, let's start talking about a mitigation strategy here."
[00:33:46] Jordan Harbinger: So how do leaders then separate charlatans from actual seer or somebody who's doing a really good job with their predictions versus somebody who just sounds like a kook? How do they separate that?
[00:33:56] Richard Clarke: One way is to say is this person, somebody who woke up at two o'clock in the morning with a premonition, you know, they heard the voice of God, tell him this was going to happen. Or is this person an internationally recognized expert or, you know, someone who everyone would look at their past performance in their career and their personal life and standing and say, "This is not a kook. In fact, this guy is an expert in the field that he is talking about." All right, so that's the first hurdle.
[00:34:25] The second hurdle is: why are you seeing it and the other experts aren't seeing it? Well, one answer to that is there's always somebody first. There's always some one expert who sees it first, but that person has data. None of these people, the real Cassandras are making this stuff up. They have gone out and collected data, and that is the data that spoke to them. And so you say, "Okay, let's see. And let's convene a group of experts." Not just experts in your field, but experts in kind of related decision-making fields. And let's look at that data and let's test that data. Let's try to replicate your experiment. Let's try to replicate your data. Let's try to collect more data.
[00:35:09] And if no expert can say that the data is wrong credibly, or if there's room for doubt, then I think you've got a problem. And then you need to start thinking about mitigation strategies and putting the issue under a spotlight because more data will come in. And if the data continues to come in, that confirms the Cassandra's theory, then you got to increase your mitigation strategy. Basically, this is about risk mitigation and hatching.
[00:35:37] Jordan Harbinger: You had mentioned before a little bit about bias. One of them was the first occurrence syndrome, where, "Well, it hasn't happened before. The levies haven't broken before. The terrorist thing hasn't happened on our soil before." What other biases are there? I know in warnings, you mentioned things like scientific reticence, not wanting to make a decision without perfect information. What other biases are there that we see in our own lives that we might see in our own lives that will cause us to maybe not listen to some of the conclusions that these Cassandras are making? One that comes to mind as well as the disaster or the conclusion being outlandish. And this comes up in the book Warnings as well. That, "They're going to crash planes into buildings? You've been reading too many Tom Clancy novels get out of my office."
[00:36:22] Richard Clarke: That a serious problem, that number of our people faced. Literally, what they're proposing had been in the movie. And the only time of day ever occurred was in the movie. We talked to David Morrison, the noted NASA scientist and astrophysicist, and he's had a struggle for 20 years to get people to take seriously, the possibility that a giant asteroid might hit the earth. One of the major problems he has is there were two Hollywood movies about that, pretty bad movies about that. And one, Mark Walberg or somebody gets in the shuttle and goes up and plants a nuclear bomb on an asteroid. When we're talking about the threat of artificial intelligence and the fact that you could combine that with robotics and there could be a bad result, people think about the movie Terminator. When we talk about the problem of sea level rise, they think about the movie Waterworld, another bad movie.
[00:37:15] All of these outlandish Hollywood script kind of things makes it hard for people to take your issues serious. That's part of it. Part of it also is if the solution is going to cost money, a lot of money, if the solution is going to require big government activity and you are, let's say, maybe you're a Republican and you don't like to spend money except on subsidizing oil companies and you're against regulation. And you think all regulations are bad. Well, you don't like to admit that there's an issue. The only solution to which is that the government launch a big program, the governments spend a lot more money, and the government comes up with regulations to prevent the catastrophe.
[00:38:02] Jordan Harbinger: It sounds like what you're saying is we need better movies.
[00:38:06] Richard Clarke: Well, we need to separate the movies from reality better than we do in some people's minds. What we need is a much more fact-based discussion on everything. And we're going in the opposite direction. Not to get too political or partisan about that. But in the last six months, since the Trump administration came in to office, we are hearing constantly about problems without much evidence that they exist. And we get proposals of the administration to solve these problems. They jump right to the solution. When you go back and say, "Well, wait a minute." Let's analyze in a non-partisan, non-prejudiced way. Let's analyze the facts. Let's collect the facts and analyze the facts about the nature of the problem. Well, then he discovered that there wasn't much of a problem. The administration wants to ban people from six countries from coming to the United States in order to stop terrorism in the United States, when never in the history of the United States, has anyone from those six countries committed a terrorist attack in the United States. So the solution has nothing to do with the problem.
[00:39:16] Jordan Harbinger: Couldn't we then say though, "Well, that's just first occurrence bias that you just talked about that"?
[00:39:21] Richard Clarke: Sure. And then the next question is, "The historical data's not there. Show me your data. What reason do we have to believe that there's a risk here? And show me the risk analysis that you've done that leads you to this conclusion?"
[00:39:35] You know, we've got now an election fraud commission that is asking all the states for a huge amount of data. And 44 of the states have said no. And the election fraud commission is because the administration thinks that between three and five million people fraudulently voted in the last election. And you say, "Well, where's the data for this?" Well, there's no data. There's absolutely no data anywhere for that." Well, maybe we should collect data about voter fraud before we start running out and having a commission that solves the problem because the data that we've got suggested this is a minor, minor, minor, minor problem. There's nowhere near three or five million people who voted that way.
[00:40:16] And the same with the wall, you know, the great famous wall that Mexico is going to pay for. That was necessitated because people were flooding into the United States and committing crime. Well, first of all, the data suggests that there was a negative outflow from the United States over that border in the last couple of years, more people going back to Mexico than coming in. And when you look at the data on crimes committed by illegal aliens, what you discover is that illegal aliens commit crimes in the United States at a rate far less than American citizens do. So again, we have the solution for the problem that somebody said existed without looking at the data and analyzing the problem.
[00:40:57] Jordan Harbinger: It sounds like this toes along the line of the ideological response rejection, which was one of many factors that cause people to reach erroneous conclusions.
[00:41:10] Richard Clarke: Ideological response rejection is if I were to believe your prediction, it would be incumbent on me to do something, to prevent this catastrophe. And the only thing that you suggest, the only things that make sense would be things that would make the government bigger and cause us to spend more money.
[00:41:27] For example, one of the things we talked about in the book as a possible future Cassandra is Jim Hansen, professor at Columbia, who talks about sea level rise. The UN model of climate change says there could be a meter, three feet or more, of sea level rise between now and the year 2100 Jim Hansen says, "No, you've got that all wrong. It's more likely to be between six and nine meters." He has a set of data and it does suggest that. Now, if you were to believe him. You would have to do something. You'd have to build dams. You'd have to build pumping stations. You'd have to move all sorts of things from coastal cities. You'd probably have to do cap and trade on carbon emissions. All of those things would require a lot of federal expenditure, a lot of federal regulation. "I don't want to do that because I'm a Republican or I'm a small government. I'm a fiscal conservative. And if I believe you, I don't have any choice because the only things that would make sense to mitigate and ultimately respond to massive rapid sea level rise are things I don't want to do."
[00:42:37] Jordan Harbinger: This is The Jordan Harbinger Show with our guest Richard Clarke. We'll be right back.
[00:42:41] This episode is sponsored in part by Mac Geek Gab Podcast. It's like a tongue twister. Today, I have gotten something for you, Apple users, the Mac Geek Gab Podcast. You can't say it fast. This show is in its 17th year — but you're not supposed to say, you got to listen — the show is in its 17th year of providing tips, cool stuff found answers to your questions about anything and everything Apple. Host Dave Hamilton and John F. Braun take time each week to actually provide tech support to as many listeners as possible while learning at least five new things weekly themselves. Actually, you know, you think you know your iPhone here, but you don't. If you press and hold the mute button during a call on your iPhone, it'll put that call on hold. I did not know that. And saying reply with audio to Siri will let you record an audio message, which is really handy if you're in the car, you don't want to dictate the text. I didn't know either of those things. And I've used my phone too much. If you use an iPhone, a Mac, an iPad, an Apple watch Apple TV, or are simply a technology enthusiast, you're going to love learning more about your tech with your two new favorite geeks over at Mac Geek Gab. Get your questions answered and have some fun along the way.
[00:43:37] Jen Harbinger: Visit macgeekgab.com or search for Mac Geek Gab on Apple Podcasts, YouTube, Spotify, or wherever you get your podcast. Don't get caught without having Mac Geek Gab in your rotation.
[00:43:48] Jordan Harbinger: This episode is also sponsored by Progressive. Progressive helps you get a great rate on car insurance even if it's not with them. They have a comparison tool that puts rates side-by-side. You choose a rate and coverage that works for you. So let's say you're interested in lowering your rate on your car insurance, visit progressive.com to get a quote with all the coverages you want. You'll see Progressive's rate and their tool will provide options from the other companies all lined up and easy to compare. All you have to do is choose the rate and coverage you like. Progressive gives you options so you can make the best choice for you. You could be looking forward to saving money in the very near future. Money for say a pair of noise-canceling headphones, an Instapot, more puzzles, whatever brings you joy. Get a quote today at progressive.com. It's one small step you can do today that can make a big impact on your budget tomorrow.
[00:44:30] Jen Harbinger: Progressive Casualty Insurance Company and affiliates. Comparison rates not available in all states or situations. Prices vary based on how you buy.
[00:44:38] Jordan Harbinger: By the way, you can now rate the show if you're listening to us on Spotify. I love when you do that, because it does help the show get a little bit more visible in those charts. Just go to jordanharbinger.com/spotify if you need some instructions. But really what you need to do is just open up your Spotify app, search for The Jordan Harbinger Show, there's three dots on the upper right there, click those and give us a good rating.
[00:44:57] And now for the rest of my conversation with Richard Clarke.
[00:45:01] Is this a conscious thing that's happening? They're coming to this process going, "Oh, that's not going to sit well." Or is this something that is, in your opinion, happening at a subconscious level in so many people, even the smartest people at the higher levels of government?
[00:45:14] Richard Clarke: It's both. I think the first occurrence syndrome is definitely happening at a subconscious level, because if you talk about it explicitly, it doesn't make any sense. But as a subconscious level, they're feeling better about their decision to do nothing because it hasn't happened before. Whereas with the regulatory issue and the fiscal issue, spending money, the budgetary issue, I think that's more conscious. I think they realize, "I don't want to spend more money. I was elected to cut the federal budget. I can't go back to my constituents and say, 'We have to increase taxes, so that we can give the federal government more money, more of your money to spend," because this thing that has never happened before might happen. That's more conscious.
[00:45:53] And also part of the problem that we saw with several of these things like sea level rise, who is the decision maker, most of these issues, it's not clear that there is a single person or a single organization that you can finger and say, "This is clearly your job. And if there's a failure here, you will personally be held accountable by history." "If it's something like sea level rise, well, I mean, not my job, probably somebody else's," or, "Certainly, not my job alone. Why should I be the one who's taking the point on this one? Sea? You know, that's everybody's job, everybody's responsibility."
[00:46:31] Jordan Harbinger: Another bias, which you can call it that I guess factor is more accurate than I thought was very interesting in the book was that Cassandras often assail, highly respected people, maybe their reputation as a Cassandra, maybe their reputation is not known to be as knowledgeable. And the example you gave was of the Boston financial analyst, who was warning the SEC about Madoff. Can you tell us about this?
[00:46:55] Richard Clarke: So Bernie Madoff, now, of course, is synonymous with Ponzi schemes and stealing billions of dollars from people. But before that became public, Bernie Madoff was unwell-known name, at least in New York. He was an icon of the financial sector of Wall Street. He had been chairman of the NASDAQ stock exchange. He had been giving money to all sorts of charities. And you wanted Bernie Madoff to come to your black tie dinner for your charity or your organization.
[00:47:25] Harry Markopolos was a nobody. Harry Markopolos was a CPA in Boston. He started doing analysis because he heard about the great returns people were getting by investing with Bernie Madoff. So he started doing analysis and he said to himself, "You know, this data doesn't make sense. There's no way that you can be getting these kinds of highly successful returns in the stock market year after year after year." When the stock market goes up, Madoff goes up. But when the stock market goes down, Madoff goes up and this happens all the time. And Maydoff says, he's taking your money and investing it in the stock market, not in something else, but he won't tell you precisely what he's investing in.
[00:48:08] So this all sounded very fishy and Markopolos did a very complicated set of mathematical analysis to judge what the probability was. The mathematical probability, the statistical probability, this was actually happening. And he went to the people who were in charge of this, the Securities and Exchange Commission in Washington, the enforcement arm, and he laid it all out to SEC lawyer. Most lawyers aren't good with algorithms and numbers, math. That's why they're lawyers because they did well on the verbal.
[00:48:38] Jordan Harbinger: That's so funny. That's why I became an attorney. I was like, I will not have to do math in this job, most likely.
[00:48:43] Richard Clarke: Yes. You did great verbal scores, right?
[00:48:46] Jordan Harbinger: Absolutely.
[00:48:46] Richard Clarke: Your quantitative scores weren't so good.
[00:48:48] Jordan Harbinger: Nope.
[00:48:49] Richard Clarke: I know.
[00:48:50] So the lawyers looked at him and were like, "No. Bernie Madoff? No, you got to be kidding. Go away." And he comes back year after year with more numbers and more data and they don't believe him and he absolutely had it nailed it. It was a very, very good analysis. And ultimately the analysis of the SEC used after the Ponzi scheme collapsed.
[00:49:12] Jordan Harbinger: Do you think that if he had had a better reputation in his Boston financial analysis or with the folks at the SEC that it may have gone further or was Bernie made off just too unassailable?
[00:49:22] Richard Clarke: No. I think if the lawyers at the SEC got a call from somewhere in the law firm they used to work with, or some guy who went to law school with him, somebody they knew did the introduction, and that person said, "You know I really think you ought to look at this. I've looked at this. It's outlandish. It's really hard to believe, but I've looked at this and this looks good." They would have paid more attention. Markopolos didn't have that kind of introduction. He was easy to blow off.
[00:49:51] Jordan Harbinger: Yeah. It's so interesting.
[00:49:53] Richard Clarke: One of the ways of judging whether or not someone is a legitimate Cassandra is what's in it for them. If they're proposing something, we have to do X because of this impending disaster. Well, if X financially benefits them, "Oh, by the way, we have to do X and I've got the copyright. I've got the patent on that one, or I'm invested in that one." Then I think he can be a little bit more dubious, more doubting. If they're just saying, "Look, I don't have any stake in this, other than the fact that I'm a citizen, then you know, that adds to their credibility.
[00:50:25] You mentioned earlier that the concept of scientific reticence and what that means is that people always willing to say, "Well, okay, you know, maybe you're right. But other people say you're wrong. Let's spend some time studying this," which they didn't really don't intend to, or they intend to do it on the slow roll. It's a way of saying, "Yeah, well I'm covering my ass. You came in and told me the world is going to end. So I'm doing something." And your response is, "We don't have time for more study. I've already done the study. Tell me what's wrong with my study."
[00:51:00] Scientific reticence is particularly interesting in the case of sea level rise. We talked to critics of Professor Hansen and his model for sea level rise. And we said, "What's wrong with his model?" And they said, "Well, we don't know. There might not be anything wrong with his model." The scientific method, of course, is that you do a study, an experiment in the real world, and then you replicate it and see if somebody else can get the same results. And he really hasn't done any of that. So I went back to Hansen. Here's what they say about you. And he said, "Dick, what do you want me to do? Melt Greenland several times." And he said, "Look, yeah, I can't use the complete scientific method to prove this. I can only do it in simulations. They can say my simulation is not dispositive because it's just a simulation. But if we wait to see this happening in the real world, it's too late." So here's a great scientist saying the scientific method is useful, but sometimes if you wait for the scientific method, it's too late.
[00:52:03] Jordan Harbinger: Well, it's like nuclear Holocaust, for example. We don't have to actually destroy a country to figure out it's going to be bad for the environment and cause nuclear winter, for example, it's something that is generally accepted because there's a scientific consensus on that. That's maybe not being interfered with too much by a political agenda because we've actually seen some of these things take place in the real world. And we can sort of extrapolate from there.
[00:52:28] Which biases and faulty decision markers and factors do you find yourself teaching your friends, your kids? What are some of the most important ones that you think people can apply right out of the box in their own life, even if they don't work for DARPA or something like?
[00:52:45] Richard Clarke: Well, I think there, you need to have a function in any organization that looks for Cassandras. When people come to me, as they always do, with conspiracy theories, my first reaction is never to say, "Oh, you're a nut. That's an incredible conspiracy theory." My first reaction is I need to disprove this and if I can't disprove it, you know, then maybe there's cause for concern, I don't automatically say, "This is outlandish. That's really incredible. Boy, that would be inconvenient if that were true." My initial reaction is, "Okay, well, let's look at your data and let's look at your record."
[00:53:22] You know, somebody came to me a while back with the theory that the Rolling Stone author, Michael Hastings had been murdered rather than just died in the automobile crash. And I tried to get what data I could get and I couldn't prove it one way or the other, but I couldn't disprove that theory. And that bothered me. If you can't disprove something, then you have to hold the door open. That it might be possible even if it sounds like a conspiracy theory.
[00:53:53] When I was in government, Oklahoma City happened, the Oklahoma City terrorist attack, and the investigation showed that one of the two Americans involved had spent some time in a city in the Philippines. We looked at that city in the Philippines and realized that at the same time, he was there, Ramzi Yousef, the Al-Qaeda, proto-type Al-Qaeda guy, who tried to blow up the World Trade Center in 1993. He was in that same Philippine city at the same time. So, all right, that was interesting.
[00:54:24] But then the conspiracy theories came that the two of them match and they exchanged discussions about how they hated the American government and how they exchange theories about bomb building and whatnot. Well, that got into conspiracy theory and outlandish. I tried to disprove it. I couldn't disprove it. So I never said that's a conspiracy theory. I never said that's a crazy idea. I always said the jury is still out on that because it could have happened. There's no evidence that says it didn't happen. And there's some evidence that says it could have. I can't prove it one way or the other, but just because it sounds so crazy, it sounds like who shot John Kennedy conspiracy theory, but if you can't prove it wrong, then you can't say it's a conspiracy theory.
[00:55:11] Jordan Harbinger: What's the most useful piece of advice that you could give someone who wants to learn, how to think better? Whatever the it means to you.
[00:55:19] Richard Clarke: Well, I think it's all about facts where trying to understand the origin of facts and the veracity of facts and testing those facts. And when I think about risk management, which is what book Warnings is ultimately about is risk management, I always ask the question: what could happen? And when somebody says, "Oh, that couldn't happen." Say why, why couldn't it happen? What's preventing it from happening? What law of physics or what system do you have in place that's preventing it from happening? And they always say, "Well, we have this system in place." The next question is, "Okay, how good is your system that's in place for stopping it from happening? If I were a bad guy, if I were a malevolent actor and I wanted to make this bad outcome of occur, what could I do to your system that's in place preventing it from happening? What could I do to undermine, go around, destroy your system that's preventing it. And if people are frank, they always admit, there's a way.
[00:56:24] I do this in companies. I go into companies and say to the CEO, "I want permission to get a random set of people from your company, put them in a room and say, 'The boss is not going to punish you for doing this. In fact, your boss is going to thank you for doing this. Let's sit around the table here and pretend that we hate the company.'" There were insiders who were working in the company, "Well, we really don't like the company because the company's done something bad to us." "What could we do that would really screw up the company? Let's think dirty. Let's list those things that we could get done." Once they believe that the boss is never going to know it was them that said it, they always say, "Well, you know, I would never do this, but I've always been concerned that somebody could do X." And then you get a remarkable list of risks that the company has that the company never knew it had.
[00:57:17] Jordan Harbinger: This is called red teaming with hackers.
[00:57:21] Richard Clarke: It is a form of red teaming. It's a little bit different than red teaming because it's actually taking the people in the company, getting them into an atmosphere where they can analyze the risks to the company. It's more of generating a risk register based on user experience. Every company has a risk register, whether they call it that or not. What I like to do is have that risk register be as thorough as possible, and then take it to the CEO and say, "Which one of these would really bother you? And let's categorize these into high, low, medium risk. And the ones that really are existential for the company, let's figure out how we could double and triple prevent that from happening."
[00:58:06] Jordan Harbinger: And that's different from red teaming in that it's not actually being executed.
[00:58:10] Richard Clarke: Yeah. Red teaming is at least in the cyber world is usually you get some team outside the company who knows nothing about the company and they see what they can find externally about the company. Particularly in cyberspace, you try to hack your way into the company. Well, guess what, they always get in. So I find that kind of red teaming not very helpful because the report comes back to the CEO and says, "Our outsider team found a way to get into your network and they did. And they planted the flag and here is that one way they used. And you have to fix that one way."
[00:58:44] Jordan Harbinger: Whereas in your way, with the risk roster, the risk ledger, you see myriad ways because you find people who are intimately acquainted with the systems themselves, and you probably have a much longer list rather than just the one random path that happened to work that time.
[00:58:59] Richard Clarke: Right. And you're using the knowledge and experience of people who really do know the company cause they work in it every day. Look for risks. Don't dismiss things out of hand because they seem unusual, because they seem outlandish, because they seem inconvenient. Have a process of looking and then when you find potential risks, decide whether or not they need to be put under continuous monitoring or periodic surveillance, decide what the mitigation strategy, the hedging strategy would be and review periodically the risk to see if you have to change that strategy.
[00:59:35] Don't reject somebody because they are an outlier. Don't reject somebody because you'd really rather not have happened what they're proposing might happen. Don't reject somebody because you want to put your head in the sand and say, "Well, if I don't listen to that, if I don't spend time on that, maybe it won't happen," because there's no correlation between you're ignoring an issue and it happening or not. Realize that history is a series of surprises and those surprises are not inevitable. There's nothing inevitable about these disasters. We can get ahead of them. We can stop some of them and we can mitigate others. We can really reduce the frequency of catastrophe if we think systematically about it, just as we've reduced the risk of fires in buildings enormously over the last century, by applying rigor in our analysis to the problem.
[01:00:34] Jordan Harbinger: Richard, thank you so much. This has been really interesting. And the book Warnings, of course, a great read for those of us that are interested in getting better information and knowing what to listen to and what to ignore.
[01:00:46] Of course, I've got some thoughts on this episode, but before I get into that, here's a sample of my interview with someone with decades of experience in protecting people at every level, from the top levels of government to victims of spousal abuse. Violence is a reality. If you're not prepared for its possibility, you'll be caught off guard by its eventuality. Learn how to hone your sixth sense for danger. Discover how to spot the red flags that signify someone's a likely abuser con artist or predator. Here's a bite.
[01:01:15] 16 years ago when I was 20, I got into a taxi cab in Mexico City and it turned out to be a fake taxi. And the guy was driving me further and further away from my destination, further and further away, and my brain went through this process. It said, "No, it's probably going to be fine. I know he said he was going to ask for directions. But he's a cabbie, he should know that. No, no, no, no, no, no, but I mean, I've never been kidnapped before, so that can't be what's happening. And then I remembered some guy on Oprah in 1994 or something like that, when I was a kid sitting there with my mom who said, 'Never go to the secondary location.' And I only realized a decade and a half later when reading the book, The Gift of Fear that that was you."
[01:01:56] Gavin de Becker: Everybody with a normal functioning mind and body system does have intuition. And what we have in varying degrees is our willingness to honor it and listen to it and learn about it. It's our most extraordinary mental and physical process. The stomach lining, as example, has a hundred million neurons, a hundred million thoughts cells. That's more neurons than there are in a dog's brain. When you hear the word, our gut, you know, I had a gut feeling. It's a very accurate description of what's going on. And these two brains in the gut and in the skull communicate with each other through the body. And so the whole mind, body system delivers intuition to you, which is knowing without knowing why, knowing without having to stop it, all the letters from A to Z on the way, just getting for A to Z automatically.
[01:02:46] It doesn't really matter how a thing should be. It only matters how it is and how it is in terms of reality in this moment. And reality is the highest ground you can get to. That's the place where you can see what's coming. And I'm so glad to hear that story and that makes my day. That means a lot to me, particularly as I'm about to hear, I hope how well you prevail, because I know we're here having a conversation, so you did well.
[01:03:09] Jordan Harbinger: I slid behind the driver's seat and he reached over towards the glove box. And I grabbed him and threw him back to his seat because I figured he had a knife—
[01:03:17] For more, including the most important thing we can do to cut potentially threatening people out of our lives forever, check out episode 329 with Gavin de Becker.
[01:03:28] Now, predicting disasters might seem like a bit of a downer in some ways, but really I think there's optimism, especially in the book that I found as well. When I find problems, it's kind of like, "Weh, weh, weh," but then you find a way to solve a problem. And as you spot that same problem like you do when you read these books, it's actually quite encouraging.
[01:03:44] I had an idea when I was listening to him talk about younger folks looking for what to do, asking me what should I do with my life, use the warnings to figure out which careers or areas of study are going to be hot in the future. Right? So it's going to be climate, genetic engineering, AI self-driving cars. I'm sure I'm missing a few here. And then if you get skills in those areas, you are going to be right in the middle of that whole thing when it's actually of paramount interest to humanity, to everyone in the world. Unless, of course, nobody listens to you, then you'll be, well-served at least in knowing how to clean up the mess caused by the aforementioned industry in which you are now an expert. So either way you win there, I think even if humanity loses.
[01:04:23] Anyway, great big thank you to Richard Clarke. It's a good read. A lot of practical stuff in there, of course. Links to all things. Richard Clarke will be in the show notes. Please do use our website links over at jordanharbinger.com if you buy books from any guests. It helps support the show. Transcripts are in the show notes. I'm at @JordanHarbinger on both Twitter and Instagram, or you can also hit me on LinkedIn.
[01:04:43] I'm teaching you how to connect with great people and manage relationships using systems, software, and tiny habits. The same ones that I use every single day. jordanharbinger.com/course. I'm teaching you how to dig the well before you get thirsty. And most of the guests on the show, they subscribe and contribute to the course. So come join us, you'll be in smart company where you belong.
[01:05:02] The show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Millie Ocampo, Ian Baird, Josh Ballard, and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for this show is that you share it with friends when you find something useful or interesting. If you know someone who's in the business of heading disaster off at the past and they need these kinds of skills, definitely share this episode with them. I hope you find something great in every episode of this show. The greatest compliment you can give us is to share the show with those you care about. In the meantime, do your best to apply what you hear on this show, so you can live what you listen, and we'll see you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.