Tristan Harris (@tristanharris) is a former Google design ethicist, primary subject of the acclaimed Netflix documentary The Social Dilemma, co-founder of the Center for Humane Technology, and co-host of the podcast Your Undivided Attention.
What We Discuss with Tristan Harris:
- Why we’re worth more to social engagement platforms as manipulatable slabs of predictable human behavior than as free-thinking individuals.
- How the social networks of the early 2000s so quickly turned from places where we could keep in touch with friends, family, and colleagues into disinformation amplifiers that contribute to the destabilization of democracies.
- Why your algorithm-tailored online experience so radically differs from that of your closest friends and loved ones, and why this is a problem when the public good is cast aside in the interest of keeping us engaged and enraged.
- The unintended consequences of allowing an algorithm to bring people together by what it sees as similar interests, and how this has thrown fuel on the disinformation fire.
- How attempting to outthink a social media algorithm is like trying to play chess against a computer that can look ahead and counter every move you could possibly make.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Social engagement platforms like Facebook and Twitter started out in the early 2000s as a seemingly innocent way for us to connect and stay in touch with friends, colleagues, and family, but now they’re cited as being instrumental in the polarization of society, the destabilizing of democracy, and even genocide. Did this transformation take place like a Frankenstein’s Monster accidentally slipped loose from the control of its creators, or was it an intentional evolution devised by well-funded technocrats looking to profit from giving us unfettered access to exactly what we want at any time without restraint, pitting us against one another to create a perpetual motion machine of rage?
On this episode, we talk to Tristan Harris, a former Google design ethicist, primary subject of the acclaimed Netflix documentary The Social Dilemma, co-founder of the Center for Humane Technology, and co-host of the podcast Your Undivided Attention. “I think it’s pretty easy to see that a society in which it’s more profitable for each person to be addicted, narcissistic, distracted, confused about reality, not knowing what’s true — that is not a society that can solve its problems,” he says. “That is not a society that can solve climate change, that is not a society that can escape pandemics, or agree on anything. And that is incompatible with the future that we want to live in.” Here, we discuss how our society got in this jam and what Tristan sees as our way out with the help of humane technology. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some of the links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. Thank you for your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
This Episode Is Sponsored By:
Starbucks Tripleshot Energy is an extra-strength coffee beverage in a can with 225 mg of caffeine to keep you going, and it comes in Cafe Mocha, Caramel, French Vanilla, Zero Sugar Black, and Zero Sugar Vanilla flavors. Find it online or at your local store!
Better Help offers affordable, online counseling at your convenience. If you’re coping with depression, stress, anxiety, addiction, or any number of issues, you’re not alone. Talk with a licensed professional therapist for 10 percent off your first month at betterhelp.com/jordan!
What time is Bourbon Time? Join us in reclaiming 6-7 pm as the happiest hour to do whatever it is that makes you happy — and if that involves a glass of bourbon, remember to drink Knob Creek responsibly!
LifeLock gives you all-in-one protection for your identity, devices, and online privacy; there’s a victim every three seconds, so don’t become one of them. Save up to 25% off your first year of LifeLock at lifelock.com/jordan!
Miss the conversation we had with Hollywood leading man and musician Dennis Quaid? Catch up with episode 279: Dennis Quaid | Sharks, a Bear, and a Banjo here!
Thanks, Tristan Harris!
If you enjoyed this session with Tristan Harris, let him know by clicking on the link below and sending him a quick shout out at Twitter:
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at firstname.lastname@example.org.
Resources from This Episode:
- Your Undivided Attention Podcast
- Center for Humane Technology
- Tristan Harris | Twitter
- Tristan Harris | Facebook
- Tristan Harris | LinkedIn
- Tristan Harris | Instagram
- The Social Dilemma | Netflix
- BJ Fogg | Tiny Habits That Change Everything | Jordan Harbinger
- 1984 by George Orwell | Amazon
- Brave New World by Aldous Huxley | Amazon
- How the Facebook Algorithm Works in 2021 and How to Work With It | Hootsuite
- How Filter Bubbles Distort Reality: Everything You Need to Know | Farnam Street Blog
- The 21st Century Skinner Box | Behavioral Scientist
- Facebook’s Global Expansion No Longer Has Its Mission Statement Standing In the Way | Quartz
- Vaccine Misinformation Has Been Spreading On Facebook and Instagram for Years | Vox
- Garry Kasparov | Deep Thinking for Disordered Times | Jordan Harbinger
- Live Die Repeat: Edge of Tomorrow | Prime Video
- How TikTok’s ‘For You’ Algorithm Works | Wired
- Tank’s Good News
- Stewardship of Global Collective Behavior | PNAS
- Kids in the US and China Have Starkly Different Career Goals | Business Insider
- The Society of the Spectacle: Annotated Edition by Guy Debord | Amazon
- How Taiwan’s Unlikely Digital Minister Hacked the Pandemic | Wired
- Estonia Already Lives Online — Why Can’t America? | The Atlantic
- A Problem Well-Stated Is Half-Solved (with Daniel Schmachtenberger) | Your Undivided Attention
- How Social Media Hacks Our Brains | Center for Humane Technology
- What a Lifetime of Playing Football Can Do to the Human Brain | Vox
- Tristan Harris: How Better Tech Could Protect Us from Distraction | TED
- Digital Democracy Is Within Reach | Your Undivided Attention
- Chris Hughes: It’s Time to Break Up Facebook | The New York Times
- Childhood 2.0 | Prime Video
- Bury the Chains: Prophets and Rebels in the Fight to Free an Empire’s Slaves by Adam Hochschild | Amazon
- Jaron Lanier | Why You Should Unplug from Social Media for Good | Jordan Harbinger
- Nina Schick | Deepfakes and the Coming Infocalypse | Jordan Harbinger
- Renee DiResta | Dismantling the Disinformation Machine | Jordan Harbinger
- Steven Hassan | Combating Cult Mind Control Part One | Jordan Harbinger
- Steven Hassan | Combating Cult Mind Control Part Two | Jordan Harbinger
- Steven Hassan | The #iGotOut Guide to Quitting QAnon | Jordan Harbinger
- Take Control | Center for Humane Technology
- Together, We Can Rebuild the System | Center for Humane Technology
- Here’s My Review of The Social Dilemma: No, Social Media Is Not “Hijacking” Your Brain | Nir Eyal
Tristan Harris | Reclaiming Our Future with Humane Technology (Episode 533)
Jordan Harbinger: Special thanks to Starbucks for sponsoring this episode of The Jordan Harbinger Show.
[00:00:04] Coming up next on The Jordan Harbinger Show.
[00:00:08] Tristan Harris: So long as a human being is worth more as the product than as a customer, then we are worth more when we are addicted, outraged, polarized, anxious, misinformed, validation seeking, and not knowing what's true because each of those phrases, addiction, distraction, narcissistic polarized misinformed, are success cases of a business model that was trying to get your attention.
[00:00:34] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. We have in-depth conversations with people at the top of their game, astronauts and entrepreneurs, spies, and psychologists, even the occasional Emmy-nominated comedian, rocket scientist, or war correspondent. Each episode turns our guests' wisdom and the practical advice that you can use to build a deeper understanding of how the world works and become a better critical thinker.
[00:01:01] If you're new to the show, or you're looking for a handy way to tell your friends about the show, which we always appreciate, we've got the starter packs. Those are at jordanharbinger.com/start. They are collections of your favorite episodes organized by topic. That'll help new listeners get a taste of everything that we do here. Again, jordanharbinger.com/start is where you can find that or help somebody else find what they're looking for here on the show as well.
[00:01:22] Today on the show, Tristan Harris, host of the Center for Humane Technology's podcast, Your Undivided Attention. You've probably seen them on Netflix. The Social Dilemma was that documentary that he made there. We're supposed to do this interview like a year ago, but here we are. He's a former Google design and technology ethicist. He studies cults, mind control, a man after my own taste, of course. He used to study some magic as a kid and studied with BJ Fogg, who was also on this show, episode 306. Magic, habits, and cults, great primer on human behavior here.
[00:01:51] Today, we're talking about apps and social media, how they're out to have people skyrocket to fame because it gives us those dopamine hits. But is this good for our information ecosystem? No, obviously. We spend billions on border security, but our cyber borders are wide open. Why spend on bombs when you can ignite a culture war domestically. The myth is that technology is neutral and it's all about how you use it. Design has evolved such that you have less and less choice about how to use technology because interaction is predetermined in many ways. So we think we have free choice, but we forget that somebody else is controlling the menu of choices. This didn't exist before at a stage that could beat our own brain and out-compete pretty much everything else for our attention.
[00:02:31] Thus, nowadays we have no common language with many. This is why we see stuff like QAnon and we think, wow, crazy as it gets. It's less crazy when you absolute fire hose of misinformation and disinformation that these people are drinking from. Don't get me wrong. It's still crazy. But you start to see how this stuff begins to creep in and surround those folks. And you often can't help but get suck into it yourself, especially because people look to confirm new beliefs by searching out new evidence for those existing beliefs that they already have, especially because people look to confirm new beliefs by searching out new evidence for those beliefs as well. So we're going to go over a lot of social media, what it's doing to our society and more.
[00:03:10] George Orwell, who wrote 1984, thought that in the future, we just wouldn't be able to get any information and we wouldn't be able to get the truth. And of course, I think it was all this Huxley who turned out to be right, who said he was actually afraid that information and truth would simply be drowned in a sea of irrelevance. And that is exactly what happened. That's what's happening right now. I've also got a lot of thoughts in additional information about this topic in the show closed. So make sure you stay tuned after the show.
[00:03:35] And if you're wondering how I managed to book all these amazing guests, I use software, systems, and tiny habits, and I'm teaching you how to do that same thing for free in our Six-Minute Networking course. Again, no payment required. And you could find that course at jordanharbinger.com/course. By the way, most of the guests on our show subscribe to the course, contribute to the course. Come join us, you'll be in smart company where you belong.
[00:03:55] Now, here's Tristan Harris.
[00:04:00] How many people have watched The Social Dilemma? I feel like everyone saw that. Do you have any idea that stats on this now?
[00:04:06]Tristan Harris: I think we estimate that about a hundred million people saw the film. It's like a baseline estimate based on the numbers that we saw. I think 38 million households saw it in the first four weeks alone and that was household. So that's not individuals. And based on the traffic that we saw, we estimate more than about a hundred million people saw the film. And that's probably still a little number, that's at about 190 countries in 30 languages. So it's definitely — and I think it broke all records on Netflix, maybe like the number two most popular documentary on Netflix in its history, I think.
[00:04:37] Jordan Harbinger: Congrats on that. I mean, it seems like that is the beginning of the antidote a little bit. So a lot of the people in the film, they helped build Facebook and now you and them are, is this a little bit Dr. Frankenstein as great killing the creation? Because I know you worked at Google and I know the other folks in the film work on Facebook and in other elements of social media, but the whole thing has gotten wildly out of control.
[00:04:59] Tristan Harris: Well, I think the word you used first is correct, which is that it's kind of a Frankenstein. I think the point of The Social Dilemma, there isn't some easy set of fixes that Mark Zuckerberg can just go in there and tweak it. And then suddenly, Hey, we have Facebook as this positive force in society. It's basically run amok and is out of the control of the creators. But the people who are in the film are insiders. Justin Rosenstein who had been at the like button, Tim Kendall who brought the business model and monetization model to Facebook and was also the president of Pinterest. You know, Roger McNamee who was former mentor of Mark Zuckerberg when he first started the company. These are all people who came forward because they understand that risks that are being created in society by this sort of out of control machine or far out gray, whatever benefits that people can certainly point to in their lives that can come from it.
[00:05:46] You have to ask — you know, in the film, actually, I think I say the thing that's so challenging about this moment of evaluating technology's impact in society is it's confusing because it's simultaneous utopia and dystopia. We have some of the most amazing capacities and things that we could possibly ever celebrate. You know, the fact that just about every bit of information that we would ever want to be able to have access to is basically available and that's incredible. And it could be the world's Library of Alexandria in educating people and creating all this positive impact in the world. But the net set of harms and risks that are sewn into the fabric of society by, not just Facebook or social media, but the entirety of the engagement platform. So like YouTube is an engagement platform, TikTok is an engagement platform, Snapchat is an engagement platform because what they have in common is predating on human behavior and human attention as a commodity.
[00:06:39] So in other words, we're worth more when we're the product as dead slabs of human behavior, than we are as free thinking individuals who are living our lives. And so long as that's true, that's kind of the double bind that the companies are in. But I think I'm a bit off track from the question that you asked originally.
[00:06:53] Jordan Harbinger: No, it's all good because we're definitely going to get into that, like how we get monetized and things like that. And you've been sounding the alarm on the social media and especially like filter bubble stuff, which we'll talk about in a second since like, I don't know, 2013, I feel like it's some of the first stuff that I found from you.
[00:07:08] How did things go from, "Oh, this will be a great way for friends and family to stay connected," to like...genocide in Myanmar?
[00:07:16] Tristan Harris: Yeah. It's a long, complicated history, but—
[00:07:18] Jordan Harbinger: You don't need to give them all the history, but like just kind of, I guess, we're kind of confirming that that's the direction we went. Because it seemed almost overnight. I remember having Facebook in college or at the end of college.
[00:07:28] Tristan Harris: Yeah.
[00:07:28] Jordan Harbinger: And I was like, oh cool, I can see what your friends look like. Or like what people we have in common, or like what girls I want to meet from your friends' circle. That's like what it was for and they parody that in the movie. And then, now as an adult who doesn't really use Facebook at all, I'm like, "How are people in other countries killing each other over misinformation?" And then I look at what's going on in the United States and I'm like, "Holy sh*t!" I mean, this is a short — it's a shorter path than I think most of us expected if we expected this at all.
[00:07:55] Tristan Harris: So I guess, what I hear you asking is, so how do we get from this sort of 2004, 2005, 2006 version of Facebook, where it's a place for people in college to see what are the hot girls and boys in their classes and they want to know more about and stalk online to how was it de-stabilizing democracies around the world, making it impossible to know what's true. Probably inhibiting our progress on climate change and driving genocides in developing countries with high social fragility. How did that change happen? So there's a sequence of changes to the design of the product, which I want to say, I think from the very beginning there were problems with what Facebook was. I mean, so from the very beginning, I think they knew that they were manipulating certain human weaknesses and the simplest example of this, I can point to you going back to 2004, 2005, they invented the photo tagging feature.
[00:08:45] Were you in college when Facebook came about or—?
[00:08:47] Jordan Harbinger: I was probably — yeah, I was an undergrad and I feel like it was only available to a handful of schools and it was called The Facebook.
[00:08:54] Tristan Harris: Yeah.
[00:08:55] Jordan Harbinger: Now it's just Facebook.
[00:08:56] Tristan Harris: Right.
[00:08:56] Jordan Harbinger: Infamously, they dropped the "The," yeah.
[00:08:58] Tristan Harris: I remember that too. So I was at Stanford at the time. And Stanford was, I think, the third school that got Facebook. And I remember the day that they rolled out the photo tagging for you. Because what was brilliant about that feature wasn't that you could just sort of link all these photos of your friends that you were taking while you were on campus at events, and to have your face visible online. As an engine for Facebook, what it meant is that every single day or every other day, you had an email come in saying, "Hey, you've been tagged in a photo by your friend, Jordan." And that is one of the most persuasive things you can throw in front of the nervous system of a human social primate, because essentially in one moment, my social validation and approval is on the line. I know that you Jordan tagged me in a photo. So I'm wondering like what photo is it that you tagged of me and you did it, so I want to know. Well, then all the people that you know are going to see that that happened. So how do I make sure that that's an okay and how do I look and all that. So we kind of drop everything and we open that up.
[00:09:57] Now, if I'm Facebook. I just found a psychological cocktail that basically gets past your entire defenses because you are going to stop what you were doing, your homework, your reading, and you are going to open that email right now. And you could ask, as Jeff Seibert says in the film, The Social Dilemma, "Why don't they just show you the photo in the email?" It'd be a lot easier to see it without actually opening up Facebook. But that simple feature is basically just a manipulative tactic to keep — think of this as like a robotic puppet show. You have a puppet master where it's not even a human being of Zuckerberg. It's a machine that figures out how do we basically keep people coming back and manipulating people, their behavior over and over again.
[00:10:34] Well, let's invent a feature that not only makes it easy for you, Jordan, to tag me in a photo. But actually what if Facebook highlighted Jordan's friends in the photos he's already taken and said, "Hey, this is Tristan's face. Do you want to tag Tristan?" So now they're actually manipulating you into even tagging me and then when I get the email, it doesn't say, "Hey, do you want to respond to Jordan being manipulated by Facebook to tag me in this photo?" Instead it says, "Hey, Jordan tagged you with this photo." And again, this is the kind of puppet master you don't see behind the curve.
[00:11:03] This is the machine that from the very beginning to go back to your original question was manipulating human weaknesses for the purposes of driving growth and engagement. And it worked. I mean, it kept all these college students from basically doing anything productive and getting sucked into going back into Facebook and then getting sucked into the photo slideshow. And then even there, you have another sort of manipulative technique, right? So there you are with the photo slide show in the early days, and you would hit the space bar and it would show you the next photo. So you hit space bar, space bar, space bar. So you're trying to do a little rat, right? You don't know what's going to come next. It's like a slot machine. Your hand never has to leave its resting position. I don't know if you know the history of slot machines, but one of the big innovations that used to actually put your hand on a lever and then pull down the lever to actually get the slots going. You notice that if you go to Vegas now, you don't have to do that.
[00:11:44] Jordan Harbinger: No.
[00:11:44] Tristan Harris: Your hand sits on a little steady resting position and you just hit with your finger. And you basically can just repeatedly get the slot machine going. And that's the kind of thing it's like, imagine a world where your whole computer is just a slot machine and you think you're sitting there doing something productive, checking your friend's photos on Facebook, but you're being turned into a rat. That's part of a psychological experiment. So that wasn't causing a genocide in Myanmar, but that was the preconditions for how do we manipulate human weaknesses to drive up the thing that we want to happen.
[00:12:13] There's a lot of other decisions along the way. I mean, the other one was personalization of news. So before the Facebook Newsfeed, you had to click around Facebook quite a lot to kind of stock the cute girl on campus. And then they invented this feature called Newsfeed, which would basically just show you, here's basically a list of all the changes that have occurred throughout Facebook. And that was a more efficient way for you to basically see everything, but to do that, they needed to personalize information to you. So instead of people, all seeing kind of some global newsfeed of here's what's popular, it would say, "Well, what are the things that would be most likely to keep you coming back?" And so of the 2000 things they could show you, they select 20 based on the patterns that you always click on puppy videos, or this other person always clicks on surfboard videos. So this other person always clicks on plane crashes or QAnon or something like that. And it just says, "Oh, we're going to give you more like that." And that personalization drove up what we call the rabbit holes and the filter bubbles that we're now in
[00:13:05] And that had this effect of reinforcing some of our existing biases and worldviews, then that set of incentives was able to be manipulated by political actors who could basically realize they would get more attention. The more they showed you information about why the other side sort of negative campaigning, like why you should hate the other, your tribal out-group by identity, the people who are not like you, and you would get clicks infinitely on a treadmill if you did that. So over time, this just kept going and going and going into that's how you get to kind of the situation today.
[00:13:32] Jordan Harbinger: Social Media, so this filter bubble has made it so that everyone essentially lives in their own reality. Right? I mean, most people are not seeing the same information because of how that algorithm works with the feed. And I'm wondering is this why when, like the other day, I picked up my wife's phone and I looked at the app she had opened, I can't remember what it was. I think it was Pinterest and I don't even normally use something like that. I don't use any social media now, other than looking at my Instagram DMs, but it was something along those lines. And I remember thinking this isn't interesting at all and she routinely will comment on whatever I would be looking at and she's like, "Oh, this isn't interesting. You know, what is this? Or what is that?" And it made me realize that our information is so tailored to us personally, that it just doesn't even work when you're looking at someone else's — like, if you're watching CNN, I might look at it and go, "Oh, what's this." But if you're looking at your phone, there's almost nothing on there that I would go, "Oh, this is interesting to me also. What is that?" because it's so hyper-personalized.
[00:14:29] Tristan Harris: Well, I mean, there's a lot of different factors at play here. When you go to the gym and you exercise a new set of muscles that you haven't exercised before, so let's try it. Let's say you're trying to learn something new or go start learning chemistry from scratch. Like, how does that feel to a human nervous system? That's jumping into some brand new topic that hasn't been resonating before. The reward circuits are not really there. And so we more easily stay with things that we get a quick hit, reward circuit, and that's one of the phenomenon that's going on there.
[00:14:58] The other thing here is you realize that when you are scrolling a newsfeed, you have a supercomputer that's pointed at your brain. That's reverse engineering and has seen literally a trillion patterns of what worked on all these other human animals that are sitting in front of the same Skinner box. And it's just known that like, okay, if you're the kind of person who's clicked on surfer videos before, do you think Facebook with its three billion users hasn't seen like a million people, just like you who click on surf videos all day long. It knows what you look like. And it knows that there's other people within that group, it's not just surfing videos. There's other things. There's like certain photos of certain kinds of lifestyles that tend to work well at keeping your attention. There's maybe workout equipment or — I don't know exactly what the things would be, but it knows the surrounding constellation of media that would work well for that group.
[00:15:42] So I'll give you an example. In the world that we're facing now in the COVID pandemic. So, you know, Facebook is only looking for these associations that will keep people engaged. And one of the things they built in 2018 was when they changed their mission statement from "Let's make the world more open and connected." If you remember, they changed their mission statement to "Let's bring the world closer together with Facebook groups." Do you remember this change happening?
[00:16:06] Jordan Harbinger: Sort of, yeah. I remember the push for groups and it was like more sort of almost like hobby based. I don't necessarily remember that, but I do remember the, sort of the shift going from, look at your stuff to look at stuff together vaguely.
[00:16:18] Tristan Harris: Yeah. There's actually a few different reasons for that. One of them is that people stopped posting as much, individuals stopped posting as much about their lives. And so if you're trying to backfill your content funnels, where you need to like get people to still come back to the trough to keep eating garbage. So instead of eating other people's garbage, well, what if we could basically backfill it with basically Facebook groups content, because that's a way of keeping people — groups have way more content flowing through them than you know, the likelihood that you and I, as an individual are going to post. If you and I burned out from Facebook, there's always like some group, it's like an interesting group, that's going to be posting content more often. So they started elevating groups in the newsfeed. And one of the things that they did to drive up group's behavior was they actually recommended more groups for people to join. So on the right-hand sidebar, it would say here's other groups you might like.
[00:17:01] So if you were a new mom, Renee DiResta, my colleague who studies disinformation was a new mom and she joined one group called the Do-It-Yourself Baby Food group, like organic baby food that, you know, you can make yourself. And that's a perfectly fine use case. But again, this supercomputer pointed at your brain says, "Okay, well, what do people who tend to click on do-it-yourself baby food, organic baby food groups, what do they tend to like? And what are the Facebook groups that tend to be really engaging for those kinds of users? Can you guess what the number one recommendation was?
[00:17:32] Jordan Harbinger: Yes. I'm cheating because Renee has been on the show, but that's how she stumbled across the anti-vax. So yeah, do-it-yourself baby food has mothers that don't want to buy stuff because they can make it themselves or people obsessed with health and it's like, "Oh, you like health. Try this thing that is complete bullsh*t. That is F absolutely terrible for you but also seems health-related," in other words, conspiracy theories about vaccines.
[00:17:54] Tristan Harris: And this is by the way, I think, in 2015 or so. So this is far before we were questioning, "Okay, what's the safety at mRNA vaccines for COVID and have we done the safety analysis that's appropriate for a brand new vaccine, for a brand new disease, and what are the long-term effects of doing spike protein? This is not that kind of thing, just to be really clear for listeners. So we're, before we get into some polarized debate about—
[00:18:12] Jordan Harbinger: The COVID vaccine.
[00:18:14] Tristan Harris: The COVID vaccine.
[00:18:14] Jordan Harbinger: Yeah.
[00:18:15] Tristan Harris: Let's go back to 2015 and say for all vaccines, like for people who are just to be skeptical of all vaccines, here's a bunch of moms who joined this one group. And then it says, "Hey, by the way, here's a thousand people, just like you, who also loved this anti-vaccine or moms against vaccines group." And as a new mom, you're like kind of curious, like, I've kind of heard about that, you know, and I have a new kid and the last thing I want to do is plunge a needle into their arm or my little baby. I feel like I'm endangering their lives. So maybe I should join that group. So it's very persuasive. And again, Facebook is essentially identifying. It's an automated machine that is like a robotic puppet master, just figuring out what does that persuasive thing to our nervous system. And it just finds out that moms against vaccines is just really persuasive but it caused millions of moms to join those groups.
[00:18:55] So before again, we question and we have a separate conversation about, okay, are vaccines safe or what's the vaccine schedule between the US versus in Denmark is doing 70 vaccines who are at the age of two, safe and healthy versus doing a few and then slowing down the vaccine schedule. We could have that conversation, but that's very different than, "Are we recommending millions of new moms to go into this anti-vaccine rabbit hole?" And that's exactly what Facebook did through the effort of supposedly trying to bring the world closer together when it changed its mission statement and focusing on groups over individual users.
[00:19:25] And so in over and over again, this is a good example of the Frankenstein, right? They're making a change. They're making a tweak to how this automated puppet master is sort of roping each of us into these different things. Whether it's me responding to that photo tag of you or a Facebook group that was persuasive, but each time they do that, it sounds like a good idea to recommend other groups you might like, because if it's like a surfing club and then now you have a boogie boarding club. That's like another group I can join. That sounds like a pretty fine thing to recommend to someone, but across the board, it started recommending more conspiracy theory groups to people. And so Renee talks about how once you started with the first anti-vaccine group, it would then recommend flat earth and chemtrails. And then later that became, there's a QAnon sort of contingent, and it just kept going and going and going.
[00:20:08] And then you ask, you know, you roll back the tape and say over the last 10 years, why does society looks so crazy? Well, as Facebook and Twitter and YouTube, which all had these similar effects, by the way, have become the predominant way that we construct reality and know like, what is true? What's going on in the world? What do we care about? Well, especially in a COVID world where we're looking all through where we were staring at home through the binoculars of social media to construct an image of the world.
[00:20:32] I mean, I was talking to a friend just a few days ago who's spending time in Portland. And she said to me, "Oh my God, it's like the most amazing, beautiful city." I was like, "Oh, that's funny. Because in my mind, all I've seen over the last year of Portland is like war zones and like fighting and rubber bullets." And in my mind, I don't have any reason to believe it should be different than that. And it's because I'm using social media to construct my image of reality. And each of us saw very different things. So I was in Hawaii a few months and I met a bunch of people who, because they're in Hawaii, there really hasn't been much COVID. And so people who knew me and knew my work actually came up to me and said, "Oh, you know, it's so interesting watching The Social Dilemma and these rabbit holes people get into, because you know, over here, we're just so aware that, you know, the shamdemic is, the pandemic hasn't really happened. There's no COVID. And George Floyd was an actor, right?" And that's really what they told me. They thought George Floyd was an actor and they thought the pandemic hadn't actually happened. And this is a really nice sweet person that I met in Hawaii.
[00:21:26] So, you know, these are the kinds of things that when you roll back the tape for the last 10 years and say, how did we get to this insanity that we're currently living into? I think people need to see that the degree to which social media has been a primary driver of the distortions and the derangement of how we're seeing everything show up.
[00:21:43] Jordan Harbinger: That is terrifying to hear that there's people who — I mean, look, we've all heard sort of extremists on many sides, but it's scary when you come across one in the wild, right? Because you kind of think like, I thought I was talking to a Russian bot or I thought I was talking to like this one weird sort of basement dweller in a rural area, wherever. Not like your aunt thinks this. And so it gets scary, especially when — and you mentioned this before, Google, Facebook, other algorithm driven companies, they're putting a billion dollars in resources to try and figure out how they keep us stuck. And it's like playing chess against Garry Kasparov, right? The IBM Deep Blue, it's like 30 moves or whatever, 15 moves ahead. And I'm thinking, "Oh, I'm just talking—"
[00:22:22] Tristan Harris: We're like a million probably.
[00:22:23] Jordan Harbinger: Or a million, yeah, yeah. If they're even, yeah, it's just millions of moves ahead. And I'm going down the little rat maze that it's building for me and I'm thinking I have free will and agency, and I'm making my own choices because the computer that's giving me the choices is so far away from me. I literally can't even see it. It's an information warfare space, so to speak. That is so vast. I can't see the edges so I don't even think that I'm inside the maze.
[00:22:46] Tristan Harris: Well, I think in any situation. Part of ethics is ethics of symmetry. You know what makes an interaction between two parties? Let's say a used car salesman who knows the car they're selling you Is a lemon, and then you're the buyer and you don't know that it's a lemon. Well, what makes that interaction unethical is an asymmetry of knowledge between how one party knows some information, the other party doesn't know. And they're participating in a transaction believing that there's a fair knowledge of what's happening.
[00:23:12] When someone's sitting there using Facebook and they're about to scroll their finger, right? And they think, "Okay, I'm going to scroll one more time and then I'm done." And they think that Facebook is just this neutral thing that shows them their friend's birthdays and some photos and some posts that their friends do. And that Facebook doesn't have any ability to manipulate them. Facebook is like the used car salesman. It has all this asymmetric knowledge, not just about that the car is a lemon, but more like they know everything about your psychological weaknesses that you don't even know about yourself. And it can quickly predict the things that would work on you even beyond your knowledge, because again, it's just seen, and it's funny.
[00:23:46] I was watching — have you ever seen this film? The Edge of Tomorrow, it's with Tom Cruise?
[00:23:50] Jordan Harbinger: No, I feel — well, I don't know. Now, I don't know. Those things all blur together, but I've heard of that for sure.
[00:23:55] Tristan Harris: Yeah. Maybe it's too niche of a film to mention it here, but basically it's about an alien species that like keeps playing out history. And the reason it keeps winning the war against the humans is that it basically plays out every version of what happens.
[00:24:08] Jordan Harbinger: RIght.
[00:24:09] Tristan Harris: And it knows what you're going to do before you're going to do it. So in a war where one party knows what you're going to do before you're going to do it, who wins in a war?
[00:24:16] Like if it could literally make an accurate prediction, let's say 95 percent of the time of like, I literally know you're going to go left here instead of right. And I've got a gun that's traced on you. Like who's going to win in that interaction? Well, instead of a gun and like which way you're going to go left or right? It's like there's your prefrontal cortex, your brain's executive control function. And you think you're only going to watch one more video and you think that you're going to make a free choice to stop watching the video one minute and 36 seconds in, but literally the computer can predict it that you're going to stop watching in one minute and 36 seconds and then it shows you something else at that exact moment.
[00:24:47] And so when you realize the degree of asymmetry of knowledge between what the machine knows about us, versus what we know about ourselves, I think there's a grand humility that's required about we as a species have developed the technology to reverse engineer our own free will. And this is such a deep point that I think it's so hard for people to accept because even using this computer with you right now, and we're using a podcasting app and I'm just sitting here in a Chrome browser and my desktop is sitting there calmly in the background. I don't feel like I'm being manipulated in this moment now with just this podcast app open, I'm not being manipulated, but if I had TikTok open on my phone and I watched one video and I said, "Oh, that's kind of funny," and then I scroll the next one. I really do experience myself as having agency. And you and I were talking about hypnosis before maybe we started recording and in hypnosis, one of the things you're doing is you're playing with people's feeling of their own choice, their own sense of from a "I'm doing something" to "something that's happening to me."
[00:25:39] And I think a metaphor for that is like, if you've ever been going into a building and you open the door, you put your hand on the door handle and you squeeze on the door handle, and then you start pulling it towards you, you're convinced that you're the one pulling the door towards you. But at that very moment, someone else was actually opening the door as well. And they were pushing the door outwards towards you. So there's this weird thing going on because it's kind of an intersection. I'm pulling the door towards me, but someone else was opening the door towards me at the same time. And if I wasn't really paying attention in a certain sense, they could have been doing most of the work in that interaction and I would have the experience that I was doing when really they were doing it. And I think there's that kind of like blurring of the lines of agency. Who's really the author of the choice? When Facebook's designing the moment to moment menu and when I say Facebook, I mean, find and replace that with TikTok, Instagram, et cetera.
[00:26:27] Jordan Harbinger: Sure.
[00:26:27] Tristan Harris: Defining the menu of those moment to moment experiences.
[00:26:32] Jordan Harbinger: You're listening to The Jordan Harbinger Show with our guest Tristan Harris. We'll be right back.
[00:26:37] This episode is sponsored in part by SimpliSafe. Remember the feeling you got as a kid. You get tucked into bed, or the feeling you get now in the arms of somebody you love? Safe and secure. It's a feeling of security that only comes through a human connection. And that's why the people at SimpliSafe Home Security are so important. Of course, SimpliSafe has an award-winning system that has all the technology, bells and whistles you'd expect these days. But the people that SimpliSafe really do take it to the next level. They're there around the clock anytime you need them. Whether it's a fire, burglary, medical emergency, a burst pipe, or even a problem while we're setting up the system, SimpliSafe has a person with the expertise you need ready to help, 24/7. And when you know, there's always somebody there to help, well, it's a feeling you just don't get with any old security system.
[00:27:18] Jen Harbinger: To find out how SimpliSafe can help make you feel safe and secure at home, visit simplisafe.com/jordan today to customize your system and get a free security camera. That's simplisafe.com/jordan today.
[00:27:30] Jordan Harbinger: This episode is also sponsored in part by Better Help online therapy. As we begin to see the light at the end of this COVID tunnel that we're in, a lot of people are still feeling down and emotionally out of sorts. And I understand that feeling. You might not even feel depressed or at a total loss. You know, your life might not be crumbling down, but if you're feeling a little bit off or your relationships are suffering, that could be a sign that you should talk to somebody. Whether you're feeling anxious or you're struggling in your career, or you're having trouble sleeping, online therapy can. Visit betterhelp.com/jordan and fill out a questionnaire that'll help Better Help assess your needs and match you with a professional licensed therapist. You can start communicating in under 48 hours via secure weekly video phone, or even live chat sessions with your therapist, and Better Help is committed to facilitating great matches. So it's easy and free to switch it up if you just don't click. An online therapy — look, I know you're thinking, "Oh, it's not the same," it is convenient. It's more affordable than in-person therapy. And of course, we got a little deal for you too.
[00:28:28] Jen Harbinger: And our listeners get 10 percent off your first month of online therapy at betterhelp.com/jordan. That's better-H-E-L-P.com/jordan.
[00:28:37] Jordan Harbinger: Now back to Tristan Harris on The Jordan Harbinger Show.
[00:28:41] It seems like technology incentives are just misaligned with the public good. And what of the major sort of takeaways from the last couple of years, one of the major reasons that our country is so fragmented is that people can't seem to agree on anything, including basic facts or reality because in part of the staggering amount of disinformation being crammed down our throats on social media. But also this is in part because technology incentives are not aligned properly with the public good. We're all competing for finite resources like attention.
[00:29:11] I do it by having interesting people on this show. Others do it by generating fear. And I think you mentioned that earlier at the top of the show that others do it by making sure that you see something that terrifies you at the exact moment, that they know they're going to lose your attention. So social media does a fear and the outrage thing, but using AI at a massive, massive scale, and then doing it with the Edge of Tomorrow style, with like a billion data points on people that behave just like you or just like me, it's an asymmetry that we can't really fight against because we don't even necessarily know we're in the battle to begin with.
[00:29:41] Tristan Harris: Yeah. And to your point, because I think many of us can get caught in the Edge of Tomorrow, infinite loop, it's like infinitely scrolling Facebook, we can get into the infinite loop of infinitely admiring the problem. So we get trapped in a kind of dystopia, where, as you said, so long as the incentives of these dominant technology companies that make up the center of our information ecosystem for most people, living in a country. You can't have the dominant information systems operating on a business model that says, "Let me just give you the next thing based on what will be most likely to keep your attention," because that will just select for that, which deranges your society.
[00:30:19] It selects for the things that are more addicting. It selects for things that are more distracting, polarizing, validation seeking. So we say so long as a human being is worth more as the product than as a customer, then we are worth more when we are addicted, outraged, polarized, anxious, misinformed, validation seeking, and not knowing what's true because each of those phrases, addiction, distraction, narcissistic, polarize, misinformed, are success cases of a business model that was trying to get your attention. And we don't want a world where like the worst version of ourselves is the thing that's profitable, especially in a geopolitical competition with China. Because if I'm China and I'm noticing, "Okay, I'm a digital authoritarian country. That's consciously using all the technologies to make a stronger, more effective digital authoritarian society." And it's well-run, and it has different values that our society has, but we notice that digital open societies, like the US are not saying, "Hey, let's consciously use all the tech to make stronger, healthier, better open societies that work better together. Instead we've allowed market process between three or four major tech companies to profit from the degradation of our democratic capacity.
[00:31:28] So now you play those two game theoretic actors, like the game of Garry Kasparov and the AI, let's play the US and China against each other in a world where one has tech that basically operates the brain implant of its entire society in a way that dumbs it down makes it addictive, distracted, polarized, narcissistic, misinformed. And the other society that basically is consciously using tech to make it expand its economy, be more effective, consciously use AI, et cetera. Like we just have to do something different. I think that's what we have to get to is let's not just say let's fix social media to be slightly less bad to have to take up a few more whack-a-mole stick to whack a few more pieces of misinformation and call it a day because now social media is like 5 percent less toxic. That's like the head of sustainability at Exxon saying, "Hey, we've increased our rate of clean oil extraction by 5 percent."
[00:32:16] Jordan Harbinger: We only spill half the amount of oil in the ocean now. Yeah.
[00:32:19] Tristan Harris: Right, right, exactly. And by the way, that metaphor is very apt. We often talk about our work at the Center for Humane Technology, which is the organization that I run that operates the vehicle for our work in this topic in the world. And we talk about this problem as the climate change of culture. So what Exxon is to climate change, I think Facebook is to this climate change of culture, this entire business model. It's an extractive business model. That's like the Exxon of human anxiety. It pumps human anxiety and drives a profit from the turning of human beings into predictable behavior and predictable behavior means sort of the seven deadly sins, the worst of us as the business model. And so that's the thing we have to change. We have to not just make Exxon slightly less bad. We need things that are like the solar, the Tesla, we need the new thing that makes us a stronger, healthier society that doesn't predate or extract from the worst of us.
[00:33:09] Jordan Harbinger: Yeah. That was something I was going to bring up later and it seems like with more data points, we could be able to do this. Like, why is it that the worst stuff is addicting? It seems like we could optimize our devices and even social media to give us the highest possible return. That's actually good for us. You know, don't show me the most outrageous crap that's going to piss me off, show me the stuff that's going to make me educated. Pump me up happy. Shape the course of my day by getting me to work out. Commute safely and in a relaxed way, and then pop into the office with a positive attitude. And I'm wondering, do we jus, not yet have the technology to say, like, measure how my day went after seeing such-and- such video or having such-and-such interaction. That Facebook could see when I'm writing an angry comment, but there's no data from my 3D real life coming back to the platform to say like, "Hey, there are negative consequences to that." But when this guy consumes this, he has a great day. "Let's give him more of that so he has a great day."
[00:34:02] Tristan Harris: So one of the things that you're bringing up here is how much of this entire problem we're talking about is due to inability to measure when the positive thing could happen as a different outcome.
[00:34:14] Jordan Harbinger: Right.
[00:34:14] Tristan Harris: We can't measure when Facebook accidentally led to you seeing that a friend also happened to be in town and you messaged them — actually, this is an example of frankly, they could, frankly.
[00:34:24] Jordan Harbinger: Yeah. I feel like they could do that, right?
[00:34:25] Tristan Harris: It's a positive — that's actually one where they could, in fact, they could probably build a little thing. There'd be a million privacy concerns about this, but just theoretically speaking, if you just wanted to lay it out — how often does someone see that someone else is visiting town? And that that person sends them a message on Facebook messenger and says, "Oh, are you still in town? Or what dates are — we should get out. We should meet together." They could actually measure the number of times that happened. Now, that would be a privacy violation, but that would at least be, you can imagine some kind of fiduciary model where there's some protected way in which that kind of scanning could happen. And then you can have Facebook literally measure how often it's leading to face-to-face connection with people who are not even in the same town normally. So they literally know when they are bringing the world, face-to-face closer together. I would call that more, an example of humane technology in the sense that what we're really after in our lives is to spend our time meaningfully and in connection with the things that matter most to us, including people who we can't see very often who we care about.
[00:35:19] But a lot of the other things like you're talking about, like when did YouTube lead to genuinely learning about, let's say, game theory, how game theory plays into US-China dynamics or something. You could have watched a video, but was that actually a video that left you with lasting comprehension or did it just like flow right through you and you actually don't know what you watched. And now you're onto the next video and it's a plane crash. Now, you're onto a QAnon video. Like mostly we're actually just a sieve and this stuff is just flowing right through us. So the fact that we don't have a good way to measure. What did genuine learning happen here? Or is genuine connection happening? Or, you know, could Tinder measure that you met someone and it's like the love of your life? Well, they can measure that you stopped using the app, but the point is they have a lot more signals that we can measure when we keep reinstating the problem. Like we can keep measuring when you keep swiping on photos of people or when you send messages to people. We can't really measure when you find the love of your life. We can keep measuring when people click on more videos. We can't keep measuring when that video changed your life course, because it inspired you to take a very different bold action.
[00:36:24] There's an entire invisible dark corner of YouTube, not really dark, but it's like a light corner of YouTube that is like all these hopeful and inspiring videos that like pump people up, right? Or videos of life advice, or things that build up people's courage. YouTube could be about creating those moments in our lives if it noticed that we were feeling low self-esteem or kind of not courageous or something, and that's kind of what we were seeking, but to do that human beings have to know ourselves better as well. There's sort of two parts of this equation. One is how are we better introspecting on what it is that's really behind us, behind our own turtle screen. So what's behind my actions is some deeper set of feelings, anxieties, emotions. And then how does the tech be in conversation, not with our finger, our rat pellet clicking, scare box part of us, but the deeper part of us.
[00:37:12] And I think that's in general, like we have the wrong parts of ourselves that are in conversation. It's like when you see your parents fight. We know that those two people love each other, but the wrong parts of themselves are in conversation because they've been triggered into a different part of their identity. Right now, both the AI that's on the screen, behind the slab of glass, is the wrong part that we want to be in conversation with. And then we're not in the right part of conversation with ourselves.
[00:37:31] Jordan Harbinger: Yeah, this is interesting. And I think it comes down to in many ways, like the lack of data, as you mentioned before, but also we want the goals of the persuader to be aligned with the goals of the persuadee and it seems sort of hard. Well, I don't want to say in part — I'm trying to avoid saying it's impossible, but it's very, very difficult to sort of reconcile. Before we attack that, a lot of people say, "Well, why is it different with social media now? I mean, come on, I grew up with television. I grew up with radio. I'm fine. Society is not great, but it's also not Armageddon. Everyone said the TV is going to cause Armageddon just like they did with the radio. And it didn't happen." Why is the mobile phone and social media in our devices that are so personal to us so much more dangerous.
[00:38:10] Tristan Harris: There's a great new paper — I forgot the institution that put it out. I think it's called stewarding collective human behavior. They actually talk about the need for a new discipline. I think they call it a crisis discipline. It's basically that — the field of engineering or building a bridge, we know what a bridge that an engineer built that broke down and crashed, you know, collapsed or something like that. Like that's a very clear engineering failure mode. What does it mean to fail? To rewire three billion people's attention and information flows and relationships in a way that doesn't quote-unquote collapse. Like when a bridge collapses viscerally, our human sensory input is evolutionary evolved to recognize that as like a failure, but technology that rewires three billion peoples literally like your brain at a fundamental neurophysiological level, your attentional level, your relational level, which relationships are top of mind, your choice-making, your sense-making, your information, like there's never been something that rewires the entire civilizational brain.
[00:39:10] So like TV did change social relationships. It did change attention. It is a pretty profound thing, but it didn't do it in this hyper-personalized, micro-targeted exponential tech backed by AI, predicting your weaknesses deeper than, you know, your own weaknesses into any of those things. And it didn't fundamentally redefine your social relationships. So with kids, for example, there's this great actually kind of really horrible and sad and depressing survey of what kids is inspiring—
[00:39:39] Jordan Harbinger: Not great to be clear, the opposite of great.
[00:39:42] Tristan Harris: Not great, the opposite of great. What I mean by great in the sense of it's a really important and insightful survey that was done about what do kids most inspire to be. And I think if you asked them like 20 years ago, it used to be scientist, engineer, doctor, lawyer, and things like that. They ask kids, so, you know, what are you most aspire to be? And the number one result today is an influencer. It's to be an Instagram influencer, a TikTok influencer. I think they did the same survey in China and the number one result was an astronaut, a doctor, lawyer, a scientist, engineer. Just right there, which society has any say in the future.
[00:40:12] Jordan Harbinger: Yeah.
[00:40:12] Tristan Harris: The one who everyone wants to be an influencer. Okay, so that's just one little note. The second note is this happening by accident? Like, is this just the natural trend of things? Like maybe this is just what kids want. We should respect what they want. Well, TikTok and Instagram, both have programs to actively cultivate the influencer lifestyle and make that as attractive as possible because if I'm TikTok, a society filled with narcissism-seeking, validation-seeking influencers is a way more profitable society. It's a way more profitable for young kid than a kid who's not interested in being an influencer. So fundamentally, it's in TikTok interests to create an influencer society.
[00:40:53] What Guy Debord, the French theorist, wrote a book in the 1960s, I think, it's called The Society of the Spectacle. In which the spectacle, the performance, the human performance is a more profitable kind of society than a society that is authentic in some way. So we have inauthenticity and alienation being a more profitable way that society could organize than an authentic society. And that has all sorts of downstream effects, let alone the fact that it screws up children in a very deep level. So when you get to your question of what is different or so apocalyptic this time, I think is pretty easy to see that a society in which it's more profitable for each person to be addicted, narcissistic, distracted, confused about reality, not knowing what's true, that is not a society that can solve its problem. That is not a society that can solve climate change. That is not a society that can escape pandemics or agree on anything. And that is incompatible with the future that we want to live in.
[00:41:46] So we know already that we can't really live with these being the business models that guide the way our society works. We can have a more humane, symbiotic relationship between technology and a democracy, like what Taiwan is doing, or like what Estonia has done. What Wikipedia is, as a good example. There are ways technology can be positively in symbiosis with humanity, but it has to be done with different business models. I think that's one of the key things that people Don't understand about The Social Dilemma or do they think that this is about, which is, this is not about being an anti-technology society. In fact, if I was China, I don't want this to be heard as an anti-technology society, because that would mean that you're going to be left in the stone age, while I basically consciously like accelerate all my tech efforts and I keep becoming a better and better digital system.
[00:42:33] So what we really want is a new third attractor, which is I'm borrowing language from a mentor of mine, Daniel Schmachtenberger, who I were interviewing on our podcast, which is we need to have a new third attractor, which is not digital authoritarianism which is like digital closed society. We don't want an open chaotic society based on tech, the downgrades humans. We need a third thing, which is society that is consciously using tech to make a stronger, healthier, better 21st century open system. And we either do that or we call the American experiment over, I think.
[00:43:06] Jordan Harbinger: Yeah, it's scary and it sounds sort of apocalyptic, but I understand. I mean, it seems like the economy has shifted from spending time doing things or consuming things that are delivering the best returns, like working out. Like you mentioned, reading, hiking, even meditation apps or whatever to spend time doing the most useless sh*t that exists online. So clickbait articles, watching viral crap on YouTube, that auto loads from the sidebar, while I'm stuffing Flamin' Hot Cheetos into my face hole. Why is it that only the junk food version of online consumption seems to be addictive and exploding in terms of the level of consumption and how does this asymmetry radicalize and polarize us instead of just keeping us stuck to FarmVille? Like I remember when everybody was addicted to FarmVille and people were like, "Oh, this is terrible." And I'm like, "Eeh, whatever kids play with their grandma, it's fine." Now, it's like, "Oh, we could destroy democracy." I mean, that seems even more, it's obviously just more dangerous and it seems to have happened somewhat recently as well. Isn't it still profitable for them to just have us mining fake cucumbers on FarmVille? Why has it become so dangerous, like really dangerous?
[00:44:10] Tristan Harris: Well, I think to your question of what's different this time, and you mentioned the example of FarmVille as bad as an addictive and manipulative as FarmVille was — and I have friends who used to work at Zynga on FarmVille and they knew how bad it was and they feel bad about it now. And a lot of those people are trying to reform their ways and again, in a different way. But FarmVille was a game that was, I think, the most popular game on Facebook for a very long time. But again, it was a game that existed as someone, as a game exists in someone's life. It's like a tiny part of your life, or, you know, it can become a bigger part of your life, but it's a game fits into a place in your life.
[00:44:41] Social media that becomes the fabric of what it means to participate in a pandemic world where you're basically stuck at home. Like people live their lives on Twitter now. Like researchers share their knowledge on Twitter. Business runs on Facebook and Instagram and et cetera. Like you use Facebook ads to basically do your primary customer demand building. If you're a politician, most of your communication is going to be happening through social media. So what's key is that it would be really bad if FarmVille became more and more dominant place people spend their time if it was, you know, addicting people and bankrupting them. And it did do some of those things in it that's bad, but it wasn't this sort of taking over the fundamental brain implant of what it meant to be you. And I think that these social media platforms have taken over the brain implant of what it means to be the person that you are and also the society that we are.
[00:45:30] I mean, if you think about it, our society is less in the physical world now than it has ever been. Most of what's occurring or makes up the US economy, for example, is increasingly occurring in a digitally mediated space. Like we find out about products online. We buy them from online stores. We communicate online. National security is an online phenomenon. We can tell now with cyber attacks and things like that. So most of what makes up whether it's national security, the economy kids' education, how much does physical kids' education matter compared to the eight hours a day that your kids spend on TikTok, like which one's going to influence them more me preying on, you know, cyberbullying that occurs during an eight-hour footprint in a kid's life, or, you know, the five hours that the kid spends at school barely paying attention to their teacher.
[00:46:11] So the point here is the primacy of the digital world over the physical world and the sort of debasing of the substrate of the physical world that undergirds that digital world. I think that's a trend that we have to be very watchful of.
[00:46:25] Jordan Harbinger: This is The Jordan Harbinger Show with our guest Tristan Harris. We'll be right back.
[00:46:31] This episode is also sponsored by Starbucks. Starbucks Tripleshot Energy is an extra strength coffee beverage in a can. With 225 milligrams of caffeine, it's Starbucks coffee that you love, ready to drink. So you have the energy to do the things that matter to you. Great for keeping you energized in a long road trip or getting you amped at a music festival that you've been looking forward to and you can finally go to. Now that everything's open. I actually looked up some reviews where people said, and I quote, "Don't drink it while cleaning, or you might not stop until your ketchup packets are labeled." Apparently, this product has some unrivaled efficiency. Offered in classic flavors like cafe mocha and caramel also was zero sugar, dairy free options like black and vanilla. This stuff is wildly popular. It is bananas. I've seen guests drinking this stuff on the show because I'm really boring, I guess. But also there's a whole lot of missing stuff in the fridge of this local store that I go to and it's always the Starbucks Tripleshot. There's like one left in the back.
[00:47:24] Jen Harbinger: What gives you energy? Find your Starbucks Tripleshot Energy online or at your local store.
[00:47:30] Jordan Harbinger: This episode is also sponsored in part by Bourbon Time. There's no denying that the past year has had us all spending way more time at home. And because a lot of us, including myself, we were working from home too, it made each day string together. Feel really exhausting. We're all kind of burned out. And a lot of us have found ourselves just blurring the lines between work and rest, which took a toll on our energy. And the folks at Knob Creek see this phenomenon happening. They're urging us to help reclaim our evenings, beat that burnout and take back the hour of 6:00 to 7:00 p.m. as just one hour where you let yourself do whatever makes you happiest. I go for a walk during that time. I'm doing some reading during that time. I'm resting my voice during that time. Don't call me between six and seven. I don't want to talk.
[00:48:10] Jen Harbinger: We're leaving burnout behind starting now. Join me in reclaiming 6:00 to 7:00 pm. as the happiest hour. So you can do whatever it is that makes you. And if that involves a glass of bourbon, remember to drink Knob Creek responsibly.
[00:48:21] Jordan Harbinger: Knob Creek, Kentucky Straight Bourbon Whiskey, Kentucky Straight Bourbon Whiskey with natural flavors, and Straight Rye Whiskey, 45 to 60 percent alcohol by volume. Copyright 2021 Knob Creek Distilling Company, Claremont, Kentucky.
[00:48:31] This episode is sponsored in part by LifeLock. This year has been unlike any other. We all deserve some summer fun, but be on the lookout for new travel scams, designed by cybercriminals to steal your identity. Help protect yourself online by being mindful of online ads, independently verified deals with the company and don't rush into giving away info on suspicious websites. It's actually really important to understand how cybercrime and identity theft are affecting our lives because every day we're putting our information out there at risk on the Internet. And in an instant, a cybercriminal could steal what's yours. Sometimes harm your finances, harm your credit, harm your relatives. That's one of the reasons that I use LifeLock, one of a few reasons. LifeLock helps detect a wide range of identity threats, like your social security number for sale on the dark web, which I have found, which was a nice fun morning. If they detect your information has been potentially compromised, they send you an alert and they'll let you know what you can do to restore it.
[00:49:25] Jen Harbinger: Join now and save up to 25 percent off your first year at lifelock.com/jordan. That's lifelock.com/jordan for 25 percent off.
[00:49:33] Jordan Harbinger: Thank you so much for listening to and supporting the show. These episodes are a pleasure to make. They do cost us a little bit, and I don't ask for anything. Well, I shouldn't say anything. I don't ask for much. Please do consider supporting those who support us, our advertisers, all those codes and URLs. We make it easy for you. They're all on one page, jordanharbinger.com/deals. All the sponsors are there. All the codes are there. Please, again, do consider supporting those who make this show possible.
[00:49:59] And don't forget, we've got worksheets for many of these episodes. If you want some of the drills and exercises talked about during the show, those are also in one easy place. That link is in the show notes at jordanharbinger.com/podcast. Now for the conclusion of my conversation with Tristan Harris.
[00:50:15] For me, it also seems like some of the most brilliant people on the planet, right? Google, for example, has engineers living in our rental unit from Belarus and Serbia. You know, these people are recruited not to come to California to cure cancer, figure out how to create green energy, but they're here to figure out in large part as a group, not as individuals, of course, but how to game the human attention span. So it seems like a huge waste of human capital and potential in many ways. And to be fair, I've realized Google does more than sell me ads, but they're really focused on monetizing our eyeballs, hacking our brains, and Google is by no means alone in doing this. So I don't mean to just pick on them, but it's not even just ignoring huge issues like climate change and responding to a global pandemic, but now they're actually making it more difficult for us as a nation or as a species to respond to these challenges facing all of humanity. So it's bad enough to neglect the issues, but now to sort of step in between us, even by accident, and obstruct the problem and the solution. It's just so much worse.
[00:51:08] Tristan Harris: And I think that premise of digging into this apocalyptic terrain that we're doing together, I think, is no one wants to live in this reality.
[00:51:17] Jordan Harbinger: Right.
[00:51:17] Tristan Harris: No one does. You don't, I don't, your kids do, politicians don't. It's a road to nowhere. So the premise is that — I think tech workers also don't want to live in this reality, which is to say that we're all on the same team. We just don't know it yet, because right now, imagine if, instead of going to work — and like you said, the smartest people in Silicon Valley are essentially focused, PhDs, focused on reverse engineering the human nervous system and getting psychological weaknesses to get hijacked. You know, there was a narrative that stood that up for a while that said, like, we are making the world more positively, open connected and enabling search and doing all these things and broadcast yourself, YouTube, Instagram, life, et cetera. What we're seeing through now is the veneer and realizing that entire concept. It's just not good and we've changed the social norms.
[00:51:59] I think people who saw The Social Dilemma, one of the things that changed is that grandparents used to say, "Oh, I'm so proud of my kid who works at Facebook. They work on the business analytics team." That's now not a cool thing to say. So I think we've changed the social status making and social norm making fabric. That now we all realize this is not the thing that we want, which means that I think tech workers actually want to go to work thinking, "Wow, I work at a tech company and here's what I'm doing to accelerate human progress on climate change." Wouldn't that be an inspiring thing if I'm coming to work. And I know that every day, my actions are actually about helping humanity at a global level, that only tech can reach three billion people at the same time and actually offer ways for us to globally coordinate in positive, non-rivalrous and cooperative ways. But I can imagine that would be an exciting way to go to work.
[00:52:44] We don't want to be an anti-tech society. We want to be a tech society that is helping us solve our problems, that is helping us find common ground. That's helping us get longer attention spans. It's actually deepening the development of children and offering space for these other parts of our lives that don't get as much attention. Technology can do all those things. And by the way, the whole premise here is that that's the goal of all this. It's not to drown in the sort of, you know, yelling about the problem from our basements. It's to say that we can actually have both regulation and a technology environment and venture capitalists that are funding humane technology. That's actually finding that symbiosis between that uplifted elevated versions, elevated angels of our nature, and having that be the thing that's as profitable as possible.
[00:53:26] Jordan Harbinger: So how do we change the business model? I mean, it seems like regulation — I hate that's a default solution. So I'm looking for something else if you've got it. It sort of reminds me about — remember when the NFL found out that concussions were causing people to have all these health problems and the players were having all these problems later in life. It's like they've realized it's smashing heads together, causes all these issues. And then they kind of like one second later realized that the smashing of heads together is integral to their business model and they're like, "How do we avoid smashing heads?" "We don't. So now we need to hide the problem." It seems like that's where we are with tech companies. On the other hand, you are a design ethicist, or you were a design ethicist at Google. So the fact that role even existed is kind of somewhat heartening, right? Like that's maybe good news at some level that they hired someone to even think about these problems.
[00:54:15] Tristan Harris: I think what you're getting at is the recognition of how we profit from the problem currently. Problems that we obviously want to go away are directly tied into, like you were saying, with the football and concussions example, the business model of football is selling concussions on TV, against advertising. Once we discover that we can't just immediately drop the concussion part of the DNA of the sport, it's just too baked into the sport. That's true of football, but there are different ways of designing technology. Apple could radically change overnight next year, the way that our phones work and are incentivized. What I mean by that is instead of having an App Store where apps are competing for downloads, for use, for payments, engagement, et cetera, they could say we're actually getting rid of apps. And instead, we're actually offering these things called helpers and helpers are basically things that are competing to help us improve our lives in different ways.
[00:55:07] What I mean by that is like right now a meditation app is like sitting in an app store along with every other thing, that's an app that's trying to like steal your attention. So it's fighting against the wrong kinds of things. It's like broccoli, that's competing against like every kind of industrial agriculture, like salt, sugar, fat maximizing thing. You wouldn't put broccoli right next to those things because it would just lose in that choice architecture. But if we sort of invent that kind of Whole Foods' reorientation where the whole store becomes about what would make us healthy and have the whole store be about that. I mean, I know Whole Foods isn't perfect, but that's kind of the direction that we would need that.
[00:55:41] Apple could be changing the way that notifications work, the way that home screens work. I mean, they're doing this already by enabling their new version of their OS. They're actually taking an idea from a TED Talk I gave in 2013 of bi-directional do not disturb. It's making it easier and making suggestions about when could we say, "Hey, I want to basically not be bothered right now, or take a break from my phone right now and have it automatically signal that to other people." So now when you try to message me, it'll say, "Oh, by the way, before you hit send on this message, you should know that Tristan is like a way at work right now," or something like that. So now it creates social signaling, not just that I'm like doing, do not disturb, which I've been able to do for many years, but it creates social signaling and new social norms. And technology could be instrumenting culture in that way, in a more humane way. That's making more room for these boundaries for how do we want to live our lives. On the democracy front and sense-making and information front, that's a harder challenge.
[00:56:36] I think we need to look to examples like Taiwan. I would really encourage that your listeners listen to — we did a great interview on our podcast called Your Undivided Attention podcast, the name of our podcast, with a digital minister of Taiwan, Audrey Tang, who basically—
[00:56:49] Jordan Harbinger: Brilliant.
[00:56:50] Tristan Harris: —strengthened democracy, used tech plus democracy to make a stronger democracy. And she has just a thousand examples. And this is against the threat of Chinese disinformation. I mean, Taiwan is basically the first target of China's sort of growth and next phase of its development. And so even under the threat of China's constant disinformation engine and machine, how does Taiwan use technology to make a stronger democracy? Right now in the west, I think we have a vision gap. We think tech plus democracy equals like, "Oh, we've seen that experiment. It's horrible. It's led to this horrible world. We saw The Social Dilemma. That's not the world we want to live in." I think we have a huge vision gap. If we don't believe that there is a future other than the one that we're heading towards, then that's also a problem. We'll sort of soak ourselves in despair. And that is why I think it's so important people immerse themselves in examples of this kind of work.
[00:57:39] So typically if you talk about Silicon Valley and you go to other places around the world, you go to Honduras, you go to Nigeria or something like that. And people say, "Wow, that's amazing that Silicon Valley works that way," but we could never do that, you know, in Nigeria or Honduras. But with this Taiwan example, when you hear about it, and if you listen to the episode, what's funny, I think people say, that's amazing, but we can never do what Taiwan's doing in Silicon Valley. And it's like, well, hold on a second, we should be able to say, we can do these things. But we have to be willing to imagine a different kind of future. And imagine we can escape these bad business models.
[00:58:11] Jordan Harbinger: I want to be respectful of your time. Are you optimistic about our ability to resist this? You seem to be, which is great, but it's really easy nowadays to say, "We're screwed. No one's ever going to get a reign on these tech companies. They're never going to be aligned with us. We're going to turn into human behavior batteries that just program their AI and then die a slow painful death." Right? But you seem to at least so far be pretty optimistic that people are figuring this out. You're figuring this out. Taiwan's figuring this out. We can figure this out as a nation or as a society. And we can put this tech to work for us. And then in 50 years, we'll be looking back at, "Man, remember when we let that whole thing get wildly out of control?" "Yeah, that was when I was in college. What a clusterf*ck that was. Now look at it." You know, it seems like you're sort of thinking that's where we're headed.
[00:58:57] Tristan Harris: I'm very pragmatic and I think that people are right to feel the level of doubt or skepticism that they feel. And I felt it for eight years. I have been working nonstop on just this one topic. I wish we had other things we can work on in our lives, but I have to say also, I've never seen more momentum than I've seen them the last two, three years, especially. You know, we live in a world where The Social Dilemma has been seen by more than a hundred million people in 190 countries in 30 languages. I heard that The Social Dilemma is required viewing and parts of the Pentagon. You know, I think we live in a world where US and China, great power competition dynamics are getting clearer and clearer to people. And it's very clear that you can't have a west that out-competes China. If the west is basically allowing technology to make democracies totally dysfunctional against an opponent, who's using technology to make a stronger, more effective autocracy.
[00:59:48] So I think that, you know, we see regulators that are taking steps up. We see attorney generals that are making public statements. We see the co-founder of Facebook, Chris Hughes, who I met, you know, years before he had kind of changed his opinion. You know, writing a public op-ed in the New York Times saying it's time to break up Facebook. I've seen more progress and more momentum on these issues in the last few years, which I don't want listeners to have your show to believe is reason for them to take a step back and say, "I'm so glad this is handled. It's going to heal itself on it."
[01:00:15] Jordan Harbinger: Right. All set.
[01:00:16] Tristan Harris: It's not going to heal itself on its own.
[01:00:18] Jordan Harbinger: Yeah.
[01:00:18] Tristan Harris: We need every single one of us to be talking about these issues to be referring our friends who don't believe this, to see things like The Social Dilemma or another film is Childhood 2.0, comparing the childhoods, children now to children who were, you know, 40 year olds who were talking about their childhoods to 90 year olds who are talking about their childhoods and doing that compare, contrast.
[01:00:37] Jordan Harbinger: Wait, is that a documentary?
[01:00:38] Tristan Harris: Yeah, Childhood 2.0 is a documentary, yeah.
[01:00:40] Jordan Harbinger: Is it on Netflix? Where is it?
[01:00:42] Tristan Harris: It's actually on YouTube, I believe.
[01:00:43] Jordan Harbinger: Oh, okay. Childhood. We're going to link to that in the show notes, but I'm going to watch it tonight.
[01:00:49] Tristan Harris: There's out there. I believe they just made The Social Dilemma available to educators for free, and you can actually just register for an educator link. So anybody who knows people at schools. We've seen that The Social Dilemma already has been required viewing it in most high schools that you know, we've heard of, and we got LA unified school district is doing various experiments and trials with, you know, 600,000 students. There's just so many things that are happening. And obviously we all wish it was going about a thousand times faster. But I just want to say that I think that more people are realizing that the current road that we're walking is not sustainable and something very dramatic needs to change. We need to get there faster. So we need everyone's help to be speaking about these issues as the top of their lungs. And it's going to take a miracle, but something unprecedented has happened before in our history, whether it's things like the Manhattan Project or Civil Rights Movement or my favorite book of mine is the book Bury the Chains, which is about the British Empire's abandoning of slavery and how it took a hundred years and advocacy with the Quakers and these networks of people who expose these cruel ways of treating other human beings over and over again. And we eventually did something that felt, you know, impossible. And obviously we still live with the legacies of all that today, but studying the history of unprecedented human social change is a really important discipline that I certainly wish I was spending more time on because I think that's where we are. We're looking for the kind of changes that are unprecedented and dramatic, but thoughtful in terms of what we have to address.
[01:02:15] Jordan Harbinger: Tristan, thank you so much, man. I really appreciate your time and your expertise. The movie's great. We'll link to that in the show notes for people who haven't seen The Social Dilemma, but thank you for your work. I think, you know, it's important, but I think now you're right. People are starting to wake up to the fact that like, "Hey, this stuff I see on the news is not necessarily real. This stuff I see in the social media is designed to piss me off and keep me engaged. And maybe this is bad for our society." I think even the sort of slowest technophobic people who well, who are still on social media are finally now starting to realize this. And as a parent, I think it's up to us to make sure that our kids don't inherit this particular dysfunction of our society. So thank you once again.
[01:02:54] Tristan Harris: Thanks so much, Jordan. And I think that on that thing you just mentioned about being a parent. One of the things that's been most inspiring to me is there's been from very high-level and influential folks who it was their children who saw The Social Dilemma and sent it to their parents who are these influential policymakers and others. And the number of times I hear that story makes me more hopeful that also the next generation is tuning in to some of these things. I'm right there with you. So thank you so much for having me.
[01:03:22] Jordan Harbinger: I've got some thoughts on this episode, but before I get into that, I wanted to give you a preview of my conversation with the legendary Dennis Quaid. We got into rejection, both in Hollywood and outside, and how he brings his characters to life on screen. This is really a fun episode. I think you're going to dig it.
[01:03:38] Dennis Quaid: I didn't know at the time, if I wanted to be an actor. That was back during the time where I wanted to be a veterinarian or a forest ranger, forest ranger.
[01:03:48] Jordan Harbinger: You'd be fighting fires right now.
[01:03:49]Dennis Quaid: Yes, I would. Matter of fact, I've been evacuated from my house right now.
[01:03:51] Jordan Harbinger: Are you really? I saw the smoke when I flew in this morning. You know our flight originally was canceled and I was like, "You got to get me to LA. I got Dennis Quaid coming here and can't stand him up for this bullsh*t fire."
[01:04:02] You use a lot of different accents in many of your films. I'm curious how you learn and practice those.
[01:04:06] Dennis Quaid: My brother and I grew up doing impersonations like Ed Sullivan and John Wayne, and everybody that was around us. So I picked up on accents badly even. You know, like in India, I will be talking—
[01:04:19] Jordan Harbinger: Oh man, are you the guy that hears one on TV and then spends the rest of the week annoying everybody in the house?
[01:04:25] Dennis Quaid: I prepared in secret.
[01:04:26] Jordan Harbinger: So like in the shower going, "One more, Jimmy! One more!"
[01:04:32] Dennis Quaid: "I can't give her the gold, Captain!"
[01:04:35] Jordan Harbinger: That's awesome. That's definitely good. There's a reason you get paid the big bucks for these and I don't. That's for sure.
[01:04:41] I know music's a big part of your life. You wrote a few songs for three of your films, been in a band for like 20 years.
[01:04:47] Dennis Quaid: Same guys.
[01:04:47] Jordan Harbinger: Same guys.
[01:04:48] Dennis Quaid: For 19 years, this Halloween.
[01:04:52] Jordan Harbinger: Happy bandiversary.
[01:04:54] Dennis Quaid: Wow, that's really good.
[01:04:56] Jordan Harbinger: You can steal that. I definitely think I just made that up just now.
[01:04:59] Dennis Quaid: Really?
[01:04:59] Jordan Harbinger: Yeah.
[01:05:00] Dennis Quaid: I've never heard.
[01:05:00] Jordan Harbinger: I've also never heard.
[01:05:01] Dennis Quaid: Wow, it just came out.
[01:05:03] Jordan Harbinger: Yeah.
[01:05:03] Dennis Quaid: See what happens when you relax?
[01:05:05]Jordan Harbinger: Is it true that you play with your band in bare feet?
[01:05:08] Dennis Quaid: Yes. When we first started out with The Beastie Boys, they don't wear shirts. I won't wear shoes.
[01:05:15] Jordan Harbinger: For more with Dennis Quaid, including how he uses fear to stay motivated, check out episode 279 right here on The Jordan Harbinger Show.
[01:05:25] This was a really interesting episode as is The Social Dilemma, the documentary on Netflix. You should definitely check that out if you haven't yet.
[01:05:32] I assume that Tristan has scared off a lot of prospective employers, like imagine the Snapchat wants to hire a guy who's shouting from the rooftops about how social media is bad for us. I don't know. The algorithm we've noticed right? Gets more extreme with what it shows. When you go down those YouTube rabbit holes, right? It's kind of like action movies have bigger and bigger explosions and crazier fights until the final showdown, because that's what keeps our attention. It just keeps one-upping us. So we search for info on the war in Afghanistan, and we ended up with a Holocaust denial video after like eight steps or sometimes even less.
[01:06:05] Jaron Lanier talked a lot about this on our show ages ago, that's episode 156. He talked about the filter bubble, how that's constructed, and what we can do about it. So that's another recommendation there for you. Right now, it seems like Facebook is the runaway AI, right? This hypothetical machine that we build, where we tell this AI to make paper clips. And it does so until it turns the entire planet and all of the people in it into paper clips. There's just grinding our bones for paperclips. There's no humans anymore, right? This AI has just ground every resource into paperclips. Facebook, social media. This is starting to look a little bit like the runaway AI and misinformation will only get worse until we do something.
[01:06:44] Deepfakes come to mind, right? Those videos that you see online, where it's Tom Cruise showing you a magic trick, except it's totally not him. We talked about deepfakes in depth with Nina Schick, episode 486 is where that is. That is a new frontier, just some terrifying media. You won't be able to believe what you hear. You won't be able to believe what you see. Do we just eventually realize that our own eyes, our own ears, and our own minds are no longer adequate for making sense of the world? That is terrifying. I'd love to know that heads of state and heads of our defense and intelligence services clued into this and glued onto this, hopefully, and ideally asking how this might be combated in some form. Further conspiracy thinking is fueled by a lot. And once people believe in one conspiracy theory, it opens up the floodgates to so much more. Kids are coming into schools, arguing that the earth is flat or that the Holocaust didn't happen. This is super dangerous. It is so dangerous to see our kids being diseducated at such a young age. It's very hard to undo that kind of damage.
[01:07:44] China and Russia are using these services. These services that we use right here in the United States, the ones they ban in their own country. They're using these services to target the divisions within our country. We talked about this with Renee DiResta as well, episode 420, we do a full breakdown of this. So as you can see, I've been around this topic and adjacent to this topic for a long time now. And if you think it's bad in the United States, Facebook, for example, just as an example, they are no more or less guilty than anyone else. They don't have content moderators in the hundreds of dialects in all of the countries on earth.
[01:08:16] Yes, we have it in English. That's the most developed part of the platform when it comes to moderation and that moderation by all accounts is pretty crappy. So just imagine what it's like in Ethiopia, which is a hotbed right now, and very at-risk for civil conflict. What's the moderation like there, especially in some of those rural tribal languages? It's non-existent and yet those platforms can be used to sow division and start armed violent conflict between the peoples who live in that region.
[01:08:42] The deprogramming, it sounds a lot like the cult deprogramming that I talked about at length with Steven Hassan, who's been on the show several times, episodes 238 and 471. The cult deprogramming stuff is no joke. Just remember if we're addicted to likes, we're more valuable. If we're insecure and checking incessantly on engagement were more valuable. If we've raged tweeting or commenting on something 40 times, because we're in an insane argument with a stranger on the internet, on a platform, we are more valuable. The worst aspect of us, our base instincts as humans, are more valuable and more profitable to these platforms. So it behooves these platforms to literally program us, to do all of these things at the expense of our sanity and our mental health.
[01:09:28] Furthermore, this is a tech arms race, right? If one platform does sketchy stuff with notifications or the algorithm or the feed, then everyone else has to follow suit or they'll lose those impressions. And that time on site to competitors, which is of course, untenable in business. So it becomes a race to the bottom, contrast this with other tools that we use. Other tools that don't have an agenda of what they want from me. Microsoft Word doesn't have an agenda. My squat rack doesn't have an agenda. Social media has an agenda. So stop following outrage media. That means a lot of those lefty righty sides that get you all worked up, just unfollow. Those influencers that post nonsense stuff to get you alarmed, unfollow.
[01:10:10] And if you don't believe me, you want to see how this looks, how this is all tailored to you to get a rise out of you, watch YouTube or look at Facebook on somebody else's phone, see how their reality looks compared to yours. I guarantee you, it won't be nearly as interesting to you personally. Try it. You'll be amazed at how tailored these things are to you. It knows what gets a rise out of you and what keeps you sucked in. And they will lean into that without mercy.
[01:10:35] Again, I highly recommend watching Tristan's documentary, The Social Dilemma. It's available on Netflix. Also speaking of YouTube, you can watch Childhood 2.0 on YouTube. Also has Tristan in it. Very interesting. It shows different generations of people and what their childhood was like. And I'll leave it to you to see exactly how our current generation of kids is dealing with this influx of social media compared to the older generations. And last but not least check out the Center for Humane Technology's podcast, Your Undivided Attention, starring the one and only Tristan Harris.
[01:11:06] And of course thank you to Tristan. Links to all of his stuff, all the stuff I mentioned as well in the show notes. Please use our website links if you buy books from any guests. It helps support the show. Worksheets for episodes are in the show notes. Transcripts are in the show notes. Videos of interviews on our YouTube channel at jordanharbinger.com/youtube. You can go watch a video of this and then eight steps later, you can be watching a video of why Mao's great leap forward never happened, and why Taiwan is a part of China. And if you don't agree, then you're racist somehow. And also how the Holocaust didn't happen. That's what you get. Hopefully, you can just stick to our channel. None of that nonsense there. We also have a brand new clips channel. Cuts that didn't make. Get to the show, highlights from the interviews you just can't see anywhere else. jordanharbinger.com/clips is where you can find. I'm at @JordanHarbinger on both Twitter and Instagram. You can also hit me on LinkedIn. I always enjoy interacting with y'all online.
[01:11:54] And of course, I'm teaching you how to connect with other great people and manage relationships using the same system, software, and tiny habits that I use. That's our Six-Minute Networking course, which is free over at jordanharbinger.com/course. Dig that well before you get thirsty. Most of the guests on the show subscribe to the course. Come join us, you'll be in smart company where you belong.
[01:12:14] This show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Millie Ocampo, Ian Baird, Josh Ballard, and Gabriel Mizrahi. Remember ,we rise by lifting others. The fee for the show is that you share it with friends when he finds something useful or interesting. If you know somebody who is ringing the bell about social media and why it's bad or somebody who doesn't really get it, maybe share this episode with them. Let them know the real deal. Hopefully, you find something great in every episode of this show. So please share the show with those you care about. In the meantime, do your best to apply what you hear on this show, so you can live what you listen, and we'll see you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.