Outrage Machine author Tobias Rose-Stockwell explains how social media fuels our negative emotions and disrupts society — and what we can do about it.
What We Discuss with Tobias Rose-Stockwell:
- How social media inadvertently created what Tobias Rose-Stockwell calls an “outrage machine” that fuels our negative emotions and disrupts society.
- Because it generates more attention (which translates into more money), negative news is more widely shared and discussed than good news, which makes it seem like the world is worse than it actually is.
- The more information we produce and share online, the more powerful the algorithms and AI designed to keep us engaged — and outraged — become.
- The longer people spend on social media, the more likely they are to be politically extreme — and elements on every inch of the political spectrum manipulate this to their advantage.
- The steps we can take to guard against the outrage machine’s control over our emotions and distortion of our reality.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Outrage Machine author Tobias Rose-Stockwell joins us for a deep dive into how social media has revolutionized our consumption and interpretation of information, often leading to the rampant spread of misinformation and manipulation of moral emotions. Here, we draw stark comparisons between our technologically dependent society and “the matrix.” Delving into the role of AI in belief formation, he cautions against the potential use of AI in fueling outrage and polarizing opinions.
The conversation further explores the transformation of modern media under profit-motivated premises, and how it steers public opinion, with potential detrimental effects on democracy. Discussing the theory of semantic drift and the concept of coordination traps, Tobias provides insightful recommendations on becoming better media consumers and adopting a ‘detective’ approach to conversations for mutual understanding and productive dialogue. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. We appreciate your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini-course — at jordanharbinger.com/course!
This Episode Is Sponsored By:
- ZipRecruiter: Learn more at ziprecruiter.com/jordan
- Progressive: Get a free online quote at progressive.com
- BetterHelp: Get 10% off your first month at betterhelp.com/jordan
- McDonald’s: Share your McDonald’s crew story at mcdonalds.com
- Ten Percent Happier: Listen here or wherever you find fine podcasts!
At a time when the United States seems to be increasingly disunited by political polarization and calls for violence, is it reasonable to wonder if we’re on the cusp of a civil war? Listen to episode 718: Barbara F. Walter | How Civil Wars Start (And How to Stop Them) to find out!
Thanks, Tobias Rose-Stockwell!
If you enjoyed this session with Tobias Rose-Stockwell, let him know by clicking on the link below and sending him a quick shout out at Twitter:
Click here to thank Tobias Rose-Stockwell at Twitter!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- Outrage Machine: How Tech Amplifies Discontent, Disrupts Democracy — And What We Can Do About It by Tobias Rose-Stockwell | Amazon
- Tobias Rose-Stockwell | Website
- Tobias Rose-Stockwell | Twitter
- Tobias Rose-Stockwell | Instagram
- Reconstructing Our Attention in the Era of Infinite Digital Rabbit Holes | Literary Hub
- Ask These Four Essential Questions to Break the “Outrage Machine” | Big Think
- How to Stop Misinformation’s Spread Before It’s Even Shared | WIRED
- Social Media Is Warping Democracy | The Atlantic
- Tristan Harris | Reclaiming Our Future with Humane Technology | Jordan Harbinger
- Nir Eyal | Control Your Attention and Choose Your Life
- Nir Eyal | How to Manage Distraction in a Digital Age | Jordan Harbinger
- Jonathan Haidt | The Danger of Good Intentions and Safe Spaces | Jordan Harbinger
- Tim Urban | What’s Our Problem (And How Do We Solve It)? | Jordan Harbinger
- Jaron Lanier | Why You Should Unplug from Social Media for Good | Jordan Harbinger
908: Tobias Rose-Stockwell | Dismantling the Outrage Machine
[00:00:00] Jordan Harbinger: Coming up next on The Jordan Harbinger Show.
[00:00:02] Tobias Rose-Stockwell: Basically, the world we're living in right now, in a few years, will likely be one in which humans are closer to feed animals for algorithms, and that we are plugged into machines that are just built for extraction, extracting our preferences and information to try to feed these algorithms more and more effectively. And we're already seeing that right now, right? We're already beginning to resemble these. strange feed animals. That are stuck in these algorithmic channels of extraction.
[00:00:35] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets and skills of the world's most fascinating people and turn their wisdom into practical advice that you can use to impact your own life and those around you. Our mission is to help you become a better informed, more critical thinker through long-form conversations with a variety of amazing folks from spies to CEOs, athletes, authors, thinkers, performers, even the occasional arms trafficker, former jihadi, national security advisor, or extreme athlete.
[00:01:04] And if you're new to the show, or you want to tell your friends about the show, I suggest our episode starter packs as a place to begin. These are collections of our favorite episodes on persuasion, negotiations, psychology, geopolitics, disinformation and cyber warfare, crime and cults, and more. That'll help new listeners get a taste of everything we do here on the show. Just visit jordanharbinger.com/start or search for us in your Spotify app to get started.
[00:01:26] Today on the show, Tobias Rose-Stockwell who tells us that social media is designed to make us angry. No big surprise there. It wasn't the original intent, of course, but that's where we are now. In this episode, we'll discover how we got here, and what social media is doing to our ability to think critically or think for ourselves at all. We'll also explore disinformation, viral news, clickbait, emotional contagion, and more with a couple of hopefully worthwhile tangents about our time in Cambodia as well.
[00:01:53] All right, here we go with Tobias Rose-Stockwell.
[00:02:00] All right, I'd love to start with the human superpower analogy because I'd never heard anything explained like this. Did you make this up by the way? Because I thought that was brilliant.
[00:02:09] Tobias Rose-Stockwell: Yeah.
[00:02:09] Jordan Harbinger: Yeah.
[00:02:10] Tobias Rose-Stockwell: The analogy I use is, imagine if one day all of the planet was blanketed by a cosmic blast of radiation. Everyone all over the earth suddenly gets a new superpower, but it's the same new superpower. And that superpower, I start the book with this, which is the superpower of invisibility. So anyone can just instantly snap their fingers and turn themselves invisible. It's a beautiful, amazing thing. People are incredibly surprised and delighted by this, introverts rejoice, it's this beautiful opportunity to mess with your friends, and everyone now is basically a superhero, it's this incredible thing.
[00:02:47] So, that morning, everyone's in awe, but by the evening, basically the world devolves into chaos. There is this looting of malls and banks, society begins to break down because this new superpower has been given to everyone. Eventually, society starts to adapt, but there's this very dark and chaotic period in which a huge portion of the world is just in terror of what they can and cannot see, right?
[00:03:11] You could have someone creeping on you in your bedroom, someone might be stealing something from you without you knowing it. There's this huge governmental and social effort to try to suddenly take care of all these horrible actors that are wreaking havoc on the system at large. So you start to see heat and infrared cameras embedded in cell phones. There's laws that are passed to keep people from using invisibility in certain places. Certain religions ban it as outright. But other portions of society actually start to glom onto this thing and use it. Some people have whole relationships with each other because they want their personality to be on display and not the distraction of their physical bodies.
[00:03:45] So I play out this analogy just to kind of show that there is this strange thing that happens when we are given new superpowers. They cause tremendous havoc upon us, upon society writ large, until society adapts to them. So in the first chapter of the book, I go through this kind of analogy of us exploring these superpowers, and it takes a number of years for society to actually emerge from this period of chaos after they are given superpowers.
[00:04:09] And this is basically an analogy for what happens when we are given new technology, right? We all get a new superpower. When we got smartphones, we got an entire new upgrade to the way we communicate, which is akin to telepathy, which I also explore as a teaser in the book, which is we instantly share what we're thinking with everyone on Earth. If we say something the right way, it will expand outward and be this cacophonous, instant point of attention that everyone will pay attention to. But it also causes tremendous chaos in the process, right? All sorts of people start saying things that are false. Huge portions of the planet begin getting confused about what is real, what isn't real.
[00:04:44] And that is, of course, an analogy for social media and what we have now with these tools. And technology has been giving us these new superpowers year over year. And we want the superpowers, right? We ask for them. But the rate at which we are given these new superpowers, we're basically being blasted with cosmic radiation on a monthly basis now, getting new superpowers that are actually changing the organization and functioning of society writ large.
[00:05:08] And there's this period in between, once we get the superpower, until we figure out how to use it. You might call it a dark time, or a dark valley, in which all of society is in chaos, trying to figure out how to actually manage and adapt to this new power that everyone has. So that is, I think, a really important piece of this strange world that we're in now is that we're just constantly being updated with new, incredible, we're basically all superheroes now.
[00:05:32] Jordan Harbinger: Yeah.
[00:05:33] Tobias Rose-Stockwell: We don't think about it, but we're all superheroes on a regular basis. We're getting more and more powers. It doesn't feel like we're in capes or anything, but we are basically just upgrading our abilities because of tech. And each year the risks are getting kind of more and more extreme as we're blasted with these new powers.
[00:05:47] Jordan Harbinger: You give a really good analogy, another one, in the book about the pre-valley, dark valley technology thing. You talk about this paved road in Cambodia. Can you go through that as well? I hate to just be like, give this other thing, but these are really good. And I think they really helped my understanding of what you mean by technology's dark valley.
[00:06:04] Tobias Rose-Stockwell: Totally. When I was much younger, I lived and worked in Cambodia and Southeast Asia. It's a small, very poor country between Thailand and Vietnam.
[00:06:11] Jordan Harbinger: People should know about Cambodia. Do you talk to a lot of people who are like, never heard of it? Because you're indicating that maybe people have never heard of Cambodia. Am I overestimating people's geography skills?
[00:06:21] Tobias Rose-Stockwell: Great question. I think that's a small enough country that a lot of people don't know about it.
[00:06:23] Jordan Harbinger: Huh.
[00:06:23] Tobias Rose-Stockwell: It is an incredible story about what happened there, which we can maybe touch on a little later. Basically, the country went through a pretty terrible civil war in 1975, and the whole country was in very dire shape up until even the early 2000s, with a lot of guerrilla violence, a very broken place in a lot of ways. Beautiful country, amazing people, incredible culture, but average income for farmers living in the countryside is less than two dollars per day, just a very poor country, one of the poorest countries in Southeast Asia.
[00:06:50] Jordan Harbinger: Yeah.
[00:06:50] Tobias Rose-Stockwell: For sure.
[00:06:50] Jordan Harbinger: I don't need to tell you, but it's wild. I've talked about this on the show very little, but I went there. I ended up staying in a hotel just to get the flavor of how wild this place is. This is in Phnom Penh, probably, let's say 2002 or something like that. Maybe around then. There's a guy with like an AK-47 who guards the hotel. And it's this weirdly fancy hotel, but the first floor is just closed. And you just walk upstairs to your room. There's no front desk like there is, but it's just the lights are off and there's like this little barricade. Like, eh, we don't do anything with this. And the guy's outside drunk with this AK-47.
[00:07:23] And at night, if you're late enough, he's like, "Give me some money!" And you're like, "You have a gun, but also you kind of, you're supposed to be protecting me, but you're trying to rob me right now in this weird, semi-intimidating way." And I met these girls there, because I couldn't turn on my shower, and they showed me how to turn on the shower. They were local girls that were also staying in the hotel. And I ended up hanging out with them.
[00:07:41] There was an American guy that one of them was married to, and he was on the run from the Cambodian police. They lived in another town. He had apparently shot at the cops, because he was also a classy piece of work. But he was a hitman that was on the run in Cambodia, lives there, and does his crazy job elsewhere in Southeast Asia. And we're hanging out, as one does, and I told them I was hungry, and they went and they got me a paper bag full of roasted tarantulas.
[00:08:09] Tobias Rose-Stockwell: Yep.
[00:08:09] Jordan Harbinger: And I ate the whole thing.
[00:08:11] Tobias Rose-Stockwell: A great treat. A great treat.
[00:08:12] Jordan Harbinger: They were actually really good. And then after I ate them, I was like, how sick am I going to get from these? Totally fine, by the way. Completely fine.
[00:08:19] Tobias Rose-Stockwell: Very nutritious. They're very nutritious.
[00:08:20] Jordan Harbinger: I bet. I'm like, there's probably a crap load of protein in these things.
[00:08:24] Tobias Rose-Stockwell: Yeah, definitely. Definitely, bags of crickets and—
[00:08:26] Jordan Harbinger: Yeah.
[00:08:27] Tobias Rose-Stockwell: —and other bugs. Really quite nutritious. During the time when you couldn't get a lot of good food, they started frying up insects and turning it into its own protein rich meal.
[00:08:34] Jordan Harbinger: That makes sense. I hadn't thought about why they eat that kind of thing. I just figured, eh, it's kind of jungly. There's probably a lot of people figured it out. Like I remember going really deep. I mean, I came to Cambodia over land through Vietnam and there was a guy on a motorbike with a giant cage strapped to the back packed so tightly with rats that they couldn't move. And there were just like tails squiggling out of the back.
[00:08:54] And I'm like, "Why does he have those?" And then, you'd see these jungle huts on little stilts because they're not on the ground because there's too many things on the ground. And there'd be like this little box next to it. And inside is a massive snake. And I'm like, "What is that? Why would they have that?" It's not a pet. It's in this tiny cage. And they're like, "Oh, they probably caught it. And they're going to eat it."
[00:09:11] Tobias Rose-Stockwell: Yeah, they also eat snake, yeah.
[00:09:13] Jordan Harbinger: I was just like, "Wow, I'm really in the jungle." I am very much—
[00:09:16] Tobias Rose-Stockwell: Totally.
[00:09:17] Jordan Harbinger: —in the jungle. But to your point, I brought one of the girls on a tour because she'd never been to anything touristy because they lived in another place and she was very poor. So we went to the killing fields. And that was an experience. Cambodia's civil war was so gross. They killed three million or two million people.
[00:09:34] Tobias Rose-Stockwell: Yeah, almost two million people. It was the largest per capita loss of life, I think, in the last century.
[00:09:39] Jordan Harbinger: It's like a third of the population, right, wasn't it?
[00:09:41] Tobias Rose-Stockwell: Huge portion of the population, yeah. And they can't even really properly count it because they just didn't account for people who just get killed constantly. If they opposed the Khmer Rouge, which is this very draconian, hyper, very extreme regime that was in place. from 1975 to 1979. And so they participated in what was what ended up being auto-genocide which doesn't happen very often usually genocides happen when it's one cultural group against another cultural group. But in cambodia it was basically an in-group against in-group because there was such a hyper paranoid regime. It's almost like the most extreme version of cancel culture.
[00:10:13] Jordan Harbinger: Mm-hmm.
[00:10:13] Tobias Rose-Stockwell: That everyone was looking at each other constantly to try to call people out if they weren't orthodox enough and weren't following the kind of line of the party enough. It was this very extreme and intense version of Maoist communism. So they were constantly looking for spies and people that weren't pure enough. And so they would just take them off and they would actually train children much of the time to go off and be the executioners because the kids just were indoctrinated so heavily into this. And they'd take them off to the killing fields and just kill them. And that, yeah, there's these amazing, horrible, but like really incredible monuments around the country to these atrocities that happened during that time.
[00:10:47] Jordan Harbinger: Yeah.
[00:10:48] Tobias Rose-Stockwell: Amazing place.
[00:10:48] Jordan Harbinger: Yeah. Have you been to Tuol Sleng, that school? I think it's called Tuol Sleng.
[00:10:52] Tobias Rose-Stockwell: Yeah.
[00:10:52] Jordan Harbinger: They turned a school into a prison camp, and it's one of the grossest, scariest places, and they do a really good job of kind of—
[00:10:59] Tobias Rose-Stockwell: Tuol Sleng, I think.
[00:11:00] Jordan Harbinger: Tuol Sleng. Yeah. Sorry, I'm just going off of 20-year-old memory here.
[00:11:03] Tobias Rose-Stockwell: Yeah, totally.
[00:11:04] Jordan Harbinger: They'll just leave all the handcuffs cuffed to the wall where the people were at that point, only a few years prior, and there's all these stains on the walls and they're like, "Yeah, we're just leaving it there for you to see. And then, there'll be a sign that's like baby killing tree, and you're like, "That's got to be a mistranslation." It's like, "No, this is where they kill babies against a tree."
[00:11:21] Tobias Rose-Stockwell: Yeah.
[00:11:22] Jordan Harbinger: And you're like, "Why would they kill babies?" "Oh, because they were kids of intellectuals." "Okay, who's an intellectual?" "Anybody that wore glasses." "That can't be true." "Yeah, literally anybody who needed glasses." "What if you're old?" "Eh, doesn't matter, collateral damage." There was lots of stuff like that. And I just remember walking through these killing fields and this is not like a well-groomed monument. It's a pit and there's clothing stuff coming out of the ground. It was raining when I went, so there's clothing bits and bones. And I stepped down because I had something stuck in my shoe and it was a jawbone with teeth in it.
[00:11:53] Tobias Rose-Stockwell: Man.
[00:11:54] Jordan Harbinger: Yeah, so I will never forget that.
[00:11:56] Tobias Rose-Stockwell: Yeah.
[00:11:57] Jordan Harbinger: Ever, obviously.
[00:11:58] Tobias Rose-Stockwell: Totally. Yeah, really powerful place to visit. Mostly I think because it happened so recently. I think we have. narratives about how horrible World War II was and other genocides that have happened around the world. We kind of see them through the lens of history, things that maybe our grandparents experienced, but this just happens up through the '80s and '90s. There's still some terrible stuff that was happening there.
[00:12:17] So that was definitely during my lifetime and to recognize that and see that I found really powerful and kind of a centering experience about the importance of democracy and the importance of good governance and the importance of not falling prey to political extremism.
[00:12:31] Jordan Harbinger: Mm-hmm.
[00:12:32] Tobias Rose-Stockwell: We'll definitely circle back around and connect this to technology.
[00:12:34] Jordan Harbinger: Yeah.
[00:12:35] Tobias Rose-Stockwell: Because this is an important story for the narrative here, but in 2003, I was traveling through Asia and I was in Cambodia. I was a backpacker at the time and I met this monk, this Cambodian Buddhist monk who basically invited me out to his village in the middle of nowhere. Basically, he brought me out to this pagoda in the middle of the country. I thought it was just going to be a community visit. And when I got there, there was several hundred coming council, village elders, several hundred farmers that were there just to see me. I was there with a friend.
[00:12:59] And they're like, "Thank you for coming. They were waiting for you. We have this project, thank you for agreeing to help us rebuild our reservoir that broke." And I was like, "I'm sorry, what?"
[00:13:08] Jordan Harbinger: Yeah, oops.
[00:13:08] Tobias Rose-Stockwell: "You have the wrong person. Yeah, what? Why am I here?" And yeah, so basically this community had this very large irrigation system that had been destroyed during the Khmer Rouge that provided about 6,000 people with a second crop of rice annually, and they were looking for help in rebuilding it. And they thought, since we were the only backpackers and foreigners that had ever come out that far to visit them, that we were going to help them and save them?
[00:13:32] Jordan Harbinger: Wow.
[00:13:32] Tobias Rose-Stockwell: And this was a totally insane and ridiculous request. I wasn't a trust fund kid. I didn't have any engineering background or anything like that. But what I did have was the Internet. And so I had spent some time with these monks. I got more and more interested in their cause. I said, "Look, you have the wrong person, but I'll at least be an advocate on your behalf." And so I wrote an email to friends and family back home, telling them about this irrigation system, this project, these monks run a local NGO. They're really trying hard to help this community. Very, very poor community.
[00:13:58] And so I wrote this email to friends. I said, "Look, there's these incredible monks that are working on this really important project. I want to help them. I'm trying to figure it out." And that email went out to, I think about 50 people at the time. But one of the recipients went to a list serve on a site that a friend had built back in California. And again, this is 2003, 2004, pre-Facebook, pre-Twitter. And this platform was a prototype social media platform. One of these friends that built this platform, she went on to be one of the first engineers at Twitter.
[00:14:29] But anyway, this email, which normally would have gone to 50 people, and that would have been it. It was connected to this list serve, which had friends of friends connected by profile pictures. It looks exactly like a social media platform. It was a prototype social media platform. And that email ended up going viral, and I ended up with this cascade of interest and support. And all of a sudden I had this support of strangers, random people being like, "I want to support this project. How can I help? What can I do to support it?"
[00:14:52] And this virality suddenly gave me the superpower to help these monks. There was all of a sudden, like, engineers that were reaching out to me. There was people that wanted to fund it. And it ended up being this plausible thing that I could do to help these monks rebuild this reservoir. So I thought it would take two months at the time. They were looking for $15,000 to rebuild this thing. I'm like, that's like a car that's like a lot of money, but it's like a car that's like a cheap car back home. I think that's like a doable thing. I can spend two months of my life working on this thing at $15,000.
[00:15:18] Anyway, long story short, I ended up living in Cambodia working on this project for almost seven years as a result of this viral email. We raised over a quarter million dollars for the reservoir, over a million dollars for the community writ large. I ended up living there to build a small scale NGO to help rebuild this big irrigation system. And that was all empowered by this viral ability that I was given having access to a viral network online. I lived there for quite a long time. We found landmines. We had to demine the entire area.
[00:15:43] Jordan Harbinger: I was going to ask, I wouldn't want to walk around a rural village in Cambodia. Even now I'd be like, uh, are you sure this stuff is all gone?
[00:15:51] Tobias Rose-Stockwell: Yeah, it was scary. We didn't think that there were landmines in the beginning, but there ended up being pretty extensive mining throughout the area.
[00:15:58] Jordan Harbinger: Wow.
[00:15:58] Tobias Rose-Stockwell: So we have to demine the entire area. It was a huge project. Amazing chapter of my life, but all created and allowed and empowered by this new viral ability that I was given—
[00:16:09] Jordan Harbinger: Wow.
[00:16:09] Tobias Rose-Stockwell: —early social media. So I was a little bit early with social media and kind of understanding what the powers were of it. And that was obviously a really amazing use case for it.
[00:16:19] Jordan Harbinger: Sure.
[00:16:19] Tobias Rose-Stockwell: And how it could take this emotional plea from the middle of nowhere and help change the lives of many people. So I was very interested in how it might help that. And the years after, I moved back to Northern California, where I grew up. I grew up in the Bay Area next to Silicon Valley. So I moved back to San Francisco and started working with and around a lot of the early kind of creators of the social web. So a lot of friends that were the first engineers at YouTube, Facebook, Twitter.
[00:16:42] And there was this amazing kind of euphoria in those early days about what these tools could do to actually change the world. And you probably remember what it was like back then, Arab Spring.
[00:16:51] Jordan Harbinger: Mm-hmm.
[00:16:51] Tobias Rose-Stockwell: I literally remember this quote from a friend, which was Facebook is so good for democracy. And that was just a clear truth that people would say out loud, right? Such a powerful tool to help us build democracies around the world, help us connect. And yes, there was this very early euphoria around how these tools could actually make a difference. Turns out the same dynamics that they give us superpowers to virally build emotional pleas for random Cambodian villagers in the middle of nowhere, can also build support for fringe causes anywhere. And you can draw a pretty straight line from the Arab Spring to QAnon on January 6th as using social media as a tool for building viral support for fringe causes. So a very similar thing. And that's what brought me to this book and this project and this whole kind of reevaluation and accounting of how strange social media really is and how powerful it is as an influence on our society writ large.
[00:17:42] Jordan Harbinger: Part of the problem with social media, as you write about in the book, is that fast spreading info tends to be false info. It's not the viral campaign to fund a dam in Cambodia that spreads like wildfire. It's very hard for something like a book or a podcast or something that's well thought out and longer to go viral. I often joke I'd be a lot richer if I could just lie and make sh*t up on the show and entertain every conspiracy theory or kooky ideology under the guise of just asking questions because that stuff does tend to make for good sound bites and go further.
[00:18:11] And this is sort of Daniel Kahneman's system one thinking, when you get emotionally triggered, you're like, you want to push something out. And then the algorithm rewards that. But a lot of us know this already, which is why I try to use my social media inbox only so that fans can communicate. But I get sucked in literally every day, almost every single time I open one of these apps. And at first I felt silly, like I'm lacking willpower, but it's by design. You can't open Instagram to look at your DMs without the first post being hyper targeted to something that you probably want to see.
[00:18:41] Tobias Rose-Stockwell: Yeah, absolutely. These tools have gotten so good at capturing our attention and keeping us there as long as possible. So good that, yeah, our agency, I think, has started to feel fuzzy, started to be degraded or sanded down as we use these tools. And I think that's the result of three very specific features that were launched with very little fanfare at social media companies. I think it's really important to be precise as to what the features are doing to us, because otherwise, it's social media writ large, it's a problem, and we throw our hands up. But if you look at the actual specific feature sets and how they were deployed, I think you can look at the straight line that got us here to this really weird, outrageous, cacophonous world of everything's on fire.
[00:19:20] Back to, oh, social media is good, like, it's fun, we like it, it's helpful. So yeah, so three features were launched between 2009 and 2012 that collectively changed our relationship with information online. And so those features are, you'd know them right away. The first one is the algorithmic feed. You know, how content is rank ordered by an algorithm and made for engaging us.
[00:19:40] Jordan Harbinger: Mm-hmm.
[00:19:40] Tobias Rose-Stockwell: Made to be sticky, made to keep us scrolling, made to keep us in there. All of our feeds these days are algorithmic. We don't see a lot of reverse chronological these days. Each one of these features came with a real important utility to them. We want these features. As users, we want these features.
[00:19:56] Jordan Harbinger: Mm-hmm.
[00:19:56] Tobias Rose-Stockwell: Designers and developers, they wanted these features. They all made sense to deploy. An algorithmic feed is so important because it gives us access to the stuff that would otherwise be lost because we have too much information. We're just producing too much information. But it turns out a certain type of content feeds that are rank order for engagement tends to be outrageous, controversial content, stuff that is referred to as borderline content. We pay attention to that stuff, right?
[00:20:18] So if an algorithm is looking at our attention throughout the day, you can imagine like an algorithm kind of over your shoulder trying to optimize for engagement. If you're just like not on social media at all, imagine an algorithm is following you around throughout the day and it's like looking at what you're looking at and trying to figure out what to serve you. You're walking down the street, everything's fine, you go to your coffee shop or whatever, you walk outside and there's a car crash in front of you. Your attention goes to 100 percent on the car crash, right?
[00:20:39] Jordan Harbinger: Mm-hmm.
[00:20:40] Tobias Rose-Stockwell: You just like zwoosh on the car crash. And the algorithm is like, Oh, well, you like car crashes. So from now on, I'm going to feed you car crashes, right? You can see how simply a simple algorithm can actually optimize for some problematic content like that. So that the social equivalent of a car crash. We have two friends or two acquaintances or two strangers fighting about something contentious will capture a tremendous amount of engagement from us as well in the same way. So it makes sense that stuff began to be really sticky on the top of our feeds during that period. But you can see how we got here, right?
[00:21:09] Jordan Harbinger: Yes.
[00:21:09] Tobias Rose-Stockwell: You can see how algorithmic feeds make sense. We make too much information. You don't want to miss the important stuff that your friends posted earlier on in the day. You don't want to miss that important conversation, that important topic. But it turns out that yes, stuff that's really contentious, really polarizing, really anger inducing is actually a great way to capture attention. That was one of those problems with that first feature of algorithmic feeds.
[00:21:28] Jordan Harbinger: We've talked about this a little bit on the show with Tristan Harris, episode 533. And I just had a conversation with Anna Lembke, Dopamine Nation, about the intermittent variable rewards that you get from social media. You're not sure if you'regoing to get likes or something interesting, so you keep pulling the slot machine. And there are counterpoints. I think our mutual friend Nir Eyal talks about this. It's clear that it's borderline addicting, if not classically according to the term, addicting.
[00:21:57] Tobias Rose-Stockwell: Definitely.
[00:21:57] Jordan Harbinger: And you've got personal experience with this, right? Didn't you do viral news at some point? Wasn't this part of what you came up with?
[00:22:04] Tobias Rose-Stockwell: Totally. So after I was working in the Valley, I moved to New York and I was a management consultant and I worked with a design strategy agency that was working under NDA with a very large news producer in the country. And this is in 2016. So after the viral web had really come online and I'll speak about this in the context of the second feature, which I think is really important, which is social metrics, which is the ability to see visible numbers and likes and comments and shares. See those numbers visibly at the bottom of your feeds also follows a really important metric as well.
[00:22:35] But I was working with one of the nation's largest news producers and we were trying to figure out what the heck was wrong with their news business. They produced a majority of the news for local news right throughout the country. So they owned like a huge portion of America's newspapers and the bottom had just basically fallen out of their business, right?
[00:22:53] That people were not making money. They were losing money, that huge obligations and that they were also just losing market share. People weren't buying papers anymore. People were going somewhere else and the other place they were going was to social media, right? So social media had taken this entire market of local news and people's attention instead of going to their paper to figure out what was happening, they would go to Facebook or they would go to Twitter.
[00:23:16] And I was really trying to study the engagement mechanisms here that were happening and I happened to have another client at the time that was actually optimizing for Facebook and Twitter at that moment in time was building viral campaigns online and I could just see in this kind of like split screen what was happening. That old school news journalism, the world of kind of old school centrist journalism where objectivity was important, where neutrality was important, was entirely losing in this battle of attention on social media to the thing that would get the emotional click, right? The thing that would get the viral hit.
[00:23:51] And everyone, all the BuzzFeeds of the world, the HuffPosts, they were all eating this market that historically had gone to local news because local news had these kind of slower systems of journalism, verification of corroboration. All the stuff that we had gotten used to over many years was suddenly, it was just losing completely in the battle for our attention. The reason for this is because there was optimization that was possible for the first time. And you can use Facebook basically as a lab for viral content, a virology lab essentially. Open up to anyone to make viral content available if you have the right tools.
[00:24:25] Jordan Harbinger: Right. So people like your organization, you're split testing these headlines. So everything is, you won't believe this one quote or like Jordan Harbinger slams Tobias Rose-Stockwell.
[00:24:36] Tobias Rose-Stockwell: Totally.
[00:24:37] Jordan Harbinger: And then it's, "Hey, there's a typo on page 359 of your book." That's the slam. Or it's a story about somebody.
[00:24:42] Tobias Rose-Stockwell: Yeah, totally.
[00:24:43] Jordan Harbinger: Something someone tweeted. And it's like slams. I mean, they said like, Oh, I thought the movie was too long. Oh, okay, yeah, but that's the thing that gets the click.
[00:24:51] Tobias Rose-Stockwell: Exactly. So what you can do is you can take a pretty benign story, right? You can take a nothing burger essentially and you can write dozens of headlines for that nothing burger and make it something that feels emotionally urgent and then you can test each one of those headlines using optimization tools.
[00:25:08] You can AB split test them and find out how much engagement each one of those headlines will get. And you can do this algorithmically meta has a whole bunch of out of the box tools you can do this with and there's a whole range of third party apps that you knew this and process of headline packaging is trying to take this kind of benign story and make it the most kind of extreme and viral version of itself by putting a wrapper on it by making it as engaging as possible and you can test against thousands of eyeballs and figure out which of these headlines will actually get the most engagement.
[00:25:37] And when you do that, you can take basically a neutral story about correction in a book from "Jordan corrects Tobias's book," and turn that into "Jordan Harbinger, he roasts Tobias or smashes Tobias," and that's the stuff actually gets the most engagement. So you see quickly how this algorithmic process of efficiency of trying to capture people's eyeballs turns us into spectators. And this is very political all of a sudden or this is very angry and emotional. And it turns out there's a tremendous preference that we have for negatively valence headlines.
[00:26:05] So we look for emotionally negative headlines. We actually respond to those more significantly. And we saw that in the data at the time. While traditional journalism was withering in real time, this kind of new salacious web journalism, social media package web journalism was turning out into this knock-down-drag-out battle for attention that was just getting really ugly, really fast.
[00:26:25] And so you remember these headlines, "You'll be surprised to see," or, "You'll be shocked to see," or, "X makes Y cry," or whatever else these kind of terrible headlines that we saw during that era. 2017, I think was really one of the peaking points of curiosity gap headlines where it's like, you'll be shocked to see number seven on this list of the most outrageous things that are going on in this particular space. And so think about like hydraulic fracking, right? We figured out how to take shale, which was just like benign content, and just squeeze it and turn it into something that was more flammable and more viable financially for these companies. And we all witnessed this happening in real time. I think we all remember how crazy it was when we first saw these headlines.
[00:27:04] Jordan Harbinger: Sure.
[00:27:04] Tobias Rose-Stockwell: Whoa, everything is crazy now. What's going on?
[00:27:07] Jordan Harbinger: Every headline was like, "Ebola in New York." And then it's like, oh, but then the guy was fine. And nobody else got it.
[00:27:15] Tobias Rose-Stockwell: Right. Totally.
[00:27:16] Jordan Harbinger: It's just like deliberately terrifying. Because news media paid for clicks, so they're incentivized to sensationalize stories even if they're just end up being complete bullsh*t. But people have made millions of dollars doing this, and that's the problem, is like, getting the toothpaste back in the tube is really tough because the incentives are not aligned, right? The news organization is incentivized to get the clicks and the platform like Facebook is incentivized to get the news agencies money for those clicks. So they're not going to be like, ah, we don't allow this anymore. That's tough.
[00:27:44] Tobias Rose-Stockwell: Yeah, there's an interesting quote that I have from a friend who was running a major millennial focused online publication at that point in time. And he said this to me in confidence, "Our job is not to check your biases and our job is not to challenge your political opinions. Our job is to ride those opinions as far as we can take them because that is where the ad money is." And that is what happened in every kind of segment. We just got this process of segmenting our audiences down further and further into these kind of tiny, tiny little ideological chambers where you could extract attention from them in the process.
[00:28:14] So, it was a process of optimization and being able to really tightly measure those likes, those shares, those comments, and really get as close as possible to those engagement numbers. And it was incredibly valuable for a long period of time. I need to say also that this was a duration of time before we started to develop some cultural antibodies against it, right?
[00:28:31] So, you don't see. the same curiosity gap headlines anymore. You don't see the same kind of you'll be shocked to see headlines quite as much anymore. Now they're a little bit more subtle and they're actually more partisan a lot of the time. So a lot of old guard news agencies and news organizations started to segment their audiences by ideology for the first time. They either went out of business in the process and didn't make it, or they started to segment towards their audience's ideological preferences for the first time and more tightly than they ever had been before.
[00:29:02] Jordan Harbinger: You're listening to The Jordan Harbinger Show with our guest Tobias Rose-Stockwell. We'll be right back.
[00:29:07] This episode is sponsored in part by ZipRecruiter. Big ups to anyone in the hiring trenches. Whether you're at the heart of a budding startup or the brains behind a mega corporation's HR, your job is far from easy. But what if I were to tell you that there was something that can make your whole hiring process faster and easier? It's ziprecruiter.com/jordan. Post your gig and watch it skyrocket across 100-plus job sites. ZipRecruiters like your personal headhunter shining the spotlight on the rockstar candidates tailored for you. ZiRecruiters technology doesn't just hunt for a match, it does the heavy lifting by sifting through the crowd then delivering the best candidates to your digital doorstep. Say goodbye to sifting and searching, ZipRecruiter's got your VIP list ready. With a community of 3.8 million businesses from fledgling startups to established giants, you're in great company when you use ZipRecruiter.
[00:29:51] Jen Harbinger: Hiring heroes, let ZipRecruiter help make your job easier. Four out of five employers who post on ZipRecruiter get a quality candidate within the first day. See for yourself. Go to this exclusive web address to try ZipRecruiter for free. Ziprecruiter.com/jordan. Again, that's ziprecruiter.com/J-O-R-D-A-N. ZipRecruiter, the smartest way to hire.
[00:30:11] Jordan Harbinger: This episode is also sponsored by Progressive. Hey, listeners, whether you love true crime or comedy, celebrity interviews, news, or even motivational speakers, you call the shots on what's in your podcast queue, right? And guess what? Now you can call the shots on your auto insurance too. Enter the Name Your Price tool from Progressive. The Name Your Price tool puts you in charge of your auto insurance by working just the way it sounds. You tell Progressive how much you want to pay for car insurance, they'll show you a variety of coverages that fit within your budget, giving you options. Now that's something you'll want to press play on. It's easy to start a quote. You'll be able to choose the best option for you fast. It's just one of the many ways you can save with Progressive Insurance. Quote today at progressive.com to try the Name Your Price tool for yourself and join the over 28 million drivers who trust Progressive. Progressive Casualty Insurance Company and Affiliates, pricing coverage match limited by state law.
[00:30:55] If you're wondering how I managed to book all these amazing authors and thinkers for the show, it is because of my network, the circle of people around me that I know, like, and trust, or more importantly, that know, like, and trust me. Liking part is optional. And I'm teaching you how to build the same thing for yourself, for free, over at jordanharbinger.com/course. This course is about improving your relationship building skills, developing relationships with and for other people. And hey, the course does it all in this non-cringy, down-to-earth way. It's not going to make you feel terribly awkward. It's not going to make you look like a knucklehead. It's not cheesy, just practical exercises that'll make you a better connector, a better colleague, a better peer. Six minutes a day is all it takes. I mean, really it's less, it's as much as you want, but five minute networking was taken and many of the guests on our show already subscribe and contribute to the course. So come join us, you'll be in smart company where you belong. You can find the course at jordanharbinger.com/course.
[00:31:45] Now, back to Tobias Rose-Stockwell.
[00:31:49] I'm always fighting with my YouTube team, and I'm like, "Can you stop putting Jordan Harbinger confronts in front of everything?" It's like, "A podcaster confronts mafia hitman." And it's like a calm conversation with me and a mafia hitman. And I'm glad when I see the top comment is, "You didn't confront him on anything." And I'm like, "Ugh, I know. Sorry about that. We're working on this." I'm glad that people aren't even paying that much attention to it, but that negativity bias of people clicking on it because it's like, "Oh, they're going to fight in this clip."
[00:32:19] A lot of popular YouTubers, the whole thing is, "AI guru predicts AI will murder everyone." And it's like a guy from Google who is not in AI at all and is, yeah, I'm a little bit worried about AI because there's no checks on it. Jesus, man, please just stop creating this garbage. But they won't because it gets clicks and it serves a Lexus ad and they get one cent per view.
[00:32:40] Tobias Rose-Stockwell: Yeah, totally. And there's an unfortunate piece of this, which is that most people that actually see the headline don't click through to see the article. So they just assume that the headline is representative of the article itself when it actually isn't. So they'll walk away from that scroll being like, "Oh my gosh, I guess Jordan is a shock jock. He's like trying to get these fights or whatever."
[00:32:58] Jordan Harbinger: Jordan's fired up against his mafia hitman. Gosh, he really confronts a lot of people on YouTube videos that I never watched.
[00:33:04] Tobias Rose-Stockwell: Exactly. So that's unfortunate because that does shift our perception towards a certain level of collective outrage that is not representative of the actual events.
[00:33:12] Jordan Harbinger: Emotional contagion has to play quite a part here too, which is not good, right? Because if I'm enraged that Hillary Clinton is drinking child blood to stay young or whatever the dumb thing is of the day, the most extreme Facebook posts are going to get the most engagement, but they are also going to get shared the most, which means aren't I going to want other people to be as upset about that same fake non issue as I am in that moment?
[00:33:36] Tobias Rose-Stockwell: Yeah. So emotional contagion is a really important concept to understand when it comes to social media because it is something that is far more present than you'd expect online. When we get exposed to angering content online, we ourselves will start to reflect back that content, that valence of anger, like we'll start to feel angry ourselves. It might not be at the same thing, right? We might get angry about the anger about the fact that they're angry in the first place, but we're hyper emotionally porous social creatures. And we respond to emotion by feeling that emotion in our bodies and ourselves, right? So when we're on social media and we see someone super upset, we will likely get upset ourself.
[00:34:17] And early on in social media world back in 2014, when a lot of these features came online, there was some studies that were done on Facebook that showed that emotional contagion was actually happening already. It was happening all the time on these platforms. And they packaged that study initially, they're like, "Look, we can actually make people who are normally like neutral, we can show them more happy content and make them happy."
[00:34:41] Jordan Harbinger: That would be great.
[00:34:42] Tobias Rose-Stockwell: It is great. This is great. Everyone's happy now, right? But it turns out that there's actually a strange line of manipulation there, right? So if you can actually make people happy on Facebook by serving a bunch of happy content, you can also make people angry on Facebook by showing a bunch of content. You can actually change their preferences and behavior on Facebook if you show them a bunch of emotional content as well. So it really did suggest the beginning, even though the study was packaged as this kind of pro-Facebook, "Look, we're making people happy on here," is actually secretly, "Oh, no, these are very deeply, influential tools that are emotionally manipulating us."
[00:35:12] Jordan Harbinger: We can make people happy. We're not going to do that, but we could. But we're going to make people angry because it turns out it's five percent more effective or 10 percent more effective. So that's what we're going to do.
[00:35:22] Tobias Rose-Stockwell: This process of watching people lose it, lose the thread. This is such a common thing. I can tell you exactly how we got here. Like, we got here because we know these people. They have been in the public sphere for such a long time. But it's the same process of audience capture, of finding niche audiences on social media, and getting egged into these corners of ideological debate in which they keep on getting this incredible feedback for the most ridiculous commentary and ideas. And unless you have a really solid sense of centeredness and kind of a good bullsh*t detector, it's so easy to get trained into being a more extreme version of yourself. And that is very much what's happening with a lot of these online personalities right now. You can see this extreme version of themselves as a result of social media. And the people that are more online are more susceptible to this innate radicalization.
[00:36:07] Jordan Harbinger: Yes. Now, that's the scary part that also makes complete sense, right? Not just public personality. Like you see crazy Uncle Frank, but you also see people that we all know and you're like, "Wait a minute. Was this person always like that?" And the answer is no.
[00:36:20] Tobias Rose-Stockwell: The answer is no. Yeah, the answer is no. And this is happening. And I should have used this example rather than the more ambiguous one. But you can see this happening across different public personalities. And people generally used to have pretty centered, grounded, rational frames, looking at the world. But all of a sudden they find themselves in these cacophonous echo chambers of people that want this one particular type of outrageous commentary from them. And they egg them on, they pull them on. And that is the price we pay to gain audiences online. We become more extreme versions of ourselves and that is a huge fundamental problem with social media.
[00:36:52] Jordan Harbinger: And it's so subtle because you just get nudged in a direction and then suddenly you're talking about how the moon landing is fake.
[00:36:57] Tobias Rose-Stockwell: Yeah, 100 percent such a critical important piece of what's wrong with the world right now because we're all privy to these same incentives of becoming the more extreme problematic versions of ourselves. And I think you do a great job of not doing that. You have a good grounded center point and a good kind of bullsh*t detector. But if you don't have that, it's so easy to get lost.
[00:37:15] Jordan Harbinger: The other thing is I don't tweet other than just like, here's the new episode. I don't look at likes. I turned it off on Instagram. I don't post on Instagram. Anyways, I just post stories that are funny and I engage with fans. And I don't have a TikTok because I didn't want to play that game where it's, "Oh, the new thing is dumping an ice bucket over your head. I better do that now." Like, I'm just not going to do that. And so I was like, I better opt out of the game.
[00:37:38] I tried playing the game in a normal way where I'm like, just going to post clips of the show. I'm not going to choose the crappiest ones. I'm going to choose the most insightful ones. Oh, those don't get traction. Then I don't want to play your game. I don't want to be in the game. Especially TikTok was easy because it was like, everyone was dancing. And I was like, I'm not doing that. So I just opted out, right? I just didn't do it. And I never got any traction there. And then I was like, I don't miss Twitter. And then it was like, I'm not going to jump into threads. I don't want to just do another one of those and Facebook is dead. And then I was like, I like just not doing this.
[00:38:09] Tobias Rose-Stockwell: Yeah, and it shows. I want to say it shows in your reasonableness and kind of centeredness and groundedness and people that don't have existing audiences, unfortunately, they need to play some version of this game in order to get ahead. And that is one of the fundamental problems is like, we've had this viral explosion as a result of all this content is available, but the people that haven't had those audiences now they need to play some version of this in order to get ahead.
[00:38:30] And it's not all bad, right? There's some good people that are able to get to decent traction by being reasonable and being focused and stuff like that. But it's not quite the same as it used to be. People want that viral traction. They're willing to bend over and contort themselves to become what the algorithm wants.
[00:38:45] Jordan Harbinger: That combined with what you called, I'd never heard of this, context collapse, was so interesting. Can you take us through this? Because I'd never heard this term, but it's almost everything that's wrong with a lot of these viral memes and partisan issues that get shared and turn into 7,000 news articles.
[00:39:01] Tobias Rose-Stockwell: Yeah, so when we push content through the boxes of social media, when we take reality and we force it through the small frame of social media, we are actually changing that content much of the time. So we have to strip certain things out to get to a character limit on Twitter or X or on Facebook. As soon as you put something into a specific medium, it changes that medium, right? Or it changes the message of content initially. On social media, you'll see this all the time. You'll see an event that happens in the world at large.
[00:39:30] I use this analogy of this guy named Bob. Bob is my analogy punching back. Bob is basically a dude. He has had a rough day. He's coming home from work. He had a rough day at work. His kid kept him up all night. He's. He's having a hard day, and he goes to the store to pick up some groceries, and he gets in the line with his groceries, and someone cuts in front of him in line. And he's just like at the end of his rope, he almost got fired today, and he's having a really bad day.
[00:39:55] And so he snaps at the person in front of him saying, "Excuse me," and he yells at her for a second. And the person in line yells back, and it escalates for a minute. Someone pulls out a camera and they start recording that particular event, right, of this altercation between Bob and this woman who's in line. It takes a second or two and the woman says, "Look, I'm actually really sorry. I got out of line because I had to go back to replace the broken eggs. I need to get some eggs. And these eggs were broken. I'm really sorry. I just had to get back in my place in line. I'm late for a thing," and he apologizes, she apologizes, things move on, right?
[00:40:24] But that particular video, right, the video of the altercation between Bob and this woman, maybe some nasty things were said in that moment, is placed on social media. It's placed without the initial context, without Bob's bad day, and it's placed without the natural resolution of that. It's placed on, without the kind of, the makeup that happened at the end of that. And it's put on social media, and all of a sudden, it's this outburst against a man and a woman, right? So, natural kind of gender power disparities there that people can use as a reference when they're like, "Oh, look, this is a problem with men in America." Someone can look at that and be like, "Oh, this is a problem with men in America. They're just constantly yelling at women and being terrible to them," right? Or if one of them is Republican, one of those Democrat, you can add additional context to it, right? Which is—
[00:41:06] Jordan Harbinger: She's wearing a MAGA hat and it's, "Oh, look at this."
[00:41:09] Tobias Rose-Stockwell: Oh, yeah, exactly. "Typical Republican, yeah, doing something to poor Democrat woman here." It's missing the richness and nuance of real life. All of the pretext and the post text are gone. The context has been stripped away from the original event. And then people can add context to it themselves. This tweet begins to go viral a little bit. And people start commenting on it, like, "Oh, this is a problem. I can't believe this guy, Bob. He's clearly a terrible guy." And then someone else sees that and adds their own context to it. This is an example of Trump's America. This is an example of the gender disparity. It goes from context collapse, which is the removal of the original context, and it is context creep, when people actually start adding their own context to it.
[00:41:46] And all of a sudden, a journalist picks up on this story, they see that it's getting some viral traction, and they write a web article about it. They put it online, saying, this is what people are talking about online, this is why it's important, because it represents this broader cultural issue of men versus women, or of MAGA versus the blue states, or something, right?
[00:42:03] Jordan Harbinger: Yeah. Liberal destroyed for cutting in line at Whole Foods.
[00:42:05] Tobias Rose-Stockwell: Exactly. Yeah. And so it becomes this kind of moral play for the rest of us to look at this kind of object lesson for society or for the particular people that view this thing and event is a totally benign thing that happens. If someone has a bad day, just sh*t happens all the time, right? There's natural resolution, people will figure stuff out, they move on with their lives.
[00:42:24] But what we've created is this vehicle for taking these tiny moments that have any kind of hint or smell of power asymmetry, and we turn them into these broader kind of thesis. about the world that we are living in and everyone participates, right? Everyone that sees it can have an opinion about it and chime in. And it becomes this big cultural touch point that is actually very divorced from common reality. And that's a huge problem for us. It's making sense of the world itself that we're living in because we're taking these otherwise benign anecdotes and turning them into major points of moral disgust and shame.
[00:43:03] Jordan Harbinger: It's scary how easily that can happen because now that I know about that, I realized I don't look at a video and go, the context has been removed and maybe this additional context has been put in there by this poster on Reddit. I really don't even have that level of investigation in a lot of this stuff. Maybe a little bit like, "Oh, how do we know that's what's happening in the video," but certainly not, "Let me imagine another context in which this wasn't nearly as bad and let me remove the artificial context that's been placed here and try and look at it neutrally." I just never probably put that amount of brainpower into examining information.
[00:43:37] Tobias Rose-Stockwell: None of us do, right? We're primed to look at stuff as it is online, right? We look for the emotional hit and that's one of the reasons why we're there in a lot of ways, right? We're looking for the emotional hit. And there's something interesting about this. We're such hyper social animals, right? We want to know what other people think about everything, right?
[00:43:53] Jordan Harbinger: Mm-hmm.
[00:43:54] Tobias Rose-Stockwell: We wouldn't say this, but we're so obsessed with opinions of others. And this is the result of a very natural human trait as social creatures, gossip. We love gossip, right? We wouldn't say necessarily that you love gossip. But as a hyper integrated society, we like to know what the status is of everyone around us and what people are doing.
[00:44:12] So when we watch something like that, we'll be like, "Oh, okay, this is interesting. This is a piece of juicy gossip that I should be paying attention to." If enough people are paying attention to something, we feel obligated to also pay attention to it. And that is a piece of the problem here. So if everyone is talking about this power asymmetry between Bob and this woman. You should have an opinion on it too, because if your friends ask you about it later, then you should also have an opinion about it. You should be able to talk about it, right? Even if it's been blown far out of proportion from the original event.
[00:44:37] Jordan Harbinger: That leads to the concept of moral emotions. You mentioned this in the book, how we change, at least how we act on our morals in groups with other people, and how this happens online at scale. Tell me a little bit about that, because I hadn't thought about that, but it is terrifying. It definitely happens. You can almost feel it happening in real time, if you're paying attention.
[00:44:55] Tobias Rose-Stockwell: Yeah. Moral emotions, this is a frame from Jonathan Haidt, who's my collaborator at NYU. He read the forward of the book and he's good friends researching moral emotions for a very long time.
[00:45:05] And moral emotions are fundamentally different from other emotions in our bodies, right? So you have an emotion, I'm sad, I need to change my state. I'm angry at something that's happening in the world around me. But moral emotions tend to involve other people. They tend to actually involve righteousness and indignation, shame, disgust. All these are moral emotions and they involve other people and how people should act in society. And we're all operating with different moral emotions.
[00:45:31] John thinks about it as a set of taste buds that we're born with and they develop over time. There's some integration between the emotions we're born with and how they actually end up changing over time. Moral emotions do change over time, but we have basically these different tastes the same way that you might like some spicy food and I might like sweet food. He breaks them into six different categories, the six moral foundations — authority, in group, care, fairness, sanctity, and liberty. Each of us has different balances of these, and liberals and conservatives tend to have a very different but discernible profile of moral emotions.
[00:46:03] They basically are filters that change how we see events that happen in the world. And these moral emotions are very powerful. They're much more powerful than you'd expect. the way that you might see an interaction between two people as being deeply disgusting and problematic, someone else might see and be like, seem fine to me, right?
[00:46:21] So we're all coming at these from different perspectives, right? But online, when we see particular moral transgressions, these moral emotions come into play dramatically, right, we actually end up in these cacophonous pylons. With other people that share the same moral opinions as us. So there's this moral segmentation that's happening online. When we're actually served a lot of the content, and this is actually what optimization does, it will start to filter for morally outrageous content that you find morally outrageous, right?
[00:46:50] So these algorithms will actually begin to serve you content that will speak to your moral emotions specifically, engage your moral emotions specifically. And that actually starts to peel us into different moral tribes in a way online. And this is one of the reasons why we see so many identity groups online that are so significant. There's a lot of in group and out group behavior in which we feel very tribal online is why you see a lot of hashtag identity online is because you end up.
[00:47:14] with these moral threats. When we feel a moral threat, we feel this impulse to find our tribe, to find our in group, and we want to find safety in that in group. So when there's a threat to the way that the world operates, a threat to us, or a threat to a group that we hold in high esteem, then we feel called to defend them.
[00:47:32] And there's two things that happen there. We denigrate the out group and we venerate the in group, right? So we actually look to the out group and we say those people are terrible because they are threatening my in group with my moral foundations and they are terrible and my in group is great. This is why I feel so good about Democrats and why I hate Republicans.
[00:47:51] Republicans are despicable. Democrats are great or vice versa, right? And that's what happens when we're exposed to moral threats. And that's what. social media is very good at doing and traditional media in concert with social media is very good at doing. It's serving us moral threats to our way of life, to the way we think about the world, because it is engaging and because it will sell out revenue.
[00:48:09] Jordan Harbinger: Yeah, it's terrifying how the incentives skew towards outrage because it's profitable. And it works in podcasting too. The biggest podcasts are either political and extreme. They're not like, oh, we discuss current events in a bipartisan way. Those are not the biggest podcasts. The biggest podcasts are extreme in one direction or the other.
[00:48:27] And then there's true crime, which is like a whole separate phenomenon. That's basically murder porn and triggers fear and thus interest as well. So it's not totally dissimilar, right? If it's not true crime, learn the story of a bank robber who didn't hurt anyone. It's serial killers that are really gross and terrifying and are still at large. Those are the most popular shows.
[00:48:46] And, Jonathan Haidt, by the way, episode 90 of this show, really sharp guy, I got to have him back at some point. The longer people spend on social media, the more likely they are to be politically extreme. You wrote that. That's terrifying. Correlation's not causation, but that shows you that it's essentially, there's a good chance that's the drug that's causing this effect.
[00:49:06] Tobias Rose-Stockwell: Yeah, this is some research by billy Brady and Molly Crockett, who were at Yale, and they've done some incredible research on the power of moral outrage online to change our behavior. And what they found was that basically if we are exposed to a significant number of outraged pieces of content online, it will actually change our behavior over time.
[00:49:29] I'll take us back for a second to BF skinner, who is the psychologist who did a bunch of groundbreaking experiments back in the '30s with pigeons in boxes. He would put pigeons into boxes and they'd flash a light, and if the pigeons pressed a lever, then they would get a food pellet. In that process, he was basically training the pigeons to be conditioned in a certain way. But if the pigeons were exposed to some randomness, and the light was flashing, and they sometimes got a food pellet, they sometimes didn't get a food pellet, they would actually get obsessed. Coming back to the intermittent variable rewards thing right here, they would get obsessed with it. And they would start pressing this button on a regular basis to try to get the same response from the food pellet.
[00:50:07] So slot machines, this happens with slot machines on a regular basis. We know that you can just put people in a room and have them press the same button over and over again. It hacks a part of our brain and changes our behavior over time. So, this is an operant conditioning chamber. It was colloquially called a Skinner Box, which is that if you put people in a Skinner Box, and you see this in video games all the time, loot crates, right?
[00:50:26] You don't know what you'regoing to get, so you keep on playing until you get what you want, right? And people get in these loops on a regular basis. Social media operates in very much the same way, and actually trains us in the same way as well. So, if you are posting online, we're all signal processing machines. If you're posting online, and you post something, it gets five likes, you post another thing, it gets 10 likes. Suddenly, you post something that gets 100 likes. And recognizing that outrageous content online will get a 17 percent boost to virality for every moral and emotional word that you use.
[00:50:57] So, if I tweet, I'm stuck in traffic. It'll get average likes, right? But if I tweet, I'm disgusted by the Democratic administration for not dealing with this traffic problem. It is horrible.
[00:51:07] Jordan Harbinger: Biden's America, everybody.
[00:51:09] Tobias Rose-Stockwell: Exactly. Exactly. Biden's America. That will get a 70 percent boost to virality for every moral and emotional word that's used there. And as a result, they'll get more traction online, will get more likes, will get more shares, and I will get more signal as an individual, right? Because of that outrageous average content, five likes, 10 likes, whatever. And then all of a sudden, 100 likes. Wow, that's important. And Billy and Molly's work showed that people that make outrage expressions will actually begin to make more outrage expressions over time because of this tremendous signal that they get online.
[00:51:44] So they'll actually start pushing more outrageous content out there on a regular basis because it tends to give us the signal back to us. And we're all signal processing machines. We all want the world to see us and to like what we are giving it. But it's actually training us to be more extreme and outrageous in our sharing and our personalities, what we share with the world.
[00:52:03] Jordan Harbinger: But it's not just random people who are pissed off who are doing this, right? It's almost self evident. This is going to be really bad when AI can just write you an article. Like they don't even have to have a writer. It's like the AI wrote the article that would piss Jordan off because Tuesdays he's coming from the gym and he's upset about this. And so we're going to combine all. His fears from his simulacrum that we have online and create just media bespoke outrage content.
[00:52:29] Tobias Rose-Stockwell: Yeah, man, like you just draw this straightforward and AI is going to make this stuff so much worse, so I really want to try to get ahead of it. The same problems that we're facing right now with kind of the pipes of how we get information. That's how the AI is going to get us, I think, and pull us apart even further in the future. So fixing social media and making sure that it's a place for kind of democratic sharing and proper information and making sure that we have the right vehicles for disagreement and the right filters for accurate information. I think it's going to be the whole game for us moving forward.
[00:52:57] Otherwise, we're going to be all living either in China or in this Russia ask chaotic information environment. That's such a problematic frame, especially considering how good these tools are at being bullsh*t creation machines. You can draw a straight line out from where we are and see a very dystopian version of reality without any hand waving or strange sci-fi projections. Like basically, the world we're living in right now, in a few years, will likely be one in which humans are closer to feed animals for algorithms and that we are plugged into machines that are just built for extraction, extracting our preferences and information to try to feed these algorithms more and more effectively.
[00:53:40] And we're already seeing that right now, right? We're already beginning to resemble these strange feed animals that are stuck in these algorithmic channels of data extraction. And. You don't have to be a creepy dystopian future thinker to see that as a near term possibility.
[00:53:54] Jordan Harbinger: No, it's like a prequel to The Matrix, right?
[00:53:56] Tobias Rose-Stockwell: Totally. Yeah, totally.
[00:53:57] Jordan Harbinger: Ugh, that's terrifying. Yeah, wow.
[00:54:02] This is The Jordan Harbinger Show with our guest Tobias Rose-Stockwell. We'll be right back.
[00:54:07] This episode is also sponsored by BetterHelp. Ever had one of those nights where you're lying in bed, you're trying to sleep, your brain decides to replay embarrassing moments from 10, 20 years ago? That's a classic case of the brain getting in its own way. We all know the feeling, understanding what's beneficial for us, yet somehow we end up wrestling with our thoughts. This is where therapy can be transformative, helping you navigate through these mental roadblocks and aligning your actions with your well being. Whether you've experienced therapy or not, it's a profound journey towards learning coping mechanisms, establishing boundaries, and ultimately, being the best version of yourself. No major trauma required. Considering therapy, BetterHelp is your online gateway to a tailored experience. It's designed around your life with flexibility at its core. A simple questionnaire connects you with a licensed therapist, and you have the liberty to change therapists anytime without additional costs.
[00:54:53] Jen Harbinger: Make your brain your friend with BetterHelp. Visit betterhelp.com/jordan to get 10 percent off your first month. That's better-H-E-L-P.com/jordan.
[00:55:02] Jordan Harbinger: This episode is also sponsored by McDonald's. Here's a McSurprise. Most people don't know that one in eight people in the US have worked at a McDonald's. It's like they're in an elite McClub — yes, I'm going to be doing this the whole time — where instead of secret codes, members are initiated into the art of the McPerfect Fry. Imagine the McMoments. Generations connecting over the magic of a happy meal. Even my dad, pushing 80, can't resist one. Whether he's chasing the toy or reliving McMemories, he's definitely loving it. Had a McBash birthday party at McDonald's? You've seen the McMagic? Okay, I'm done. And those smiling faces supersized with joy. And there are a lot of cool things McDonald's does as an employer that people might not even know about. They've got an amazing English under the Arches program that helps employees sharpen their English skills. Need a high school diploma? They've got the career online high school, looking to go to college, check out their Archways to Opportunity. They even have success coaches to help you map out your future. I think that's brilliant, especially for people in a position like this. So next time you're munching on those iconic fries, remember McDonald's isn't just feeding people, it's fueling opportunities. McDonald's is now serving much more than orders.
[00:56:04] If you liked this episode of the show, I invite you to do what other smart and considerate listeners do, which is take a moment and support our amazing sponsors. All the deals, discounts, and ways to support the show are at jordanharbinger.com/deals. You can also search for any sponsor using the AI chatbot on the website as well, jordanharbinger.com/AI. And if you really can't find something, email me and I'll dig it up for you. Some people have taken me up on that and I appreciate it when you do. If the codes don't work, you let me know that too. Thank you for supporting those who support the show.
[00:56:31] Now for the rest of my conversation with Tobias Rose-Stockwell.
[00:56:37] I've noticed we're all sort of journalists now slash no one's a journalist now. You ever go online, you try to find something and it's old, and it's an article from the LA Times, 2002, or even the '90s, and it's formatted all weird, but it's still digitized? You read it, and it's almost a completely different animal than a similar article you would read today. Many people now think opinion is journalism. It's not. There used to be this clear line, and if you ask people, "Hey, who's your favorite writer, journalist, author, whatever," they always list these talking heads from whatever news channel as their favorite journalist. I'm like, technically that person might be a journalist, but really they're just somebody who yells at somebody else. And that other person yells back at them and they're both journalists, but they're not though. They're entertainers that are pretending to be experts on a topic so that they can yell at each other on a TV news show. And that's what the news is now and it's not true to form.
[00:57:30] Tobias Rose-Stockwell: Yeah, so there's this difference between journalism, straight news, and opinion. People don't recognize that there's actually historically a difference between those things, but for the world of straight news journalists, those are actually different entire rooms, right?
[00:57:47] So you'd have the editorial wing of the paper, and then you have the straight news wing of the paper, essentially, where people are reporting on facts, they're corroborating stuff, they're making sure that things are accurate, and they're just getting the news of the day out there. And opinion, which is mostly what you see it on Fox News, you see it on CNN, you see it on MSNBC a lot of time that is very fundamentally different from straight news reporting is really like just the facts reporting is what happened, why did it happen, where did it happen, what is going on in the world at this very moment.
[00:58:16] Versus opinion, which is analysis. It is editorial accounting of what's going on is why is this important. And we all respond to that more significantly. We're all very good at opinion. And We're not so good at the actual process of journalism and proper reporting, which I think is one of the biggest problems that we're facing right now is that we don't really know a difference and we've all become journalists on social media.
[00:58:37] And as a result, without knowing the kind of precursors and preconditions of being journalists, we need to know how journalists actually source information factually. As a result, everything is becoming opinion. Everything is becoming editorials. Everything's becoming analysis. Everything's becoming a talking head on a regular basis. We're not parsing information the same way.
[00:58:55] Jordan Harbinger: You give this parable of the island. Is it possible to go through that in a brief way? Because I thought this was very eye opening in terms of how bad actors take advantage of this information environment.
[00:59:07] Tobias Rose-Stockwell: So imagine an island. And I use this as a kind of example of the most basic, kind of the smallest possible democracy, and what would happen on an island, this little parable here.
[00:59:17] There's five people that get together every year and they vote on the policies of the island and the trade rates with other islands. And this island is a basic democracy, so they elect an executive every year. They have a single person that goes around and factually reports on what's happening on the island. The island's big enough so that people actually don't see each other very much. Five big farms and there's just not a lot of coordination possible amongst people without having a journalist, a guy that goes around and just keeps track of what's going on on the island. This is a state journalist. He's able to go around to basically report on what's wrong and share that with the executive and the rest of the people that are living on the island.
[00:59:57] One year, one of the members of the island, he is digging a well, and he finds oil on the island, all right, so there's oil bubbling up out of the ground, but he knows he's going to have to share that with the rest of the collective of the island. This guy, we call him the loud man, he is kind of a blowhard, he's always a little bit obnoxious, always saying what he wants, but he's part of the democratic system. So he finds oil on the island, and he tries to think of a way of getting this oil for himself, but he can't because if he reports it to the reporting man, then all of a sudden he's going to have to share this oil wealth with the rest of the people that are living on the island.
[01:00:31] So a few years pass and suddenly one year there's this trader that comes to the island and he has an antenna. He has this new thing called a radio and he puts this radio on the island. He hooks everyone up. Everyone thinks this is a great idea. We have this new radio people can use to communicate in real time. The reporting man loves the radio. Everyone loves the radio. They can basically just like share information in real time about what's happening all over the island.
[01:00:53] So the loud man seeing this opening, he starts to use the radio against the reporting man. One year, there's a storm that hits the island and the road goes out around the island. This reporting man is trying to keep track of all the repairs that need to happen on the island. He's reporting it to the executive and the rest of the people on the island. And he gets something wrong. And the loud man calls him out. He says, "Look, the reporting man is clearly lying because there was a problem with the island and he got it wrong, right? So he's able to make a stink about what is going on elsewhere on the island and tries to basically reduce the confidence, the reporting man.
[01:01:29] Now, that he has access to the radio, he can start to make his own messaging and he can just add confusion to the mix. He can start to say, "Oh, look, I think the reporting man is actually lying. I think that he's in cahoots with the executive and I think that we shouldn't trust him, and I think that you should vote for me instead of the next election." So he starts to increase the level of chaos and confusion by using this radio antenna to promote his own narrative about what's going on.
[01:01:51] The reporting man keeps on trying to keep up, but there's this one kind of black mark on him now because he got something wrong, right? And you see this happen all the time with traditional media. A lot of people will say, "Oh, the media is biased. It's wrong." And you see these small points of infraction in the general media where a media organization like an old garden news organization will actually get something wrong. And then a certain political aspirant will look at that. And they'll say, "Oh, look, the media is broken. The media is actually lying to you."
[01:02:18] And usually the people that do that have an agenda. And so in the case of the loud man, he was able of the next election to increase the confusion just enough to make a number of people not vote. They don't vote for the executive. And they stop paying for the reporting man, they stop paying for the newspapers, they stop paying for the media that's actually reporting on what's happening on the island. And all of a sudden, the loud man, he can capture enough votes to become the executive himself. At that point in time, he passes a new law, banning the reporting man, and basically allowing for him to use the radio however he wants. And for him to sell the oil rights in secret, and for him to make money and exploit the island.
[01:02:54] So, I use this as an example. I'm sure you've heard this Steve Bannon quote, "The real enemy is the media, and to deal with them, we need to flood the zone with sh*t."
[01:03:04] Jordan Harbinger: Yeah, I have heard that. That's crazy though. I didn't know if that was real.
[01:03:07] Tobias Rose-Stockwell: It's a real quote from him. And it's a real strategy that authoritarians have been using now for a while. Russia has turned this into a science in recent years of basically trying to increase the net quantity of misinformation and garbage in the system so that people stop caring. They get cynical. They stopped trusting straight news reporting. They stopped trusting media in general, and they instead check out from politics or they go with their gut and kind of vote for the guy that has the loudest voice.
[01:03:35] And so I use this parable of knowledge, like the smallest possible democracy and what happens if you remove this kind of central player who is the reporting man, the person that goes around and actually tries to make sure that information is good. If you remove that and you put this new system in place in which anyone can say anything and allows for enough confusion to exist. That reduces the coordination of the goodwill actors. So it allows the single or the small minority of badwill actors to actually take the place of good straight news reporting. And when that happens, you actually end up with authoritarian slide. You end up with people that are willing to take what they can from a democratic system because the noise and the garbage information in the system is so high that people can't actually figure out what is really happening.
[01:04:19] So Putin has become very good at this. Putin is incredibly good at at just spewing misinformation and falsehoods out there so that his population, they can't figure out what is real and what is not. They're like, I guess you can't know. And truth is unknowable. Reality is unknowable. We might as well go with Putin. He seems to know what he's saying.
[01:04:34] Jordan Harbinger: Because when there's uncertainty, most people feel a little bit threatened or scared because they don't know what's out there. And when we feel like we don't know what's out there, or maybe we feel like our current government isn't doing enough to keep us safe from those unknown outside threats, people seek authoritarian leadership. So there's a lot of power to be gained by people who want to shape media and propaganda and other messaging to trigger this in people so that they vote a certain way.
[01:04:59] And I think this was in your book, and I've seen this elsewhere, an uncomfortably large percentage of people say that they would prefer and/or be okay with autocratic authoritarian rule as long as it was like their guy who was in charge I wish I knew the percentage. Do you know what I'm talking about? Have you heard this?
[01:05:15] Tobias Rose-Stockwell: Yeah, absolutely. It's close to a quarter of the country that we're in on both sides of political spectrum right now. They're actually okay with an authoritarian and that is a direct results of losing faith.
[01:05:26] Jordan Harbinger: Yeah, that's terrifying. That should scare everybody.
[01:05:29] Tobias Rose-Stockwell: In the media, essentially, and losing faith in a central body that might keep potential autocrats to account. We might keep the government accountable. keep all of us accountable. When we feel threatened by external narratives, when things get extremely confusing, then we tend to default to this authoritarian disposition and understand where it comes from, right? If there's enough things that feel like they're going wrong in society, then it makes sense. You'd want to just go with a guy that seems like he's going to fix it. Go with the simple narrative who's telling you the story, it's telling you what you want to hear. And that authoritarian disposition is present in a huge portion of the population now. Enough so that, yeah, that America, I think, is a lot closer to the default of throwing up their hands when it comes to democracy and being like, just put the guy in charge. It seems like he's going to do what my party needs done. And that it's a sizable fraction of the country.
[01:06:17] Jordan Harbinger: Yeah, that's really scary. And I want to highlight the fact that people usually do bad things because of bad incentives. And you're pretty clear about this in the book. It's not that they suddenly turn evil or that the other side is evil. It's that the incentives are bad, and you give a really good example of standing up in a theater. If somebody decides to stand up in the theater to get a better view for themselves, people behind them are going to go, well, I can't see the show now, so then they have to stand up, and the people behind them have to stand up. And This is cascading incentives where everybody suddenly has to break the rules in order to just even function in that environment. And that's really a bad way to exist. And you can see examples of this all over society. And it's only going to get worse if we allow it to go unchecked, which seems like what is happening.
[01:07:05] Tobias Rose-Stockwell: Yeah, this is a broad category of things called coordination traps, which are really important to understand. There's like a mathematical foundation for why they happen. But basically you can see coordination traps happen all over the place, right? So you can find them in nature. There's this thing called a crab bucket, which if you're crabbing and you put a bunch of crabs into a bucket. They can't coordinate together, so every time one of the crabs tries to reach the end of the bucket at the edge of the bucket, they could climb out if they were able to work together, but every time one starts to get the edge, another one will pull them down.
[01:07:34] So it's like this collective coordination problem they have and as social creatures, that's actually our greatest superpower is coordination together, right, is this ability to collaborate to solve problems and work towards greater solutions. that we couldn't do as individuals. So you can see how coordination traps or coordination problems are actually the antithesis of social success, right? The antithesis of our species success, right? Because if you make enough incentives visible for doing the wrong thing, people will do the wrong thing, even though they know it's the wrong thing.
[01:08:04] So another example is a bank run. If someone has bad information about the stability of a bank, and this just happened very recently, you know, you had this kind of viral explosion of misinformation about the solvency of a couple of different banks. And they were all shared via WhatsApp and group text threads about this bank is maybe going under and all of a sudden people start pulling their money out. And if you're halfway through, you know, you have perfectly good information that the bank is solvent and has no problems whatsoever. Even if you know that, if you're in the middle of a bank run, it still makes sense for you to go out and pull your money out of that bank, right?
[01:08:39] So this is a coordination problem collectively, like a small fraction of people have bad information and they start this process, this cascade, and you caught in the middle. You know it's the wrong thing to do, but you're going to do it because otherwise you're going to lose all your money. And there's a whole category of these issues, these things that happen.
[01:08:54] Yeah, so the example of you're at a sit down concert. Someone's playing some nice acoustic music or something and someone in front stands up and the person behind them stands up and the person behind them needs to stand up and the person behind them is that and all of a sudden everyone needs to stand up. Everyone has a worse view of the stage but you had no choice you had to stand up and the person behind you had to stand up. And so you can see this kind of coordination traps happen all the time.
[01:09:14] And that is really the origin of a lot of the worst things that happen in society writ large is that you end up with these things where the incentives are doing the wrong thing are greater than the incentives for doing the right thing, even if you know consciously that it's the wrong thing to do.
[01:09:29] And so social media is very much like a collective coordination trap for us, right? There's a lot of things about social media that make it easy for us to do the wrong thing, right? So if you want to get ahead in the intentional environment, right, if you want to make sure your newspaper stays alive or your publication stays alive. Then, you got to use this crazy headline packaging in order to make your message heard and you have to compete in the attentional environment against the loud, obnoxious, rage baiting that happens. Or if you want to get a lot of friends and followers online, you need to compete in the same environment, which kind of forces us down the race to the bottom of the brainstem.
[01:10:02] As Tristan Harris says, you have to force yourself to do things that aren't necessarily in the best interest of everyone and social media is very good at that and unfortunately democracies writ large are prone to a lot of the same problems that come from coordination issues. So it's important to recognize that coordination traps have solutions to them. Usually, it requires some kind of authority or consortium of individual players to get together and act as an authority to try to reset the game, to try to reset things.
[01:10:32] In the case of the concert, it's the person on stage saying, "Excuse me, could everyone please just sit down so everyone can see the stage?" And then, all of a sudden, there's a rule in place, right? There's a suggested norm and everyone sits down and resets the game, right? In the case of a bank run, it's FDIC, the government coming in and saying, look, we are going to stand behind the deposits of this bank and we are going to make sure that depositors aren't going to lose their money. It's a trust game writ large.
[01:10:56] And if you don't have these third parties that are willing to come in and help reset the game, then coordination traps will ruin it. And so democracy is very much like a coordination game. You need to have a lot of players that have good information, that have accurate information, in order to make sure that you're getting the right candidates.
[01:11:12] And social media has very much undermined our ability to get good, accurate information on a regular basis. And it's given us tremendous incentives to defect and to break the general coordination function of good democracies.
[01:11:26] Jordan Harbinger: I've got some practicals for how people can consume media better and become better at thinking about these sorts of problems, but there's one that I'd love for you to teach us how to do before we wrap here, which I thought this was genius. So let's say we're arguing with somebody. How do we reframe our concerns with those we are arguing with in terms of their values? This is a great sort of rhetorical technique that I'm embarrassed I didn't do this my whole life.
[01:11:49] Tobias Rose-Stockwell: Sure. Yes. When it comes to disagreeing with others, disagreement is required for Good functioning society. It's like we need to disagree about stuff. That's how we get to better truths. That's how we get to better understanding, right? So if you think about democracy itself functions like this big outrage machine, right? We're given these tools of functioning government, but we need to be able to see clearly the problems in front of us, and we need to be able to debate clearly about what those problems are.
[01:12:17] And that's the point of the First Amendment, right? It's like when I say something that is contentious, you can call out the ways that I am wrong because you can see through your perspective the ways in which my arguments have flaws, right? So democracies really do require disagreement, public disagreement, debate, compromise, all of these things in order to function correctly, right?
[01:12:37] But when we're disagreeing online, oftentimes there's not good incentives for actual debate and discussion. When we disagree online, we actually have followers that have been built up as like bleachers alongside of our every disagreement, right? And you can think about this. If you and I are having a conversation in private about a contentious issue, we're more likely to try to get to the core of the issue. You and I were like, okay, cool. Let's talk about the central point of disagreement here. Let's talk about the issue at hand. But if there's suddenly dozens and dozens of people around us that are on my side and on your side, I'm going to want to look as good as possible to the audience. And so I'm going to try to find the point of playing to my audience's desires as opposed to refuting your central point.
[01:13:19] That's a big problem with social media writ large is that it actually pulls us away from this process of kind of collective debate and understanding and pushes us towards this more rhetorical fighting that happens focused on what the desires of our audience are.
[01:13:31] So if you are actually disagreeing with people, and this will happen, right? It's part of being human and part of sharing a world together. We need to have better disagreements. And so the point is not to not have disagreements. The point is to try to figure out how to disagree better, not less, just disagree better.
[01:13:46] There's a really helpful frame for this, which you can think about as a secret code. Think about this a lot. I have a lot of friends that are politically diverse, come from very different backgrounds and have different beliefs. And what I find is that if you start from the kind of the point of the general narrative of the kind of rhetorical battleground narrative, then you're actually explaining the problem, but you're not speaking to the person's emotional foundations. And that's really important that if you want to actually change people's minds, you need to get curious about what their value system is. You need to understand what their moral foundations are. So, if you can figure out what their moral foundations are, what they, what kind of core values are that they're trying to explain.
[01:14:27] Jonathan Haidt uses the analogy of an elephant and a writer, which is basically our conscious brains are essentially the writer, this kind of lawyer on top of an elephant that is explaining away the actions of the elephant, which is our emotional values driven self. That is trundling along, going with the emotional whims of the day. But the writer on top is basically the lawyer. He's just explaining this elephant, right? The emotion.
[01:14:53] So most of the time when I'm explaining an argument, I'm actually the lawyer on top explaining to your lawyer, lawyer on top of your elephant, what is wrong with your argument. And that's not going to change the perspective of the elephant, right? So the point is, if you can actually figure out what is driving the elephant, if you can figure out what kind of moral foundation they are following, then you can change the direction of the other person's elephant.
[01:15:13] You say, okay, cool. Say we're disagreeing about gun rights, for instance, right? And I'm coming at this from the moral foundation of harm, of safety, right? I want you to understand that gun rights are really important to the safety of my kids going to school. I don't want there to be guns in my school, for instance, right? And you're coming at gun rights, thinking about it in terms of liberty, the liberty to own a firearm, right? If you want to own a firearm, that's coming from a liberty based foundation.
[01:15:37] So if we're fighting about this, I'm going to be talking to your lawyer. Instead, I should be talking to your foundation of liberty here. And I should think, okay, how can I reframe my argument in the frame of liberty? How can I make the safety of my kids an issue of liberty? So, I want to live in a free enough society where my kids are free to not worry about having guns in their school, for instance, right? I can start to change my messaging and get curious about their moral foundations. And the questions that you can do here is really, you can ask, what are the values that are important to you around this issue? What are the core emotional feelings that you get when you think about this, about what should and should not be happening when it comes to gun rights in this country?
[01:16:16] And if you get curious about that, you can act less like a warrior and more a detective or a scout and being like, okay, cool. Let me understand where you're actually coming from here. You can try to reframe it based on this moral foundation and you'll get much further in the topic of contention. It's almost like ninja skills for understanding conversational debate and disagreement. And you'll get closer to a foundation of changing people's minds and of moving the conversation forward.
[01:16:42] Because first of all, you will understand it better. You'll understand what the core issues are for the other people that are holding their ground and they will understand you better too. And you can both start to act in concert better and figure out what the foundations are of this. So definitely don't assume the beliefs of others. I think that's one of the biggest problems that we face right now in political discussion is that we assume the worst of the other side, right? We get this small kind of fraction of an opinion that we assume is that representative of the entire opinion of the other side. But instead, if we focus on the emotional foundations of the individual, you can get much further in finding common ground.
[01:17:16] Jordan Harbinger: You go over the history of how we all became journalists, starting from the printing press about authoritarian beliefs coming down from the church, the fairness doctrine. I mean, there's just a lot going on. It is easier to just go, "Ugh, I don't know. It's so complicated. Who knows what's correct?" I hear people say things like that, and it makes me want to stop and shake them and be like, "No! Apathy is the enemy! That's the enemy! Not the other side.
[01:17:42] Tobias Rose-Stockwell: Yeah.
[01:17:42] Jordan Harbinger: Man, I got more practicals than the show closed, but I have to say, Tobias Rose-Stockwell, you definitely brought three names worth of value and expertise today. And I appreciate you coming on the show.
[01:17:53] Tobias Rose-Stockwell: Yeah. This is awesome. I'm honored to have the opportunity to unpack this with you. And all of these problems that we're facing are quite solvable. Truth is noble. And I think we can get to a much better future if we're willing to put in the effort.
[01:18:05] Jordan Harbinger: You're about to hear a preview of The Jordan Harbinger Show about the warning signs for civil war.
[01:18:11] Barbara F. Walter: There were times when I was writing that I myself started to get terrified.
[01:18:16] Is this right? Am I getting this right? Because what I'm saying is going to hit people hard. There have been hundreds of studies of civil wars. The group that tends to start these wars are the once dominant groups that are in decline. The group that has been politically, socially, economically dominant since the very beginning of this country, white Christian males for the most part.
[01:18:40] America is going through this radical demographic transition from a white majority country to a white minority country. White working class men have declined on most social and economic measures that hasn't happened with any other demographic group. And there's a subset of this population that's deeply resentful of that, that's deeply threatened by that, and truly, truly believe that it's their patriotic duty to do something about this.
[01:19:10] January 6 was so public, it was so obvious. This is part of a far right, white supremacist, anti-federal government movement here in the United States. We know that some of the far-right militias, the Oath Keepers, the Proud Boys, and the Three Percenters actively encouraged members to join the military, to join law enforcement. If you continuously portray this as these are just crazy individuals, then you remain blind to what's actually the cancer that's growing slowly from within.
[01:19:47] Jordan Harbinger: To hear whether we're on the cusp of a civil war here in the United States, check out episode 718 of The Jordan Harbinger Show.
[01:19:56] A lot to love about social media here in this one.
[01:19:58] I think it's going to get so much worse with AI making bespoke outrage content for all of us every single day based on things we've engaged with in a negative way or in any way, really, but negatively, especially in previous time, I mean, the Internet is going to have this copy of us, this simulacrum from all of our digital surplus, every like we've ever done on social media is going to be used, weaponized essentially against us at some point.
[01:20:24] I know that sounds hyperbolic, but I don't see that as far away because these companies. Want clicks, this is how they make their money, and if they can enrage us, they know they'll get more of it. So that's a little bit terrifying, to me anyway. People have incentives to deliberately misinterpret what other people say and do online so that you can get clout by reacting or being offended, right?
[01:20:43] So it doesn't even have to be us against the machine, it's us against other people who are being, we're being egged on by the machine, and we end up with something called semantic drift. I thought this concept was interesting. This semantic drift has spread faster than our ability as a culture to catch up. So when people aren't caught up on the latest language use, and they use language that is no longer acceptable to whatever group, they get called out. And sometimes these call outs are done badly. And when this happens, people are humiliated, and if that happens online, it's public, right? And we have all these groups mixing together in the so called public square of social media.
[01:21:18] And when that happens, when they're humiliated in public like that, they seek in group familiarity. In other words, shaming people for misuse of language, even if it's done with good intentions, can often increase tribalism. An example of this would be policing Crazy Uncle Frank at Thanksgiving, who uses the term homos when he's talking about LGBT folks. And instead of him being corrected and going, "Oh, we don't say that anymore." He goes back and goes, "Oh, the younger generation, all they do is police language. We can't say anything anymore. They're all soft. They're all, they're all libtards" or whatever it is that Uncle Frank has beef with. And now, he's sharing that with other people who feel humiliated online. And so that ends up with a good old tribe of crazy Uncle Franks talking about how you can't use the word homo anymore and how unfair that is. And those people join these groups and become more and more insular and more and more radicalized as a result.
[01:22:11] It's clear that we are running imperfect algorithms in our brain. And we see this by our cognitive bias here. One, which I thought was particularly interesting that Tobias brought up in the book, he said just because something has more coverage, it doesn't mean that there's more suffering. The same disaster has the same amount of people affected regardless of how many articles or how much time in the news cycle it receives. It just means that a story has found its audience. And if you remember that Titanic exploration submarine from a while ago where it was missing and it was made out of fiberglass and it had like four people in it. Much of the world was captivated by this.
[01:22:46] And people were saying, "What happened to the coverage of the immigrant boat that capsized, off the coast of Greece or whatever it was? Why aren't we talking about that more?" And the truth is, I think a lot of people were able to relate to the submarine story in a way that was particularly horrifying. And that story had found its audience and it caught on like wildfire. It doesn't mean that there's more suffering, right? A bunch more people died on that capsized ferry or whatever that boat was than they did in that submarine. But one received more coverage. If you were just going by the coverage, man, you'd think half the population of Rhode Island was in that submarine.
[01:23:20] And we always got to remember, good news doesn't sell. Good news is not actually news. Gradual improvements in something, those don't make the news either. Bad news is what gets the clicks. So we have to be careful how we evaluate our coverage. And I know it's tempting to just tap out and ignore the news. I don't think we want to become that society either, but I'm not above blocking accounts that make you feel bad. If something makes you feel envious all the time, if something makes you feel desperate or you engage in depressive behavior or self-shaming, block those accounts. You don't need that. If you think that you need to buy these things that some other person has, or you are less than just block the account. There's no shame in that. Those are designed to make you feel bad. They're designed to make you feel lack and want so that you buy something or buy into an idea. Just get rid of that from your neurological ecosystem.
[01:24:09] More tips and practicals from the book are also going to be on outragemachine.org. We'll link to that in the show notes as well. Don't forget transcripts are in the show notes. Advertisers, deals, discount codes, and ways to support the show are all at jordanharbinger.com/deals. Please consider supporting those who support the show.
[01:24:25] We've also got our newsletter and every week the team and I dig into an older episode of the show from it. So if you're a fan of the show, you want a recap of important highlights and takeaways, or you just want to know what to listen to next, the newsletter is a great place to do that. Jordanharbinger.com/news is where you can find it. We're going to be doing some giveaways on there as well. Don't forget about six minute networking. That's over at jordanharbinger.com/course. I'm at @JordanHarbinger on both Twitter and Instagram. You can also connect with me on LinkedIn.
[01:24:51] This show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Milie Ocampo, Ian Baird, and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for this show is you share it with friends when you find something useful or interesting. The greatest compliment you can give us is to share the show with those you care about. If you know somebody who's interested in social media, fake news, disinformation, definitely share this episode with them. In the meantime, I hope you apply what you hear on the show, so you can live what you learn, and we'll see you next time.
[01:25:23] This episode is sponsored in part by the Ten Percent Happier Podcast. Ten Percent Happier is a podcast hosted by my good friend Dan Harris, and it operates on one simple principle. Happiness is a learnable skill, so why not become a pro-addict? Dan Harris was previously a restless and skeptical journalist until a panic attack on live TV made him reassess his life. The profound transformation he underwent propelled him on a quest to absorb as much knowledge as possible about the human mind. Today, his mission is to assist others in finding their own peace and happiness. Every week, he engages in comprehensive discussions with leading scientists, meditation gurus, even the occasional celebrity tackling subjects such as productivity, anxiety, enlightenment, psychedelics, relationships. His guest list boasts names from Dr. Gabor Maté to Brené Brown to Mike D of the Beastie Boys. Consider tuning in to Ten Percent Happier as a workout for your mind. The Ten Percent Happier podcast is available wherever you listen to podcasts.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.