Renee DiResta (@noUpside) is the technical research manager at Stanford Internet Observatory. She studies the role that tech platforms and curatorial algorithms play in the proliferation of disinformation and conspiracy theories, terrorist activities, and state-sponsored information warfare.
What We Discuss with Renee DiResta:
- How the anti-vaxx movement, the Tea Party, and militia groups learned early on to market and cross-pollinate their messages across social media to receptive segments of the populace with shocking success.
- The challenges faced by the US government in trying to stem the tide of social media exploitation by fringe groups and terrorists for the purposes of recruitment, generating sympathy for their causes, and fueling general chaos.
- How entire networks of disinformation evolve to support and perpetuate interconnected webs of conspiracy theories, pseudoscientific bunk, fringe political views, and propaganda planted by hostile states and bad actors.
- Why foreign organizations like Russia’s Internet Research Agency and China’s 50 Cent Party have promoted tribalism in the United States to keep us fighting among ourselves, how it serves their interests at our expense, and why we’re so receptive to their influence.
- Why memes and hashtags are so effective as short-form vehicles to promote simple ideas powerfully across wide demographics.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
What does a typical day in October of 2020 look like for you? Maybe you grabbed the morning coffee and checked in on your heirloom tomato gardeners’ Facebook group to pick up some tips for growing the most flavor per square inch (I’m not a gardener — is that even a thing? It sounds like it might be a thing). While there, you noticed a curious assortment of recommended groups the algorithm is convinced you might enjoy. Among them, you’ve got the holistic beekeepers, the medicinal ginger picklers, the carob truthers, and the antisemitic forest rakers. Wait…what? Of course you’re going to check that one out. And then that leads you down an Internet rabbit hole of disturbing, yet strangely compelling information (or, just as likely, disinformation) that keeps you occupied on the weird part of YouTube until lunchtime. Your head is bubbling with a frenzy of semi-facts, pseudo-malarky, and straight-up fiction (we call those lies where I come from). Now you’re arguing with high school friends about the evils of vaccination, backed up with all the “research” you’ve done this morning. You’ve got gallant strangers chiming in to defend your stance, and an equal number of clueless “sheeple” calling you names for it. It sounds like you’re having a pretty busy week educating yourself and the masses, but then you realize it’s only Monday.
On this episode, we talk to Renee DiResta, the technical research manager at Stanford Internet Observatory. Her job is to understand the spread of malign narratives across social networks, and assist policymakers in devising responses to the problem. To do this, she studies the role that tech platforms and curatorial algorithms play in the proliferation of disinformation and conspiracy theories, terrorist activities, and state-sponsored information warfare. She’s been doxed by anti-vaxxers and recruited by the White House to fight the marketing efforts of ISIS, but she’s still fighting the good fight. Want to find out how you can ensure her work is not in vain? Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some of the links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. Thank you for your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
Microsoft Teams lets you bring everyone together in one space, collaborate, draw live, share, and build ideas with everyone on the same page and makes sure more of your team is seen and heard with up to 49 people on screen at once. Check out microsoft.com/teams for more info!
Miss our episode with Matthew Schrier, the photographer who was held captive by Al-Qaeda for seven months before escaping? Catch up with episosde 217: Matthew Schrier | How to Survive in a Secret Syrian Terrorist Prison here!
THANKS, RENEE DIRESTA!
If you enjoyed this session with Renee DiResta, let her know by clicking on the link below and sending her a quick shout out at Twitter:
Click here to thank Renee DiResta at Twitter!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- Renee DiResta | Website
- Renee DiResta | Twitter
- Renee DiResta | Wired
- The Hardware Startup: Building Your Product, Business, and Brand by Renee DiResta, Brady Forrest, and Ryan Vinyard
- Internet Observatory | Stanford University
- Data for Democracy
- Renee DiResta: Data for Democracy | Frontline, PBS
- Anti Vaxxers: Understanding Opposition to Vaccines | Healthline
- Anti-Vaxx Mom Asks How to Protect Her Unvaccinated 3-Year-Old from the Measles Outbreak, Internet Delivers | Bored Panda
- Measles Outbreak: California, December 2014–February 2015 | CDC
- Vaccinate California
- The Tea Party Didn’t Get What It Wanted, but It Did Unleash the Politics of Anger | The New York Times
- What To Do If You’ve Been Doxed | Wired
- Whither the Limited-Purpose Public Figure? | Hofstra Law Review
- Five Things to Know Now About California’s New Vaccine Law | CalMatters
- The Real-Life Conversion of a Former Anti-Vaxxer | California Healthline
- Here to Stay and Growing: Combating Isis Propaganda Networks | Brookings Institution
- How US Government Wants Silicon Valley Tech Leaders to Fight ISIS | The Christian Science Monitor
- The Internet’s Endless Appetite for Death Video | The New York Times
- She Warned of ‘Peer-to-Peer Misinformation.’ Congress Listened. | The New York Times
- Terrorist or Freedom Fighter? | The New Yorker
- Social Media in Strategic Communication (SMISC) | DARPA
- Warnings: Finding Cassandras to Stop Catastrophes by Richard A. Clarke and R.P. Eddy
- Richard Clarke | Defending Ourselves in the Age of Cyber Threats | The Jordan Harbinger Show 240
- Anatomy of an Online Misinformation Network | PLOS One
- What is Pizzagate? 10 Facts About the Conspiracy Theory | Esquire
- What Is QAnon, the Viral Pro-Trump Conspiracy Theory? | The New York Times
- How Russian Trolls Used Meme Warfare to Divide America | Wired
- How China’s 50 Cent Party Trolls by Distraction | The Atlantic
- How to Get Beyond Our Tribal Politics | The Wall Street Journal
- Jordan Peterson, the Obscure Canadian Psychologist Turned Right-Wing Celebrity, Explained | Vox
- Why Is the Rose a Symbol of Socialism/Social Democracy? | AskHistorians, Reddit
- Millionaire Tells Millennials: If You Want a House, Stop Buying Avocado Toast | The Guardian
- When the ‘Heritage’ In ‘Heritage Not Hate’ is More Skynyrd Than Stonewall Jackson | Code Switch, NPR
- Veterans Before Refugees. The Narrative Hurts Us All. | Happy Joe
- Richard Dawkins Told Us What He Thinks About Memes | Vice
- Start Here: Your Guide to Getting Into K-Pop | NPR
Transcript for Renee DiResta | Dismantling the Disinformation Machine (Episode 420)
Jordan Harbinger: [00:00:00] This podcast is brought to you by Microsoft teams. When there are more ways to be together, there are more ways to be a team.
[00:00:07] Coming up on The Jordan Harbinger Show.
Renee DiResta: [00:00:09] Social platforms, they were not developed to be information libraries and repositories of human knowledge, where people go find answers to their financial and health and political questions on those platforms. They were to help you like find your friends, right? And find your knitting club, not your scientific and medical authority for how you're going to treat a disease.
Jordan Harbinger: [00:00:34] Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. If you're new to the show, we have in-depth conversations with people at the top of their game, astronauts, entrepreneurs, spies, psychologists, even the occasional drug trafficker. Each show turns our guests' wisdom into practical advice that you can use to build a deeper understanding of how the world works and become a better critical thinker.
[00:00:58] Today, not a drug trafficker, but my friend Renee DiResta, she's a researcher who studies influence operations and propaganda in the context of pseudoscience, conspiracies, terrorist activity, and state-sponsored information warfare. Now, if you haven't heard about this, Russia is looking for the cracks and fissures. And then once, of course, they see something, they ignite and pour gasoline on the fire. And I just mixed that, well, whatever. Putin loves a good mixed metaphor from what I hear. You get what I mean. You may have also noticed that online arguments get nastier and more quickly because the trolls are escalating things.
[00:01:29] Now, of course, it's important to separate Russian interference with collusion and things like, "Well, Trump wouldn't have won without this." That's neither here nor there for this conversation today. The above really clouds us from looking at this objectively. Russian election interference happened. We know this. Intelligence agencies say that it happened. The government says that it happened. The platforms themselves have seen that it happens. Everyone acknowledges that it happened and is continuing to go on around the world. It's not the same thing as collusion in between parties and Russia or whatever. For those of you who are just bristling at this, or just not sure of the difference.
[00:02:01] Now, today we'll discuss the extent of these influence campaigns and how they work, including why we see so many bots and real people alike, spewing hate online as well as how platforms like Facebook were used as pawns in the plot to create strife and division right here in the United States and Western countries in general. We'll also explore some practical ideas on how to protect ourselves from this including how we can tell if we are seeing an influence campaign or if it's real grassroots activism, and why Americans have to be especially vigilant as our 2020 election draws closer.
[00:02:33] If you're wondering how I managed to book all these great authors, thinkers, and celebrities every single week because of my network. I'm teaching you how to build your network for free over at jordanharbinger.com/course. And by the way, everyone you hear on the show has either contributed to the course, is in the course, or both. So come join us, you'll be in smart company. Now, here's Renee DiResta.
[00:02:54] How did this get on your radar? I know this sort of started with you while trying to do the impossible, which is to find preschool for your kids.
Renee DiResta: [00:03:00] Yes. So I had my first son in December 2013 and in California, in San Francisco, you have to get on these waiting lists like a year ahead of when you want it. So in December of 2014 was when I had to get him on this list. Anyway, I started Googling around looking for schools but also I wanted to look at the vaccination rates because that had come up. I used to live in New York and it was kind of required but in California, you could just opt-out for just personal beliefs. You could just say, we're not going to vaccinate and that's it, philosophical exemption.
[00:03:27] So I didn't want to go to a school like that. And I downloaded a bunch of data and was really disturbed by how the rates have been really trending down over time. And I wrote a blog post about it in late November, looking at how private schools were a million times worse, how all schools were an absolute disaster. Just trying to say California parents, like we should be demanding better here. I called my local representative. I had never done that before, but I felt like —
Jordan Harbinger: [00:03:52] That's such a Karen move.
Renee DiResta: [00:03:53] I was so annoyed. I was like, "How come there are schools with 30 percent immunization rates?" and I was like, "What the hell is going on here? Can we do something about that?" and they were like, "Yeah, nothing to do, nothing to do about that, you know, good luck." And then the Disneyland measles outbreaks happened a month later. And I called back again and I was like, "Hey, how about now?" And they were like, "Actually, yeah. You know we have a Senator in Sacramento, who's going to be — kind of introducing a bill to deal with that situation." And I said, "You know I'd love to help. Maybe I could do some data analysis for you guys. I'd love to be useful in some way."
[00:04:27] And so they put me in touch with two other moms and we started this group called Vaccinate, California. And by starting a group, I mean, we made a Facebook page, just to be clear.
Jordan Harbinger: [00:04:35] Yeah.
Renee DiResta: [00:04:35] There was no legal filings. There was nothing that this was going to be like a big deal. We didn't really know what we were getting into. So we just made a Facebook page. And I started running Facebook ads because we needed people to call and to support the bill. And so I started running Facebook ads and I got to learn and try to do ad targeting and lookalike audiences and a bunch of other kinds of more sophisticated ad targeting tools. And in the course of doing that, we also started putting out the content mock bill on Twitter. I would say part of SF tech.
[00:05:03] I've been on Twitter for you. And it was kind of completely blown away way by like activism on Twitter, particularly by how well organized the anti-vaccine movement already was. And the way in which fake accounts would come out of nowhere, they would say that they were from California. They'd been created yesterday. They were communicating. They were being given instructions on YouTube videos that were being put out every night. Like, "These are the hashtags you're going after. Here's the memes you're going to use. Here's a folder with memes in them. And I just felt like that was a really amazingly organized operation.
[00:05:32] On Facebook, what I started noticing was if you search for vaccine for running any of your ad targeting, what it started giving you was anti-vaccine suggestions. So if you type in vaccine, it would push you a bunch of anti-vaccine topics and you could target anti-vaxxers really easily. But I was like, "Well, how do I find parents who want high immunization levels in their schools?" And I realized that there was this asymmetry, right? Like most people vaccinate, but they don't stop to think about it. They definitely don't become activists about it. They vaccinate their kids, nothing happens. They move on with their lives and they don't get on social media to talk about how nothing happened.
[00:06:06] So I was kind of struck by this asymmetry and by saying like, "Okay, how do we rectify this? Because it seems like all of the algorithmic boosts are driving things in the other direction." And so I started looking at mapping the networks. I did some network analysis with a data scientist named Gilad Lotan. We started looking at how the anti-vaccine parents on Twitter who had been anti-vaccine for years and years and years were beginning to reach out to the tea party, for example, to some of the militia movements, to some of the second amendment activists. All of which again like the kind of communities that have been on Twitter for a really long time.
Jordan Harbinger: [00:06:40] I haven't heard about the tea party and like —
Renee DiResta: [00:06:42] I know, the tea party in like 2015, like late 2015, early —
Jordan Harbinger: [00:06:47] Yeah. That was some Sarah Palin stuff right there.
Renee DiResta: [00:06:49] Early 2015. It really was. So this was before the 2016 campaign. This was all happening in January to June 2015 or so, so pretty early on. And I started writing about how social networks are enabling different persistent factions of people, different communities to find each other and to find, you know, to kind of like marry hashtags is what the anti-vaccine activists called it. So that if you want it to push your anti-vaccine content into tea party communities, you would use the tea party hashtags alongside yours. And then they were kind of using these basic marketing tactics actually for policy activism. And I thought that was really interesting cause it made me feel like everything was going to be a marketing campaign for an idea. Any policy that went out the door from now on was effectively going to use these tactics.
[00:07:31] And so the people who were best, most adept at deploying them were going to be the groups that got their message heard because what mattered was share a voice. On social media, what matters is how much of your audience can you reach. Can your meme become the sticky thing that everybody says? Does it have virality potential where it's going to be forwarded along? How are regular people going to see your content and engage with it? How are influencers going to pick up your content and engage with it? And so I started looking a lot at the dynamics of just this really one niche bill in California, that was highly politically polarizing. There was a ton of media content going on, but I got Dox. I got harassed. You know, I was not expecting that either.
Jordan Harbinger: [00:08:09] Sure, yeah, of course. That was a surprise for you at that time. And now it's like, "Oh, of course."
Renee DiResta: [00:08:13] At that time. In 2015, it was a surprise. Now, it's like obviously, right?
Jordan Harbinger: [00:08:17] Yeah.
Renee DiResta: [00:08:18] So also the cost of being a public advocate has really changed now too, you know. Here, I think I'm just like a mom.
Jordan Harbinger: [00:08:24] Right.
Renee DiResta: [00:08:24] And that's not it.
Jordan Harbinger: [00:08:26] Dox, by the way, for people that don't know is like, they put all of your personal information online essentially to say, "Call Renee DiResta at her place of work and at home, and like bother her and her family because she's doing something bad." I don't think everyone knows what Dox means, but it's worse for people who aren't already public figures because, of course, someone can just find your office and call you. You work wherever you work. But it's bad when you're an activist and you have a bunch of crazy people calling your husband's workplace and telling him nasty things about you, which is kind of the point of doing that. Right?
Renee DiResta: [00:08:55] Right. And so I used to have a little Tumblr blog where I put pictures of my kids — well, I had one kid at the time — but you know, wedding pictures, like just your life. The way that normal people engage on the Internet. I locked all that shit down.
Jordan Harbinger: [00:09:07] Yeah.
Renee DiResta: [00:09:07] I took it all down. You know, it was sort of like, you know, Twitter's response to me when it happened. The effect of it was like, you're a limited-purpose public figure now. Like, they'd lock you and they started making YouTube videos about me also. I'd never heard that term before. I didn't realize that —
Jordan Harbinger: [00:09:20] What does that mean?
Renee DiResta: [00:09:21] Limited-purpose public figure is a legal designation that says that you have chosen to inject yourself into a particular narrowly tailored conversation, which for me was immunization policy in California, which meant that I was a public figure in the context of that topic. That I had chosen to engage in. Now, legally, that would mean if I were to file a defamation suit, I would have to, you know, prove an intent to harm as opposed to just intent to harass, which is protections are different for these different tiers of publicness, I guess.
Jordan Harbinger: [00:09:52] Yeah.
Renee DiResta: [00:09:52] And I thought that was bizarre too because I thought, well, on the age of social media, anybody who has, again, like throws up a Facebook page, puts up of a Twitter account, and weighs in on the conversation, if the Internet gets mad at them and turns them into a public figure, like, boom, there you go.
Jordan Harbinger: [00:10:06] Right, you don't have a choice. Like before it was you had to run for office. That's one of their sort of traditional public figure occupations that's listed in the law. It's been a long time since law school, but I think the other one would be like, if you are an actual celebrity.
Renee DiResta: [00:10:21] Right. Like regular people would recognize you.
Jordan Harbinger: [00:10:23] Right. And so you have all of the downsides of being a public figure, but none of the real upside of getting money thrown at you to wear Chanel perfume and like getting tables firsts in restaurants.
Renee DiResta: [00:10:33] Or bodyguards.
Jordan Harbinger: [00:10:34] Right, no, downsides only. Right.
Renee DiResta: [00:10:37] So I had this experience and I thought like, all right, so I'd learned a couple of things here. We did get the law passed — so, you know, success. It's a really hard slog. I was very, very proud of being able to get that done. We were the first state to do it in something like, I think, 30 or 40 years.
Jordan Harbinger: [00:10:50] And the law to be clear was what again?
Renee DiResta: [00:10:52] The law says to have a medical reason to opt for children out of vaccines. That was it.
Jordan Harbinger: [00:10:56] Gotcha.
Renee DiResta: [00:10:56] You had to have some sort of like medical justification as opposed to, "I don't want to."
Jordan Harbinger: [00:11:00] Right.
Renee DiResta: [00:11:01] You know, so I mean, reasonable people can disagree on how they feel about that law and where they think the line should be and what the state's obligation or responsibility should be. I feel like reasonable people shouldn't turn that disagreement into doxing, harassing, threatening.
Jordan Harbinger: [00:11:17] Right.
Renee DiResta: [00:11:18] That's where we got to get into the realm of what the Internet delivered as far as policy conversations versus the olden days where you fight it out through some op-eds in a newspaper. Right?
Jordan Harbinger: [00:11:27] Right.
Renee DiResta: [00:11:28] So I learned a lot about that. I really felt like I got this introduction into network activism. And I think it was the fact that we took a very quantitative approach to understanding how these groups were operating, how they were networking, how social media algorithms, how curation and amplification we're really amplifying this content. And I had been a venture capitalist. So I had done a lot of investing in tech companies, looking at tech business models, looking at implications of technology on society. And I really felt like what I was seeing was this shift, this thing that was going to fundamentally change political campaigning policy advocacy, activism online in a very, very fundamental way.
[00:12:06] And then what wound up happening for me, it was the White House reached out and through Todd Parks, a former chief technology officer. He asked me to meet up. And he said, "You know, some of the work that you're doing, we're trying to do some similar work, looking at how terrorist organizations are using social media for recruitment, advocacy, and propaganda. We think that the way that you've been writing about this and talking about this and seeing this develop is something that we'd like to have you kind of come down to DC and participate in some of our conversations about this in the context of this other group."
[00:12:37] So that was not really where I expected my career to go, but that's how it went.
Jordan Harbinger: [00:12:41] Yeah.
Renee DiResta: [00:12:42] I went to DC for about three weeks, worked with a group of people who were, who had done a lot. You know I was not a counter terrorism expert, but this was ISIS, of course, which was going on at the time. And if you recall, back in 2015, this was the era of the beheading videos —
Jordan Harbinger: [00:12:55] Yeah.
Renee DiResta: [00:12:56] — and the ISIS fanboys, the recruitment process. Recruiting the women to go be Jihadi brides, looking at ways in which — again, if you followed one Jihadi on Twitter, Twitter would suggest like six more. We're looking at ways in which radicalism was happening and in a very real sense, not in a debatable. Is this radicalization or not? Like an actual international terrorist organization was using this platform to push out its message and recruit new adherence. There were what they called ISIS sort of fanboys. So there were the core accounts that would occasionally post the beheading videos and things. Those would come down quite quickly. Twitter would take those down, but then there were the people who would try to serve as like the amplifiers or the people who were like, "Hey, Twitter is censoring this content but you can go find it over here on this other platform," directing people to things like LiveLeak and other places. So they're really working to amplify the content, amplify the message.
[00:13:43] And the thing that we started talking about was, what happens when social platforms become kind of vast tools for propaganda? What happened when they became vast amplification tools? that really in a very different way than propaganda had been done in the past. It wasn't being carried out only through the media, it wasn't being carried out only through broadcast or print, it was a participatory process. It was ways in which you could produce content on memes or YouTube videos. You could go make your own content, and then you could disseminate your own content. And what was happening was these very small organizations were able to follow that process. Put the content out and then real people would come and serve as amplifiers. Leaving this question of if you were to take down or delete the accounts of the people who were amplifying it where you're kind of stifling free expression.
[00:14:34] So this was where that kind of like the manufacturing process of one group producing the content but then regular peer-to-peer dissemination was something that we've not really seen before. It was something that the Internet really enabled in a way that prior broadcast mechanisms and print just hadn't, you know, it wasn't really possible.
Jordan Harbinger: [00:14:54] And this took Washington and everyone by surprise. Right? They were thinking like, "Oh, well, if we find a channel that has ISIS propaganda, we'll just close the channel. We'll just close the website. We'll do something to that." Now, when it's like thousands of users or tens of thousands of users on social media, you have to invent software to track it. And then if you shut that, there's all these free speech issues. Right? I assume that they then said — there's always like the slippery slope fallacy where it's like, "Well, if we shut down anti-vaxx then what else are we going to shut down? And I would imagine they use that with ISIS. Although what's the argument if we shut down beheading videos, then what's next?
Renee DiResta: [00:15:26] No it was one man's terrorist is another man's freedom fighter. And if the US government requests this be taken down, what if a more authoritarian government requests takedowns in the future. That argument did come up actually. The slippery slope was very much a part of it. There were a couple of interesting questions, right? One was first it was not a surprise to them actually. There's been a DARPA program. It's called Social Media in Strategic Communication SMISC. And because the DARPA program is open, anyone can actually go and read the archives. So anyone who's interested in go read the research that came out of SMISC
Jordan Harbinger: [00:15:58] And DARPA is our defense research, kind of like invented the Internet and looks at technology in terms of how we can — it can either be weaponized against us or how we can weaponize it. Is that kind of a layman's overview?
Renee DiResta: [00:16:10] Yeah, that's a good description. I think the motto is to prevent and create strategic surprise, I think is obvious —
Jordan Harbinger: [00:16:18] That sounds ominous.
Renee DiResta: [00:16:19] Yeah, you both prevent it and put it to use. It's a defense department, Defense Advanced Research Projects Agency. There we go.
Jordan Harbinger: [00:16:27] I don't see a motto anywhere, but I haven't looked that hard. Defense Advanced Research Projects Agency. That's what that is. Okay.
Renee DiResta: [00:16:34] So DARPA had this program from 2012 to 2015 called Social Media in Strategic Communication that actually asked this question, what happens if hostile governments, terrorist organizations, unsavory characters begin to use the Internet for these kinds of nefarious purposes, particularly propaganda and spreading false information. It was not a surprise. I think the question really became what could the government do about it though. And there were certain limitations in place. So at the time, the State Department could counter message. But because of US law that says that the government is not allowed to propagandize to its own citizens, the state department messages that went out over Twitter, if they might be seen by an American, had to be declared right attributable to the state department, which meant that the entity that was kind of tweeting back at ISIS sympathizers on Twitter was this US State Department group. So it's sort of like, well, it's not like the US State Department is really going to defer somebody who's on their way to down this radicalization path to becoming an ISIS sympathizer.
Jordan Harbinger: [00:17:36] Right.
Renee DiResta: [00:17:36] So that's not going to be the most well-received counter-messaging —
Jordan Harbinger: [00:17:39] Right.
Renee DiResta: [00:17:40] — source. You're not going to de-radicalize somebody with a State Department tweet.
Jordan Harbinger: [00:17:42] It's like the opposite of influencer marketing. It's like —
Renee DiResta: [00:17:45] Right.
Jordan Harbinger: [00:17:46] "We want influencers to wear our cool shoes and show how useful they are." And then it's like, "Please buy our shoes." Like, "What? No, no—"
Renee DiResta: [00:17:53] Think again and turn away. The hashtag that they were using was like, "Think again, turn away."
Jordan Harbinger: [00:17:57] Ah, brutal.
Renee DiResta: [00:17:58] It was not the most well-executed —
Jordan Harbinger: [00:17:59] No.
Renee DiResta: [00:18:00] — early kind of counter operation, but they recognize that we said they had to do something. So there was a widespread acceptance of the fact that this was going to be sort of like the new normal that information or through terrorists. And at the time, as we were having these conversations in October 2015, we also already knew that Russia was operating new social platforms that also was not surprising. The extent was a surprise. The specifics were a surprise. But at the time, as we were thinking about what was the US government's response to this kind of activity be, of what should the tech platform response to this kind of activity be, even in the very narrowly scoped realm of counterterrorism? The question really became — well, if regular domestic like moms in California can do it and if ISIS can do it better, believe that like Russia and China can do it.
Jordan Harbinger: [00:18:50] Right.
Renee DiResta: [00:18:50] And so it turned into a little bit of like, well, maybe the kind of counter-messaging entity and the whole of government response to this stuff shouldn't be some random State Department Twitter account. Maybe we should be forward looking and think about the fact that this is — I couldn't tell if it was like Chicken Little or Cassandra, you know.
Jordan Harbinger: [00:19:07] Right, right.
Renee DiResta: [00:19:07] It's like something is changing here. I really feel like we should be giving this some like pre-big guns in terms of getting a lot of like the best brands in the country, thinking about what's going to happen here, because it seems like there's going to be a disaster. And yeah, like I said, so Adrian Chen's article about the Internet research agency had come out six months prior, the article called The Agency in the New York Times. So again, the fact that Russia was running fake Facebook pages and doing these things was not news. But the real question became like, "What happens when it's all happening on a private platform?" So the government doesn't really have that kind of jurisdiction. You know, what are the regulatory mechanisms? How do you compel a platform to take action if you choose to do so?
Jordan Harbinger: [00:19:47] So let me pause for a second. Because I think a lot of people, they look at Facebook groups and there's an algorithm that's doing some of the work, right? It's not necessarily just information warfare, actors from Russia and China.
Renee DiResta: [00:19:58] Right.
Jordan Harbinger: [00:19:58] It can be — if I join a group that's anti-vaxx and I've done that before to see what's going on in there, especially when I do shows about that kind of thing, it's kind of a quick jump to like QAnon California, right? And like Pizzagate, 9/11 Truthers, like those are the things that were in their sidebar at the time.
Renee DiResta: [00:20:17] Yup.
Jordan Harbinger: [00:20:18] It wasn't like biking in the Bay area. Right? There wasn't normal stuff. You search for COVID-19 or coronavirus. It's not like a coronavirus information network. It's like coronavirus truth, coronavirus warriors, coronavirus, whatever, like it's a misinformation rabbit hole that you almost can't even find the real stuff.
Renee DiResta: [00:20:36] Yeah.
Jordan Harbinger: [00:20:36] The algorithms just like, "Oh, you believe dumb crap. Here's a lot more dumb crap for you to deal with."
Renee DiResta: [00:20:41] So there's a reason for that. So when you build a social network, particularly if somebody is onboarding. You want to show them things that are relevant to them, which means that you build a recommendation system and before it has data from the person before it's instrumented the user and follows their clicks — so what they engage with and what they do — you're in a blank slate. And so what it does is it does one of two things. It shows you accounts that are geographically relevant. It keys off the information that has to suggest interest based on certain demographic characteristics, certain things that maybe you have selected.
[00:21:12] But the other thing that it does — so when you engage with content, it can show you more content similar to that type. So that's called content-based filtering. So it says, "You like gardening, here's 50 more gardening pages. Then there's the other thing which is called collaborative filtering, which has you in its aggregate and its understanding of who you are based on your clicks and your behavior and your demographics and all of the data that it masses on you. If you engage with this type of content, people who are statistically similar to you who engage with that content also engage with this other content.
[00:21:43] So if you engage with gardening, maybe you're also a cook. So even if you've never searched for cooking, we can push cooking stuff at you because 75 percent of people who gardening also like to cook. An oversimplification but that's basically how collaborative filtering works. So it says that some percentage of people who liked this thing, like this other thing,
[00:22:00] So what happens with conspiracy theories? There are some people who are into the anti-vaccine conspiracy, for example, because they have health — they're, they're concerned about toxins or they're concerned about the sort of purity people, “My baby is born perfect. Why would I inject this thing into them?" But then there are other people who are involved in the anti-vaccine movement because they believe that there are vast government cover-ups and the New World Order and the Illuminati and Pharma. These sorts of people who are more on the conspiratorial, "Someone, somewhere is harming people and I'm not going to be a victim of that." Those people are more likely to be receptive to other anti-government conspiracies or conspiracies that are alleging that evil nefarious horses are at work. And that's where you start to see — even if you were originally going to the anti-vaccine group, because you are a natural health like juice person, it still says well of the 10,000 people in this group, 7,500 of them are also members of this Pizzagate group. Ergo, we're going to share with you the pizza gate stuff too, even if you've never gone and seen it.
[00:23:01] So I also started getting — you know, I had joined a bunch of antiques ad groups and pages when we were doing the vaccine law. And so I had this account that was regularly, you know, I was getting recommendations for Pizza gate and for QAnon. And so that tie, that kind of conspiracy correlation matrix kind of became pretty clear, really early on. And that was another thing where we were trying to flag that and say, "Hey, maybe this is one reason why some radicalization is happening." Maybe pushing ordinary people who have never gone looking for Pizzagate or QAnon — we're like pushing it to them and saying, "Hey, you should see this. You should see this. You should see this." Maybe that's not the most kind of ethical design for a recommendation system. Maybe that's not how we want the recommendation system to work.
Jordan Harbinger: [00:23:44] Yeah. Remember when we thought cyberwarfare was going to be like a tax on power plants and infrastructure, and now it's like, your gardening group suddenly links you three groups later, and you're talking about how essential oils can cure COVID-19 or drinking bleach can do it. And it's like, unfortunately, that algorithm has sort of mastered finding people who lack critical thinking skills and then like pushing their boundaries a little bit, little by little. And even people who are otherwise sensible can have their boundaries pushed like this, especially if it's done over time and in a calculated way. And it's like this algorithm that's designed to serve you a bunch of ads and keep you on the platform for as long as possible, just happens to also be really good at that, good at pushing people's belief systems.
Renee DiResta: [00:24:25] Yep.
Jordan Harbinger: [00:24:25] So it's not just the platforms. It's not just the algorithm. You mentioned earlier, the Internet Research Agency. I don't think IRA has ever been a good acronym for anything. Not a great track record for that one. What is the Internet Research Agency? Sounds so innocuous sounds like an academic think tank.
Renee DiResta: [00:24:41] It's an entity in Russia attributed to an individual named Yevgeny Prigozhin. Prigozhin is an oligarch, a Russian businessman with close ties to Vladimir Putin, president of Russia and the Internet Research Agency began operations in around 2014, I think, late 2013 established 2014 operational. It originally operated largely like a comment farm, kind of leaving comments. And then Twitter became a bigger thing, tweets on content related to topics of interest to the Russian government. So that originally started locally. So Ukraine, the annexation of Crimea, and MH17, the downing of the jets by the Malaysia Airlines jet — its goal was really to kind of nudge the conversation in particular directions to distract, to do the things that you would do — you know, China had actually set up an operation called the 50 Cent Party. It is sometimes called the 50 Cent Army back in 2004. So again, there are certain entities in certain governments that believe that kind of controlling the narrative first domestically, first within your local kind of sphere of influence.
[00:25:38] The Internet Research Agency rapidly expanded outward and by mid-2014, early 2015 was actively targeting Americans as well, American citizens.
Jordan Harbinger: [00:25:48] Yeah.
Renee DiResta: [00:25:48] So it didn't start as an entity to disrupt the presidential election of 2016. That of course is the thing that it's perhaps best known for because of the investigation and that's been a very much a topic of conversation, Russian interference, but it did start several years prior to that. And it was a multi-year operation. One thing that's kind of remarkable about how it took very old active measures, tactics, things that had been carried out in the Cold War, the Soviet era. And I think their real insight was that propaganda was going to shift with the shift in technology. And as again, this was something DARPA saw too.
Jordan Harbinger: [00:26:24] Yeah.
Renee DiResta: [00:26:25] Russia executed on it and prioritized it, actively prioritized it, actively made it a strategy, actively made it something that it put to use for geopolitical influence operations. And so that was, again, that recognition that propaganda evolves to put the information ecosystem or architecture of the day. The new information architecture was social networks. The new information architecture, the means of transmission was peer to peer. So rather than executing simply on old Cold War propaganda, tactics of placing an article on the newspaper and kind of laundering a narrative up the chain of newspapers, they realized that you could make a meme, push the meme to people who would be receptive to it, and then have those people act as your disseminators. And they would become this process by which content designed for the Internet, designed for people — very much like an extremely modern social media marketing approach to propaganda was what this entity really took and ran with.
Jordan Harbinger: [00:27:18] And they're trying to create tribes. Right? These aren't just like meme posters. I know some of the earliest pages were — that they created these Facebook pages for the religious community for the right, and then they kind of switched more of whatever to black activism. And then it's like Texas pride, LGBT pride, Texas secessionists. And they're targeting these in-group communities. Why are they doing this? What's up with the tribes? And it seems random to somebody like me looking from the outside or someone that's new to this. Looking at LGBT Kansas pride. How is that possibly related to somebody trying to change an election or trying to change public opinion? It seems so niche. How are the 500 people who are LGBT in Kansas going to do anything? Like what's the strategy here?
Renee DiResta: [00:28:04] So the Internet is really made to help you find people who are like you, right? So much of that is built into the design. It's prioritized. It's actually prioritized. This is why Facebook developed groups. You used to go to Facebook with your real friends and you would see what your real friends were putting out there. But if your real friends either weren't that interesting, weren't that active on the platform, for the platform to keep you there, coming up with the structure of groups is a fantastic way. We help you find new friends. We help you find new interests. And particularly in a kind of increasingly polarized US, a lot of that was people who would come to Facebook to kind of like fight the culture wars. They find their political group and get entrenched and become really active.
[00:28:46] And you started to see this on Twitter as well. Again, these kinds of like persistent groups of people who were really aligned around particular shared interests. You had your Internet friends, all of a sudden.
Jordan Harbinger: [00:28:56] Right.
Renee DiResta: [00:28:56] And sometimes, for people who are more into fighting and trolling — do you remember the 2015, 2016 campaign? Real Americans are just on there with say like, they put their Pepe the Frog or their blue wave, or whatever, like you signal now, it's like the DSA Rose, the nail —
Jordan Harbinger: [00:29:12] Yeah.
Renee DiResta: [00:29:12] — liberal, like, globe. You know, whoever you are, it's like, boom. It's right out there and your affiliation, you're just wearing it on your little like username line there.
Jordan Harbinger: [00:29:21] It's the MAGA hat of your social media account, right? There's even lobsters. Those I don't know. They're like a Jordan Peterson thing.
Renee DiResta: [00:29:29] Oh yeah.
Jordan Harbinger: [00:29:29] The rose is socialism. I don't understand why those are that way. I've asked about the Rose and I got an explanation for that. But yeah, like then there's the blue hat emoji. So anything that you can possibly imagine.
Renee DiResta: [00:29:41] The avocados for California, you know, kind of making fun of the idea that if you just eat less avocado toast, you can buy a house.
Jordan Harbinger: [00:29:47] Oh, yeah. There's something to that though actually. You can at least pay your mortgage with that.
[00:29:56] This podcast is brought to you by Microsoft Teams. Now, there are more ways to be a team with Microsoft Teams, bring everyone together in a new virtual room, collaborate live. Building ideas on the same page and see more of your team on the screen at once. Learn more at microsoft.com/teams.
[00:30:15] So we're wearing this on our sleeve. We're fighting with each other. Like you can even just find people to mess with, by searching for that particular hashtag or that particular emoji. Right? You can just pick fights that way.
Renee DiResta: [00:30:26] Yeah. And so you think of the Internet as like a network series of factions, right? Just this kind of war of all against all. And everybody's out there trolling. What the Internet Research Agency does, they're a big kind of innovation is they realize that they can create Facebook pages. And they can pretend to be members of these communities. And so the content that's put out is 90 percent of it is really about solidifying that anger of dynamic. Right? It's about saying, "I, as a Texas secessionist, believe this. I, as a Black Lives Matter activist, believe this." And so the point of the pages, what they're doing is they're really instilling pride in that group. "We are descendants of Confederates. This is not about racism. This is about our pride and our Confederate heritage."
[00:31:10] So most of the narratives, even for the groups that many Americans would see as highly polarizing was reinforcing that in group pride. And then, when you do that, there's a kind of a presentation of other groups as the other. Right? So the question then becomes if, I, a real American, have this identity and this other guy's identity feels kind of countered to mine, the question that sort of underpinning a lot of it is like, "Who is America for?" So, if you believe like there's a finite set of resources, why are we giving money to refugees and we should be supporting our veterans? And that was a theme. That was hammered home constantly. Why should black Americans vote if this country has never done anything for them? And so for us, it's how it's phrased. It's always phrased as like, "I am a member of this community. Speaking to people who are just like me."
[00:32:00] It doesn't read as media from on high or kind of like ivory tower, intellectuals, or media pundits talking to you. It reads like you're having a conversation with people who are just like you. So there's that element of you are receptive to the message because it's coming from somebody who is theoretically, just like you.
Jordan Harbinger: [00:32:18] Right. But unless you're like a Russian 20 something, 30 something that lives outside of Moscow, they're not really just like you.
Renee DiResta: [00:32:25] They're not really just like you.
Jordan Harbinger: [00:32:25] They're just pretending to be. Right. Why the focus on the black community specifically? Or was that just an example you picked?
Renee DiResta: [00:32:31] No, no, no. That was actually a huge percentage of the content was focused on the black community. So the Senate requested from the social media platforms from Facebook, Twitter, and Alphabet Google, which included YouTube, that they provide all of the data related to this operation. And then I led one of the teams that analyze that data set. And there was another team that did the same analysis with the same data. And we were told not to communicate because they wanted to make sure of that. They could check to make sure that different teams found the same things, because this was such a political live wire at the time. This was 2018 when we were doing this work. It was before the Mueller report came out. And so what we got when we were given this dataset was this look about 400 gigs worth of stuff, several hundred thousand memes, 10 and a half million tweets. So just this kind of corpus planning multiple years of their activity.
[00:33:18] And then what we were tasked with was saying, what are the kinds of tactics, techniques, and procedures that this particular adversary used? What were they doing? What was the goal? What were the messages? How did it work? And you couldn't look at it and not see the extraordinary effort put into expanding racial tension.
Jordan Harbinger: [00:33:36] Now when you say memes. I think a lot of people know what those are, but when I think meme, I think Kermit the Frog sipping tea. I think somebody is rickrolling me when I'm clicking on a link. Right? Those are what I'm thinking of. Are we talking about the same thing?
Renee DiResta: [00:33:50] Yeah. So meme, the kind of geeky academic definition, goes back to Richard Dawkins, right? The idea of a cultural gene, a unit of culture. And much the way your genes kind of in aggregate form you as a person. A meme is something that is intended to be spread. It's intended to propagate. It's not only the kind of square cat picture with the bite words on it which is I think what has come to be thought of.
Jordan Harbinger: [00:34:16] Okay.
Renee DiResta: [00:34:16] It's things that sort of signal of participation in a particular community. If I were to say, this is fine, probably maybe 50 percent of your audience would immediately see the dog on fire in a room, right, you know?
Jordan Harbinger: [00:34:27] Right.
Renee DiResta: [00:34:27] If I were to say, winter is coming. Maybe you get more people. Right?
Jordan Harbinger: [00:34:32] Yeah, this is your brain on drugs. Is that one? That's something I remember from the '80s.
Renee DiResta: [00:34:36] Right, right. Exactly.
Jordan Harbinger: [00:34:37] So basically anything I remember from the '80s or '90s is potentially a meme because everything else is gone. That's how effective these things are.
Renee DiResta: [00:34:45] Yeah. Well, they're intended to be sticky. They're intended — people will use them in a certain way. If you were to go and look at like the K-pop community on Twitter. I am not a K-pop fan. I'm not a NOC fan. I just don't pay much attention to it.
Jordan Harbinger: [00:34:57] Not yet. Yeah.
Renee DiResta: [00:34:58] You know there's these hashtags, there's like a whole vocabulary, right? There's a little like in-group language of ways in which people who are part of the community, talk to each other. One of the things that's really unique about the Internet is ways in which searching for certain hashtags, certain phrases, certain words kind of like is the gateway to finding that entire faction entire community
Jordan Harbinger: [00:35:20] Hashtag BTS, right?
Renee DiResta: [00:35:21] Right, BTS.
Jordan Harbinger: [00:35:23] New track is fire.
Renee DiResta: [00:35:24] Don't go down that rabbit hole. Yeah. But what the Internet Research Agency did if you look at their Instagram posts, every single Instagram post has like 40 hashtags down at the bottom because they know that that's how people are going to find them. So if you're searching for, you know, in the most basic form, hashtag MAGA, right? It was the one thing that the Trump campaign did very well was their slogan became it could be reduced down to an acronym and that acronym became kind of a meme. That became a thing that people would use to signal their support by putting the hashtag in their Twitter bio, by putting the hashtag on Instagram. So you make your community discoverable in that way.
[00:36:01] And the other thing that's really interesting about memes. They don't require very much thought usually, right? It's something that kind of immediately hits. It's got some emotional resonance. It's funny. It's pithy, it's tailored for kind of Internet style communication. The olden days of propaganda a lot of times it was like kind of long form narratives. You would read a persuasive article or an article that made an insinuation, you would feel a certain way maybe, but it required a lot more time. With memetic propaganda was just the — you know, I believe that veterans before refugees, like ensure if you agree, that's all you have to say. Right? And you've communicated information about yourself, information about what you believe a political point of view. Like and share that takes two seconds. So there's no real heavy lift. You're not asking someone to go read a thousand-word article. They click a button and it's moved onto their network as well. And so that's the kind of propagation that happens.
Jordan Harbinger: [00:36:52] So it's like a virtue signal plus propaganda in an easily snackable, sharable piece of content.
Renee DiResta: [00:36:59] Yeah. And to be part of the community, to be part of the activist faction, all you do is click the share button.
Jordan Harbinger: [00:37:06] Are these things being created by bots or real people or both? Because I'm always on the fence. Right? I do a lot of stuff about the Chinese Communist Party and like organ trafficking or whatever. And you see the Wu Mao, 50 Cent Army.
Renee DiResta: [00:37:16] Yeah.
Jordan Harbinger: [00:37:16] They'll post and some of it is clearly just like an automated thing because they've done zero looking into anything that I've done. Or I'll get DMs when I do stuff about Russia. Like I had Clint Watts on the show and he was like, "Watch out, you're going to get a ton of Russian bots and Russian hate mail." And I was like, "Eh, no problem." And my DMs were just alive with people who are like, "You should have been an abortion—" Like stuff like that, like horrible things. And I was like, "Oh, this is just automated. I bet this person tweets it to everybody who is on their like, hate list." But then other things look like real people.
Renee DiResta: [00:37:48] Yeah.
Jordan Harbinger: [00:37:48] Other things like someone will engage and some of them seem to just be really sort of dumb folks from wherever, but other times it's clearly a foreigner because they're like — what is it called? Like subject-verb agreement is off —
Renee DiResta: [00:38:00] Yeah.
Jordan Harbinger: [00:38:01] — and everything. Like the punctuation is off.
Renee DiResta: [00:38:02] The articles are wrong, yeah.
Jordan Harbinger: [00:38:03] Yeah, the articles are wrong. Pronouns are wrong, stuff like that.
Renee DiResta: [00:38:06] So your reasoning is interesting there — so I'm at Stanford Internet Observatory now. And I look at both China and Russia and we've actually done quite a lot of work on China recently because one of the really interesting things, you know, well, Internet Research Agency, I believe Russia is still kind of most effective, most sophisticated at understanding how to target Americans. You asked earlier why racism. That was a very common theme in the Cold War also, right? How can you say you're a free society if the black people are treated so badly and don't have rights and those sorts of civil rights? They have a very kind of deep bench to work with on understanding what messaging works with Americans.
[00:38:41] As we've seen China come into the game, as we've seen China expand from the Wu Mao, which focused on the domestic Chinese Internet into how do you execute those operations, targeting people internationally. And that's a really interesting question. So what we've seen from China — first of all, propaganda is a core part of Chinese government public diplomacy, right? So there is a propaganda bureau. This is not a thing that has done surreptitiously or secretly it's quite out there. So they have a large state media apparatus quite well developed. That's attributable. It's sometimes called white propaganda. It's the attribution that's quite clear, as opposed to what Russia was doing, which is sometimes called black propaganda, because the attribution is nebulous. It's kind of opaque. It's hidden in the shadows. It's actively misattributed, as opposed to knowing where the provenance comes from.
[00:39:29] So what we've seen from China is first of all, they've taken their broadcast apparatus, their television stations, their radio stations, their print news, and they've begun to establish presences for all of those other publications on social platforms as well. So they have hundreds of millions, not an exaggeration. I think CGTN, China Global TV Network has over a hundred million Facebook fans on its page.
Jordan Harbinger: [00:39:52] Wow.
Renee DiResta: [00:39:52] Yeah. I mean, to put that in perspective, right? I think CNN has about 33 million, right? So this is sort of like 3X CNN. Russia today has something like between seven and 15 dependents. They all have regionalization for their channels. So, what you see is the overt apparatus, ported to Facebook, even though Chinese citizens are banned from Facebook, they can't go on Facebook. The purpose is not to reach the domestics the way you have with some of the Chinese state media and Wu Mao, it's to create ways of reaching audiences globally and so they run ads. And so you'll see CGTN boosting their content. For example, about coronavirus, they began to run ads, kind of internist, pushing out the Chinese state position on coronavirus. Where it had originated? How the country was handling it? All of the kind of glowing stories about China sent PPE to Italy, China built this hospital. You know, so these sorts of positive stories. So they're using the Facebook apparatus as yet one more channel and an overt propaganda strategy.
[00:40:51] Then the other thing that they do is they begin to run bots and that's where you get into, again, that actively misattributed, these are just ordinary people on the street talking, but again, they are controlled by — the attribution that's made to the Chinese Communist Party. It's a little bit difficult to make an attribution to like where in the Chinese government, this is attributed to. So Facebook and Twitter say kind of China or CCP. But what we see from those accounts is, as you're describing, it's very much like the Wu Mao kind of ported into this other platform. It's not very well targeted. The personas are not very sophisticated. It's not like Russia, where they had these personas that were so convincing that real people, real big influencers, Donald Trump Jr., [00:41:35], Jack Bursey, were retreating these accounts because they really seem to be a black woman activist, a Marlboro-man style, Trump supporter, right? And so where Russia invested years in developing these personas and making them convincing. What you see from China is this kind of throw spaghetti at the wall.
Jordan Harbinger: [00:41:57] Right.
Renee DiResta: [00:41:57] The accounts don't even have plausible names. They rarely have a profile picture. They don't have a great bio. It's just crap.
Jordan Harbinger: [00:42:04] Yeah. It's like an anime or like a stock photo and it's cropped poorly.
Renee DiResta: [00:42:08] Right.
Jordan Harbinger: [00:42:08] Yeah. It's almost like even the creation of the photo was automated using basic AI and it just didn't work very well. It's like a screenshot from a photo of something else. And the name is like Huggy Bear Rose. And you're like, this is not a real name. It's a bunch of random words that sounded cute, put together. Somebody who didn't speak English fluently thought it sounded like a name.
Renee DiResta: [00:42:28] You're absolutely right. You see this — they're not very persuasive, but one thing that we're looking at right now is trying to understand the motivation for that. Like, why would you run an operation that's just such garbage? Like, what is the point? Because when you go and you try to take over a hashtag or put out a hashtag or do something like this, Twitter is watching now, right? They're integrity teams that are designed to find state sponsored operations on the platform. This is the kind of legacy, you know, the improvement post 2016. So the question is like, why do you do it? And one of the things that we're thinking about at the observatory is not all propaganda is designed to persuade. Not all activity is designed to persuade. Sometimes it's designed to distract. Sometimes it's designed to just make it too hard to find the good stuff.
[00:43:13] So if in the case of the Hong Kong protests, there was this one moment. I remember this. I was sitting in an airport. I was getting ready to go somewhere and I was on Twitter. And this was right when the Hong Kong Police had shot a woman in the eye. I believe she was a medic or a nurse or something and they shot her in the eye. And there were these extraordinary photos of the woman on the ground. you know, the tear gas around. It is very, very compelling visuals. and Western Twitter was paying attention all of a sudden, right? It was really paying attention to this moment. My Twitter feed, which is not primarily China Watchers, was all talking about this moment. And it reminded me of Iran actually, when right after the Arab Spring, there was a woman. I think her name was Neda, who was murdered and her face became an icon. And similarly, with China, this woman who was shot became an icon and the protesters began coming out in solidarity with eye patches, covering their eyes for their sort of woman who became an icon.
[00:44:07] And when you have that moment where the protest has a human face, has an icon, that's where you see government often times will come in and we'll realize that this is now a thing that it has to respond to. This has just gotten quite big. Right now, there's a face, there's a humanization. It's not abstract protests. It's, it's quite personalized. And so what we started to see almost immediately after that incident happened was all of these accounts coming out of the woodwork to talk about how the West had it wrong. Her side had shot her. She was really a plant. This was a false flag. The Hong Kong Police hadn't done it. When that happens, you see this barrage of conflicting narratives. They don't even have to make sense. They don't have to be cohesive. They just have to be in the hashtag so that when people are searching for information about this protestor or about this moment, what they see is this content, which is designed to cast doubt on what actually happened by flooding the zone with alternative explanations.
Jordan Harbinger: [00:45:05] Right.
Renee DiResta: [00:45:05] So saw Russia do this when Malaysia Airlines, when the flight came down also, you know, six or seven different explanations for what had happened, but how it wasn't them. When Jamal Khashoggi was killed, the Saudi bots turned on. Boom! Here are all the different reasons why Jamal Khashoggi was not murdered, was just missing. He was still in the embassy. He had fled.
Jordan Harbinger: [00:45:25] Right. So this is a journalist that was murdered in the Saudi consulate or embassy. And they cut them into little pieces and they got rid of him. It was horrifying. So what they're doing is they're competing for attention, right? And they realized, I guess, that they don't need to get a critical mass of all Americans to do anything. They can just focus on these pockets. Diseases can take root literally or figuratively. Swing States can be swung. And you focus on these little bubbles, just like in real life that make fringe perspective seemed like a normal or prevailing one. So you have like, Anti-vaxx, 9/11 Truthers, Birthers, Pizzagate, Flat Earth, and in this case, it's like you say, you flood the zone with something that people just go, "Well, we can never be sure what the real answer is," because these guys over here are saying that MH17, this Malaysian Airlines flight crashed. They're saying it was shut down by Ukraine. They're saying it was shut down by Russia. These people are saying it was shut down by the US. These other people say it was a mechanical thing. We just don't know. Even though it's like 99 percent of all the evidence points to one thing. If enough people scream that it's not that, and that it's something else, then it casts doubt on what the real explanation might be. And it doesn't even matter. So what's interesting is a lot of people. And I originally thought this too, they're just going to argue their perspective. "Hey, this was shot down by a Russian militant in Ukraine." "No, it wasn't. It was shot down by this," or, "It was mechanical failure," but if they say like 10 different things, it's even more effective than just persuasively arguing there one counter argument. If you just cast doubt, you don't even have to persuade people, right? It's easier to just create FUD, fear, uncertainty, and doubt.
Renee DiResta: [00:46:55] Yep. And so I think one of the things, when we see the Chinese activity, it may not be designed to be persuasive.
Jordan Harbinger: [00:47:02] Right.
Renee DiResta: [00:47:02] And they may not care if they lose those accounts because it's really easy to spin up another cluster of accounts when you need them or to go buy a cluster of accounts off the black market, if you need to. And so there's a difference in strategy. You know Russia ran this multi-year long game. And it takes a lot of time and investment in those personas in that operation. And then of course, when they were discovered and they lost about 3,800 Twitter accounts and little over 200 in aggregate Facebook and Instagram pages. When you lose that audience, you have to decide whether you're going to go reinvest in growing it again, over again a multiyear period.
[00:47:35] One of the things that we're seeing is the idea that maybe, particularly for American politics, you have enough people who already have that point of view. You can just go in and amplify it.
Jordan Harbinger: [00:47:44] Yeah.
Renee DiResta: [00:47:44] Why create your own accounts? So one of the things that we're looking for the 2020 election is not so much that same replicating the tactics of 2016 are replicating this multi-year long game propaganda operation or this influence operation. But perhaps instead taking those highly polarized existing factions that exist from real Americans and just amplifying the contents that you create. "What could possibly have happened in this moment on this night? Well, here's 10 different real Americans. You have 10 different points of view about that. We're just going to amplify, you know, those different perspectives."
[00:48:19] We've been seeing the state media take that route, right? RT speaks very differently about —
Jordan Harbinger: [00:48:24] Russia Today? Or is it just RT now?
Renee DiResta: [00:48:26] Well, it was Russia Today, but now the proper name is RT.
Jordan Harbinger: [00:48:29] Okay.
Renee DiResta: [00:48:29] RT speaks very differently about the Floyd protests, for example, then its subsidiary Redfish does, right? Redfish really leans into the American left, amplifying content from the left leaning communities. RT takes the more law and order approach, articulating the conservative point of view. And so here, even in a completely attributed way, if you're looking at Russian state media narratives. On one side, they're talking about how America is burning. On the other side, they're talking about how cops are destroying communities. And so these are, again, these are both points of view held by real Americans. So they're just putting it out there and then they're using their broadcast tools and their ability to reach people, to kind of put out these kinds of conflicting, opposite sides of the story, and then retweeting and finding prominent accounts that hold these particular points of view, and amplifying them in trying to get those ideas, more exposure.
Jordan Harbinger: [00:49:22] Have there been leaks from the IRA? Like have we heard from people that have worked there that are like, "Yup, this is totally real." Because other people are like, "Oh, it's not even a thing. That's just an excuse. That was being told." Of course, it's just more FUD. Right? Fear, uncertainty and doubt. But have we heard from anybody that's worked there? It's like, "No, my job was to cause crap on Twitter."
Renee DiResta: [00:49:40] Yes. Actually, Russian press did some of the early high-quality kind of exposé.
Jordan Harbinger: [00:49:46] Huh? I didn't see that coming.
Renee DiResta: [00:49:47] Yeah, I know. There's some — is it RBC or RBK? I'm trying to think of how they transliterate the name. So there was actually a woman who went undercover. Her name is Lyudmila Savchuk, and I probably just butchered the pronunciation of that, but she went and worked as an IRA troll and then kind of came out and told stories about it. And a few others have also told stories about their time there. They have these people who worked in the troll factory and then told their stories and they described something that sounds very much like a social media marketing agency. Right? "We have quotas. We had to stand up. During stand up, these are the kinds of guidance that we were given for what kind of posts we should put out." The Twitter people operated a little bit differently than the Instagram people. They had to understand the nuances of the platform and react to what was dominant and trending on the platform that day. They talk about sitting next to the person. "I'm running the Black Lives Matter page and they're running the Confederacy page," and sort of how they are going to do things that would start fights and increase tension. So the troll next to you is tasked with startling the other community.
[00:50:49] One thing that I think is very different about the IRA that gets lost sometimes in the conversation about the memes though, is that the IRA actively engaged with real activists also. So one of the things that we talk about when we think about Cold War active measures is the idea of agents of influence, right? People who work for another government that are not telling you who they work for, but trying to influence you to take a particular action.
Jordan Harbinger: [00:51:13] Right. Like that show The Americans, right?
Renee DiResta: [00:51:15] Yes, The Americans.
Jordan Harbinger: [00:51:16] I love that show.
Renee DiResta: [00:51:16] I love that show.
Jordan Harbinger: [00:51:18] Yeah,
Renee DiResta: [00:51:18] I loved it.
Jordan Harbinger: [00:51:18] Of course. Yeah. That makes sense. And they had like the South African people who were like anti-apartheid or something like that, but they were just being used by the Russian government to stir stuff. They were just like college students and one of them ended up, I think — spoiler alert, if you haven't seen the whole series — they ended up killing him for some reason. Or he ends up like getting left behind in some operation. So it's like —
Renee DiResta: [00:51:38] Yeah.
Jordan Harbinger: [00:51:38] — they're using these Americans and people who mean well just as tools to further their own ends. It has nothing to do with the actual mission.
Renee DiResta: [00:51:45] Right. And so they find people who are ideologically aligned with what they're pretending to be. And social media makes this easy. Right? You don't have to have a sophisticated handsome spy and alias, the way, you know, Philip Jennings is the character. You're an avatar on Facebook profile, right? But at the same time, you're the admin of this page that I follow and I'm a Black Lives Matter activist and you look like you're the person running a Black Lives Matter page. Then you DM me and we start having a conversation. You say, "Hey, can I give you some resources for protest? Do you guys need some money? We can't be there in person because we live too far away. But would some posters help?" And so you start to see this process by which there's extensive amounts of direct messaging that's happening. They're really communicating with the activists. They're working to, you know, useful idiots. That's kind of unflattering term to create people who are unwittingly doing what they want them to do because of this perception of shared camaraderie.
[00:52:46] Yeah, they do that targeting a range of communities. They hire a Hillary Clinton impersonator. They pay for Hillary Clinton impersonator to show up to a Trump rally, sitting on a flatbed truck like in jail, you know.
Jordan Harbinger: [00:52:55] Cute.
Renee DiResta: [00:52:55] They hire a self-defense teacher to teach the black community ostensibly how to defend themselves at protests. Right? So again, these are these things that were there and they're actually paying the guy and they're paying him via PayPal. They're communicating with them via telephone and text.
Jordan Harbinger: [00:53:09] And he doesn't know that this is all just an influence operation. He's just like, "Great, a gig that I can run."
Renee DiResta: [00:53:15] Yeah, a gig.
Jordan Harbinger: [00:53:16] Amazing.
Renee DiResta: [00:53:16] Most people don't think that the person who reaches out to them on Facebook is Russian troll.
Jordan Harbinger: [00:53:22] Generally not.
Renee DiResta: [00:53:22] That's where people's minds go. Right?
Jordan Harbinger: [00:53:24] No. Ironically, most of the people that probably believe in a lot of these conspiracy theories would be the people that would be suspicious enough to believe that, except in this case, since the crazies agree with the trolls, they really won't know because they think they're all on the same side. I'm sure you heard about this. There's like a Texas secession movement and like a pro-Muslim protest. Can you take us through that?
Renee DiResta: [00:53:44] Yup.
Jordan Harbinger: [00:53:45] I don't remember the exact details, but this was like a classic ridiculous example of what happens.
Renee DiResta: [00:53:49] Yeah, so they had a page called United Muslims and they had another page called Heart of Texas. Heart of Texas was the Texas secessionist page, United Muslims was the pro-Muslim page, which they used in some very interesting ways. They used it when they wanted to. They ran like Muslims for Hillary protest where they wanted people to show up and be proud of Muslims for Hillary, which of course was designed to antagonize the other side. So it's sort of a very like, kind of dirty weighted to use that group. But what they did in this particular case as they had, there's an Islamic Center in, I think, Houston, Texas. Da'wah Islamic Center, if I recall the name and what they did was they had the Heart of Texas page come out to protest. Islamization of Texas was what the Facebook event was called. The United Muslims were called to rally on the same day to defend — I'm trying to remember what they said, the specific wording. It was to put out kind of pro-Muslim presence to counter this anti-Muslim rally. And so people sitting in St. Petersburg created two Facebook events and solicited the members of this group, which were, I think, both were over a hundred thousand members.
Jordan Harbinger: [00:54:55] Oh wow.
Renee DiResta: [00:54:56] And these are not small pages, no. So they had these different pages and they put out these events and then people RSVP'd and then they showed up in person. And so you had two different groups on opposite sides of the street at the same time. And, of course, the organizers aren't there. You know, they're in Russia.
Jordan Harbinger: [00:55:13] Yeah. They lived in Moscow or St. Petersburg, right? The organizer — I wonder have they ever done anything where there's a protest and like seven people go like, "Hey, where's the guy that set this whole thing up. Like, no one's here." Or like, "Hey, the other side didn't show up."
Renee DiResta: [00:55:26] What they kept doing and what you see constantly over and over and over again in the memes is they're trying to recruit people to be their local on the ground representative.
Jordan Harbinger: [00:55:34] Ah, okay.
Renee DiResta: [00:55:34] So like, "Hey, will you photograph? Get involved in the cause. Come photograph our protests. Come be a reporter for our fake newspaper." So they're constantly trying to find real Americans so that they do have someone to send in person to these events. So that's of course, part of the operation. What winds up happening with this Islamic Center rally is the police have to come and keep the peace between the two sides.
[00:55:57] And then if you go on YouTube — I actually went to YouTube because I was curious if there was first-person video footage. Houston Chronicle covered this, major Texas press is out there reporting on those protests and you can go back and actually read the coverage from the day, which I think now has a disclaimer up at the top linking to their more recent coverage, talking about how this was really executed by Russia. But you have this YouTube footage you actually can go and see a man on the street, a woman with a camera phone recording this stuff.
Jordan Harbinger: [00:56:27] Wow.
Renee DiResta: [00:56:27] So it actually happened. And there's when people uploaded the footage of it to YouTube, which then of course provides the trolls with more fodder because now they have video.
Jordan Harbinger: [00:56:35] Right. And, and of course this has other real world impacts in the UK. I don't know if this has happened here, but people are burning down 5G towers because they're like, "This causes coronavirus." So they're destroying and burning these things down. And then of course, probably the most common, easy to point to example of disinformation in the USA costing people's lives is a lot of the anti-mask stuff is normal Americans, but a lot of it is definitely foreign interference. I mean, you can just sort of tell, it has all the hallmarks of it. Have you looked at that at all? It seems like that's an easy one for them to get involved with.
Renee DiResta: [00:57:08] So one thing, I'll say we have a really hard time with attribution. So the Internet Research Agency, when the platforms provided the data to the Senate, the platforms did the attribution. So I used to get like shit from people on the Internet. Like, "Well, she said it was a Russian troll, but how does she not know?" Now, the platforms did digital forensics and looked at things like metadata, login, shared email addresses, cookies. You know, certain types of behavior did this attribution, trace the network. And one thing that we found when we looked at the data related to how that the attribution methodology Twitter provided some of the metadata, you know, the Baltimore City news fake Twitter account was registered to a Russian Beeline phone, right? You had American pages that were being run with people logging in from Jakarta. And so there was not very much of an effort to conceal the operation because nobody was looking, they paid for the ads in Googles. So this was not a—
Jordan Harbinger: [00:58:03] Really?
Renee DiResta: [00:58:03] Yeah, yes.
Jordan Harbinger: [00:58:04] Oh, wow, wow. That was it there like, "I'm not even going to try and cover this."
Renee DiResta: [00:58:08] No. It was just right out there. You know, Beeline was Russian carrier service. We are looking at all device data and we're like, "Jesus Christ!"
Jordan Harbinger: [00:58:17] Yeah. Like, "Oh, my VPN is not working." "Don't even turn it on. No one's even looking. Don't even worry about it."
Renee DiResta: [00:58:22] Basically, but that's changed now. In any kind of cybersecurity or kind of conflict situation, when you expose the tactic of the adversary, the adversary should evolve to no longer use that tactic. Right? And so what we have now is an additional challenge where there are integrity teams looking. There are researchers looking, regular people are cognizant of the fact that Russian bots or Chinese bots or Saudi bots. any nation's bots exist on Twitter. And so what you have instead is it's much more difficult to say this is being amplified by the Russians.
[00:58:57] And one of the reasons why we try to hold off from doing that is one, you don't want to be legitimizing existing movements. And that's partly not to legitimize them, but it's also partly not to absolve them of their responsibility. It is kind of domestic actors that are pushing out lies or misleading video footage or whatever else.
[00:59:13] So with the coronavirus, one of the things that you look at, in the similar work we've done at Stanford, is we've looked at, how different countries all around the world. We looked at Russia, China, Saudi Arabia, Iran, Venezuela, Brazil, how each of these governments has responded to coronavirus both domestically and internationally. Ways that they'd used their state media apparatus, the ones that have gone one step further and use this kind of covert Twitter bots, fake accounts, fake news that sort of darker side of the propaganda spectrum. And what we see is there's messaging that's put out for domestic audiences and there's messaging that's put out for international media. And the international content is often designed to — you know, it's just kind of like a public diplomacy effort. We want to look good. So here we are talking about what we've done, but what really matters is particularly for more authoritarian regimes, that didn't handle it so well, ensuring that their population believes that they handled it well —
Jordan Harbinger: [01:00:09] Right.
Renee DiResta: [01:00:10] — to preserve their ability to govern. And so what you see is conspiracy theories that the state puts out to ensure that in large part for the domestic audience, because they want the domestic audience to believe this was a matter outside of our control. Dark forces beyond our borders brought this virus here, and this is what we have done to control it. But really the coronavirus was created by the Americans in a lab. Fort Detrick was kind of where some of these sorts of things went.
Jordan Harbinger: [01:00:40] Right.
Renee DiResta: [01:00:40] And then you have entities like Russia Today, which say like the Iranians are saying that the Americans — so it's this kind of daisy chain of just asking questions. And it's been really fascinating to watch. You know the thing was coronavirus is the entire world is paying attention to it. Every government has had to handle the crisis in its own way. The entirety of the population of the planet, pretty much has all been searching for information about the same thing as various countries have gone through phases of high death rates or lockdowns. So it's really been a remarkable opportunity to study how narratives move and how this interplay between the kind of traditional, like sort of fever swamps of the Internet being legitimized through state media and state influencers in this free for all.
[01:01:28] So in some ways it's been less of Russian trolls in there and more of this sort of free for all in which every faction is participating simultaneously. What are the narratives that are spreading? What are the mechanisms by which they moved through communities? And then what is the impact that they have? Do we start to see people change their behavior in response to allegations of a drug working or not working? Allegations of masks working or not working? How does that become incorporated into people's identities? The partisan identity in particular in the US and that's the kind of stuff we've been looking at.
Jordan Harbinger: [01:02:06] This podcast is brought to you by Microsoft Teams. Now there are more ways to be a team with Microsoft Teams. With together mode, you can bring everyone together in one space in the same virtual room. You can bring the power of true collaboration to your projects with whiteboard drawing, sharing, and building ideas in real time, all on the same page. And with a large gallery view, you can see more of your time all at once with up to 49 people on screen all at the same time. You can even raise your hand virtually so everyone can be seen and heard. When there are more ways to be together, there are more ways to be a team. Learn more about all the newest Teams features at microsoft.com/teams.
[01:02:45] Do we know if the United States is doing this as well to other countries? Like, are we doing this in — oh gosh, I don't even know — like Central Asia, South America? It seems like we would have done. But I don't know, are we doing mostly like, "Hey, we're trying to form a democracy and we're promoting democracy," or is it kind of like, "You know what, screw it, we're doing the exact same thing"?
Renee DiResta: [01:03:06] We have not found an operation attributable to the United States. We were chatting earlier. And I said, there are certain laws that prevent the United States from putting out particular types of propaganda or communications, even on social platforms because of the risk of it being seen by American persons and then violating certain laws. Similarly doing kind of mass ingestion of social media data to do analysis and find influencers or messaging, that's also not legal. Like, there's a certain privacy laws that prevent the US government from doing that. So in some ways, the sort of laws that were put in place for kind of older communication ecosystems have limited — it's the sense that I've gotten — US government's ability to do this.
[01:03:48] So even though everybody says, "You're just not looking for, you're not talking about what the US is doing."
Jordan Harbinger: [01:03:53] Right.
Renee DiResta: [01:03:53] We don't have any activity attributed to the US and that is not because people are afraid to find it or unwilling to look for it. It's just that's the current state right now. So the platform, integrity teams have been pretty clear that they will take down any content that they see as inauthentic. And that's kind of where we are right now assumption being that domestic groups do come down, periodically. Activist groups occasionally come down. You know there's a variety of reasons. Sometimes the platforms will pull things down because of coordinated, inauthentic behavior is the term. That really usually means that there's like a state actor involved. And so the platforms will — then Twitter will make the data set public. Facebook will write a blog post and outside researchers will do an analysis and publicize their findings. We do that sometimes. We're one of the teams that does those analysis, but either way that things come down as for spam, just kind of coordinated distribution, like no allegations at any foreign actor or government is involved.
[01:04:51] Or just like, I have a blog and then I have 10 other blogs whose job it is to promote that one blog. And so that becomes more of like a spam-type takedown.
Jordan Harbinger: [01:04:58] Yeah. That makes sense. Coordinated inauthentic behavior has got to be tough to catch. That's a whole technological discussion. I wonder what other solutions are available to us for this? Especially in this, like I think Sam Harris calls it — I'm going to try and say it. I had it the other day — an epistemological free for all where facts do not exist. And I've always wanted to use the word epistemological because it sounds hell, yeah, sophisticated. But what solutions are available to us for this if people kind of don't believe any facts now, or if that's like the idea that they're trying to generate or the mindset they're trying to generate?
Renee DiResta: [01:05:31] I mean, for me, what I think. We have to fundamentally rethink curation and recommendations online. And I think that that is a bigger issue than state actors or not state actors. I think that's a question of information integrity. And then just a question of this is where people will yell on the Internet and say like, "Who decides?" Well, someone's already deciding. The algorithm is already structured in a certain way. So I don't understand why we've decided that these algorithms that are like 10 years old are sacrosanct and we can't rethink it in any way.
[01:05:58] So I do believe as you started to see with anti-vaccine content in particular, with QAnon now as well, the platforms are saying we're going to remove it from the recommendation engine. It stays up on the platform. Meaning if you want to go find it, you can go find it, but we're not going to return it in search results. We're not going to return it. We're not going to promote it in the recommendation engine. And if somebody is looking for an answer to a topic, we're not going to serve up garbage in response to that inquiry.
[01:06:23] It's weird to me that in the span of like five short years or so, we've gotten this idea that, you know, who can possibly decide how to set the rankings and weightings. There is a system that's in place. That's only that old. I don't think it's unreasonable to think that we could change it. Google in particular recognized pretty early on that certain garbage in response to health and financial inquiries for search results had potentially very serious negative downstream impacts for people's lives. And so it came up with a framework called your money or your life. This was back in, I think, 2013 when the framework was introduced in 2015. It was improved. It didn't apply to YouTube at the time interestingly.
[01:07:02] Because for a long time, the search function was seen as kind of different than the social function in the sense that when you're going to search, you have a question. If you have cancer, you want to know what's going to happen to you. You want reliable medical information, not juice, fact bullshit.
Jordan Harbinger: [01:07:17] Yeah.
Renee DiResta: [01:07:18] But if you're on YouTube, you maybe want to be entertained. Right? You know, none of these platforms were not the social platforms. They were not developed to be information libraries and repositories of human knowledge, where people go find answers to their financial and health and political questions on those platforms. They were to help you like find your friends, right?
Jordan Harbinger: [01:07:39] Right
Renee DiResta: [01:07:40] Find your knitting club.
Jordan Harbinger: [01:07:41] Yeah.
Renee DiResta: [01:07:43] Not your scientific and medical authority for how you're going to treat a disease or operate in the world of global pandemic. Right? And as they've been put into that position as this is what they have inadvertently evolved into. I think it's eminently reasonable to say, we have to be rethinking what we're curating and what we're recommending in the context of looking at the potential downstream harms.
[01:08:06] On the subject of the foreign actors though, that's something where — you know, for the 2020 election, we've really done a whole lot of work in outside researchers like, you know, our team at Stanford or governments, tech companies, the integrity teams at tech companies, policy teams, civil society which, you know, particularly for like the African-American community, that's been very targeted. They have a really strong voice now saying like, "Hey, come on. You know, these fake trolls pretending to be black people. They are really detrimental to our community. What are you doing about them?" The ways in which academic researchers, all of the different kinds of stakeholders that pay attention to and study this topic now are working together in concert saying, "Okay, the 2020 election is coming up." How do we have a narrowly tailored — you know nobody wants to be the fact-checking police, but how do we have a narrowly tailored, kind of multi-stakeholder ability to say, "Here's voter suppression narratives. Here's voting misinformation related to voting itself"? And how do we try to ensure integrity in the election and ensure integrity and the outcome of the election? Make sure that people's degree of trust in the outcome of the election remains high. The election remains legitimate. How do we all take the tools and capabilities we have and work together?
[01:09:21] So if we see something that looks like an anomalous activity, we're not going to be the ones to make the determination about whether it's Russian, Chinese, or domestics. But if we were to give it to the social platform and say, "Hey, you guys should take a look at this hashtag you should take a look at this page, take a look at this account. "They're the ones who then have the capability to make that judgment behind the scenes. Or similarly when someone like the FBI says this site appears to have financial ties to Russian activities, Russian intelligence services that go —"
Jordan Harbinger: [01:09:50] Right, they're buying ads in rubles.
Renee DiResta: [01:09:52] Right. So this is where you start to see that we all have different capabilities. How do we ensure that we're sharing information in an effective way? Particularly as the 2020 election approaches.
Jordan Harbinger: [01:10:03] If we can sew doubt and be negative with these narratives, can we do positive things with it as well? Like I'm kind of thinking of explosives, right? They can be used for construction, but only like a small part of that would be construction. You can blast the rocks and then you can build the tunnel and that part takes a really long time. But then when it comes to destruction, they're really fast. You don't worry about cleaning up the mess. If you blow up like a shopping mall or something, right? You just worry about the destruction part. Can we also do positive things with these techniques and these narratives or these narrative building techniques and technologies? It seems like there should be a way for us to go. "Hey, remember when everybody started to believe that COVID-19 was caused by 5G?" Why don't we use that to show people that — oh gosh, I don't know. Good hygiene, or like staying in school is a good thing, you know? Is anybody thinking about that? Or is it just like, "Hey man, one problem at a time. "
Renee DiResta: [01:10:52] No people are thinking about that. On the immunization front, in particular, you are seeing a number of medical journals running convenings, kind of immunization research teams working alongside marketing teams. Not because they're interested in the marketing piece, but again, this idea of everything is a marketing campaign for an idea now. So how do you ensure that you are communicating in a way that's conducive to the modern environment? There's one of the things that's been very frustrating to me as the kind of COVID stuff began happening in January, we started seeing the anti-vaccine narrative.
[01:11:26] So we're paying attention to coronavirus pretty early on because we pay attention to Internet conversations globally. And this just a big topic of conversation in Southeast Asia, even in January, and what we started to see among Americans was the anti-vaccine activists saying, "Oh my God, they're going to use this as a way to Institute programs and mandatory vaccinations." So that was the sort of thing where we're like, okay, this is the thing that. If there is a vaccine, like the people who are the government and the manufacturers and the scientists and the public health authorities have to do better than a press conference here and there, and like a PDFs, because the problem that you're getting at is that there are low levels of trust in authority, and there's low levels of trust between communities. And that is not a social media problem, that is a societal problem. And social media can exacerbate it, social media kind of reflected.
[01:12:14] But ultimately the question becomes how do Americans respond to their authority figures and do they trust them? And if the president that we have, interestingly, funny enough, he was kind of a darling of the anti-vaccine movement when he was running in 2016. He's tweeted in the past kind of tacit support for the idea that vaccines cause autism and a range of these other kinds of theories. So they were big supporters of the president then. Now as this Operation Warp Speed and this idea that the US government is looking for and then hoping to find a coronavirus vaccine, they're not quite sure what to make of that actually. But interestingly, the people who opposed the president who do generally trust science, don't trust the president. And so you have this interesting dynamic of would you take a vaccine that Donald Trump's FDA says it's safe? And that's one of the narratives that you see, even from people who are inclined to trust scientists and science in general, believe in vaccination, vaccinate their children, are hesitant because they don't trust this particular administration.
Jordan Harbinger: [01:13:15] Right.
Renee DiResta: [01:13:15] And this particular administration's commitment to science. So the thing that's interesting about social media is it really kind of tears down the veneer of infallibility. You see your leaders having —
Jordan Harbinger: [01:13:30] Right.
Renee DiResta: [01:13:30] I'm trying to think of a nice way to say it.
Jordan Harbinger: [01:13:33] There might not be so just go for it.
Renee DiResta: [01:13:36] But it's the idea that like, you know, you come away with the sense of like nobody's steering the ship, right?
Jordan Harbinger: [01:13:40] Right.
Renee DiResta: [01:13:40] Nobody's in charge. And the other thing is, you know, you see the media gets things wrong. There are a lot of stories early on with COVID-19 about, "Oh, it's just the bad flu, right? No big deal."
Jordan Harbinger: [01:13:50] Right.
Renee DiResta: [01:13:51] There are a lot of people who were very angry about that. Then there was the mask guidance issue. CDC and the World Health Organization didn't update their guidance until people on social media who were not even epidemiologists had been screaming for weeks that we needed to be using masks. And then there was a sense of well, random persons on Twitter seem to be right and the authorities are lying to us are wrong.
Jordan Harbinger: [01:14:15] Yeah.
Renee DiResta: [01:14:15] So how can we trust the authorities when they come to us later? So you just have the sense that — I think we have to rethink how authorities communicate in the era of social media. And that goes for kind of media as well, in the sense of saying, "Hey, look, we got this wrong. We didn't have — or not even, we got it wrong, with the information at the time, this seemed accurate. New information has since come out then renders that obsolete or incorrect, and here is where we are today." Right? Or communicating to people, "We are 25 percent sure of this now. And here's our best guess, 25 percent. You guys are all on here looking for information. We're going to level with you. This is what we know. This is what we don't know. Here's how we should be thinking about this." Right? It requires a very different style of communication than I think these institutional authorities are used to. And I think that that ultimately is going to have to change in order to restore trust —
Jordan Harbinger: [01:15:07] Yeah.
Renee DiResta: [01:15:07] — in authority, in media.
Jordan Harbinger: [01:15:09] Almost like Thinking in Bets, right? Have you ever read that book or heard of that book?
Renee DiResta: [01:15:12] Yeah.
Jordan Harbinger: [01:15:12] Yeah.
Renee DiResta: [01:15:13] Yeah. I was a trader for a while. Yeah.
Jordan Harbinger: [01:15:14] Yeah. So yeah, Thinking in Bets, right?
Renee DiResta: [01:15:16] Right.
Jordan Harbinger: [01:15:16] Like we're not a hundred percent sure. And we can admit that because I get why an agency says, "You have to do it this way. Because this is what we know right now." Because if they say, "Well, we're really not sure." A lot of people are going to go, "Well, I'm going to go with the guy who says that he's sure," even though he just pulled the information completely out of the air and has no facts to back it up or research, it just sounds more persuasive. But the problem is once you have to change your mind, then people go, "Well, how do I know this isn't one of your things where you're just going to flip-flop on me." But if he says, "Look, we're 60 percent sure it's this way. And if we do it this way, it's at least safer than doing it the other way until we know for sure." A lot of folks like us are going to go, "Okay, good enough. That's like a plan," but I think there's just going to be a lot of people that are used to hearing certainty and only will settle for that. And that's a problem.
[01:16:00] Social media manipulation is right now. What happens now, the deep fakes are coming? They're going to be super easy to make.
Renee DiResta: [01:16:06] Yeah,
Jordan Harbinger: [01:16:07] Deep fakes for people that don't know is when like — I mean, I can only think of porn examples, which might be a little incriminating, but whatever — it's like, they put someone's face on a body of another person and they might even fake the voice and make it look real. I mean, how do we counteract that? If I see a video of Renee DiResta of being like, "Hey, all of this, I just said on The Jordan Harbinger Show and Joe Rogan about the IRA was just complete garbage. I just made it up. I was being paid by Canadian intelligence." I'm not likely — well, I am now after this, but normally I'm not likely to go, "Let me just double check that I saw in the video. It has to be real," right?
Renee DiResta: [01:16:39] Yeah, I've been thinking a lot about that lately. You know, as a society, like we adapted to Photoshop.
Jordan Harbinger: [01:16:45] True.
Renee DiResta: [01:16:46] So I've been curious both about what is the impact of the technology? What is the societal acceptance and internalization of the technology? Do you think they're interesting because — for those who don't know, with Photoshop, there's a thing that you can go back to, right? There is original material. And actually funny enough, Adobe has actually been really like leading the Vanguard of this desire to ensure that there is some sort of like cryptographically verified kind of certification that says that this is the image as it was when it came out of the camera and here it's still is, whereas you can see the manipulations that have taken place on an image or something.
[01:17:17] When you have something that's generated by AI, there is nothing to go back to. So the interesting question is for audio or video, do people believe it? Right now, we're still in the uncanny valley. It's not quite right. It's not quite there. Particularly video, there are certain detections, like ways that you can observe something that a computer has generated. There's little things like you can see when somebody's heartbeat is pulsing. You know, the color of their skin changes ever so slightly. This is how your iPhone can see red to normal, red to normal, red to normal.
Jordan Harbinger: [01:17:48] Wow. I did not know that. That's incredible.
Renee DiResta: [01:17:50] Yeah. There's ways that you can tell if you're looking at a video of something that is alive. I am not an expert in this particular type of detection, but what happens now though, is we have the ability to generate text and still images and video and audio. And I think we focus a lot on the video because it's very, very sensational. As you mentioned, it also did kind of really take off in porn first.
Jordan Harbinger: [01:18:12] Yeah, that's where most people get their news these days. It's right out of Pornhub. Like, "New tech? Oh, let me go look it up right now."
Renee DiResta: [01:18:17] Porn started taking it down out of non-consent, actually out of the idea that the people who were in it had not consented to be in the video.
Jordan Harbinger: [01:18:25] Imagine looking to porn sites for like guidelines on how we're going to develop policy.
Renee DiResta: [01:18:30] Yeah, on how to develop policy. So that was kind of an interesting thing. Facebook and Twitter have since come down on generative media. They recognize that it's different than something that's just selectively edited. Selectively edited is allowed to stay up. Oftentimes with an annotation. Generated will come down if it's seen as being manipulative. Question is what happens when you have a generative that is labeled as satire. Does it fall under artistic expression? So there's a lot of these kinds of policy gray areas but the thing with the video and audio is that usually. They're used to create sensational moments, right? So this is a hot mic politician or a sex tape politician has been — you know, of course, because everything goes back to that.
[01:19:11] So those are various sensational moments that then inspire people to go and try to authenticate the video, find out who created it, find out how it spread. So there's a lot of attention that kind of immediately goes into understanding that particular video. The thing that's been more interesting to me has actually been like GPT-3, which is the open AI's text generation AI.
Jordan Harbinger: [01:19:30] So this is like an AI algorithm or program that writes like a human. Is that kind of what that is?
Renee DiResta: [01:19:35] Writes text. Yeah
Jordan Harbinger: [01:19:36] Okay.
Renee DiResta: [01:19:36] You give it a prompt. You can give it kind of constraints around how creative it can be. The AI has been trained on content on the Internet. So it's read Wikipedia, right? And so it has a body of knowledge, which is very interesting. And so I've been using it, just to see what comes out of it. Depending on what prompt you give it and how, whether you constrain it to stick to what it knows or allow it to be more creative, you will get content back, very different content. You can submit the same prompt over and over again, and it'll return very different types of content to you. And you can have it write long form essays. You can give it tweets and it'll return back kind of tweet link things. I've been working on an essay actually, and I thought of him having a hard time with the closing.
Jordan Harbinger: [01:20:17] Yeah.
Renee DiResta: [01:20:17] Well, it's an article on AI, let me give it a GPT-3 and see what the AI generates for me for my closing. And it did a couple of really interesting things. Like once or twice it pulled in characters, I had not mentioned. I was writing about AI and all of a sudden it gave me a paragraph on Edward Snowden and I thought like —
Jordan Harbinger: [01:20:33] Wow.
Renee DiResta: [01:20:33] — okay. I wasn't expecting that. Once or twice, it returned links to regurgitations of academic papers that it had probably kind of read at some point and consumed.
Jordan Harbinger: [01:20:44] Right.
Renee DiResta: [01:20:44] And then a couple of times it gave me back like once it suggested a title for the prompt, I gave it a couple of paragraphs and suggested a title. And then it gave me — you know when you go to like, you're reading an article and it gives you the list of like related articles on the bottom?
Jordan Harbinger: [01:20:58] Oh yeah.
Renee DiResta: [01:20:59] It generated a whole bunch of titles for things that I thought would be related to this content that I had given it.
Jordan Harbinger: [01:21:04] Wow
Renee DiResta: [01:21:05] Which I thought was sort of an interesting response to get back. But what the takeaway for me was like, it's not perfect. You know, it is sloppy. And at times it goes off the rails into these like freshmen philosophy essays.
Jordan Harbinger: [01:21:15] Yeah, I mean, that's, as far as I ever got with it. So it probably just looks like I wrote the essay but yeah — as Immanuel Kant once said — yeah.
Renee DiResta: [01:21:25] Or just like garbage, just like word soup where you write enough words on a page and you hope that something has come out of it, but really it hasn't.
Jordan Harbinger: [01:21:31] Right. Yeah, you hope your teacher's grading it like while watching Netflix.
Renee DiResta: [01:21:37] So it's not perfect, but it's remarkably good. And with like some mild human curation, you could see how — if you wanted to run a Twitter bot, you could give the prompt, run it a bunch of times, pull the stuff that comes out of it, evaluate it, check it off and push it into a queue, and then go right? And it's original content. And that's the thing that's so remarkable about this stuff. You can't trace it back to something somebody else said, it's not a copypasta. It's not an engram that is tied to some community or something in the past. It's uniquely generated content. And so it really is interesting. It kind of in a way democratizes the ability to run a comment or army because all of a sudden you have this machine that does the work of producing all the content that kind of 50 Cent Army they've been paid to do. Right?
Jordan Harbinger: [01:22:29] Right.
Renee DiResta: [01:22:29] So you have this ability to create something virtually undetectable. It's cheap. It can be put into an environment where it's not sensational people. Aren't going to go look at it. It's just read as like, "This is the voice of the man on the street. This is the opinion of this commenter. And so what I think it ultimately does, a lot of these things, is it means that we have to have a kind of different relationship to our idea of identity online, right? When you can no longer trust that the content you're reading is being produced by a real person, when you can no longer distinguish, then the question becomes like one is, do we care? Do we evaluate the comment based on the merits? But as we've talked about in this conversation, if you create the perception that a whole lot of people have this opinion, and you've just done that you don't even have to use copypasta now you have an AI generating that. Then you really do have the ability to influence through this repetition and this persuasion and this idea that everybody's talking about or thinking the same thing. So then the question becomes, how do we know that these are real people, real identities that are tied to this content?
[01:23:29] And so I think that the next thing that we'll be thinking about is actually that question really, it's how are we going to think about identity in an era when faces are generated? When I spoke with Joe Rogan, I said, there's this website, thispersondoesnotexist.com, and it generates faces. The AI is generating faces. Each face is different and there's no permanent repository or library. So once the face goes by, if I've screen capped it, nobody's going to find it because it's not a stock photo and it's not a real person. And as that technology improves again, you can create personas with these computer-generated faces that are untraceable.
[01:24:05] And we've already seen through Foreign Influence Operations. I was talking about it with Rogan in March 2018 — sorry, I just spoke with him about it in March 2019.
Jordan Harbinger: [01:24:13] Yeah. This year has been five years long. It's so fine. So it's really confusing being the 2020. Yeah. It has been at least three to five years in my perception.
Renee DiResta: [01:24:23] So is having this conversation about this technology in March of 2019. And in November, 2019, we had the first takedown based on this network of pages that were attributable to an entity affiliated with the Epoch Times called the BL. I think it means the Beauty of Life or something. One of these kinds of, you know, fake is not the word. One of these like kind of spammy Facebook pages —
Jordan Harbinger: [01:24:44] Okay.
Renee DiResta: [01:24:44] It was created personas to share their articles and they choose your fake accounts with AI generated faces. Just going into groups and sharing content affiliated with this publication. So we've already seen ways in which people who want to manipulate are using all of these different technologies as they become available. And so the question comes, okay, how do we think about in advance what kinds of frameworks we're going to want to see to ensure that people can trust the information and content that's in front of them? Trust that that person who is saying it is real
Jordan Harbinger: [01:25:17] Renee, thank you so much. This has been fascinating. It's enlightening. It's also a little disturbing because it almost seems — how optimistic are you that we're going to be able to solve this? Like what happens in other countries like Estonia that have been hit with Russian influence operations for decades? Like, are they doing okay or is it just nobody trusts anything they see ever?
Renee DiResta: [01:25:35] I think there are certain countries that have a higher degree of trust in government than we do right now. So like Sweden has been a target for operations, but people trust the government. When the government says, "This is how these operations look." People internalize that and then they feel better, educated, and the impact is less. That doesn't happen here. Unfortunately, it turns into, well, this political party said it's a thing and this political party just is saying it because they hate that other political party. So it's become a partisan debacle here.
[01:26:03] People have adapted to emergent technologies before. Propaganda's been around for a very long time. I think the question becomes are we sufficiently prioritizing making changes to improve our information environment, right? To recognize that this is the new normal. And how do we want to adapt to that? Either by way of regulation or by way of changes, you know, making algorithms more understanding of the downstream impacts that they have. So I think that, that's kind of where we are today. I wouldn't say solve. I don't think that you solve this information. I think it's more like a chronic condition. Right? You manage it, you adapt to it, you help the population be resilient to it. And that's where we have to be going.
Jordan Harbinger: [01:26:44] Renee DiResta, thank you so much.
Renee DiResta: [01:26:46] Thank you.
[01:26:49] I've got some thoughts on this episode, but before I get into that, here's a preview of my episode with Matthew Schrier. He was a freelance journalist kidnapped in Syria by Al-Qaeda, locked up for months at a time in basements, all over Syria. Matthew suffered and learned a lot and eventually managed to escape his captors. This was a crazy conversation. Here's the snippet from that.
Matthew Schrier: [01:27:09] Boom! This silver Jeep Cherokee just cuts across from the oncoming lane and forces us to a stop. The doors popped open and they got out. The guy in the front seat, you know, he was cloaked head to toe in black. He had an AK in his hand. Dude in the back seat, just as pock face guy, sweater with a chrome pistol in his hand. They jumped out and I knew exactly what was going on. I was just like in shock.
[01:27:31] Dude in the black came over, opened the cab door, takes me out, leads me up to the Cherokee, puts me in the backseat. He gets in after me. I looked at him. He reaches up. He pulls the ski cap I was wearing because it's cold in Syria in December. This is New Year's Eve. He pulls it over my eyes and leans me forward and presses the barrel of the rifle to my head. And we took off a couple seconds later.
[01:27:51] I just still didn't know who had me. So, you know, the way to figure out who has me was I asked for a cigarette because like pretty much everyone in the Free Syrian Army smokes and anyone in the gang will smoke. And when they told me I can't smoke, that's when I knew I was really in deep trouble with the Al-Nusra Front, which is Al-Qaeda.
[01:28:07] And they bring me up to a hall to the broiler room. And that's where they torture people. There's kids everywhere. There's a guy hanging from a pipe by handcuffs.
[01:28:16] They sit me down with my knees bent up to my chin. And the force of car tire around your knees and they take an iron rod and they slide it over the tire, but under your knees in a crook and that locks it into place. And then they flip you over on your stomach. So you're cuffed and your feet are in the air and you can't move them. And they take this thick cable and that's what they use. They start wailing on the bottom of your feet.
[01:28:43] Let me tell you something. It freaking hurts. And I got 115.
[01:28:51] That was the beginning of our punishment.
[01:28:57] Well, what are you out of your mind? We're trying to escape from a terrorist prison here. We have more to worry about getting our arm jammed between a rock and a hard place for 127 hours.
[01:29:04] He was like, "Well, I never saw that movie."
[01:29:06] And I was just like, Ahhhh!
Jordan Harbinger: [01:29:09] To hear how Matthew survived, captivity and escape being held hostage by Al-Qaeda in Syria, check out episode 217 of The Jordan Harbinger Show.
[01:29:20] There was so much, we just ran out of time. I mean, there are fake YouTube channels. There are fake podcasts. The Chinese Communist Party does this too. They call it White Monkey Jobs, which I think is interesting. There was a whole fake YouTube channel called Williams and Kelvin. It's all just to create fear, uncertainty, and doubt. They were just hired by Russian intelligence operations. They had no idea who they were working for and they were just getting money supplied by foreign agents. Just crazy. I mean, it makes perfect sense. It's so easy to just go on websites like Fiverr. Or hire other YouTubers to create video messages that are sponsored and make it political or make it so division.
[01:29:52] Now, what we can do, Renee was telling me, is we can look at the dates of social media accounts. Is the account new? Does it have 40,000 retweets in the last two weeks? That screams automation. Is it a singular topic focus posing as a real person? You know, somebody who just only talks about anti-vaxx 24/7 and seems to never go to sleep.
[01:30:12] Also the IRA understands the nuances of meme and Internet culture. And they're so adept at this communication. For example, they use people of color only in memes on black LGBT pages, for example. They know better than to use white people in the photos. So there's an extreme level of — they're very adept in what they're doing, they know what they are doing. They understand the subtle cultural differences that are going to make something stick here in the United States. They just keep testing. Hundreds of people working here. This is all they do.
[01:30:40] Also the pages — I thought this was interesting. The pages it's formed from one thing to another. So they might start off as like black gun owners and then it becomes, Oh, it's an anonymous page for the hacker group anonymous. They're often sleeper accounts. They'll go from The Simpsons to Bernie Sanders to Second Amendment Jesus. It's just crazy. Or they'll start off as something really innocuous, like The Simpsons, and then they build a following and then they overnight get very political and change the subject matter after building a couple of hundred thousand followers or something like that.
[01:31:11] The platforms know they are serving this BS. They just want us to stay on the plan form longer so that they can serve us ads. And the recommendation engine on things like YouTube, it automatically does this for this reason. Platforms need to admit this is a problem before we can get our hands on it and actually fix it.
[01:31:28] So, how do we tell if what we're seeing is an influence campaign or if it's quote-unquote real? Well, does the article that's being shared support what the headline says? Read things before you share. Think about things before you share them. Ask people that you think might know better before you share. A lot of people are asking me stuff all the time about this, especially on Twitter. Also, does the post read like American English? You can also see a lot of bad English and some of these fake and info warfare type posts. Of course, that's not a foolproof way. You can also see things that are repurposed American content. There's plenty of crazies here in this country as well. Is it automated? Is it all the same thing? Is it all memes? It doesn't guarantee that it's a bot or automated, but it's a pretty good hint.
[01:32:08] And by the way, we are not the only country facing these sorts of attacks. Countries like Estonia and other nations in the Baltic region in Eastern Europe, they've been dealing with this for decades because of their proximity to Russia and large domestic Russian population. So this isn't a problem, that's new. And it's not a problem, that's going away anytime soon. The solution is education. We are the ones that have to know how to filter this information and you'd know damn well that you're not going to be able to train crazy Uncle Frank, so it's on you. Remember everyone who believes all this crap, they also go out and vote. And they reproduced for that matter. So make sure that you are doing the same or at least voting and educating yourself if you don't want kids.
[01:32:46] Big thank you to Renee DiResta. Links to her stuff will be in the website in the show notes. Please, if you do buy any books from any of the guests we have here, do use the links on our website. It does help support the show. Worksheets for this episode in the show notes. Transcripts in the show notes. There's a video of this interview on our YouTube channel at jordanharbinger.com/youtube. I'm at @JordanHarbinger on both Instagram and Twitter, or you can just hit me on LinkedIn.
[01:33:09] I'm also teaching you how to connect with great people and manage relationships, using systems, using tiny habits so that it's not a huge pain every single day. That's over at jordanharbinger.com/course. Do dig the well before you get thirsty. If you're trying to create relationships, once you need them, you're too late. And most of the guests on the show, they're in the course, they help out with the course. Come join us, you'll be in smart company.
[01:33:30] This show is created in association with PodcastOne and my amazing team. That's Jen Harbinger, Jase Sanderson, Robert Fogarty, Ian Baird, Millie Ocampo, Josh Ballard, and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for this show is that you share it with friends when you find something useful or interesting. If you know somebody who's interested in info warfare or Russian interference or just interested in how these platforms work and how people are persuaded, go ahead and share this episode with them. I do hope you find something great in every episode of the show. So please do share the show with those you care about. In the meantime, do your best to apply what you hear on the show, so you can live what you listen, and we'll see you next time.
[01:34:11] Now, there are more ways to be a team with Microsoft Teams. Bring everyone together in one space with a new virtual room, collaborate live. Drawing, sharing, and building ideas with everyone on the same page. And make sure more of your team has seen and heard with up to 49 people on screen at once. Learn more about all the newest Teams features at microsoft.com/teams.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.