Jaron Lanier is an early Internet pioneer, computer scientist, visual artist, musician, and author of Ten Arguments for Deleting Your Social Media Accounts Right Now.
What We Discuss with Jaron Lanier:
- The real cost of the “everything is free” mentality that accompanied the cultural proliferation of the Internet and social media.
- How social media manipulates human behavior to threaten free will.
- Why negative emotions are the lifeblood of social media.
- How social media contributes to the mass production of misinformation.
- Why feeding on social media content tailored to you makes it difficult to empathize with the perspective of others.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
On the surface, social media exists to keep human beings connected. But at its core, its business model relies on surveilling, analyzing, and manipulating your behavior in order to more effectively sell you things. It also deprives you of your economic dignity, hampers the democratic process, and even undermines your experience of humanity.
In this episode, Jaron Lanier, author of Ten Arguments for Deleting Your Social Media Accounts Right Now, the man who coined the term “virtual reality,” and one of the architects of the early and not-so-early Internet, dissects how social media companies curate and essentially control what we see, think, and feel. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
The One You Feed is a podcast by Eric Zimmer and Chris Forbes that hosts inspiring conversations about creating a life worth living. Check it out here!
More About This Show
While he experienced bouts of living off the grid throughout childhood, Ten Arguments for Deleting Your Social Media Accounts Right Now author Jaron Lanier is anything but a Luddite. He was an early Internet pioneer who coined the phrase “virtual reality,” and he was creating video games when the Commodore 64 was the most sophisticated and portable computer consumer dollars could buy. But his presence is notably absent from the ubiquitous social media outlets most of us take for granted these days, and it’s by design.
“I don’t think social media in some broad sense is necessarily bad,” says Jaron. “I don’t think it has to be bad forever. I think that there’s this business model that makes it bad…it’s been taken over by this advertising paradigm. And what that means is any time two people connect, it’s financed by some third person who wants to manipulate those people — because that’s the only way anyone makes money — the whole system becomes optimized for addiction, manipulation, sneakiness, and trickiness.
“And once it’s optimized for that, then it’s really easy for bad actors to create millions of fake people to create fake social perception to create…fake news, fake paranoias and irritabilities to get people distracted or shut down. It’s a very common strategy. The whole thing has turned into garbage.”
But Jaron also believes it doesn’t have to be this way, and cites software developer community GitHub as an example of social media being done right.
“It’s not about third parties manipulating you,” he says. “It’s about direct collaboration — contact between the people who are doing things — and it seems mostly really positive to me. It seems to be doing the good work of civilization and it seems to be improving the lives of people who are on it. There’s nothing compulsory about being on it — people don’t feel like they have no choice. But I think it’s good for them.”
Jaron also mentions the podcast as another Internet construct that contributes to “the good work of civilization,” but we might be biased in our agreement with him in this case.
Listen to this episode in its entirety to learn more about the real cost of the “everything is free” mentality the Internet fosters, the dangers of being immersed in your own social media bubble, how information curation by algorithm severs connections rather than builds them, why social media exposure may be counterintuitively shrinking your capacity for empathy rather than helping it thrive, why negative emotions are the lifeblood of social media, how social media contributes to the mass production of misinformation, why even good intentions can be twisted and exploited by an uncaring algorithm to generate oppositional emotional responses and behavioral patterns, the bright side Jaron sees in this entire mess, and much more.
THANKS, JARON LANIER!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at firstname.lastname@example.org.
Resources from This Episode:
- Ten Arguments for Deleting Your Social Media Accounts Right Now by Jaron Lanier
- Other Books by Jaron Lanier
- Instruments of Change by Jaron Lanier, Spotify
- Jaron Lanier’s Website
- How We Need to Remake the Internet by Jaron Lanier, TED 2018
- Jaron Lanier: “The Solution is to Double Down on Being Human” by Tim Adams, The Guardian
- Blood Meridian: Or the Evening Redness in the West by Cormac McCarthy
- The Legendary Slant Six, Dodge Blog
- “Surely You’re Joking, Mr. Feynman!”: Adventures of a Curious Character by Richard P. Feynman and Ralph Leighton
- A Great Story about How Jaron Lanier Met Timothy Leary by SatoriD, Steemit
- Moondust by Jaron Lanier, C64 Archive
- Head-Mounted Display, Ivan Sutherland, 1968
- Data Gloves and Eye Phones, Jaron Lanier, 1990
- Did Susanne Langer Invent Virtual Reality? by Tom Leddy, Aesthetics Today
- GitHub: The World’s Leading Software Development Platform
- All about Empathy: History of the Word and Concept Of Empathy, Center for Building a Culture of Empathy
- Facebook’s Targeted Ads Are More Complex than It Lets On by Louise Matsakis, Wired
- Google’s Ad Tracking Is as Creepy as Facebook’s. Here’s How to Disable It by Olivia Solon, The Guardian
- Facebook’s ’10 Year Challenge’ Is Just a Harmless Meme — Right? by Kate O’Neill, Wired
- TJHS 139: Kai-Fu Lee | What Every Human Being Should Know about AI Superpowers
- Ex-Facebook President Sean Parker: Site Made to Exploit Human ‘Vulnerability’ by Olivia Solon, The Guardian
- Is Social Media Addiction Worse Than Cigarettes? by Goran Wagstrom, Forbes
- UN Says Facebook Helped Fuel Rohingya Ethnic Cleansing by Nicola Smith, The Telegraph
- ISIS’s Use of Social Media Still Poses a Threat to Stability in the Middle East and Africa by Antonia Ward, Georgetown Security Studies Review
- Nothing about ‘Blood and Soil’ Is American by Edward Morrissey, The Week
- The New Blood Libel by David M. Perry, Pacific Standard
- Elon Musk Triples Down on Thailand Diver Pedophilia Claims by Glenn Fleishman, Fortune
- Trump Thanks Kanye West for Twitter Compliments, BBC News
- The Mystery of Marlon Brando by Rex Reed, Observer
- How to Disconnect From Social Media but Stay Connected to the World by Jaime Green, Lifehacker
- Sounds of a Glass Armonica, by William Zeitler, Toronto Star
- Pin Pia Demonstration, ICH Thailand
Transcript for Jaron Lanier| Why You Should Unplug from Social Media for Good (Episode 156)
Jordan Harbinger: [00:00:00] Welcome to the show. I'm Jordan Harbinger. As always, I'm here with my producer, Jason DeFillippo. Using social media is like living in a behaviorist cage. You are constantly being watched, analyzed, and manipulated. Rather than any particular technology, the business model of the social media companies that are watching you, well that is the underlying problem. This business model relies on selling your data to advertisers that want to change the way that you act and convince you to buy. It also encourages some serious, A-hole behavior, deprives you of your economic dignity, hampers the democratic process, and even undermines your experience of humanity. What's worse? The filter bubble makes us all see things that confirm our own worldviews and surround us with people who think the same way as we do. This is so dangerous. Today, Jaron Lanier, one of the architects of the early and not-so-early internet, dissects how these media companies curate and essentially control what we see, think, and feel.
[00:00:57] If you want to know how I managed to book guests like Jaron, and manage my relationships with hundreds, even thousands of people. I use systems and I use tiny habits and I'm teaching you these for free over in our Six-Minute Networking course over at JordanHarbinger.com/course. All right, here's Jaron Lanier. I'll tell you, when I was prepping the show, I saw that your childhood sounded pretty wild. My wife and I were like, “Wait a minute. Then what happened? Hold on. What?” Like it was kind of like a lifetime movie in a way. You had this like gypsy-ish upbringing in a way.
Jaron Lanier: [00:01:29] I guess. I mean, it wasn't intentionally so. What had happened was, my parents were both survivors of deadly anti-Semitism in Europe. My mom's survived a concentration camp in Austria. She's from Vienna. My dad's family was mostly wiped out by pogroms in Ukraine. And they met as survivors in this very Bohemian, kind of cool kids’ world of the fifties in New York City. And when they had a kid, which could be me, they had this impulse to run. Because what happened is, my mother's family didn't leave Vienna soon enough. Many others did, but they didn't. They waited too long. And of course that haunts me these days. I have a daughter now and you know, you wonder what's happening in the US and you know, trying to make these bets. But none of us really know for sure. Anyway, they were going through that and I think they came up with this calculation.
[00:02:31] We need to get as far from civilization as possible. But we've got these American citizenships now, so it has to be in the US and we have to be in a college town for God's sakes. And so the most remote place they could find that had a decent university was in Southern New Mexico. And the place they landed was actually very close to where Cormac McCarthy lives, very close to the blood Meridian territory. I recognize every little rock and branch in that novel. And I initially didn't have a gypsy childhood at all. Initially, my mom raised me like a kind of a high pressure European mom would. Had me take a bus across the border every day to Mexico because in those days, Mexico was more advanced. It had schools that were a couple of years ahead of the Texas schools and everybody who cared about their kids sent their kids over the border in little buses.
[00:03:28] Yeah. So Mexico was where the intellectuals and the artists were and where you went to get your kids educated. It would be like sending kids to Switzerland or something.
Jordan Harbinger: [00:03:38] Wow, that’s [indiscernible] do not know that back then.
Jaron Lanier: [00:03:41] Oh yeah. And Mexico was this place that was not developed. It was still developing, but it had a sweetness about it. It was like Italy or something. It was like this place people love to love, you know, but then the gypsy parts starts when my mom dies in a car accident when I'm about nine. And then things really do get strange. It's true.
Jordan Harbinger: [00:04:06] Yeah. I'm reading about this and it's like you became a midwife. You delivered a baby for this critical event.
Jaron Lanier: [00:04:18] Oh, wait, wait! I didn't exaggerate, so you shouldn't either. I was a midwife’s assistant which is different. Mom dies. It turns out my mother was the breadwinner, so we were suddenly super poor and our house burned down, probably anti-Semitic arson actually. Yeah. This part of the country at that time was pretty rough. There was a lot of violence. We weren't at the bottom of the social wrongs that would be reserved for what were called Chicanos or Hispanics of Mexican ancestry. They were really put upon pretty badly. One of the kids in my elementary school was murdered by other kids in the school and they got away with it. Oh yeah. No, I mean, America in that period, especially like rural and remote America was rather violent and awful and scary. It's one of the reasons why these days when some of the internet idealists would say, “Oh, the Internet's the new Wild West and where the hackers are the new cowboys.”
[00:05:19] I'm thinking, “Oh God, that was horrible. The Wild West was terrible.” I was there like, we don't want that. But anyway, what happened was, we didn't have any money, so we moved to this piece of super cheap desert land with all our stuff under tarps and we lived in tents and gradually built this crazy house that my dad let me design so it's a very long and crazy story. But yes, indeed, it's true. When I needed to start making money, I did become an assistant midwife and it was for a service that helped indigent farm workers. So we'd run around into fields and help women who were giving birth in fields. I mean, it was just in the field. Yeah. Because they couldn't, they weren't documented so they couldn't go anywhere. Yeah.
Jordan Harbinger: [00:06:08] Oh wow! So you deliver this baby and I guess the guy who went to prison at the time, the father, and then when he gets out, he gives you this car and that's got bullet holes in it.
Jaron Lanier: [00:06:18] Well, yeah. I mean, what happened was, the mother had some sort of mental difficulty and was institutionalized shortly after the birth, babies undocumented, dad's in jail. He had been caught up smuggling drugs across the Rio Grande river and the Rio Grande river more is just sort of a muddy thing with a bit of water in it. You can drive across it sort of, but the car got stuck, bullet holes and the car was just sitting there. He gets out and he says, “Would you like a car?” And in those days, I must've been, I'd have to reconstruct it, but I was probably about 15 or 16 years old or something.
Jordan Harbinger: [00:06:53] Hell yeah, you want a car!
Jaron Lanier: [00:06:54] Boy, did I want a car? Because there was no other option in those days. Like you walked, you hitch-hiked, or you had a car. There was no other option because the conditions weren't good for cycling. I mean, that was it, you know, and so I got his car and he said, “Oh, bullet holes, we'll put bumper stickers over the bullet holes.” So that worked actually pretty well. It had rotted out in the river so you could see the street going by under your feet and you have to be really careful not to bring yourself on the exhaust because it was hot and no back seats. The other thing I did for money, aside from being an assistant midwife is I had a goat herd and I sold milk and cheese and this paid for my undergraduate education.
[00:07:34] I was already in college. I'd started college early. So in these years, I have to pay for tuition and stuff. So I used to make goat milk and sell it. And this car, it was a Dodge slant six, and those were indestructible. And, I drove it to Silicon Valley eventually and it served me through my first years here. And I cried when I finally decided I had to give it up.
Jordan Harbinger: [00:07:56] I'm surprised it didn't just sort of disintegrate while parked.
Jaron Lanier: [00:07:59] You know, it almost did. I was thrown to the ground by cops a few times around Palo Alto when I finally made it to Silicon Valley because they'd see me starting it, but with wires, you know, it didn't have a key. And thinking I’m stealing it but I'd say, “Who would steal this? It's really mine.” But yeah, I missed it. I missed it a lot when I had to give it up.
Jordan Harbinger: [00:08:20] What brought you, this is all decidedly low tech, right? So what brings you to Silicon Valley at that point?
Jaron Lanier: [00:08:25] I'm a kid in New Mexico. By now, I've been to New York and back. That's a whole story. I'm probably about 17 and have my first serious girlfriend. She turns out to be visiting her estranged mom from her family. Her dad is back in LA and he happens to be the head of the Caltech physics department, which I didn't know. She goes back after the summer back to Pasadena. I chased her as you might expect someone to do. And I suddenly landed in Pasadena, and I'm like the weird boyfriend of the charming daughter of the head of the physics department, which is some sort of weird role in the community.
[00:09:06] And it turned out to be a really important moment for me because even though it was informal, it meant I was spending time with people like Richard Feynman and learning things. And I already had a math background. And, I haven't even gone and taught it, but I had a crazy other story going on. So it was an amazing time. And eventually, we’re kids, so she met somebody else. We're still friends now and she can't remember this other guy's name.
Jordan Harbinger: [00:09:33] Ha! What do you win? Long game?
Jaron Lanier: [00:09:34] Yeah, I mean, yeah. You know, I guess. So I had to do something and I just ended up actually Potter ride on the back of her brother's motorcycle up to Santa Cruz. And then I lived in Santa Cruz as I was a busker. I play music too. So I played on a sidewalk and I lived in this absolutely preposterously unsafe and unhealthy and revolting compressed group household by the beach with all these surfer kids around.
[00:10:06] Oh my God. Everybody was saying, “You know, you can do math. You should go over the hill to Silicon Valley.” You know what, I was kind of a naive country kid. I didn't even think that way but I finally went over one day in my jalopy and that same car and discovered that my skills were valuable in this funny place. So yeah, that's how I ended up.
Jordan Harbinger: [00:10:29] And you named and founded ‘virtual reality’, which when I was reading your story I was thinking, “Okay, you're non-virtual reality is kind of chaotic.” So maybe there was some allure to like, “Hey, there's this whole world I can construct where I'm not getting chased by anti-Semites and losing my girlfriend and you know, in danger of having my foot stripped off by the street as I drive.”
Jaron Lanier: [00:10:51] Yeah, I guess there's something to that. If your life is strange enough, then maybe virtual reality can be where you find normalcy. There might be something to it when we get to a few years later and stories about Timothy Leary, we can cover that angle if you want, but I started out doing music for video games because I really, like so many young people, I was into music. I was into tech and I really wanted to do creative music and I really wanted to do techs. I did music for super early video games, 8-bit era ones. And then eventually, I started making my own games. I had one that was pretty successful, but it's very, very strange. It would still seem strange today, I think. And it was called Moondust.
Jordan Harbinger: [00:11:33] And this is like pre-Nintendo or what? What era?
Jaron Lanier: [00:11:37] This would have been, yeah, this was way pre-Nintendo. This was 8-bit times so we already had things like the Commodore and the Apple 2 was out and of course the Atari, the first console. And that was kind of it. It was before there was such a thing as a PC. And then I had this one hit game called Moondust that actually generated a lot of royalty. And so some friends and I, we moved into this little collection of old sort of bungalows or shacks along a creek on a dirt road in Palo Alto – a kind of place that just doesn't exist there anymore. And we all lived there and we started what our, you know, the dream was to build virtual reality. That was what we wanted to do.
Jordan Harbinger: [00:12:18] So people were thinking about virtual reality in the eighties?
Jaron Lanier: [00:12:22] Well, sure. I mean, when you get into virtual reality, one of the ways you can stay up all night and have conversations is to talk about exactly when it started and what should count and what's the pre-history and all that.
[00:12:34] But the first headset that tracked so that if you move your head, there's a compensated 3D virtual world that appears to be stationary outside you, which is one threshold for when you can start talking about it. Ivan Sutherland, the inventor of computer graphics, proposed that in the mid-sixties and built one in ‘69. So that would be the first headset. I made the first commercial one. And I made there, you know, if one wants to click first so I could come up with other firsts. I made the first mass produced one or you know, production line one. I made the first color one, and the first one that was a fully self-supported in every sense. And the first hand interactions with gloves, the first multi-person one. And we did a lot of the first applications like surgical SIM and designing interiors and vehicle prototyping and all kinds of stuff like that.
[00:13:23] So, and indeed, the term ‘virtual reality’ was meant initially to be a contrast to the original terms. So when Ivan Sutherland who is still with us and is currently in Oregon and working on an amazing idea for a different approach to making chips where instead of a central clock, all the different parts of the chip are coordinating in an emergent way. And it's very cool and interesting philosophically. But anyway, Ivan's original term was ‘virtual world’, which he got from art theorist named Susanne Langer from the forties and fifties and I thought if that's a virtual world, then if you do a multi-person one, we should call that virtual reality because reality is shared world. So that's where virtual reality came from. And also back then, we had mixed reality, which is for when you have a combining display, which we did some prototypes of as well, but never sold commercially in those days. But let me assure you, you can find people who will want to talk about this all night and argue about the little minutiae of which term is this and who did that. I mean, Oh my God, there's no end. Yeah.
Jordan Harbinger: [00:14:22] Well, one of the reasons that I wanted to talk to you is, of course, I'm curious about all the tech, but you got this book that my producer, Jason, is a big fan of yours by the way, basically made me read, which is Ten Arguments for Deleting your Social Media Accounts. And when I read it, it struck some fear into me and I want to dive into this because this is something even my parents are on social media at this point and so first of all, you have no social media accounts at all, correct?
Jaron Lanier: [00:14:47] That is correct. There are many fake versions of me. There's periodically fake Jaron Laniers on Twitter and whatnot. I think Mr. Putin maintains a whole basement of fake Jaron Laniers so far as I can tell.
Jordan Harbinger: [00:15:02] Yeah. Maybe.
Jason DeFillippo: [00:15:04] You're listening to The Jordan Harbinger Show with our guest, Jaron Lanier. We'll be right back.
Jordan Harbinger: [00:15:08] This episode is sponsored in part by Athletic Greens. What is the most important meal of your day? Traditionalist, whereby breakfast, while the rest of us argue about the merits of lunch over dinner or vice versa. And if you're already tough as nails, like recent guest, General Stan McChrystal, maybe you only have one meal a day. You can call it whatever the hell you want, but I know whatever the most important meal of my day is, it's probably going to involve Athletic Greens, especially if I'm traveling around the road. Athletic Greens is a supplement sourced from 75 whole food ingredients. Fills energy, immunity, digestion, helps manage stress -- and I got plenty of that so I need all the management I can get. It's also got essential vitamins and minerals, prebiotics, probiotics, enzymes, adaptogens, and you basically take one daily scoop. You throw 12 servings of vegetables and fruits in there with that scoop. That's not bad. The four hour body of course Tim Ferris calls it his all-in-one nutritional insurance. I like to refer to it as vegetable insurance, but you know, whatever. Jason, tell them what they can get.
Jason DeFillippo: [00:16:03] If you want to experience the difference that adding Athletic Greens to your health regime can make, go to athleticgreens.com/jordan for 20 free travel packs, a $79 value with your first purchase. That's athleticgreens.com/Jordan. Don't miss this deal.
Jordan Harbinger: [00:16:18]This episode is also sponsored by DesignCrowd. The crowdsourcing, that's how busy people get stuff done in the 21st century and thanks to DesignCrowd, you can focus on running your business while handing over the reins of your company's logo, web design, tee shirt, that whole process, to a pool of over 670,000 pro designers from all over the world. DesignCrowd crowd sources custom work based on your specs and then you pick the design you like best and then you approve payment to the designers. It's really simple. What you do is you go to designcrowd.com/Jordan. You tell them what you want, post a little brief of the art you need. They invite designers from all over the world, 670,000, to respond within hours. You get a few designs over the course of a week and change. You might get 60, a hundred maybe even more different pieces from designers around the world. You pick the design you like best and you approve payment to the designer. If you don't like any of them, DesignCrowd offers a money back guarantee. Jason, tell him where to get it.
Jason DeFillippo: [00:17:08] Check out designcrowd.com/Jordan, that's D E S I G N C R O W D.com/jordan for a special $100 VIP offer for our listeners, or simply enter the discount code JORDAN when posting a project on DesignCrowd. Don't forget we have a worksheet for today's episode so you can make sure you solidify your understanding of the key takeaways from Jaron Lanier. That link is in the show notes at Jordan Harbinger.com/podcast. Thanks for listening and supporting the show. To learn more about our sponsors and get links to all the great discounts you just heard, visit JordanHarbinger.com/deals. If you'd like some tips on how to subscribe to the show, just go to JordanHarbinger.com/subscribe. And now back to our show with Jaron Lanier.
Jordan Harbinger: [00:17:46] Why is social media bad for us? I mean, look, nutshell this for me, and then we'll go into each area in more detail. I already know that makes me feel like crap, so I get it on a visceral level, but people don't really believe me. They think I'm being a Luddite when I talk about this stuff.
Jaron Lanier: [00:17:59] Okay, look, the first thing I want to say is, I don't think social media in some broad sense, is necessarily bad. I don't think it has to be bad forever. I think that there's this business model that makes it bad and so people can connect together in things like social media very positively. And if you want me to give you an example of one that I think is positive now, I can actually come up with a few. GitHub for programmers is like that. It's kind of like social media. It's not about third parties manipulating you. It's about direct collaboration contact between the people who are doing things. And it seems mostly really positive to me, really productive. It seems to be doing the good work of civilization and it seems to be improving the lives of people who are on it. There's nothing compulsory about being on it.
[00:18:50] People don't feel like they have no choice, but I think it's good for them. So that's an example and that's kind of specialized but what's happened with kind of mainstream social media is that it's been taken over by this advertising paradigm. And what that means is anytime two people connect, it's financed by some third person who wants to manipulate those people and the whole system because that's the only way anybody makes money. The whole system becomes optimized for addiction, manipulation, sneakiness, trickiness. And once it's optimized for that, then it's really easy for bad actors to create millions of fake people, to create fake social perception, to create just fake perception -- fake news, fake paranoia's and irritabilities to get people distracted or shut down -- very common strategy. And so the whole thing is kind of turned into garbage. And if you look at examples where you don't have everything optimized for these third parties who believe they can manipulate you, it doesn't have to be that way. And I mentioned GitHub is one example. There's some others but I've been liking that one lately because there you have a really positive online community. Like, why can't we do that for other aspects of life?
Jordan Harbinger: [00:20:03] Yeah. And you explained why that that doesn't work later in the book. And one thing that really struck me was that people have different views of reality entirely, speaking of virtual reality, because we're actually seeing totally different things on social media. So it's not like I'm sitting in the same room with my mom and dad over the holidays watching Fox News or MSNBC, right? It's everyone you know, online, every video YouTube feeds you, every personality you're presented that you then follow. They're all curated to reinforce a specific view. And we're all kind of getting grouped into these different groups based on what we are, what we're doing online. In that way, we all kind of seem crazy to each other because of the algorithm.
Jaron Lanier: [00:20:45] Yeah. So the thing, if each person has a personalized feed, and this is very important, that's calculated to manipulate that person, that's calculated to change the behavior patterns to that person and make no mistake, if that wasn't going on, Facebook would be bankrupt because that's the only product they have to sell. That's what they do. When you put money into it, that's what you're buying. So if everybody's seeing this slightly different world that's optimized to manipulate them, then when they talk to each other, they don't have as much of a basis of shared experience as they need to have to fully empathize. And so when you talk to somebody who's just using social media, it's kind of weird. It's like you haven't been out, they've been away in their social media world but everybody has been. And then it's almost like we all become slightly strangers or work every single day
[00:21:39] we've returned from some weird vacation that nobody else can relate to. And you start to have this strange way in which things don't feel real anymore because things feel real through social perception. People evolved to pay attention to it, the people around them are paying attention to, we work together as a group like meerkats if you like, kind of looking out. And in the book, I described an experiment, my friends and I used to do when we were little kids, where you go out into a crowd on the street and just like starting looking at something and pointing and everybody will be looking there, even if there's nothing. And that's kind of persuasion on social media. And if there's no agreement on what all that stuff is, the real world becomes less real. It becomes less shared. And it doesn't have to be that way. Sharing online shouldn't decrease the reality of offline real world stuff. But if you do it this way, it does.
Jordan Harbinger: [00:22:32] So wrapping to how this actually works, right? So because of the algorithmic customization, everyone's personal feed, among other things looks different. And so because people who are listening to this who aren't really technically inclined might be like, “Wait, I'm seeing the same thing that my husband is seeing or that my girlfriend is seeing, or that my kids are seeing on social media.” And that's not the case. The feed is different, the YouTube curation. So everything we're being fed has essentially been tailored just for us, especially on the big platforms like Google, Facebook, and Instagram.
Jaron Lanier: [00:23:03] Well, what happens in practice is the story that the programmers tell themselves is that we're measuring data from people and using it to optimize the experience of people. But in practice, that becomes us manipulating people in guiding them because there's no way to tell which is happening. Like, if you say, “Well, I'm going to have this algorithm do more of whatever seems to be predictive for the person”, that might be that you're taking in data and optimizing something so it's perfect for the person. Or it might be, that you're changing the person to match your data. You have no way of telling which is more true. See what I mean is? A bit of a subtle point. And so, what then ends up happening is they tend to sort of cluster people together to corral them into groups, but the problem is when you're corralled into a group of shared perception, it doesn't help that problem.
[00:23:59] I was just talking about people living in different realities because those aren't even necessarily people you know. So, the algorithm might corral you into a shared perception about the latest music and a politician and the latest cheese and the latest nose picking trends on YouTube or God-knows-what. I shouldn't make fun of the silly stuff. Okay, whatever people like, it's great, but you know, it might corral you into a group of similar people and it has to do with the statistics on large populations to learn. But there's no guarantee that you know those people, there's no guarantee that you're interacting with them. So, on the one hand that corrals you into type, on the other hand, it does it in such a way that it doesn't reduce that problem of you and other people not quite living in the same world. So you're sort of getting the worst of both approaches.
Jordan Harbinger: [00:24:46] And that not living in the same world erodes our empathy for one another, right? It’s kind of what I've understood from the book, right? So since we can't understand each other so well, we kind of don't have the same ability of feel for them either.
Jaron Lanier: [00:25:00] Yeah. So empathy is a really interesting idea. The term empathy was actually invented by psychologists in Germany a long time ago who were trying to imagine something like virtual reality. So the very idea of empathy is part of the history of virtual reality. It was originally a part of this idea that if you could imagine yourself positioned as any part of the universe and some of the original examples where if you could imagine yourself being a leaf blowing in the wind or a mountain where you could feel forest growing on your body, that sort of thing, that extreme exercise and changing who you are would then help you be able to experience life in some other person’s shoes who in comparison isn't so different from you. And then you might become kinder and less likely to be, you know, racist or biased or dismissive.
[00:25:47] That was the original idea and that was part of the early idealism of virtual reality when we got around to it in the 80s, but unfortunately, it's not working that way at all because instead you're being optimized for the purpose of whoever's paying for the advertising or whoever is manipulating the system with a bunch of fake people and therefore, it's kind of, the metaphor I use sometimes it's like, there can be therapeutic hypnotists. But if you’re hypnotized by somebody who's working for another person, you don't even know who that is, then there's incredible potential for abuse, and probably everything gets a little weird and crazy. And that's kind of what's happened to the internet.
Jordan Harbinger: [00:26:32] So this isn't just harmless e-commerce because look, if my counter argument here would be, “I like seeing relevant ads, I want ads for stuff I might want. What's wrong with that?
Jaron Lanier: [00:26:41] Yeah. You know, the whole ad thing was really cute at first because we wanted to pretend we're in this socialistic environment. Like that was very intensely desired by the kind of leftist early internet culture to want to pretend we're giving freely and receiving things for free. But at the same time, we worship people like Steve Jobs, the big entrepreneurs. So we wanted to have some way to do business but still feel like socialists and the advertising idea seems to solve that. And so Google was kind of forced into a corner. The internet culture would have accepted no other solution because it's the only solution that gives you both passions at once. And at first, it really was cute. You'd see relevant ads say, “Oh, I didn't know there was that dentist in my neighborhood. That's cool.” And the problem with it is that in its crudest form, it's fine, but over the years the computers get faster and faster.
[00:27:29] The internet gets higher bandwidth, the algorithms become more sophisticated. All the players in the system learn and get more clever. And so, the incentive to manipulate that's inherent in that just gets optimized and optimized and optimized and at a certain point, we get so good at if screwing with each other, if that's what the internet is optimized for, that it's no longer advertising, it's no longer just like, “Oh, I'm seeing relevant ads.” It turns into the stark thing that no person can even be conscious of. It becomes this really creepy new world of massive behavior modification. And that is different from advertising. It's different from what we've ever meant by advertising in the past.
Jordan Harbinger: [00:28:10] Let's talk a little bit about why, let's say advertising, plain in a way and these sort of covert ads or covert behavior modification. Like I run ads on the show, I don't put them in the middle of the conversation because I don't need to. I can put them in a post. But how were those ads different from the things that you're talking about online that are insidious?
Jaron Lanier: [00:28:31] Well, there are a few important differences, one thing is that the ad is not creating your show. Whereas if you're looking at a news feed that's made of a lot of little blip, a lot of little short pieces that are put together in order to enhance the effect of the ad, then the ad is creating an experience. And in fact, in the book, I state that podcasts are one example of something happening on the internet that hasn't yet been corrupted. So I don't have any objection to advertising per se. I have objection to advertising that's targeted and creates the content, that's the tail wagging the dog and that's where you can draw a red line of when that starts to happen. Advertising per se, I actually have a pretty positive assessment of, I feel that even though I often find advertising annoying overall, it's helped humanity learn about modernity and move as technologies moved. And overall, I think that that's been to the good, I think it's helped people adjust en mass to the new possibilities of new products and services that's been good for us. So I'm not anti-advertising.
Jordan Harbinger: [00:29:36] My ads are probably the least annoying part of the show.
Jaron Lanier: [00:29:38] And I'm going help you keep them that way.
Jordan Harbinger: [00:29:43] Yeah. By the way. But before I forget, I know we're going to dive into a bunch of stuff here, but behavior manipulation, it’s sort of speaking of like the AI bot sucking up at night. You won't have seen this, but there's this, I don't know what you'd even call it now, a hashtag kind of going around that's called the 10-year challenge. And what you do is you take…
Jaron Lanier: [00:29:59] Oh no, I know, I follow on that.
Jordan Harbinger: [00:30:01] So you do follow that stuff? Okay. So you take that old photo that's 10 years ago, you take a current photo and you place them side by side. And I did that because I thought, “Oh that's fun.” And then somebody was like, “Don't do that. The social media companies probably invented this so that they can improve their AI aging detection algorithm.” And I was like, “Damn, that might be true. Or, it could just be BS-conspiracy theory or it could actually be kind of both.” You know, invented by a normal person and now being used for that.
Jaron Lanier: [00:29:43] So one thing I want to say is that, all it takes is a few students who are on social media and just listening to them and talking a little bit over coffee once a week and you're totally up on all the mediums and everything. Like I feel like I'm as up as a hardcore user and it takes me like three minutes a week. It's really not hard to keep up with things like that.
Jordan Harbinger: [00:30:47] That’s about all you should spend.
Jaron Lanier: [00:30:48] Yeah. Because it doesn't merit more than that. As to whether somebody at Facebook deliberately set this up. I have no idea. Might they find some use in it after the fact anyway for this creepy purpose -- maybe. I mean, a lot of times people doing machine vision research will try to come up with some kind of a social game to get people to say, tag cats versus dogs, in order to improve algorithms for cats and dogs. Let me bring up one angle on this sort of thing that perhaps your listeners haven't thought about, which I think about a lot, which is the economics of it. Because you've probably heard this trope, “Oh hey, this time it's different. The new technology this time it's going to throw you out of work.” Even though in the past, every time there was new technology, it just created new jobs. AI is different because AI really replaced as a person at whatever the job is and so then there is no new job and there aren't going to be enough jobs running the AI because those are very specialist, mathy, you know you needed a degree from Caltech to do some kind of jobs.
[00:31:46] Okay, so not true in my view. What happens is in fact, we need data from people to run AI and we currently steal that data through social media and then tell the people that they're obsolete. Like for instance, with language translation, this is my favorite example because I think it's clear that people who do language translation have to steal tens of millions of new example phrase translations from bilingual people every single day just to keep up with pop culture, and memes, and politics and news and all that stuff. And those people are seeing their careers just go away because of the automated translation systems and they're being told, “Oh, you'll be obsolete”, except we still need them, right? So all this little data, like the 10-year challenge, when we give AI programs data, we’re being stolen from, and then we're being told we're obsolete. I mean, it's a lie. And if there's something terribly creepy and there's a sort of a spiritual crime there to tell people you're worthless when you're actually needed.
Jordan Harbinger: [00:32:51] I definitely agree and look, I don't want to read a novel translated by Google Translate from German to English. It's going to be a little rough.
Jaron Lanier: [00:32:59] Right. The Google algorithm probably doesn't want you to read a novel. It wants you to watch one video and then another, another recommends until it turns into some really horrible, you know, creepy, paranoid, irritability-enhancing stupid video from somewhere. That's what YouTube appears to want.
Jordan Harbinger: [00:33:16] Geez. Yeah. Some sort of clockwork orange scenario. Yeah, so all right. Argument number one, social media can manipulate your behavior and it puts your freewill under threat. And what you said in the book is, “We're in a cage. We're being watched, manipulated and analyzed while inside this cage.” Now, a lot of people are going to go, okay, you've sort of explained why, but I get it. We're feeding the algorithm and Kai-Fu Lee came in here before and was talking about how China's going to win because they have more data because, of course, they have more people in there, really in there up in their business, so to speak. But how are algorithms actually predicting behavior, like how has this algorithm taking seemingly irrelevant data and how is this making something that could hurt us?
Jaron Lanier: [00:34:03] It's just a boatload of statistics laid on top of a structure that sometimes is called neural nets, where you have a whole bunch of little place keepers for intermediate results of those statistics that are related to one another. And it's not that hard to understand, but it's very rarely explained. And I don't think I can do it with audio alone. But let me give you an example of how it works. Let's imagine you have a thermostat and the thermostat turns off and on as the temperature moves and then you say, “Wait, I want it to turn the thermostat to be different if a person's in the room.” So suddenly like there's this other thing that's measuring if a person's in the room or not and it's related. So you can think of that as being like two neurons. Then you add and add, and add to that.
[00:34:47] And if you build systems like that, they can start to discriminate more and more and more different situations and act accordingly. And eventually, you can get them to recognize whether they're pictures of cat or dog. But it's really just an accumulation of that same principle of compounding, a whole lot of little things and they build up in layers, a little accumulators. My new perspective keepers you might say. And so, the thing about this is that they're sort of stupid. If you look at what any one of them knows, it has nothing to do with what the whole achieves collectively. So, in a social case, there might be one of them that's looking at everybody who likes the flavor cherry and does that correlate with them having blonde hair? And it might be only so in certain regions, and it might be just this weird thing, it means nothing.
[00:35:38] But then if it turns out that all those blonde-haired cherry-liking people responded to a certain ad in a certain way, then the accumulators will find that correlation. Then somehow by magic that similar ad will go to other people who share those qualities. So you end up with this sort of quality bingo for humans. You end up with this statistical way of classifying people according to random stuff. And there's no real science at the bottom of it. We don't know why these correlations exist, but statistics is real and so you tend to start to be able to manipulate people just through being able to do experiments on hundreds of millions or billions of people at once. You'll find these correlations that actually work on people and there are a few cases where social scientists get to work at Facebook and try to untangle what's really going on. But for the most part, it just happens in this way that's completely blind and yet it works.
Jason DeFillippo: [00:36:35] You're listening to The Jordan Harbinger Show with our guest, Jaron Lanier. We'll be right back after this.
Jordan Harbinger: [00:36:40] This episode is sponsored in part by SimpliSafe. No one should feel unsafe at home, period. Fear has no place in your house and that's been SimpliSafe’s mission from day one. You may have seen their commercial on the Super Bowl. If you didn't, you can find it online. SimpliSafe blankets your whole home with protection. Around the clock professional monitoring makes sure police are on the way when you need them. And security sensors, they're really tiny. They blend in with your house. You're not going to notice these bulky things everywhere because they don't exist. Verge called SimpliSafe, the best home security and it's a Wire Cutter top pick. More than 3 million Simplisafe customers already know that it feels good to fear less. Protect your home today and get free shipping on any system order as a Jordan Harbinger listener, just visit simplisafe.com/Jordan. That's simply -- S I M P L I safe.com/Jordan.
[00:37:27] This episode is also sponsored by HotelTonight. Is the brutal polar vortex squeezing you in its merciless, icy grip this winter? Jason and I grew up in the Midwest, so we've been there. We get it. Our friends at HotelTonight offer this sage advice. Consider taking a vacation in a place where your nose hair doesn't freeze on the walk between your front door and the snowdened driveway you now have to shovel for the third time today. HotelTonight can hook you up by tonight for a quick thaw in a warm Sunnyland or an advance for a leisurely week on a remote beach maybe where a steady supply of cocktails is the only climate control you need. And if you think you don't have the budget for a vacation right now,HotelTonight's rates are so reasonable, you kind of can't afford not to unless your furnace somehow magically harnesses the power of windshield or you can burn stacks of icicles and the fireplace to stay toasty. Jason, tell him how to get it.
Jason DeFillippo: [00:38:15] Go to HotelTonight.com or download the app now and with the promo code JordanH you can get $25 off your first eligible booking. That's promo code JordanH. Thanks for listening and supporting the show. Your support of our advertisers is what keeps us on the air. To learn more and get links to all the great discounts you just heard, visit JordanHarbinger.com/deals. And don't forget the worksheet for today's episode. That link is in the show notes at JordanHarbinger.com/podcast. And if you're listening to the show in Overcast, please click that little star next to the show. We really appreciate it. And now for the conclusion of our interview with Jaron Lanier.
Jordan Harbinger: [00:38:49] Is this as simple as, “All right, blonde-haired cherry-loving people that live in a pocket in Alabama. There's another pocket up in Vancouver and other one in Winnipeg and then one in Sao Paulo.” -- As many blondes that love cherry down there and then they don't like green. When they see green in an ad, if they click on it, very few, they have an aversion to it and then we surround a political candidate in green when we show it to those people and then they don't like that person anymore. Is that oversimplification or…
Jaron Lanier: [00:39:17] Not really. I mean in a way, the ridiculousness that comes out in that example is good because it is kind of ridiculous. It's just these little things and through a multitude of tests blindly, the algorithms just discover what works. A lot of it, it might be to do with color, it might be to do with timing, it might be to do with how many other messages you may send before you get to the one you care about. It might be, just a whole world of things. I mean, nobody's ever cataloged all this. And the thing is it's not transparent and it's impossible to really know what's being done to you. I think the most important result that's come out of the research in this area, particularly the research published by the companies themselves, like Facebook's on papers, is that when Facebook proves it can make people sad, or when Facebook proves it can repress a vote or you know, and these are things that's published as scientific papers that people couldn't have pulled that it was happening to them.
[00:40:15] Nobody ever detected it. And I think that's the main thing to get out of this is that when you use this weird indirect statistical technique, there's a certain creepiness to it because you can't know about it. In the old days, when people used to get paranoid about advertising or their subliminal messages, at least you knew you were looking at an ad, at least you know you could just like not look at the ad or something. But in these systems, because you don't even know where the lines of attack are, you don't know where the lines are drawn. Like right now as we're talking, we know this is not an ad. And that is something, this isn't calculated, this is actual reality. This is just me talking with you. Agree with me or not. But online, you know in this, once you're dealing with these manipulation algorithms, particularly from Google and Facebook, you're in this world where you don't know what's being done to you.
Jordan Harbinger: [00:41:01] Yeah, it's a little scary. I mean, and they sell our information. We're the product, not the client, which I think is a little scary. People always go, I have someone telling me the other day, “I don't understand why I can't get my Instagram back. It got blocked and you know, their customer service is terrible.” And I was like, “Oh, I think their customer service is great. You're just not one of their customers.”
Jaron Lanier: [00:41:17] Yeah, that's exactly right. Yeah. You're not. When people think they’re customers of these companies, if you're not paying them, you're not a customer. You know what I mean? It's really simple.
Jordan Harbinger: [00:41:26] If you want them to pay attention to you, buy $2 million a year worth of advertising, and I'm sure that that someone will call you back, and figure out how to solve your problem.
Jaron Lanier: [00:41:34] Yeah. I don't know what that threshold is. I don't know if it's 2 million.
Jordan Harbinger: [00:41:37] I have no idea. It could be less. Yeah, it really probably depends on the medium. And I don't know if that's a whole different kind of question, but we're not being tracked by names supposedly, but it doesn't really matter, right? Because we're being lumped into these little boxes. It kind of matter.
Jaron Lanier: [00:41:55] That's the really strange thing. And that's one of the reasons why I'm not sure that privacy is the right way to think about this because if you say like in the European new laws, such GDPR, well your particular -- like your name, your address, we're going to cover that stuff. But the thing is this whole world of correlations just routes around that. Like if it can just figure out that you're the person with dreadlocks who has a blue phone case, send it for whatever it is, you know, lives in Berkeley. Like in a sense, once you do enough of that stuff, that's actually better than having this specific name and you can always strive the name anyway. I mean that's been shown again and again and again. So privacy per se is probably not really even a useful way to slice this.
Jordan Harbinger: [00:42:41] Yeah, it kind of doesn't matter if someone knows your address and your name, if they know your eye color, how often do you get your haircut? What time you wake up in the morning, the type of coffee and clothing that you buy in the area where you live.
Jaron Lanier: [00:42:50] Yeah, ultimately, identity is a collection of quantities even more deeply than it is a particular address or name. And if they have that, that's even better. So yeah, that's why the approach I've always taken to this is that we need to change the business model completely if we want to fix this. Otherwise, it'll just be switching slightly between flavors of hell until we finally realize we just cannot afford to do this anymore if we want to survive.
Jordan Harbinger: [00:43:17] And of course, the simple answer is, stop using it. But the problem is social media is designed to be addictive. And we know, was it Sean Parker who said something like, “Yeah, we're actually messing with your dopamine response. Like, we're doing that on purpose.”
Jaron Lanier: [00:43:29] Yeah. Sean said that in the last few years. Now here's the weird thing, I knew Sean back in the day when he was the first president of Facebook, and I really don't think he was thinking that way at that time. I think they were still kind of idealistic and thought they were doing this great thing for the world. There was a lot of ego. There was a lot of mania, but I don't feel that they were trying to be evil and I sometimes wonder if maybe he's misremembering and thinking that they have this whole evil plan in order to cast himself as this evil bond villain because that'd be really glamorous. I don't know.
Jordan Harbinger: [00:44:01] I mean, he seems like the type of guy who would be like, “You know, let me lean into this.” I already got played by Justin Timberlake in that movie, so it’s sort of it's nowhere to go.
Jaron Lanier: [00:44:09] Yeah, like maybe being the bad guy isn’t so bad. I don't know. I haven't talked to him in years. I don't know where he's coming from right now. But at any rate at this time, he's saying they didn't know what they were doing and that it was deliberate.
Jordan Harbinger: [00:44:20] Can you explain how, and you call these systems bummer in the book? Can you explain how it focuses us on being addicted to likes, views, followers and things like that.
Jaron Lanier: [00:44:30] It’s a behavior of users made into an empire for rent. But I'm missing one M in there. I just need to come up with some name for it. The business model happened, as I said, because we wanted to have the feeling of socialism, but also have the romance of capitalism and this ad paradigm was the way to get both things at once. And then that was when it was just low tech and just barely beginning and it was cute. But it's morphed into this horrible thing. I don't think the idea of connecting with people online is inherently bad. Otherwise, what would I be doing with my life? I've devoted myself to making that work. And as I say, I think there are examples, I've mentioned to hear, podcasts and GitHub, and I could mention many other where I think people connect over networks in a way that's positive and doesn't have this kind of weird darkness and manipulation and this kind of negativity, right? And even within Facebook and Twitter and so forth, there are very substantial numbers of people who have positive experiences, right? It's a statistical distribution here. This is a bell curve. And so I'm not denying the positivity that a lot of people experience. Unfortunately, we know from many measures that their experience is atypical and there's more negativity. But in theory, you could have a really positive thing similar to Facebook. In fact, there were some early ones that were more positive. In theory, you should have something similar to YouTube that's much more positive and in fact I think we do. Netflix is not perfect but it's better. It's a different sort of thing.
Jordan Harbinger: [00:46:06] Although Netflix is addictive -- just ask my wife, but I love Netflix but the difference is I'm paying for it so they have an incentive to keep me there by making great stuff. They don't need to trick me into giving out all these details, I suppose.
Jaron Lanier: [00:46:20] Well there's a little bit of stupid paranoid and irritability-inducing stuff on Netflix or some weird conspiracy movies there and stuff. But it's not dominant. You don't automatically go there in the way you do, if you follow the advice that YouTube algorithms give you. And, I think Netflix is instructive because there used to be a time when nobody thought they'd ever pay for video because we could get video for free on torrents and everything was going to be volunteer from there on out. We would kill capitalism from media. That was like a dearly held, universal, passionate, mainstream belief in the internet world at a certain point. And Netflix proves, “Hey, you know what? We can pay for this stuff if they give us something that's worth paying for, it's not that bad.” And I think this idea of peak TV that's happened since we started paying directly for TV instead of waiting for advertisers to support something we want to see. This direct connection to the audience, it's not perfect. I don't love everything on Netflix or Amazon Prime or whatever, but it's better, right? And I keep on imagining what would peak social media be like, what would peak search be like? And I imagined these things as taking the parts of those services that are currently positive, but just amplifying them and getting rid of all the creepy crap.
Jordan Harbinger: [00:47:40] I think a lot of people do exhibit some addict behavior on social media. Though to your point, I mean, in fact I think you gave a couple of examples of social media kind of becoming the new cigarettes where it's looked at as a little bit as a vice and I would agree with that to an extent.
Jaron Lanier: [00:47:57] Well, the comparison to cigarettes is interesting to me in a way that I want to spin positively because we have some examples in our past of mass addictions with commercial connections that we nonetheless were able to address in a reasonable way. So with cigarettes, for generations through the 20th century, the cigarette was the cool thing and being anti-cigarette made you some kind of unsexy fuddy duddy and everybody smoked. Whether it was a businessman or the punk, everybody was smoking because that was the cool, sexy thing. And then somehow, enough people that out of that addiction mold, just to be able to look at it and say, “Man, this has got to be the stupidest thing ever. Why should we condemn a bunch of kids to get lung cancer over this stupid thing?” Like, and we didn't make it illegal. We're not throwing people in jail.
[00:48:48] Like we ended up dealing with marijuana. We just said, “Hey, you know, we'll keep you out of public places. We'll do a few things like that”, made a huge difference. So I think we need, or another one is whether it’s against drunk drivers. Addiction systems tied to a commercial interest nonetheless are enough people who are able to address it. And so the fact that those movements happened, gambling's another one, which is technically gambling addiction is technically much more similar than chemical addictions to social media addiction. And so in all of those cases we record, we recognized that this industry is leading us to ruination and we need to find some way to steer, we don't have to go to extremes. We just need to steer. And in this case, I don't think we need to ban social media. I don't want to kill Facebook. I just want to reform it by changing its business model. And I think that could be positive for everybody. It would be positive for shareholders and for its users, or at least we got to try it. If you see something that's just getting worse and worse and nothing you try helps it, why not try something different?
Jordan Harbinger: [00:49:52] Yeah, I can understand that. And I think one of your other arguments, which is that social media contributes to this mass production of misinformation. That was something not necessarily new for me because of course we've seen journalism turn into click-bait articles, right? We've seen a lot of people make headlines that are just completely ridiculous to the point where when you open the article, you're actually annoyed at the journalist and the outlet for having tricked you into reading something that is clearly just complete malarkey and mislabeled. But it goes beyond that. Now there's fake people, literally fake people that contribute to even fellow smart or intelligent people making worse decisions. Can you take us through that sort of process? Because I think people don't realize that, we think click-bait, whatever, I ignore it. It's not that anymore. Now you're getting opinions from people that don't exist.
Jaron Lanier: [00:50:43] Right, right. So this is what I was talking earlier about social perception. And so, I write books and can I just say something? I write books. I do pretty well on it. I've had best sellers, I've had best sellers in a bunch of countries. I have a public life. I have no social media. Like, your life doesn't end if you don't have social media. You can still be a public thinker or whatever, it works, you know? But the thing is, because I write books and I'm pretty visible. I get contacted by these people who are trying to sell me fake people all the time.
[00:51:13] And so what happens, somebody will call you and they'll write to you or they'll say, “I have a business proposition, you know, and I have created hordes of fake people for so-and-so and so on and so and so and so all these, all these famous people.” And it doesn't cost that much and you could buy tons of fake people in Twitter. You can buy tons of fake people on Facebook and it takes them a while to figure it out. So they will try to get rid of them, but only gradually and you can keep on adding more fakes. And then what those fake people do, they don't necessarily communicate directly with any real human. All they do is they top up ratings and views and so forth to bias the algorithm to push a particular thing. That's what they're for.
[00:51:53] And then they create fake social perception so that what you see seems to reflect other people around you being interested in something, which on a very deep level, speaks to you and it's like, “To be connected to my world. I will respond to what the people around me are responding to”, but they're fake, you know? And so one of the reasons I want to see things more monetized, more like Netflix is that you can make a million fake people on Facebook pretty cheaply. It's not that hard. They cobbled together bits of real information from various real people and just recombine them to make fake people. But you can't make a million credit card accounts. You know, you just can't do it. So there aren't millions of fake users of Netflix, you know, there just aren't there. There aren't millions of fake people on the Apple store.
[00:52:37] There aren't. It just can't be done. And so as soon as people have a bit of skin in the game, all of a sudden they get more real. And I know there's some people are going to be listening to, but what about the poor? And yes, we must address that, but I have to point out that the current system is destroying the poor. When you look at things like the Rohingya crisis in which the most vulnerable people are being attacked and destroyed by this very system of manipulation. And this is happening all over the world, and I can give many examples of this. We can say that this system is good for people who are vulnerable, or poor, it's horrifying for them.
Jordan Harbinger: [00:53:12] Can you tell us what's going on with it? Because I'm familiar with the Rohingya crisis, but I don't think everybody is. Can you tell us what that is and what role social media played in this?
Jaron Lanier: [00:53:21] One of the easiest things to do in social media is inject fake contact through fake people. That creates a sense of paranoia and irritability directed at some group. Okay. Now, why is this so easy? The reason why is that the algorithms that you're trying to influence with all your fake people have to pick up on some kind of arise from those who are targeted, right? And the emotions that people display reactions to that are the easiest to test for and arise the most quickly, or the startle emotions -- a fight or flight emotions. And so fight or flight can translate into fear and rage. But in the more diffuse world of social media, it's paranoia and irritability, right? So these are the highest value emotional targets that you can go for.
[00:54:09] You can put fake people into Facebook to make a population more racist, less likely to vote, perhaps more upset, more angry, but you can't do the reverse. You can't as easily make them kinder. You can't make them more ready to support minorities or vulnerable people. You can do it a little bit in bubbles, but not overall. And in fact, the people who do temporarily get a positive effect or ultimately feeding the evil here in a way they don't realize. As an example in the US, let's say you're the Black Lives Matter Movement. So Black Lives Matter starts as a sort of a social media thing, right? A hashtag and so on, on the different platforms and feels good. Feels like it's getting somewhere. It's one that I felt very good about and support it. But the thing is, that's just data fuel going into the system.
[00:55:02] The algorithms don't care. They don't care at all. So the algorithms take any information that was uploaded by Black Lives Matter and they're feeding it all over the place to random people to see who they get an effect from because what they want is a rise that they can then use to further addiction and behavior mod, because that's all they can do.
Jordan Harbinger: [00:55:21] Like an emotional rise out of people.
Jaron Lanier: [00:55:23] An emotional or behavior pattern change. So it's not like when I say emotion, we don't even know what emotion is in the brain exactly. But what they're looking for is a measurable change in behavior pattern, which we can do. And then that's correlated with emotions. So what happens is, the people who are these horrible people who hate, who are racist and stuff, get more immediately detectedly moved by Black Lives Matter than the original people.
[00:55:52] So the negative people get detected, they get introduced to each other, they get reinforced. And so all of a sudden you have this resurgent KKK and Neo Nazi movement in the US that had been really dormant and isolated and fragmented before. Furthermore, bad actors can detect that that's happening through the system and then they can start feeding it because it's in their interest as well. If somebody is trying to destabilize the society, so then what starts out as Black Lives Matter as a positive thing, at least in my judgment, turns into a bigger negative result. And I think you see this, whenever somebody tries to do positive social change with social media, it gets flipped around eventually. The most dramatic one was the initially effective use of social media platforms for the Arab Spring turning into the even more devastating use by groups like ISIS.
Jordan Harbinger: [00:56:44] Oh, interesting. Right. So Arab Spring getting people together in these countries, getting them able to communicate them, realizing they're not this insular group that hates get off here or whatever, they can all kind of coagulate into a revolution. And then on the flip side, you've got ISIS going, “Great. We can reach out to these isolated feeling kids in the UK, the US who feel like they're getting picked on and turn them into Holy warriors if they just blow themselves up.”
Jaron Lanier: [00:57:09] But the thing is, the system finds that people who are annoyed in the first place, like it's an irritability detector and enhancer. Getting to the Rohingyas, this is a Muslim minority in Myanmar, and whenever you have something like that and if you have interests that you think you want to do the classic fascist move of saying, “I'm going to get the population to support me by turning on one part of itself, and oppressing the minority”, and there's this thing that happens where on social media nothing means anything anymore. Everything just turns into memes and hashtags. Like what does it mean to be a conservative in the Trump era? The actual ideas or policies are all over the place. They're nothing related to the traditional bundle of conservative ideas. Trade is now the opposite of what it was. Immigration is the opposite of what it was.
[00:57:59] Everything's different. Personal behavior standards are the opposite of what they once were, everything's different. But the thing is, when everything turns into this sort of context slash hashtag competition for who can be the most irritable, all that's left is some kind of very rotten simplistic idea of identity. Like this artificial idea we have of race or maybe of blood and soil, you know. And so, the traditional fascism that used to be thoughtless has found a new ground in this new kind of thoughtlessness. So people now have this new high tech way of getting powerful and promoting their own personality cults in their own new centralized authority by promoting this weird xenophobia racism. And that’s way we see this all over the world at once, in countries where it hadn't been present, in countries with nothing else in common at all. It's the only explanation for why both Sweden and Brazil would have this happening at the same time.
Jordan Harbinger: [00:58:56] This makes sense. Yeah. And of course, the Rohingya in Burma or Myanmar, they were living essentially on the border of, is it Bangladesh or, yeah. And now, there's essentially fake news on Facebook, I believe was being pushed to the locals around there saying these people are causing trouble. They're trying to do -- I don't even remember what the accusation was.
Jaron Lanier: [00:59:18] You know, since I'm Jewish, I can tell you that there've been things like this a long time. We call it the blood libel, where you come up with these crazy stories. Oh, they're taking the blood of Christian kids to make their food or something. And this kind of thing has existed for a long time, but it just hasn't been technologically optimized. It hasn't been this thing that could happen so quickly and there wasn't a direct commercial motivation behind it. It wasn't being run by the biggest US companies out of Silicon Valley right here where we live in this beautiful Bay Area. Like this development is a spiritual disaster. It's a profound embarrassment and like there's tragedy in the tech world.
Jordan Harbinger: [01:00:00] Yeah. So, this Rohingya population has essentially been forced to flee because people are coming in. That's right. It was the blood libel. It was something like they're killing a bunch of local babies or something.
Jaron Lanier: [01:00:11] Yeah, it's a sort of, because to spread these things, the stupider and weirder, the better, you know, and it's not just through Rohingya, so there's similar scenarios in rural parts of India, in parts of Africa, and of course, the rise of this weird Neo Nazi KKK phenomena in the US, and in Europe, the rise of these Neo Nazi parties. And you can say, “Well, it's because of the immigration crisis there.” But once again, it even happens in parts of Europe where there isn't really an immigration crisis. There might be a sense or fear of one. It correlates to the arrival of Facebook more than to other events.
Jordan Harbinger: [01:00:49] In a way, the reason this works so well is because negative emotions are essentially the lifeblood of social media, in a lot of ways.
Jaron Lanier: [01:00:56] Well, they're just somewhere, negative emotions are kind of like the high octane ones. Like, if you go to the gas station, you know, you can get this more powerful gas, and the negative emotions, they rise faster. They're easier to detect. In the overall scheme of human life, if you look at the study of what kinds of feedback influence behavior, there's a parody between what you can call negative and positive emotions if you can accept that grouping, which requires a bit of a leap of faith admittedly. But it's not so much that negativity drives humanity. It's just that in this particular, it's a little bit like high speed trading, when you have these automated systems that are just trying to pick up quickly on how people respond in that context, negativity is more powerful.
Jordan Harbinger: [01:01:37] So making us feel bad contributes to our use of the platform because essentially we get triggered by something.
Jaron Lanier: [01:01:46] Well, this is a weird thing about behavioral addiction. I don't know if you've ever had a friend who had a gambling addiction.
Jordan Harbinger: [01:01:50] I'm sure that I have. I'm trying to think of one that actually told me about it, might be a different story, but yeah.
Jaron Lanier: [01:01:56] Yeah. It's not great. I mean, we've all had friends with addictions and most of us, if we're honest, have had addictions ourselves. I mean, this is a part of the human experience. People with gambling addictions are sure that they're different. They have the special system, they have luck, they have real luck. They're sure that they're the exception. But the interesting thing to me about it is that what they're addicted to isn't that moment when they win, but to this whole cycle when they're usually losing. This is something I've noticed in people with heroin addictions of whom I've known more than a few, being a musician through the seventies, eighties and nineties. And people who have a heroin problem are hooked on this whole experience where most of the time, it's horrible. And then there are these moments of ecstasy. And so the thing is you become hooked on this whole experience and social media addicts become hooked on this whole cycle where most of the time they're getting punished and every once in a while they get this reward. And I believe this is why some of the most prominent social media addicts deliberately seem to say stupid things that will humiliate them online because they want that punishment as part of the cycle.
Jordan Harbinger: [01:03:01] That's interesting. So all this controversy, them getting beat up in the media, people thinking they're an idiot. You feel like that's part of their addictive behavior?
Jaron Lanier: [01:03:09] Yeah. Well, like, think about Elon Musk calling a diver a pedophile out of the blue. What's he doing? He's got a terrible addiction problem. And it's to the point where, you know, people in his company, investors have said, “Get off Twitter!” You know, and why would he do that? Well, he's doing it because he's addicted to that whole cycle and he needs to be punished for part of it. Our current president, I see the same pattern in. I see it in Kanye. A lot of the people who sort of degrade themselves in public, who would seem to be skillful, intelligent people who've built successful careers and yet do this ridiculous thing -- That's their addiction. And furthermore, hear this, there's something else interesting about this. In the past, the powerful male persona in the world had mystique. And mystique is this mystery where you don't let your vulnerability show.
Jordan Harbinger: [01:04:01] I don't have that. Yeah, at all.
Jaron Lanier: [01:04:03] Yeah, I don't particularly either. But the thing is, they traditionally did, you know. If you think about, Oh, I don’t know, Marlon Brando and whatever, Ronald Reagan, there's this persona and it might be very nice in some cases, but there's always like this, you don't really know what would tick them off. They're not announcing it. These new ones who are acting like cry babies, what's going on? How does it work? Why do people like seeing somebody who's humiliating himself all the time like Trump. And I think the reason why is that all the social media addicts out there see themselves in it. So they relate.
Jordan Harbinger: [01:04:38] Yeah. Maybe there might be something to that. And I think also, is there an element of -- and this is just sort of like a hair-brained thing on my end, but I've sort of picked this up over Christmas, a lot of advertising just seems, you know what, now that I think about, I've known this my whole life, a lot of advertising to me seems designed to make me feel like kind of like, you know, FOMO, I need this. I feel bad that I don't have it. I'm positioned as such I am less than for not having been a part of this. And it's not just items that I need to buy, now it's everything. It's like experiences, whether or not it's even for sale, it's like part of the platform is to just make me feel less than.
Jaron Lanier: [01:05:18] Well, like I say, I think traditional advertising was often annoying and often went over some sort of a line for me being too manipulative. But overall, I think it served the purpose of civilization and betterment just because it helped maternity move along and things actually have gotten better with maternity. So when you have these individualized feeds that are calculated to manipulate you, I really do think it's something entirely different. I think when you're being made to feel bad, just as part of the addiction process, and it's not even about getting that new car anymore. And in fact, there's a sort of an arms race where if everybody's trying to manipulate you, you know, everybody feels that they're blackmailed into paying an existential tax to Facebook because otherwise the other people manipulate you. There's a kind of a parity that arises. And so I think it really takes off on its own and just becomes part of this other weird religion of feeling that the central server must be the new god or the new King that runs everything. And there's another part of it we haven't talked about, which is a lot of the people who run these things think they're building the new AI that'll take over and replace humans. And so, there's that religion as well, but it takes on this very strange momentum on its own.
Jordan Harbinger: [01:06:33] Back to the social media making all of us triggered and emotional, I feel this too, whenever I'm online, I often have to check myself because someone will say, “Hey, I didn't like this one minute thing.” And if they had told me that in person, I'd be like, “Oh, thanks for the feedback.” I would have been like, “Yeah, whatever.” But if it's done on Twitter, or in the wrong way, in an email or Facebook -- I will catch myself being a horrible person.
Jaron Lanier: [01:07:00] Yeah, that's accurate. And so this is interesting, I haven't talked about this. The way that people turn into assholes online, that predates this advertising model I'm talking about. And in fact, if somebody wanted to argue against me, they could say, “Hey!”, but there was this thing even before this stuff and they'd be correct about that. And in my view, what happened is the advertising model kind of merged with this other thing that was going on where people were making themselves into assholes. And that weird asshole making things started early. I mean that was, we already knew that that could happen back even in the late seventies, certainly in the 80s with a really, really early prototypes of social networking that was happening. And in fact, I decided to cut out of that world back then because I didn't like what it was doing to me. So there is something very powerful there and it's been studied a lot. There's been a lot of people working on exactly why it is, what's this asshole making thing. I present some of my own theories in the book. I don't think we totally understand it honestly. But it's definitely something that's intrinsically there. Even in something like just straight email, it was always there.
Jordan Harbinger: [01:08:05] It's pack behavior at some level. I'm sure.
Jaron Lanier: [01:08:07] Yeah. That's my theory that it's turning you from a lone wolf into a pack wolf. So you changed from being primarily a scientist and to being primarily political and that's maybe something cryptic, but that's my theory about it.
Jordan Harbinger: [01:08:21] In the book, you do say, if triggering emotions is the highest prize and negative emotions are easier to trigger, how could social media not make you sad? If your consumption of content is tailored by near limitless observations harvested about people like you, how could your universe not collapse into the partial depiction of reality that people like you also enjoy? And that's a little kind of a bummer, right? Because you know, well, I mean, wow. Like, it's optimizing for making us feel bad so that we engage more and to further our addiction. It's like, “Ugh, Oh my gosh, I need a shower.”
Jaron Lanier: [01:08:56] Well, look, quitting is not that hard. People do it. There've been studies of people who've quit that followed them after they've quit. And by the way, quitting means really quitting. Like deleting the app from your phone doesn't actually delete the surveillance and it doesn't change the effect on you because there's so many tendrils by which these manipulation machines affect things that you see. It might still affect what you see on the new site you like for instance.
Jordan Harbinger: [01:09:21] Shoot. So we don't even think about that. You have to unplug and toss a grenade in there or something.
Jaron Lanier: [01:09:27] Yeah. You have to delete and it's not that hard and you can. I don't want to promote anybody's particular thing, but you can get privacy-oriented browsing extensions or whole browsers. You can turn off auto-feed on YouTube and you can use YouTube without a Google account at all anywhere so that it doesn't know who you are. You can do these things and suddenly, the manipulation machine is at least subdued and everyone who does that reports that their lives get better. They get better informed more quickly, they feel happier, they have better relationships. I mean, and I just don't think this is even ambiguous. It really seems to help people with the exception of those who really have a special need that's addressed by the technology. The example I use in the book is people with unusual medical conditions who have found each other through a particular platform,
[01:10:15] by all means, like if it's doing something special for you, don't change on my account. You know, like use it if it's really working for you. But for the average person who thinks that they're immune to all this stuff and just, I mean I run into journalists all the time like, “Oh, you're just so good at Twitter. It's just funny.” And it's like, yeah, that's like my gambling friends saying, “Oh, you just don't get it. I'm lucky.” Like you know, like it's hard to cut through addiction. All we need is a tiny minority of people to break this stuff in order to have a community that can talk about it so that we can talk about it from outside of its own addiction system. That's what we need as a society.
Jordan Harbinger: [01:10:50] Yeah. We'll talk about how bad Twitter, Google and Facebook are on WeChat – so that Deng Xiaoping can take a look at it. But I love your rule of thumb about which platforms are bad for you and would you take us through that because you'll probably state it better than I can.
Jaron Lanier: [01:11:09] Well, what makes a platform bad is that it's optimized for third parties who are paying out of a belief that they can change your behavior. The way to test if it's really bad is if bad actors have made a practice of using it. So was Putin there?
Jordan Harbinger: [01:11:26] Did Putin use it to influence some poor countries election or our country's elections, right? Still debating that.
Jaron Lanier: [01:11:34] Alright, so if you use that criteria, the really bad ones are the various Facebook platforms including Instagram, WhatsApp, Facebook Messenger, and normal Facebook, Twitter, and it hurts me to include Twitter because I know the Twitter people and I like them and to me, Twitter is a great tragedy. It's like ruining the world and not even a solid business. You know, like, I mean it's like, it's awful. Not all of Google, but certainly the YouTube part of Google and a lot of the search experience, although you can adjust the search experience through enough very careful blocking of cookies and not having an account and all that to make it cleaner. It's possible. Then there's others that are sort of integrase. Also, very, very sadly a lot of the sort of online forum world and things where people connect and lesser known forums is very compromised and you see this especially in any of the places where gamers congregate because they should have young men
[01:12:29] a lot of the bad actors really focused on them. So you have a lot of the stuff happening through Reddit for instance. So that's another, those unfortunately are bad and once again, it hurts me to say that because I think I'm old enough to know when those were started. They were started with tremendous idealism and optimism and it's a horrible thing to have to say they've become this bad and there's some that are kind of like teetering on this edge, like Snap. Some things have happened on it but not as bad. Since I have a connection to Microsoft, I don't want to toot the Microsoft thing because I don't feel that can be credible. I think whenever you have a social network where people have some kind of skin in the game, something at stake other than mind games, all of a sudden their better angels come out and you see that in GitHub and maybe a little in LinkedIn.
Jordan Harbinger: [01:13:14] I was going to say LinkedIn because you go there and you kind of think this is sort of for my job, maybe I shouldn't jump down people's throats or like post something completely asinine here.
Jaron Lanier: [01:13:25] If you're going to be kind, compassionate, responsible in any environment, part of the reason you are is because it's based on enlightened self-interest that when you make a better world that's also your better world and if you have no skin at all in a game, if you're just like this anonymous ghost from your point of view, of course from the manipulation machines’ point of view, they know all about you and they're manipulating you. But if you're pretending you're just this anonymous ghost and nobody knows who you are, you don't have skin in the game and you lose the opportunity for that enlightened self interest. And so when you do have something you care about, like your career in LinkedIn or your code on GitHub, and I can give you some other examples -- It doesn't make things perfect. There'll still be annoyances. People will still sometimes be jerks.
[01:14:10] People will not be perfect, but it doesn't become this dark pit of endless, you know, degradation. It’s not that bad anymore.
Jordan Harbinger: [01:14:19] I've read that you're an optimist, but our conversation here might not signal that to everyone. Tell us why and how you see the bright side of all this or a solution in all of this.
Jaron Lanier: [01:14:30] Well, there are a few things. One thing is that if you look at history, you see our dear species making it through tight squeezes and difficult times. And so that leads to optimism that it's something we can do. What's different now is we're facing a bunch of them at once. We're making ourselves insane with the stupid communications technology. At the same time, we have to face climate change and when you have combined challenges, maybe that increases the odds that this time we won't make it and then they're coming on fast and furious and, and that's bad.
[01:15:02] But still, we have an incredible track record of survival through all kinds of things. You know, many of them brought on by technological change. That gives me a sort of an empirical baseline for optimism. Then another thing is, I really liked some of the young communities in tech, like I'll give you an example. In the blockchain world, so I used to be so cynical about blockchain. If you'd interviewed me a couple of years ago, I would have been like, “Oh that's a bunch of get-rich-quick scammers. They want to make the largest possible carbon footprint for their security. So they're willing to destroy the earth just to feel more secure.” And plus it's fake security because you're making this mathematically perfect security and then at the edges of it, that's where people will scam, which happened a lot with Bitcoin exchanges and stuff. So I just thought the whole thing was stupid.
[01:15:51] But now that it's gotten shaken out a bit by coins losing value. The people who are left after that, it's this really large, substantial, technically adept and really optimistic and interesting new generation of techies who want to do good for the world. And I think they're learning lessons from how earlier generations like ours screwed it up and they give me optimism. I look at them and I'm thinking, “Yeah, there is a future here.” Obviously, it's going to be future generations. It's going to be young people who fix this. And I think I'm seeing signs of great intelligence and warmth and good intent there. I see engineers and managers at the big tech companies organizing, protesting, taking risks for their own careers because they want a better world. That wasn't true a few years ago. That's something new and I think incredibly valuable, incredibly heartening. So I'm actually kind of optimistic right now.
Jordan Harbinger: [01:16:48] Jaron, thank you so much. It's been really interesting. And by the way, I heard you have like 1500 rare instruments or maybe they're not all rare, but…
Jaron Lanier: [01:16:56] Well, all right, so we were talking about before about how everybody has addictions, I have this weird thing. My mom taught me music and then she died when I was little. And I somehow have this connection to her playing music, but the form it took is that I always need to be learning a new instrument because I was learning from her. That was really my connection. And so I've just ended up learning one instrument after another. And at this stage in my life that means I really do have a lot of instruments and a lot are from different periods in history, in different parts of the world.
[01:17:30] And I've studied music in all parts of the world. I'm always learning a new and it's just this weird obsession. And yeah, we live in a forest of unusual instruments.
Jordan Harbinger: [01:17:39] Can you play each one or are there some where you're just like, “Hey, this is kind of out there.”
Jaron Lanier: [01:17:44] I can play the vast majority of them. There's some that I got up to speed on and then just lost completely because it would have required too much ongoing work to stay. But I seem to have an ability to at least play decently, instruments I haven't touched in a while. If I got to a certain point before, it doesn't mean I'm a virtual so on everything. But, if I may say so, I'm actually pretty good on a lot of them.
Jordan Harbinger: [01:18:05] What's the strangest instrument that people would just say, “How is that an instrument? You know, like we've heard of like glass shapes being played or like a friend of mine made a documentary about people playing roots in like the jungle.
Jaron Lanier: [01:18:19] Well, the glass harmonica which was invented by one of our founders, Benjamin Franklin, these spinning, wonderful little glass bowls. You can move your fingers on and create this ethereal sound.
Jordan Harbinger: [01:18:30] So that these bowls spin and you just put your fingers on the edge of the bowl?
Jaron Lanier: [01:18:34] Yeah. And you play it like a keyboard very carefully. Benjamin Franklin invented it when he was doing, what was in those days, kind of a foreign, you know, manipulation and stuff on behalf of the Americans and a revolution in France. And he heard somebody playing a bunch of wineglasses set up the way, you know, you can play wineglasses by running your finger along the edge. And he had this idea of turning them on and so they could all spin at the same time and then playing it like keyboard and there's so much to say about this.
Jordan Harbinger: [01:19:09] And you have that at home?
Jaron Lanier: [01:19:09] Oh yeah. I can play that. That's not that rare, but yeah, sure. Everybody has to play glass armonica, are you kidding? The initial sound, it's a very haunting, beautiful sound. There was unlike anything people had heard at that time, and there was this psychologist who is interested in hypnotism and in subconscious effects and how people could enter different states of mind, and his name was Mesmer and he used it to put people at the state…
Jordan Harbinger: [01:19:33] A hypnotist named Mesmer? That can’t be a coincidence.
Jaron Lanier: [01:19:35] No, no. That's where mesmerizing comes from. And his technique to mesmerize was the glass armonica. And then, Franklin met a young woman who was blind, who became a virtuoso. And then she traveled, she toured Europe playing them and that inspired Mozart and Beethoven to write for them. But the problem is that the early glass was leaded and so it made people crazy.
[01:19:55] So the glass armonica players got this reputation for being really nutso and wild.
Jordan Harbinger: [01:20:02] Wait. So the lead would sieve through the glass into your fingers when you play it and you'd go brain-dead?
Jaron Lanier: [01:20:09] So he brought one back to Philadelphia over the ocean and he played it to wake up his wife and she thought she died because it was so unlike anything in experience that she initially thought it must be something from the afterlife. And I'll tell you one other story about it. I have an early manual for players and what it focuses on is which water from which wells throughout the United States is adequate because the water composition is really important.
Jordan Harbinger: [01:20:32] Oh, there's water in the glass?
Jaron Lanier: [01:20:34] Oh yeah. You have to dip here. You dip your fingers on the bowls to keep them wet, to make it easier to play.
Jordan Harbinger: [01:20:40] And they say, it just depends on the water that…
Jaron Lanier: [01:20:44] And it does, it does. You have to really think about that. So these days we can add stuff to the water, but in those days it was about which well.
Jordan Harbinger: [01:20:49] Wow, that is totally unusual.
Jaron Lanier: [01:20:52] Oh, that's not that. No, that's mainstream. You don't even know. Oh my God.
Jordan Harbinger: [01:20:57] What's the kookiest thing in the house?
Jaron Lanier: [01:21:02] Oh my God. I wish we were doing it up there and I could play this for you. The pin pia is pretty good. This is played by people who live in the hills, in between Thailand and Myanmar. And imagine if you will, a sort of a long stick and a cross piece on it. And then there's strings strung across it in such a way that the strings hook against each other. And when you pull on a string, it implements this weird interaction between all these strings that are connected, which is similar to what anybody who's into synthesizers would know as a ring modulator. This thing is connected to half coconut or carved out piece of hardwood that you hold against your heart and then you start playing it and it's a courting instrument. And so what it is, is that your chest cavity amplifies it.
[01:21:51] So this was a very Cis Hetero kind of culture as far as I know anyway. So then as it was told to me, then the woman can hear your heart in the literal sense. And so you play this thing. And so, they're these little bronze tips where all the strings connect and it turns out that for hundreds of years, nobody has been able to make bronze tips that sound as good as the really old ones that survived. It's kind of like our problem with Stradivarius is if you believe that's a real problem. And so when I was there, you compare the old tips to the new tips and there's something different. It's something about the brass or something about the casting or something. So, I made a deal with some of them that I brought them and gave them a digital recording studio in exchange for brass tips.
[01:22:37] I have an old one. And I was so excited and I flew back to New York where I was living very, very close to the bottom tip of New York City, in a beautiful loft at that time. And I got up in the morning having forgotten and very late last night, ready to put my pin pia together and it was September 11th. And my loft collapsed and a lot of stuff got broken but the pin pia, and that particular one given its history and I've never gotten it to work as well as I heard in the jungle, but that's a pretty amazing, wonderful, astonishing instrument. This is one of the nicest traditions that's obscure, I guess.
Jordan Harbinger: [01:23:20] That's all I got, man. Jen wants to come in and I know we're over time. I was just curious because I was like, “1500 instruments. You've got to have something crazy.”
Jaron Lanier: [01:23:27] More than you'd believe.
Jordan Harbinger: [01:23:28] Yeah, I can only imagine. Jason, this was your recommendation. I think this was a really good show. Good recommendation. Thanks, man.
Jason DeFillippo: [01:23:37] No problem, man. I've been following Jaron for years. I mean, he's one of the founders of VR so you know he's been around for a long, long time but now that he's taking on social media and the effects that it has on people in society and our mental well-being, I thought we just had to have him on because it's in the zeitgeists right now. Everybody's talking about this because they keep screwing up. They keep screwing up bad. So I'm glad that we could share this with our audience at this time. This is fantastic.
Jordan Harbinger: [01:24:02] Yeah, he's really interesting. A really smart and interesting guy. And of course, he's got a bunch of books. We'll link to a few of them in the show notes. You know, he offered, next time, to let us come over to his house and see his instruments and mess with some of them because he's got 1500. I know we covered this a little on the show, 1500 plus instruments and I think once you get past like five, six, seven, I just run out of instruments that I know exist. So that's pretty neat. His house must look like something out of Hoarders Music, music edition, but like there's got to be some cool stuff in there.
Jason DeFillippo: [01:24:37] Yeah. It sounds like it sounds more like the warehouse at the end of Raiders of the Lost Ark, if he's got 1500, you know that many of them.
Jordan Harbinger: [01:24:44] Yeah. Where they sort of slide it in there and it's like, “Yep, this is the…”
Jason DeFillippo: [01:24:49] Yup, here’s the didgery-do from, you know, 2,400 BC? Or something like that. I'll definitely be flying up if we get to do another show from his house. That sounds awesome. I love old instruments and rare stuff like that. That just sounds super fun.
Jordan Harbinger: [01:25:01] Yeah, definitely. I think it'll be like a museum trip. If you want to know how I managed to book all these great people, manage my relationships using systems and tiny habits, check out our Six-Minuted Networking course, which is free over at JordanHarbinger.com/course. And look, I know you're going to do it later. Sure, you are. The problem with kicking the can down the road, you cannot make up for lost time when it comes to relationships and networking. This is a mistake I see very often when people are talking about this stuff with me. Dig the well before you get thirsty. You can't leverage relationships once you need them, it's too late. The drill takes a few minutes per day. Quit crying. Go to JordanHarbinger.com/course and get after it.
[01:25:38] Speaking of building relationships, tell me your number one takeaway here from Jaron Lanier. I'm @JordanHarbinger on both Twitter and Instagram.
[01:25:46] This show is produced in association with PodcastOne, and this episode was co-produced by Jason “Unfollowed” DeFillippo and Jen Harbinger. The show notes and worksheets are by Robert Fogarty. I'm your host, Jordan Harbinger. The fee for the show is that you share it with friends when you find something useful -- which should be in every episode. So please share the show with those you love and even those you don't. Lots more in the pipeline. A lot of great stuff coming up in the next few months. In the meantime, do your best to apply what you hear on the show so you can live what you listen and we'll see you next time.
[01:26:17] A lot of people ask me which shows I recommend, which shows I listened to and you know, it's funny, The One You Feed with the two-headed wolf here is one that I find myself recommending often, and I've got Eric Zimmer, the host of The One You Feed. And he recently interviewed Tim Pychyl. I actually interviewed him a long time ago. Procrastination is a topic that I should look into, but maybe I'll do it tomorrow. Eric, tell me about this episode.
Eric Zimmer: [01:26:41] Yeah. Tim Pychyl was one of the world's leading researchers on procrastination. That's what he does. He's a professor at a university and he knows more about it than perhaps anybody on earth, and most of us are pretty familiar with procrastination ourselves as a problem that we wrestle with pretty regularly. And so he, in this episode, gives us lots of great strategies for how we can stop procrastinating and how we can really get on with life. One of the things that he says that's so important is that procrastination is really, we delay our lives by procrastinating and when we start to really think about the cost of it, we recognize it's big and lots of great tips in this episode to overcome it.
Jordan Harbinger: [01:27:27] If you're looking for The One You Feed, search for The One You Feed in a podcast app, and of course we'll link to this episode in the show notes as well.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.