Do personalized ads enhance our online experience or intolerably invade our privacy? Andrew Gold joins us on this Skeptical Sunday to uncover the truth!
On This Week’s Skeptical Sunday:
- Personalized advertising relies on algorithms that collect and analyze user data to tailor ads to individual preferences.
- While personalized ads can be effective in engaging users and providing relevant content, they raise concerns about privacy invasion and potential manipulation.
- Voice-activated devices like Amazon’s Alexa and Apple’s Siri are designed to listen for wake-up words, but there have been instances of accidental recordings and privacy breaches.
- The use of personal data in advertising can lead to discrimination and perpetuate existing inequalities.
- Stakeholders including tech companies, advertisers, regulators, and users have roles to play in ensuring ethical and privacy-respecting practices in personalized advertising. Users can — and should — take steps to protect their privacy and advocate for stronger privacy protections.
- Connect with Jordan on Twitter, Instagram, and YouTube. If you have something you’d like us to tackle here on Skeptical Sunday, drop Jordan a line at jordan@jordanharbinger.com and let him know!
- Connect with Andrew on Twitter and Instagram, and check out On the Edge with Andrew Gold here or wherever you enjoy listening to fine podcasts!
Like this show? Please leave us a review here — even one sentence helps! Consider leaving your Twitter handle so we can thank you personally!
Please Scroll Down for Featured Resources and Transcript!
Please note that some of the links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. Thank you for your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini course — at jordanharbinger.com/course!
This Episode Is Sponsored By:
Miss the interview we did with sleep doctor Matthew Walker? Catch up with episode 126: Matthew Walker | Unlocking the Power of Sleep and Dreams here!
Resources from This Episode:
- The Story of Our Rooms | BBC News
- How Companies Learn Your Secrets | The New York Times
- Revealed: 50 Million Facebook Profiles Harvested for Cambridge Analytica in Major Data Breach | The Guardian
- What Is GDPR, the EU’s New Data Protection Law? | GDPR.eu
- 11 Ways VR and AR Stand to Impact Advertising, Marketing, and PR | Forbes
- Privacy in the Age of Psychological Targeting | Current Opinion in Psychology
- ‘Alexa, Are You Invading My Privacy?’ — The Dark Side of Our Voice Assistants | The Guardian
- Google Nest: Built-in Microphone Was Never Disclosed to Users | CNN Business
- Sex Toy Company Admits to Recording Users’ Remote Sex Sessions, Calls It a ‘Minor Bug’ | The Verge
- Americans Reject Tailored Advertising and Three Activities That Enable It | SSRN
- Potential for Discrimination in Online Targeted Advertising | Proceedings of Machine Learning Research
- Facebook’s Ad Delivery Could Be Inherently Discriminatory, Researchers Say | The Verge
- The Cambridge Analytica Scandal Changed the World — But It Didn’t Change Facebook | The Guardian
- All the New Privacy and Security Features in Apple iOS 16 | Wirecutter
- Speak Freely | Signal
862: Targeted Ads | Skeptical Sunday
[00:00:00] Jordan Harbinger: Special thanks to Airbnb for sponsoring this episode of The Jordan Harbinger Show. Maybe you've stayed at an Airbnb before and thought to yourself, "Yeah, this actually seems pretty doable. Maybe my place could be an Airbnb." It could be as simple as starting with a spare room or your whole place while you're away. Find out how much your place is worth at airbnb.com/host.
[00:00:18] Coming up next on The Jordan Harbinger Show.
[00:00:21] Andrew Gold: As algorithms and data collection methods become more advanced, personalized ads could become even more intrusive and pervasive. We're talking about the devices already doing this, which are like smartphones, wearables. Simply creating an even larger pool of information about you. With enhanced AI, data mining and machine learning will also be even better at understanding our habits.
[00:00:47] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger, and this is Skeptical Sunday, a special edition of The Jordan Harbinger Show, where a rotating co-host and I break down a topic you may have never thought about, open things up, and debunk some common misconceptions. Topics such as why the Olympics are kind of a sham, why food expiration dates are nonsense, why tipping makes no sense and is maybe even a little bit racist, fast fashion, weddings, recycling, banned foods, toothpaste, chemtrails, and a whole lot more.
[00:01:14] Normally, on The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people. And turn their wisdom into practical advice that you can use to impact your own life and those around you. We have long-form interviews and conversations with a variety of incredible people from spies to CEOs, athletes, authors, thinkers, and performers.
[00:01:33] And if you're new to the show and you want to tell your friends about the show, or both, Our episode starter packs are a great place to begin. These are collections of some of our favorite episodes organized by topic. New listeners there can get a taste of everything we do here on this show. Topics like persuasion and influence, China, North Korea, scams, conspiracy theory debunks, crime, and cults, and more. Just visit jordanharbinger.com/start, or take a look in your Spotify app to get started.
[00:01:59] By the way, if you use the Stitcher app to listen to this show, they are getting rid of that app. August 29th, it will no longer be useful. So switch to a different app if you use the Stitcher app to listen to this podcast. If you're on Android, I suggest Podcast Addict. It might not be as pretty, but it works really well. If you're on iOS, Apple, you should use Overcast, in my humble opinion, or Apple Podcasts, but definitely no longer Stitcher. It will not update anymore in the next couple of months. So if you're using the Stitcher app, now's a good time to switch to a new podcast app. And if you have any problems with this, you're kind of Boomer in terms of your tech, you don't know what to do, you can always email me, jordan@jordanharbinger.com. I will try to point you in the right direction, but the Stitcher app will no longer work for this show.
[00:02:42] Today, we're discussing personalized ads, a topic that has garnered a ton of attention and debate in recent years. Are they a convenient way to enhance our online experience by getting targeted advertising, or do they cross the line and invade our privacy?
[00:02:58] Little note here, just after this was recorded, of course, Google has announced that Chrome will no longer be using cookies in the way that we explain them in the show. Of course, advertisers are still going to be getting our info, tracking us, targeting ads, even if the form that tracking takes is something different than a cookie as we know it now. So just keep that in mind, and no need to email me and let me know, because I know we just recorded this and, of course, the week later, everything gets shaken up. But the rest of the episode, to our knowledge, is still up-to-date and accurate.
[00:03:25] To help us delve into this topic, I'm excited to welcome journalist and host of the On the Edge with Andrew Gold podcast. Well, yeah, Andrew Gold.
[00:03:32] So, Andrew, let's start with the basics. What are personalized ads? How do they work? I think we've all seen these and gotten creeped out by these.
[00:03:39] Andrew Gold: Yeah, I agree. Actually, I think we've all had those moments where we've been a little spooked by personalized ads. You know, when you were just talking to your friend about the kitchen being dirty, and there's suddenly an ad for vacuum cleaners on all your socials, all your devices, everywhere you look, there are vacuums.
[00:03:56] Jordan Harbinger: Right, that's a problem when you're a podcaster who has to research and talk about weird topics for your job. You know, you're like, uh, child trafficking, and it's like, ooh, what kind of targeting am I going to get from this? And your devices, they start to seem like something out of the Twilight Zone.
[00:04:10] Andrew Gold: Yeah, just from having said Twilight, you'll probably be inundated with ads for memorabilia featuring Kristen Stewart, or was it Kirsten? I don't even know and Robert Pattinson, werewolves, vampires, and all those things.
[00:04:21] Jordan Harbinger: I'm on the market for Robert Pattinson full-size cardboard cutouts, so it couldn't be more welcome. But yeah, camping, roasting marshmallows, catching fireflies, possibly.
[00:04:30] Andrew Gold: Yeah, that's twilight in the traditional sense. Before the word was hijacked by a book and movie franchise. But anyway, what's causing these spooky ads that seem to know what you're thinking before even you do? Personalized ads are also known as targeted or behavioral ads. And they are advertisements that are tailored to individual users based on their online behavior, interests, and demographic information. Advertisers use complex algorithms and data collected from various sources such as browsing history, social media activity, and online purchases, to create a profile of each user. This profile is then used to serve ads that are more likely to be relevant and appealing to the individual.
[00:05:10] Jordan Harbinger: Okay, so on the surface, it's not that surprising. It's not that controversial. I mean, gee, they look at stuff I like and they show me more of the stuff that I like. Big deal. What are the actual concerns around personalized ads? Maybe I want them to show me new stuff that I can waste money on. I mean, what's the big deal?
[00:05:24] Andrew Gold: Well, look, we're living through a very strange moment in time when AI is becoming smarter than us. And that's now being spoken about a lot. But the other thing that rapidly advancing technology has brought on is our becoming accustomed to being listened to. Historically, humans didn't really have privacy. If you go back centuries, an entire family would sleep in the same room. Even in London at the turn of the 20th century, writer Jack London was venturing into the city's poverty-stricken East End, where several families would share one small room. So you can only imagine the sounds, smells, and sexuality that was shared, overheard, and eavesdropped on. Children would hear their own parents having sex, for example.
[00:06:05] Jordan Harbinger: Okay, there's a lot here, but is the guy's name really Jack London, and he lived in London?
[00:06:09] Andrew Gold: Yeah.
[00:06:10] Jordan Harbinger: Because that's weird. Or is it like a pen name?
[00:06:11] Andrew Gold: And I think he was American.
[00:06:13] Jordan Harbinger: Oh, that's even more ridiculous somehow. And is the East End of London nice now, or is it gross still?
[00:06:19] Andrew Gold: It took a long time. So he was there like the very early 1900s and the writing he did was a little bit like George Orwell. He did Down and Out in Paris and London, did a similar thing I think 20 years later or so, living with the poor people. I suppose it would now be seen as sort of poverty tourism.
[00:06:33] Jordan Harbinger: Mm-hmm.
[00:06:34] Andrew Gold: But back then, they didn't have concepts like that, at least, it wasn't spoken about or written about. But decades later, the East End has gradually become very much gentrified and fashionable nowadays. So yeah, it's quite a lot nicer now.
[00:06:47] Jordan Harbinger: It's like Brooklyn.
[00:06:48] Andrew Gold: Yes, it is like Brooklyn, yeah.
[00:06:50] Jordan Harbinger: In the '80s, you can't go there, and now it's like, ooh, you live in Brooklyn. Fancy, what a yuppie.
[00:06:55] Andrew Gold: Exactly.
[00:06:55] Jordan Harbinger: Probably drinks pour over coffee.
[00:06:56] Andrew Gold: Mm-hmm.
[00:06:57] Jordan Harbinger: But my mind is just racing right now with how gross all of that would have been back then with all the people in one room. I mean, it's not just your parents that you hear having sex or see having sex; it's the other family in there too. I just, ugh, I mean, oh, someone's got the sh*ts? All right, everyone gets to enjoy that for a few days. And then, get their own version of it, kind of right after the fact. It's just really gross. Yikes.
[00:07:21] Andrew Gold: Yeah, well, privacy became the luxury of the wealthy, and then fast forward a century, and many of us have become used to living quite privately. And that is a huge privilege compared to the rest of recorded history. And now, all of a sudden, we're being told, "Ah, well, you are still private, but you're sort of being tracked and listened to, and you just don't know when, where, or by whom?
[00:07:44] Personalized ads have become ubiquitous across the Internet, and they have undoubtedly changed the way we interact with online content. Online companies often track our data through cookies. You'll have seen those annoying pop-ups asking you to accept all cookies, and you know that annoying thing all the time. What a lot of us don't even realize is that we can actually click reject, and the computer won't blow up or anything, you can still reject them, and the website still works. The cookies can be useful because they can help you store your login details so you can sign in each time without typing your password in. Websites also track your data with pixels, for example. Pixels will tell you if someone has opened your email or visited your website. And there is also IP address tracking so they can see what you're getting up to.
[00:08:30] On the other hand, they can make the browsing experience more engaging by showing us ads, as you say, for products and services that genuinely interest us. And who doesn't enjoy that convenience? I've had a bad neck, and I keep seeing ads for videos and tools I can buy to help it. I'm being prompted to think again of ways to solve what's ailing me. But also, personalized ads have raised serious concerns about privacy, data security, and the potential for manipulation.
[00:08:57] Jordan Harbinger: Right, given the controversial nature of some of the guests that I research on the show, as I mentioned before, I mean, you just don't even want to know what kind of stuff pops up in my feed. Let's dive into some of these controversies, by the way. Can you give us some examples of instances where personalized ads have gone just a bit too far or even just been perceived as invasive?
[00:09:15] Andrew Gold: Absolutely. There was one infamous example, the case of Target, a large retail chain in the United States that you'll all be familiar with. And even I am, despite not being from anywhere near that country. In 2012, the New York Times reported that Target had used its data analysis capabilities to identify or, ahem, target pregnant customers based on their shopping habits. The company then sent these pregnant customers targeted, targeted, ads for—
[00:09:43] Jordan Harbinger: Yeah, we get it.
[00:09:43] Andrew Gold: —baby products. I'm sorry.
[00:09:46] Jordan Harbinger: No, we get it. We get it. Wait, but how did they know that the customer was pregnant? So she didn't fill anything out to say, "Hey, I'm pregnant."
[00:09:53] Andrew Gold: No, this is the crazy bit. I mean, they managed to identify what products pregnant women typically buy more. So when an online user hit the threshold for unscented lotions, as well as supplements like magnesium and zinc, it alerted Target to the fact that they were likely pregnant. Because of your unique online ID and your online history, it also knew whether you'd be more likely to respond to emails or coupons in the post, what day you're most open to receiving them and most likely to buy new stuff.
[00:10:24] Jordan Harbinger: Mmm.
[00:10:24] Andrew Gold: Again, it's an invasion of privacy, and that's bad. But it's still a little bit abstract, right? Like, oh, our privacy is invaded, so what?
[00:10:32] Jordan Harbinger: Yeah, wah wah, I got coupons.
[00:10:33] Andrew Gold: Yeah.
[00:10:34] Jordan Harbinger: Give me a break.
[00:10:35] Andrew Gold: Exactly. So everyone's wondering now, so what are the real-world ramifications of that? Well, here's one example. A year after Target started this program, a man walked into Target in Minneapolis and angrily demanded to see the manager. "My daughter got this in the mail." I'm doing an American accent.
[00:10:50] Jordan Harbinger: Yeah, you're nailing it.
[00:10:52] Andrew Gold: "She's still in high school, and you're sending her coupons for baby clothes and cribs. Are you trying to encourage her to get pregnant?" Right?
[00:10:59] Jordan Harbinger: Ah, sweet summer child. Like, by the way, dude, so his daughter was pregnant, and because of her online habits, Target knew before he did. And I'm guessing with AI that Target's going to know even before you know that you're pregnant because you're like, "Ah, I don't like this lotion anymore. And I feel like I'm bloated. Maybe I need this kind of vitamin or something." And they're going to go, "Are you pregnant?" And you'll be like, "Nah." And then later, you'll be like, "Oh, actually, hang on a second. Maybe I'm pregnant."
[00:11:25] Andrew Gold: It's really scary, isn't it?
[00:11:26] Jordan Harbinger: It is. It's kind of interesting but spooky. Yeah.
[00:11:28] Andrew Gold: Yeah. Suffice to say, the manager of that particular Target store had no idea what the man was talking about.
[00:11:34] Jordan Harbinger: The manager's like, "Dude, I don't know. We mailed you coupons. Calm down."
[00:11:37] Andrew Gold: "Who's your daughter? I don't even know. I wasn't there." But he called the man a few days later and the New York Times describes the man as being somewhat abashed and said, "I had a talk with my daughter. It turns out there's been some activities in my house. I haven't been completely aware of, and she's due in August. I owe you an apology."
[00:11:53] Jordan Harbinger: Wow. I think he was obviously a bit shocked and surprised, maybe overwhelmed by the whole thing. But I'm pretty sure that he doesn't owe a shop that's been spying on his kids an apology. I will hand it to him, though. It is pretty classy to admit that you lost your temper at a random person who didn't even understand what corporate was doing and then call that person and apologized. That's a class act.
[00:12:14] Andrew Gold: Yeah, that is classy. I mean, look, the incident raised serious questions about the ethics of data collection and personalized advertising. Target got some bad press for this and lost some trust. But to get around it, they just started adding more products around those in the brochures aimed at pregnant women, so that the related stuff would seem random and coincidental. So they'd like stick a lawnmower next to an ad for diapers. And it works. Their sales rocketed.
[00:12:40] Jordan Harbinger: That's really interesting and crazy. And I guess that if it didn't help sales, there'd be no point in going to all that trouble and crunching the data. But it is a funny visual. I'm imagining this manager at Target just being like, "Sir, our marketing AI says that your daughter's been getting railed by her boyfriend while you're out golfing. And the AI is almost never wrong. So, like, we've also sent the boyfriend some coupons for latex condoms if it makes you feel any better. And thank you for shopping at Target." Oh man, it's amazing to see how valuable information and data are. And look, the cliche is right, knowledge is power. The more you know about your customers, the more you can exploit it, especially tracking down to the day where they open their mail or their email. So that's women and diapers and inadvertently telling a father his teenage daughter is pregnant. What about on the global scale? There's sort of macro stuff going on here with—
[00:13:31] Andrew Gold: Yeah.
[00:13:32] Jordan Harbinger: —the targeting of ads.
[00:13:33] Andrew Gold: On the global scale, we can see how personalized ads are shaping politics. Let's go back to the 2016 US presidential election, where personalized ads played a significant role. This is the one a lot of listeners will, of course, know about because it was a huge story at the time. Cambridge Analytica, a data analysis firm, used Facebook data to build detailed profiles of more than 87 million users without their consent. These profiles were then used to create highly targeted political ads aimed at influencing voter behavior. The scandal led to a global debate on the ethics of using personal data for political purposes.
[00:14:09] Now, again, why is this so bad? Well, it depends on your outlook. In some senses, it's just about companies getting an advantage and using the information they have, but it was a huge privacy violation because they took this information without consent via a sneaky personality quiz app. It didn't just gather information about the users of the app, but their Facebook friends and networks. It was a manipulation of public opinion as they created psychological profiles of the users based on their answers. It undermined the democratic process and is thought to have heavily influenced the 2016 presidential election as well as Brexit. That's Britain leaving Europe.
[00:14:43] Jordan Harbinger: Yikes.
[00:14:44] Andrew Gold: Yeah. By the way, when I remember when I was in Germany, I met some, a German person who kept telling me they would, they couldn't believe that Britain wanted to leave Europe. And I swear to God, this person literally thought that we were moving the island further away from Europe, and that's what Brexit was.
[00:14:57] Jordan Harbinger: Yikes.
[00:14:58] Andrew Gold: Yeah.
[00:14:59] Jordan Harbinger: That's, uh, yeah, well, not everybody's a genius.
[00:15:03] Andrew Gold: But yeah, we can't know exactly how Cambridge Analytica influenced these votes, but it did work directly with the Leave EU campaign. They also worked with Ted Cruz's Republican campaign and then Donald Trump's.
[00:15:15] Jordan Harbinger: So that's an example of how personalized ads have put presidents in place, moved countries out of political zones, almost moved the entire island of Great Britain somewhere further away from the coast. I don't know how that was going to work. But that is crazy. It's shocking because you think like, what if they'd worked for a different cause? Or what if they'd done something like this for the Ukraine war or to China? I mean, the possibilities are kind of endless, especially since that was what round one. I mean, they'd be way better at it now if they didn't get shut down/sued into oblivion. I assume somebody else is just doing it now. But how do we balance the convenience of personalized ads with the potential invasion of privacy? Because look, the capital is going to capitalize. They're going to keep doing this, well, as much as companies and governments allow them to do this.
[00:16:01] Andrew Gold: Well, that's exactly right. So there's no easy answer as to how to stop it. But one approach is for regulators to implement stronger privacy laws and guidelines. For example, the European Union's General Data Protection Regulation has significantly increased the level of control that individuals have over their personal data. It requires companies to be more transparent about their data collection practices and gives users the right to access, correct, or delete their data. However, regulations alone may not be enough. Users need to be educated about the risks associated with personalized ads and take proactive steps to protect their privacy.
[00:16:38] We live in such a different world to the one in which I went to high school, even back then, most stuff that I was learning at school felt pretty irrelevant. You know, at 34 years old, I could still tell you now what an oxbow lake is. You know, we learned about that. I knew what—
[00:16:50] Jordan Harbinger: I don't know what that is. What is that?
[00:16:52] Andrew Gold: When a river meanders and like goes like a snake and eventually over time, the water, rather than like doing this big loop, can sometimes go over ground, just go straight over the hump, so to speak. And then, the water that was the bend gets cut off and becomes its own lake on the side.
[00:17:10] Jordan Harbinger: Oh, okay. So it's a little lake that's on the side of a river that used to just be part of the river but is now isolated. That's interesting. I've never heard of that, but it totally makes sense.
[00:17:17] Andrew Gold: Yeah.
[00:17:17] Jordan Harbinger: Okay. I remember Bunsen burners.
[00:17:19] Andrew Gold: Yeah, Bunsen burners as well. I just remember learning about the oxbow lake and that was for me the epitome of like something I don't really need to know but I wish I knew more about taxes. Not that I'd have listened when I was younger. We live in a world of chat GPT and AI bots and personalized ads, and kids need to learn about this stuff from a young age. So I would hope that they're being educated more about like their privacy and personal data than we were. I hope they're not just learning about oxbow lakes and stuff like that. You can also make sure you are using privacy-enhancing tools, such as ad blockers, virtual private networks, which we know as VPNs, and privacy-oriented search engines, like DuckDuckGo. Additionally, tech companies can be encouraged to adopt more privacy-respecting practices, such as providing better privacy settings and incorporating end-to-end encryption in their services.
[00:18:05] Jordan Harbinger: Yeah, they're probably not going to do that. The old, again, cliché, absolute power corrupts absolutely, as they say. I think humans just can't help themselves, and when you got the tech to do it, it's hard to stay out of other people's business, especially when there's immense monetary rewards for spying. So let's get speculative then.
[00:18:25] What are some of the concerns and speculations about the future of personalized ads and privacy in the digital age? I mean, are we in, are we in trouble? ChatGPT4 is pretty damn good at writing ads, let's say for this show that I just have to briefly rewrite in my own voice and then sort of ad-lib or improv. But man, if you can apply that weapon to, which they will, to targeting, it's just, it's going to be unstoppable. It already is.
[00:18:51] Andrew Gold: Yeah. Well, one concern is the increasing sophistication of ads targeting technologies. As algorithms and data collection methods become more advanced, personalized ads could become even more intrusive and pervasive. We're talking about the devices already doing this, which are like smartphones, wearables, simply creating an even larger pool of information about you. With enhanced AI, data mining and machine learning will also be even better at understanding our habits. It's all a little minority report, you know, when he goes into the mall and it scans his eyes and everything in the mall is perfectly tailored to him. It's augmented reality, virtual reality, ads that can read your emotional state and take advantage of it.
[00:19:34] Jordan Harbinger: Here are some non-targeted ads for the fine products and services that support this show. We'll be right back.
[00:19:39] This episode is sponsored in part by Airbnb. So we used to travel a lot for podcast interviews and conferences, and we love staying in Airbnbs because we often meet interesting people. And the stays are just more unique and fun. One of our favorite places to stay at in LA is with a sweet older couple whose kids have moved out. They have a granny flat in their backyard. We used to stay there all the time. We were regulars, always booking their Airbnb when we flew down for interviews. And we loved it because they'd leave a basket of snacks, sometimes a bottle of wine, even a little note for us. And they would leave us freshly baked banana bread because they knew that I liked it. And they even became listeners of this podcast, which is how they knew about the banana bread. So after our house was built, we decided to become hosts ourselves, turning one of our spare bedrooms into an Airbnb. Maybe you've stayed in an Airbnb before and thought to yourself, "Hey, this seems pretty doable. Maybe my place could be an Airbnb." It could be as simple as starting with a spare room or your whole place while you're away. You could be sitting on an Airbnb and not even know it. Perhaps you get a fantastic vacation plan for the balmy days of summer. As you're out there soaking up the sun and making memories, your house doesn't need to sit idle, turn it into an Airbnb, let it be a vacation home for somebody else. And picture this, your little one isn't so little anymore. They're headed off to college this fall. The echo in their now empty bedroom might be a little too much to bear. So, whether you could use a little extra money to cover some bills or something a little more fun, your home might be worth more than you think. Find out how much at airbnb.com/host.
[00:21:03] Thank you so much for listening to and supporting the show. Your support of our sponsors does keep the lights on around here. All the deals, discount codes, special URLs are all in one place. jordanharbinger.com/deals is where you can find them. You can also search for the sponsors using the search box on the website as well. And feed your data into the AI chatbot so that you can probably get even more targeted ads. So please consider those who support the show.
[00:21:27] Now, back to Skeptical Sunday.
[00:21:31] Okay. So not just what we like, But something we like when we are more likely to buy it. I think that's what kind of scares me. Maybe it detects my mood is a bit low because of I'm wearing an Oura Ring wearable, and my scale that's Wi-Fi connected says I'm three pounds heavier this month, and that I slept poorly. So now, they're trying to sell me coffee, Xanax, and Spanx, which sounds like a hell of a combination. I mean, again, I'm possibly on the market for at least one or two of those things.
[00:21:59] Andrew Gold: What is Spanx?
[00:22:00] Jordan Harbinger: Spanx? Those are, oh man, you don't have those over there. So Spanx are, I think they're only for women. They probably have a men's version. I don't know, but they're for women. They're like tights, but they allow you to change the shape of your body under the clothes that you're wearing. So they'll take things that maybe you don't like being so big and they'll shove them somewhere else where you don't mind being bigger. And I don't exactly know how they work, but I think it'll take like the skin on the back of your hamstrings and quads and like shove it up towards your butt. So you got a big old round booty, but your legs look nice and thin, and it just changes the shape of your body. So this is a multi-billion-dollar company that makes these body shaping tights, for lack of a better word.
[00:22:40] Andrew Gold: I've seen that a bit in the gym. It looks a bit unnatural and I did wonder like—
[00:22:44] Jordan Harbinger: Yeah.
[00:22:45] Andrew Gold: Is that just what some people look like? And I suppose it's just things are being moved around to look like that.
[00:22:49] Jordan Harbinger: Things are being shuffled in a way that probably isn't super comfortable but is better than maybe feeling like you look like Groot or whatever from whatever that movie is.
[00:22:59] Andrew Gold: I don't know what you're talking about.
[00:23:01] Jordan Harbinger: Invincible me. Yeah.
[00:23:01] Andrew Gold: I don't know. Look—
[00:23:03] Jordan Harbinger: No. Looking like a, let's say, you don't want to look like a Pixar character.
[00:23:06] Andrew Gold: No.
[00:23:07] Jordan Harbinger: Okay. Which a lot of us do. You know, I'm including myself in this. I'm not trying to body shame anyone. And so it's like, hey, if I can put something on that makes this look bigger, and this other thing looks smaller—
[00:23:17] Andrew Gold: Mm-hmm.
[00:23:17] Jordan Harbinger: I want 10.
[00:23:19] Andrew Gold: I want some of those as well.
[00:23:20] Jordan Harbinger: Especially if it's that kind of day.
[00:23:22] Andrew Gold: Yeah.
[00:23:22] Jordan Harbinger: You're having a day.
[00:23:23] Andrew Gold: Yeah. I know that now everybody listening to this is going to be getting loads of ads for Spanx.
[00:23:28] Jordan Harbinger: Yeah. S-P-A-N-X.
[00:23:29] Andrew Gold: They should pay us.
[00:23:30] Jordan Harbinger: Hey, look, I'll take that too. I'll turn it right around and give it to them, depending on how I look that morning in the mirror.
[00:23:35] Andrew Gold: But look, if people want to look like that, then they want to look like that, and then it's good that they're getting ads for Spanx. So, look, this is the thing that's underlying this entire discussion about the perils of personalized ads, and we're shying away from it a little bit. We're focusing mostly on the invasion of privacy, and that makes sense because that's where the concern lies, and humans are concerned creatures.
[00:23:53] But in terms of convenience, there are many positives that can come from improvements to the current systems. For example, I make a lot of videos for my YouTube channel about the alleged abuses of Scientology and Tom Cruise. So what ads tend to pop up during my videos, what ads for Scientology and Tom Cruise? It's the last thing my subscribers want to see. Sometimes they take it up with me and you know, they say like, "Why am I getting ads to join Scientology? I hate Scientology. That's why I like your channel." And I have to explain that I've got nothing to do with it. Now a less crude system, as it advances, would be able to differentiate better between a user who hates and a user who loves a certain cult or celebrity or product. And it is getting there. The Internet is fast becoming a mall, very specifically tailored to your tastes, even tastes, as you were saying before, that you may not have realized you have or a state of being you didn't realize you were, like being pregnant, for example.
[00:24:44] Now, Netflix is a great example of the benefits of this personalization algorithm which uses user data such as viewing history, ratings, and preferences to recommend content tailored to each individual. This has proven to be highly effective in keeping users engaged and increasing the likelihood that they'll find content they enjoy. But all of this could lead to a further erosion of privacy and potentially enable new forms of manipulation and exploitation. Another concern is the potential for a privacy divide. As more privacy-conscious users adopt tools to protect their data, advertisers may shift their focus to those who are less informed or less able to protect themselves.
[00:25:21] Jordan Harbinger: Mmm.
[00:25:22] Andrew Gold: Think of how spam mail scams are often deliberately misspelled and badly written. to weed out the people too aware to fall for the trick.
[00:25:30] Jordan Harbinger: So I've heard that they're like, "Oh, all those errors, so that they screen in stupid people." And I'm like, is that true? Or are they just dumbass people who also are, you know, scamming for a living and they just don't, they don't know.
[00:25:43] Andrew Gold: There's no way of actually knowing that. I've definitely heard people who've said they actually did that as a job and they did purposely misspell it. I know what you're saying. It feels like they were caught out and made fun of for spelling things so badly. And now, they're sort of owning it like, "Oh, we were doing that on purpose just to like weed people out.
[00:25:57] Jordan Harbinger: Yeah. Totally, we're playing 40 chess. That's the thing. That's the thing you don't understand.
[00:26:02] Andrew Gold: Yeah. Well, apparently, this could happen with advertisers, you know, so personalized ads, I should say it could result in unequal distribution of privacy protections with the most vulnerable users bearing the brunt of invasive advertising practices.
[00:26:15] Jordan Harbinger: Got it, okay. And by the way, I was thinking of the character Gru from Despicable Me. I think I said Groot from despicable, or invisible me, I don't know, whatever. It's Gru. He's basically like really big on top and really skinny on the bottom.
[00:26:26] Andrew Gold: Oh, yeah.
[00:26:26] Jordan Harbinger: So he looks like every 40-year-old-plus man in skinny jeans does. Alright, so that is scary. You'd have half the population constantly consuming kooky, crazy stuff, spending a ton of their money, getting scammed, getting defrauded, junk mail, and the rest of us are sort of left alone ish, better to keep a hold of our money and I can see that divide taking place. I mean, I think it kind of already does, but you know, that's why we do these episodes, right? Hopefully, a lot of people are going to be a little more privacy-aware at the end of it. And I think one of the things we're all pretty suspicious of and have long wanted to know for sure about is whether voice-activated devices like Amazon, Alexa, Apple, Siri, Google Assistant, are those things listening to us and stealing our data? Supposedly, they're in our phone, they're in our house, some of our TVs have it, I mean that stuff is, that's a little invasive. That's a little creepy.
[00:27:17] Andrew Gold: Yeah, it's a really interesting question and I've also long wondered that because it definitely feels like they're listening. The first thing to say is that they are listening because they're designed to pay attention to wake up words. They employ something called passive listening, which means they're not actively recording but simply listening to see if you'll wake them up.
[00:27:35] Jordan Harbinger: Ah, okay. So, does literally anyone listening to this right now trust that at all? You shouldn't. We're really starting to inhabit an Orwellian world of tech, an invasion of privacy. I just don't buy that, oh, it gets to wake up word, it's not listening. Like, okay. But is it not sending anonymized data to a server that then crunches it? That just seems like It has to be doing that. I don't know.
[00:27:59] Andrew Gold: Yeah, I think sometimes like more conspiratorial minds, we think of an evil person at the top of a company who's like doing all these like bad, naughty things to get all of our data. And that does happen, but sometimes it's just a case of like the technology's just gotten out of their control.
[00:28:14] Jordan Harbinger: Yeah.
[00:28:14] Andrew Gold: They've created something they can't control anymore. There are too many people involved, too many potential problems that can happen, and then privacy just leaks out that way. The Guardian reported of a former Amazon employee who actually had to record a lot of commands for the Echo, like he did the voice for it and stuff.
[00:28:30] Jordan Harbinger: Mm-hmm.
[00:28:31] Andrew Gold: One day, he came home and found his own Echo spitting out loads of mad old information that he had asked it over the months about recording TV shows and things he'd been looking up. And it wouldn't stop. He had to just throw it away. But he said, "I felt a bit foolish. Having worked at Amazon and having seen how they used people's data, I knew I couldn't trust them."
[00:28:50] Jordan Harbinger: Oh, man.
[00:28:51] Andrew Gold: Yeah.
[00:28:51] Jordan Harbinger: The guy who like worked on that team is like, "I'm throwing this in the trash."
[00:28:55] Andrew Gold: Yeah.
[00:28:55] Jordan Harbinger: That's creepy.
[00:28:56] Andrew Gold: Yeah. He comes home, finds it like shouting things that he's asked that like should or shouldn't have been recorded and it's still there—
[00:29:02] Jordan Harbinger: Yeah.
[00:29:02] Andrew Gold: —on the machine. And then another time, an Amazon customer in Germany was mistakenly sent 1,700 audio files from another customer's Echo, including enough detail to locate and identify him and his girlfriend. Then, there's a woman in Portland, Oregon, who discovered her Echo had sent recordings of private conversations to one of her husband's employees. Amazon responded that it must have mistakenly thought that it heard the wake word, misheard a request to send a message, misheard a name in its contacts list, and then, miss heard a confirmation to send the message all during a conversation that she was having about hardwood floors.
[00:29:43] Jordan Harbinger: Oh my god, that's a lot of coincidences at once. Although I suppose that if millions of people have these devices, then even if the odds are one in a million, it happens all the time. And this reminds me, I had the Amazon Echo for a while the one with the screen. I can't remember which one is which but it found my old law school classmate who lives in the Philippines. And it would just say like, Paolo's office. And I'm like, oh. And I would tap it, once I tapped it, because I was like, how do I get rid of this? And what it showed me, somehow, was a live feed of his office.
[00:30:16] Andrew Gold: Shut up.
[00:30:17] Jordan Harbinger: I sent him a message. I was like, "Hey, long time no see. You got to fix this, because I can look at a live feed of your office at any time." And he's like, "Oh, that shouldn't happen." Because he actually worked for the government and he's like, "That's definitely not supposed to happen." And I'm thinking this is really bad. If you're a highly placed guy in a government and you've got this thing, I can just tap on it and see you while I'm in my house and you don't even know this thing is necessarily on. That's really, really bad. But what really scared me was like, this thing is in my kitchen, which is attached to my living room. I'm walking around here at night, I'm walking around here in various states of undress—
[00:30:57] Andrew Gold: Mm-hmm.
[00:30:57] Jordan Harbinger: —my wife the same, we got babies, you know, my parents are here, whatever. Who's looking at me in my house with this? If not him, who is it? So that scared the crap out of me and we got rid of that crap like immediately after that, but probably not soon enough because. How long were some random show fan who found me on that thing watching me in my kitchen before that happened? I don't know.
[00:31:17] Andrew Gold: Well, I've stopped watching since, Jordan. Well, since you threw it away, actually.
[00:31:21] Jordan Harbinger: Yeah.
[00:31:21] Andrew Gold: I've got nothing to watch anymore.
[00:31:22] Jordan Harbinger: Right, no. Now you have to settle for Netflix.
[00:31:25] Andrew Gold: That's all I can watch now. No, I know, look, and those kind of bugs, like, there could be somebody at the top somewhere who's just like elite, crazy person who wants to find out all your secrets. And it could be like, you know, there were just bugs in the system. And for some reason, you end up seeing your friend's office, which is insane. So I think we need to err to an extent on the side of the non-conspiratorialist while also raising these issues as food for thought. There was the Google Nest device that people realized had a microphone in it. And that wasn't mentioned anywhere, and then Google were like, "Oh right, that, well that's not a secret." As if it's just like an afterthought or something, a hidden microphone in your device that wasn't advertised and nobody knew about.
[00:32:05] Jordan Harbinger: Wait, so it's like not on the box? It doesn't say it has a microphone.
[00:32:08] Andrew Gold: No.
[00:32:09] Jordan Harbinger: The plans don't have a mic, but it's like, "Oh, yeah, we added that later, and, but it's not a secret, we just didn't tell anyone."
[00:32:15] Andrew Gold: Exactly.
[00:32:16] Jordan Harbinger: That's weird.
[00:32:17] Andrew Gold: So bizarre. So that's when you start thinking, okay, well, I don't want to be like a conspiracy theorist here, you know, I really don't, but like, what, how does that get put in? How are these like hundreds of people at Google or wherever they're designing this? Like, who is the person that's going like, "Eh, just put a microphone in." "Are you going to put that in the plans?" "Eh, people won't mind."
[00:32:35] Jordan Harbinger: We don't have to reprint the plans. Yeah. We just have to redesign the entire device and manufacture them differently, but we're not going to reprint the instructions online.
[00:32:42] Andrew Gold: Nah. The guy who does the printing's already left the office for the day. Let's not bother him.
[00:32:48] Jordan Harbinger: Yeah, it's really hard to read the plans. Let's just redo the entire circuit board, find a supplier for these, manufacture a bunch of them, sell them.
[00:32:55] Andrew Gold: Yeah.
[00:32:56] Jordan Harbinger: But we don't want to have to bother with that pesky PDF editing.
[00:32:59] Andrew Gold: Just really weird.
[00:33:00] Jordan Harbinger: Very.
[00:33:00] Andrew Gold: What's the microphone for? So ultimately, we don't know. And as you say, even if something seems incredibly unlikely, the sheer number of people who own these devices means it might still happen even when it's not meant to happen. The bugs happen and weird coincidences when they think they've heard the wake word and they wake up and record you these devices. And here's the thing, like, even with these seeds of doubt in our mind, we've created a society where it's just too damn convenient to have these devices. And as I was alluding to at the start, we'd become more and more used to the idea of privacy beforehand. You know, compared to the sort of Victorian times, we had real privacy. And now I sit at my laptop researching this very topic. And as I do so, I sometimes read bits and think about. What I'm going to say out loud, I'm totally alone in my apartment and yet I can feel that a small part of me has sort of given up on the concept of anything I do or say being entirely private. So when I'm sort of saying things out loud in my head, I think something probably is recording me. I better be a bit careful, just in case.
[00:34:01] Jordan Harbinger: Mm-hmm.
[00:34:01] Andrew Gold: I don't know what. So in the words of Gizmodo writer Matt Novak, he said, we live in a techno dystopia of our own making. If you still have an Alexa or any other voice assistant in your home, you were warned.
[00:34:13] Jordan Harbinger: Mmm.
[00:34:14] Andrew Gold: That probably feels even truer, at least on a psychological level, for people like you and me, Jordan, because we sit in rooms with big cameras, microphones, speakers, headphones, smartphones, all sorts of devices. I don't even know which device or whether all of them are listening to me. And to be honest, it probably is no different for most people listening to this as well. Anyone listening has either a smartphone or a laptop or computer, and they live and work in areas where these devices might be recording them. And I worry not only about that invasion of privacy from a demographic and marketing perspective, but what if my data fell into the wrong hands?
[00:34:54] Jordan Harbinger: All right, we need to take a break and hear some non-personalized ads from our sponsors, but it's personally important to me that you support them. We'll be right back.
[00:35:02] This episode is sponsored in part by Airbnb. We used to travel a lot for podcast interviews and conferences, and we love staying in Airbnbs. We often meet interesting people. The stays there are more unique, more fun. One of our favorite places to stay in LA, a sweet older couple, their kids moved out. They've got an in-law unit in their backyard. We used to stay there. We used to book that place every time we flew down for interviews, and it's great. They had parking. They had snacks. They would bake banana bread for me because they knew I liked it. They listened to this podcast, which is a great way to become one of my favorite people. So maybe you've stayed in an Airbnb before you thought to yourself, "Hey, this seems pretty doable. Maybe my place could be an Airbnb." We built one in our house with a separate entrance because we thought we would utilize the space. It could be as simple as starting with a spare room, your whole place while you're away. You could be sitting in an Airbnb right now and not even know it. Maybe you live in a city with a music festival, an epic sporting tournament, and that noise isn't your cup of tea, get out of town. Make a quick getaway, leave the chaos behind. Meanwhile, Airbnb your home. Earn a little extra cash while you're at it. Or maybe you're in the work-from-home club and now you're back in the office. The home office, well equipped, ready for use. It doesn't have to sit there and gather dust. Turn it into an Airbnb. Earn a neat little sum on the side. So whether you could use a little extra money to cover some bills or something a little more fun, your home might be worth more than you think. Find out how much at airbnb.ca/host.
[00:36:19] Once again, thank you for listening to and supporting the show. All the deals and discount codes are over at jordanharbinger.com/deals. You can also search for any promo code using the search box on the website as well. Once again, consider supporting those who support the show.
[00:36:34] Now for the rest of Skeptical Sunday.
[00:36:38] Just think about what multinationals state surveillance operatives and hackers could do with your information. I mean, say Alexa wakes up accidentally and records a politician having an affair or whatever because he got one of those from his kids for Christmas and he just plopped it in his office so they could ask it the weather or something.
[00:36:56] Andrew Gold: Yeah, it's funny you use like an affair as an example because it's not just Alexa who is recording you. There have been all sorts of cases of — wait for it — sex toys recording their owners.
[00:37:06] Jordan Harbinger: Oh God.
[00:37:07] Andrew Gold: So the Lovense remote sex toy that allows long-distance couples to be able to play with one another was recording audio. It was found to have been. So there's all these audio recordings.
[00:37:19] Jordan Harbinger: Oh my God.
[00:37:20] Andrew Gold: Yeah. The We-Vibe company was sued for allegedly collecting information this way. They had like a ton of collections of audios of like people using these toys.
[00:37:29] Jordan Harbinger: Oh God.
[00:37:30] Andrew Gold: People were also able to take control of their butt plug, which then—
[00:37:34] Jordan Harbinger: Okay.
[00:37:35] Andrew Gold: —which is absolutely insane. But like imagine that, like you're taking control of it from afar because people could hack into it really easily.
[00:37:41] Jordan Harbinger: Yes.
[00:37:42] Andrew Gold: Which, of course, raises questions of consent and the concept of rape by app or raped by sex toy. These are things we're going to have to write into law and things we're going to have to think about as—
[00:37:51] Jordan Harbinger: Oh my God.
[00:37:52] Andrew Gold: —time goes on. It's not just sex toys. Elon Musk's electric car company, Tesla, have been in the news for recording footage with videos that are used for the self-driving functions of the camera, recording with, you know, the cameras that are supposed to be looking out for where the car is going and everything.
[00:38:08] Jordan Harbinger: Right.
[00:38:08] Andrew Gold: Staff were even sharing the funniest videos of customers with one another and often there was enough to identify the people, the location, and everything going on in and around that car.
[00:38:19] Jordan Harbinger: Yeah, that's scary because what if you find someone famous driving in their car?
[00:38:23] Andrew Gold: Mm-hmm.
[00:38:23] Jordan Harbinger: And you get them picking their nose or whatever, or, you know, I don't know, having a call with their doctor, I mean, geez. And if you're not safe from intrusion during sex or in your car, have we lost the battle and the will? I mean, you know it's bad when your butt plug is spying on you, okay? You can't even get privacy in your own rectum. It's nothing sacred.
[00:38:44] Andrew Gold: Quite possibly. Many of us have given up, but a study conducted by researchers at the University of Pennsylvania found that the more personalized an ad is, the more likely it is to be perceived as invasive. Participants in the study were shown ads that varied in personalization, and they consistently rated the highly personalized ads as more intrusive and less enjoyable.
[00:39:05] Jordan Harbinger: Mmm.
[00:39:05] Andrew Gold: Another study conducted by researchers at Northeastern University examined the potential for personalized ads to discriminate against certain demographics. For example, their ads for lumberjacks were mostly shown to men for jobs as lumberjacks, I should say.
[00:39:20] Jordan Harbinger: Okay.
[00:39:20] Andrew Gold: And their ads looking for supermarket cashiers was mostly for women. The algorithm doesn't really care for political correctness or ideas around equality of opportunity. The algorithm just finds the best ways to make their ads work, but it raises concerns about the potential for personalized ads to perpetuate or exacerbate existing inequalities.
[00:39:40] Jordan Harbinger: Yeah, it sounds like the algorithm forgot to be woke. So given these complexities, what role do different stakeholders, so tech companies, advertisers, regulators, and users, us as users, what role do we have to play in ensuring that personalized ads are ethical, they're privacy respecting? I mean, is it even possible to do it that way?
[00:39:59] Andrew Gold: Yeah. Well, tech companies play a crucial role as they develop and maintain the platforms and tools that enable personalized advertising. They should prioritize privacy in the design of their products and services. So not like hiding a microphone in there without telling anybody.
[00:40:13] Jordan Harbinger: Mm-hmm.
[00:40:14] Andrew Gold: They can't continue to lose trust as with the sex toys and Alexa and all of these recording devices. We can't be living in a world where a company like Tesla are sharing clips of you in your most private moments between staff for a laugh because they were actually getting like promotions and things apparently based on who was sharing the funniest clips and things like that. It was insane.
[00:40:34] Jordan Harbinger: Whoa.
[00:40:34] Andrew Gold: Yeah.
[00:40:35] Jordan Harbinger: Yeah.
[00:40:35] Andrew Gold: So, you know, the tech companies need to improve confidence and provide users with clear information about the level of control they have and how exactly their data is collected and used. Let's face it, none of us want to read these great long contracts and agreements that Apple and the like keep throwing our way, you know?
[00:40:52] Jordan Harbinger: Well, even if we read the contract from Apple, what's the contract from Apple's supplier for that software that's on there or the mic that's in there? I mean, we don't even get access to those. Apple agrees to those for us and then passes whatever on, I mean, it's just not, we're never going to be fully informed. And imagine if you had to agree to 19 different contracts to use your phone—
[00:41:16] Andrew Gold: Mm-hmm.
[00:41:16] Jordan Harbinger: —or to use an app that you get, I mean, you just never do it. It's like those, when you install software and it's like, make sure you respect the end user licensing agreement. You're like, whatever, next, next, next, next, next. Oh, I have to scroll to the bottom, drag the slide to the bottom. Next, like, this is stupid. I'm not reading this. And look, I agree. If you can't trust your Wi-Fi-enabled vibrator, who can you trust? Am I right, ladies and guys with vaginas? I got to include that. Got to be woke, unlike the algorithm.
[00:41:42] Andrew Gold: Yeah, exactly, Jordan. Yeah, well, exactly right if you can't trust them. And they need our trust. They want to sell more stuff. At the end of the day, a company is a company. It's not doing things to be nice or to because they care, they're doing things because they want to sell more stuff and if they have more trust from customers, then they sell more stuff. I mean, advertisers too, they've got a responsibility to act ethically in their use of personal data. That includes being transparent about their data collection practices. So you can't have like this quiz on Facebook, you know, the Cambridge Analytica scandal.
[00:42:10] Jordan Harbinger: Mm-hmm.
[00:42:10] Andrew Gold: They need to respect user consent and ensure that their ads do not discriminate against or exploit vulnerable groups. Regulators as well, I'm talking about governments here, need to be creating laws and guidelines that protect user privacy and prevent the abuse of personal data. The tough part is staying up to date with the latest technological advancements. I mean, that's the thing. It's going to be really, really hard. The technology sort of goes faster and then the policy always comes later. And then a lot of people get hurt in the middle. Now, finally, the users themselves have a role to play in protecting their own privacy and advocating for stronger protections. If we don't care, and we continue not to care, things could get to a point when we are too far along. Maybe we already are. But this involves being informed about the risks associated with personalized ads, using privacy tools, and making their voices heard through political and consumer actions.
[00:43:02] Jordan Harbinger: I guess what people are really going to be thinking is and myself included, okay, fine, but what can I really do about this or to help with this? I mean, how much influence does public opinion and consumer power have in shaping the practices of tech companies and advertisers when it comes to personalized ads? I mean, we're not really the customer when it's Facebook, right? We're the product. So what do I do? I'm going to stop using this. I mean, I guess if tens of millions of us do. Okay, but I don't really know. It seems like we don't have a whole lot of leverage.
[00:43:31] Andrew Gold: Mmm. Yeah, I'm quite a negative person and I tend to agree with you, but you know, I think I have to believe that we can have a real impact on the behavior of tech companies and advertisers. As we've seen with recent privacy scandals and controversies, public backlash can force companies to reevaluate their practices and make changes to better protect user privacy. For example, following the Cambridge Analytica scandal, Facebook actually pledged to making several changes to its platform, such as limiting third-party access to user data and providing more transparency and control over how personal information is used for advertising. However, some have criticized them for not actually enforcing all of those changes. Facebook founder Mark Zuckerberg wrote a 3,000-word article about the purported changes, which was described by Guardian writer Emily Bell, and I love this, as the nightmarish college application essay of an accomplished sociopath.
[00:44:27] Jordan Harbinger: Wow. Tell us what you really think, Emily.
[00:44:30] Andrew Gold: Exactly. Similarly, Apple has introduced new privacy features in its operating systems that limit ad tracking. and require apps to obtain user consent before collecting certain types of data. Consumers also have the power to vote with their wallets and choose products and services that prioritize privacy. The growing popularity of privacy tools, I mentioned DuckDuckGo, for example, as a search engine, there's also Signal and several other chat apps that encrypt their data. As well as the increasing adoption of ad blockers demonstrates that many users are willing to take action to protect their privacy.
[00:45:05] Jordan Harbinger: If we give a damn about our privacy, it just seems like we got to do it ourselves. Because companies, they don't care about morality, they don't care about philosophy. And why should they if we don't? They care about what we care about because they're trying to reflect your interests in a way. So as we look into the future, how can we prepare for the potential challenges and opportunities that come with increasingly advanced personalized advertising technology, such as the ones you mentioned involving AI-driven algorithms and virtual or augmented reality? It seems like we got to wear our own suit of armor and not expect a cushy landing from big tech.
[00:45:41] Andrew Gold: Yeah. Yeah. Well, it's always tough to speculate about the future because you spend time imagining this and that, and then something comes out of left field that was totally unexpected. But in general, we need to engage in ongoing discussions and debates about the ethical implications of personalized advertising and the limits we should place on data collection and use.
[00:46:00] Jordan Harbinger: All right. As we wrap up our conversation here, do you have any advice for listeners who want to take control of their online privacy and navigate what looks to be a complex world of personalized advertising, you sort of mentioned some apps and things like that. If we can do a little roundup there—
[00:46:15] Andrew Gold: Yeah.
[00:46:15] Jordan Harbinger: That would probably help.
[00:46:17] Andrew Gold: Yeah, absolutely. Well, first and foremost, educate yourself about the risks associated with personalized ads and the ways in which your data is collected and used. That includes listening to this episode, sharing it with people—
[00:46:27] Jordan Harbinger: Mm-hmm.
[00:46:27] Andrew Gold: But also looking up stuff for yourself. There's so much out there about this. Knowledge is power, as you were saying before, and being informed will help you make better decisions about protecting your privacy. Second, take proactive steps to guard your privacy by using privacy-enhancing tools and adjusting your privacy settings on various platforms. This might include installing ad blockers, using a VPN, and choosing privacy-focused search engines and messaging apps. Finally, don't be afraid to advocate for stronger privacy protections, both as a consumer and as a citizen. Even just talking to your local politician or whatever, just saying, this is something that bothers me. By voicing your concerns and supporting companies and policies that prioritize privacy, you can help shape a future where the benefits of personalized ads don't come at the expense of our personal information and freedom.
[00:47:14] And finally, what are your thoughts on personalized ads, Jordan? I just asked because I know that ads are always seen as the bad guys, but you and I live from them. You read some out on the podcast.
[00:47:24] Jordan Harbinger: Mm-hmm.
[00:47:24] Andrew Gold: They're not personalized ads as such, although I imagine sponsors will be aware of general demographics. Are people getting personalized ads in this podcast, for example?
[00:47:32] Jordan Harbinger: So people won't get personalized ads, but they will get geo-targeted ads sometimes. Depending on the age of the episode, older stuff that has dynamic ad insertion, and like stuff maybe that I don't voice, people will say, "Wow, I got an advertisement for Vancouver something hospital and I'm there right now." And a lot of times, people are amazed, but other times people will actually get mad at me for an ad that they hear. And they don't realize that it's some ad for electric vehicles in the episode about cobalt. And I recorded that ad with my voice, so it did sound like it was part of the podcast, but it's just inserted by a computer. And it goes across tons of thousands of shows per day, right? It goes across the entire PodcastOne network, it just happens to be my voice, so people will go, "Man, this is bad taste, bad form, you know, how dare you." It's just a coincidence that that episode that they downloaded right then happened to have The EV ad for Nissan in it, opposite a guy saying how bad EV trends are for cobalt mining, which kids are dying and doing it. So it's not me deliberately acting in poor taste. People just don't understand at all how this tech works. And so a coincidence like that can look really bad.
[00:48:40] But thanks, man, I really appreciate it. This wraps up this episode on personalized ads. And who knows, maybe after listening to it, you're going to start seeing more personalized ads about personalized ads.
[00:48:51] So, all right, to our listeners, don't forget to tune in next week for another thought-provoking episode of Skeptical Sunday on The Jordan Harbinger Show. Until then, stay curious and keep questioning. And hey, if you have a suggestion for a topic here on the show, go ahead and email me jordan@jordanharbinger.com. I'd love to hear your thoughts. I'd love to hear your suggestions. We get a lot of suggestions for this show for skeptical Sunday, especially from show fans.
[00:49:13] A link to the show notes for the episode can be found at jordanharbinger.com. Transcripts are in the show notes. I'm at @JordanHarbinger on both Twitter and Instagram. And you can also connect with me on LinkedIn and you can find Andrew Gold on his podcast On the Edge with Andrew Gold, explosions, anywhere you get your podcasts.
[00:49:31] Once again, a reminder that the Stitcher app will no longer work for any podcasts as of August 29th, 2023. So if you're using the Stitcher app, time to switch, if you're on Android, Podcast Addict is a good one, Castbox. And if you're on iOS, I suggest Overcast or Apple Podcasts. The Stitcher app is going away, folks.
[00:49:49] This show is created in association with PodcastOne. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Ian Baird, Millie Ocampo, and Gabriel Mizrahi, and for this one, of course, Andrew Gold. Our advice and opinions are our own, and I'm a lawyer, but I'm not your lawyer. Do your own research before implementing anything you hear on the show. Remember, we rise by lifting others. Share the show with those you love, and if you found the episode useful, please share it with somebody else who needs to hear it. In the meantime, do your best to apply what you hear on this show, so you can live what you listen, and we'll see you next time.
[00:50:23] You're about to hear a preview of The Jordan Harbinger Show, with a top sleep expert about why we dream, what happens when we sleep, and why chronic lack of sleep and driving while tired is more dangerous than driving under the influence of alcohol.
[00:50:36] Matthew Walker: Sleep is not an optional lifestyle luxury. Sleep is a non-negotiable biological necessity. Sleep is a life support system. It is Mother Nature's best effort yet at immortality. And the decimation of sleep throughout industrialized nations is now having a catastrophic impact on our health, our wellness, as well as the safety and the education of our children. It is a silent sleep loss epidemic. And I would contend that it is fast becoming the greatest public health challenge that we now face in the 21st century.
[00:51:08] The evidence is very clear that when we delay school start times, academic grades increase, behavioral problems decrease, truancy rates decrease, psychological and psychiatric issues decrease. But what we also found, which we didn't expect in those studies, is the life expectancy of students increased. So if our goal as educators truly is to educate and not risk lives in the process, then we are failing our children in the most spectacular manner with this incessant model of early school start times. And by the way, 7:30 a.m. for a teenager is the equivalent for an adult waking up at 4:30 or 3:30 in the morning.
[00:51:49] If you are trying to survive or regularly getting five hours of sleep or less, you have a 65 percent risk of dying at any moment in time. When you wake up the next day, you have a revised mind-wide web of associations, a new associative network, a rebooted iOS that is capable of divining remarkable insights into previously impenetrable problems. And it is the reason that you have never been told to stay awake on a problem, instead, you're told to sleep on a problem.
[00:52:23] Jordan Harbinger: For more on sleep, including why we dream and how we can increase the quality of our sleep, check out episode 126 with Dr. Matthew Walker on The Jordan Harbinger Show.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.