Debug Information
Processing Details
- VTT File: mss539_Rebecca_Lemov_2025_08_26.vtt
- Processing Time: September 11, 2025 at 03:25 PM
- Total Chunks: 2
- Transcript Length: 112,476 characters
- Caption Count: 998 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.880 --> 00:00:08.720] A mochi moment from Tara, who writes, For years, all my doctor said was eat less and move more, which never worked.
[00:00:08.720 --> 00:00:10.080] But you know what does?
[00:00:10.080 --> 00:00:13.280] The simple eating tips from my nutritionist at Mochi.
[00:00:13.280 --> 00:00:19.920] And after losing over 30 pounds, I can say you're not just another GLP1 source, you're a life source.
[00:00:19.920 --> 00:00:21.040] Thanks, Tara.
[00:00:21.040 --> 00:00:23.600] I'm Myra Amet, founder of Mochi Health.
[00:00:23.600 --> 00:00:27.360] To find your Mochi moment, visit joinmochi.com.
[00:00:27.360 --> 00:00:30.400] Tara is a mochi member compensated for her story.
[00:00:30.400 --> 00:00:32.320] Every business has an ambition.
[00:00:32.320 --> 00:00:41.760] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:00:41.760 --> 00:00:50.240] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:00:50.240 --> 00:00:55.040] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:00:55.040 --> 00:00:57.680] Grow today at PayPalOpen.com.
[00:00:57.680 --> 00:01:00.720] Loan subject to approval in available locations.
[00:01:03.920 --> 00:01:09.600] You're listening to The Michael Shermer Show.
[00:01:15.040 --> 00:01:18.080] All right, everybody, it's time for another episode of the Michael Shermer Show.
[00:01:18.080 --> 00:01:20.400] I'm your host, as always, Michael Shermer.
[00:01:20.400 --> 00:01:29.760] My guest today is Rebecca Lamove, who's a historian of science at Harvard University and has been a visiting scholar at the Max Planck Institute.
[00:01:30.560 --> 00:01:36.240] Her research explores data, technology, and the history of human and behavioral sciences.
[00:01:36.240 --> 00:01:42.240] She's the author of Database of Dreams: The Lost Quest to Catalog Humanity.
[00:01:42.240 --> 00:01:52.800] That is how scientists between 1942 and 1963 attempted to map the elusive and subjective parts of the human psyche V via once-futuristic data storage techniques.
[00:01:52.800 --> 00:01:53.840] You can only imagine.
[00:01:53.840 --> 00:01:55.920] That must have been like magnetic tapes or something.
[00:01:56.320 --> 00:01:57.040] Exactly.
[00:01:57.920 --> 00:02:01.960] World is Laboratory: Experiments with Mice, Mazes, and Men.
[00:01:59.680 --> 00:02:07.240] I'll have to tell you about my experiences working for two years in a Skinnerian lab with pigeons and rats.
[00:02:08.840 --> 00:02:16.680] And co-author of How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality, all of which I'm interested in.
[00:02:16.680 --> 00:02:23.320] But our new book is The Instability of Truth: Brainwashing, Mind Control, and Hyper Persuasion.
[00:02:23.320 --> 00:02:24.040] There it is.
[00:02:24.040 --> 00:02:25.640] It's also available on audio.
[00:02:25.640 --> 00:02:27.880] I've listened to it one and a half times, Rebecca.
[00:02:27.880 --> 00:02:29.000] It's a great book.
[00:02:29.560 --> 00:02:31.400] So glad you're enjoying it.
[00:02:31.960 --> 00:02:34.120] Did you listen both times?
[00:02:34.680 --> 00:02:37.480] Well, I had the book, so I've been kind of skimming around taking some notes.
[00:02:37.480 --> 00:02:44.840] But yeah, I mainly listen to it when I'm out riding my bike or walking with the dog or doing errands or driving and that sort of thing.
[00:02:44.840 --> 00:02:49.960] But I've read all these books on mind control, cults, and persuasion and all that stuff.
[00:02:49.960 --> 00:02:55.080] This is really the best kind of overall summary of the whole history of the phenomenon.
[00:02:55.240 --> 00:02:59.160] So, but before we get into all that, maybe just tell us your story.
[00:02:59.880 --> 00:03:04.040] What's your trajectory into this from your background?
[00:03:04.040 --> 00:03:08.600] Yeah, thank you for saying that, and thanks for having me on your show.
[00:03:09.080 --> 00:03:20.200] Yeah, so my trajectory is that I guess I went into graduate school, having a background in English literature in college.
[00:03:20.200 --> 00:03:23.800] And I thought I'd really like to do something more in the world.
[00:03:23.800 --> 00:03:29.000] And it seemed, you know, not that I don't love reading epic poetry.
[00:03:29.000 --> 00:03:34.360] Or if I went to graduate school, I thought, I really want to study something meaningful and meaningful to me.
[00:03:34.360 --> 00:03:38.360] And that really asks, you know, the biggest questions I could think of.
[00:03:38.360 --> 00:03:43.800] So I kind of made my way into anthropology, or I was living in California.
[00:03:43.800 --> 00:03:58.080] I decided to study anthropology, but then I ended up not quite doing fieldwork, but thinking that anthropology would equip me to study the scientists themselves who had asked questions about human freedom, how free are we really?
[00:03:58.080 --> 00:04:02.080] What is what are the things, how rational we assume we are?
[00:04:02.080 --> 00:04:03.760] Is that actually the case?
[00:04:03.760 --> 00:04:09.120] And also, to what extent are we shaped by factors that we just take for granted?
[00:04:09.120 --> 00:04:15.600] As, oh, that's just the way it is, our culture, or even just things we take to be trivial.
[00:04:15.600 --> 00:04:19.040] But, you know, how much of our, what we take to be ourselves is really us.
[00:04:19.120 --> 00:04:25.280] So I ended up writing a history of behaviorism, which it sounds like you have direct experience in as well.
[00:04:25.280 --> 00:04:42.720] I was interested in these great ambitious plans during the Cold War, especially, but even in the earlier periods of behaviorists to create a kind of science of human control and maybe human engineering.
[00:04:42.720 --> 00:04:52.080] And then I, at the end of my dissertation, I wrote a chapter on brainwashing because brainwashing seemed like it posed these questions in an especially urgent way.
[00:04:52.080 --> 00:04:58.640] And I also thought, what could be more out of date and seemingly irrelevant than this thing called brainwashing or mind control?
[00:04:58.640 --> 00:05:04.960] Because at the time I was writing in the late 1990s, it really had fallen out of favor.
[00:05:04.960 --> 00:05:12.560] It seemed just like a relic of an earlier moment of hysteria, and it couldn't possibly have anything to do with our present or future.
[00:05:12.560 --> 00:05:15.280] So it seemed like the most out of date.
[00:05:15.280 --> 00:05:19.200] And I've always had a soft spot for ideas that have fallen out of favor.
[00:05:19.200 --> 00:05:31.480] So I ended up taking that kernel of brainwashing that was left over from the dissertation and teaching classes on it for the last couple decades and eventually and writing about it in different ways.
[00:05:31.640 --> 00:05:44.760] And now I finally, then I thought a few years ago, I should actually put it all together in a book if I can, but I'm not sure anyone will be interested because it's again, it didn't yet seem very urgent.
[00:05:45.720 --> 00:05:46.200] Nice.
[00:05:46.200 --> 00:05:46.600] Yes.
[00:05:46.600 --> 00:05:48.440] Well, you know, it's a fascinating history.
[00:05:48.440 --> 00:05:54.680] I've had a number of authors on the show about books about MKUltra and CIA mind control and all that stuff.
[00:05:54.680 --> 00:05:56.360] It's so interesting.
[00:05:56.680 --> 00:06:07.880] And, you know, this whole like Tom O'Neill's chaos, which was just made into a Netflix documentary series, I think, you know, was Manson part of the whole CIA, MK Ultra control, mind control?
[00:06:07.880 --> 00:06:12.600] How did he get those young women to commit murder when he wasn't even there and so on and so forth?
[00:06:13.240 --> 00:06:23.400] There does seem to still be a mystery that's unsolved, namely, how do you get people to do things that on the surface they would not normally do?
[00:06:23.400 --> 00:06:24.120] Yeah, I agree.
[00:06:24.120 --> 00:06:26.360] There does seem to be a mystery.
[00:06:26.360 --> 00:06:40.760] And I think, I mean, I don't think maybe people growing up today necessarily know the story of Manson or the fact that he didn't actually commit the murders, but yet people acted as if they were extensions of his will.
[00:06:40.760 --> 00:06:52.840] But also, you know, the evidences they felt or they spoke of how they said, you know, certainly he may be a prophet, and I'm following in his footsteps, but I may be a prophet too.
[00:06:52.840 --> 00:07:07.160] So they were also, his followers were strangely not simply robotic, but also seemed to be empowered in some way, or enjoying themselves, which is very, it's very strange to even contemplate.
[00:07:07.160 --> 00:07:18.160] So, every time I've taught the class on brainwashing at my university, students do seem to, at least they usually seem like they have just a question.
[00:07:18.880 --> 00:07:19.920] You know, what is this?
[00:07:19.920 --> 00:07:22.400] Or is this a how-to class?
[00:07:22.400 --> 00:07:26.560] Or what is even the question at the heart of this?
[00:07:27.200 --> 00:07:31.840] Yeah, I guess at some level they're probably asking themselves, would I fall for something like that?
[00:07:32.320 --> 00:07:38.800] You know, it's like when you show the Milgram shock experiment stuff, you know, you know, they're thinking I would never do this.
[00:07:39.120 --> 00:07:43.200] Yeah, very few people immediately identify that they would.
[00:07:43.200 --> 00:07:44.720] They know they would.
[00:07:44.720 --> 00:07:50.320] And if they do, either they're being brave or, you know, I don't know.
[00:07:50.880 --> 00:07:51.920] It's just really rare.
[00:07:51.920 --> 00:07:55.920] I know one writer said that she knew she would have delivered shocks.
[00:07:55.920 --> 00:07:59.120] But yeah, it poses the same kind of question, the Milgram experiments.
[00:07:59.120 --> 00:07:59.920] What would I do?
[00:07:59.920 --> 00:08:08.880] Are there situations where, I mean, it's also the fascination with cult documentaries and stories of elaborate financial scams.
[00:08:08.880 --> 00:08:16.480] We always want to reassure ourselves that we wouldn't take part or we wouldn't fall for that or we wouldn't do that.
[00:08:17.120 --> 00:08:19.600] Have you ever fallen for anything like that?
[00:08:19.600 --> 00:08:23.360] Any personal experiences that made you interested in this topic?
[00:08:23.360 --> 00:08:37.440] Yeah, I mean, I have many from the, I guess what, you know, there's the one I mentioned in the book is just the experience in graduate school of kind of falling into a circle of it.
[00:08:37.440 --> 00:08:44.000] So there's seeming, there's sort of intellectual level and then there's deeply personal levels, and then there's seemingly trivial levels.
[00:08:44.000 --> 00:08:59.560] But on the intellectual level, during graduate school, I kind of became, I fell into a crowd of scholars, and my dissertation advisor was well known for his, for he was quite famous for his relationship with Michel Foucault.
[00:08:59.440 --> 00:09:01.960] And I learned a lot from learning from him.
[00:09:02.120 --> 00:09:13.640] But these kind of post-structuralist theorists that were in vogue at the time at UC Berkeley, I mean, I just was in a circle where people were reading them as if they were these fragments of sacred texts.
[00:09:13.640 --> 00:09:16.360] And they would be, you know, I was falling into this too.
[00:09:16.360 --> 00:09:20.680] I thought if I could just read these very, they were very hard to understand.
[00:09:20.680 --> 00:09:28.840] So you would take them home and you puzzle over what could this Jacques Lacan have meant here and bring it back to the group and people would disagree.
[00:09:28.840 --> 00:09:45.640] I remember a moment seeing a graduate, I was meeting with my advisor and a graduate student ran in, or just a professor I was taking a class with, and he ran in brandishing one of these books and waving it around like he had, like he was intoxicated.
[00:09:45.640 --> 00:09:47.240] And he said, it is not about truth.
[00:09:47.240 --> 00:09:48.280] It is about love.
[00:09:48.600 --> 00:09:51.560] And then she yelled back, no, it is about truth.
[00:09:51.560 --> 00:09:52.840] And he said, no, love.
[00:09:53.160 --> 00:09:55.880] And it was like a, it was like a movie.
[00:09:55.880 --> 00:09:56.840] It was very funny.
[00:09:56.840 --> 00:10:01.960] And I don't think, I don't suppose they saw it quite, you know, you get carried away.
[00:10:01.960 --> 00:10:10.120] And at anyway, as it relates to myself, I wrote a paper of which I received, for which I received a lot of praise.
[00:10:10.120 --> 00:10:15.080] It seemed at the very edge of being understandable, yet maybe profound.
[00:10:15.080 --> 00:10:16.920] And I eventually got it published.
[00:10:16.920 --> 00:10:23.880] But some friend of mine who was a journalist read it and he said, I don't even recognize your writing here.
[00:10:23.880 --> 00:10:25.560] I don't know, like, who is this?
[00:10:25.560 --> 00:10:27.400] Who is this person writing?
[00:10:27.400 --> 00:10:33.720] He said that it just seemed not like me and not really completely understandable.
[00:10:34.120 --> 00:10:41.480] He was kind of a stickler for not necessarily easy writing, but writing that you could really get to the bottom of.
[00:10:41.480 --> 00:10:53.440] So, anyway, that was a little bit of a wake-up call where I saw that, you know, at certain moments you can get carried away, you can become distant from maybe yourself.
[00:10:53.440 --> 00:11:20.560] And, you know, the more personal part that happened around the same time was just that I fell into a kind of coercively controlling relationship that had to do with drug addiction somewhat because of a fair, you know, I think I was, I think I felt at the time that I was, I disappointed my family and I basically wanted to check out and I thought I can handle this.
[00:11:20.560 --> 00:11:25.440] This will just be a little bit, I'll just take a few steps in this direction.
[00:11:25.440 --> 00:11:35.600] And it turned into a couple of years of just a very painful experience where the deeper I got into it, the less I was able to get out.
[00:11:35.600 --> 00:11:37.600] And I lost perspective on it.
[00:11:37.600 --> 00:11:54.560] I really thought I no longer deserved, you know, to maybe even survive this addiction and this relationship where I constantly felt that I wasn't measuring up to the standards this person was constantly illuminating.
[00:11:54.560 --> 00:12:08.320] So just having, I guess, one thing that added to my research and just the fortune of getting out of that difficult situation gave me a lot of insight into, you know, on the personal level.
[00:12:08.320 --> 00:12:15.360] It reminded me recently, my daughter took a class on the poetry, the literature of Dante.
[00:12:15.360 --> 00:12:24.960] And he, at the beginning of his great work, he talks about how in the middle of our life's journey, I found myself in a dark wood, the right road lost.
[00:12:24.960 --> 00:12:32.200] And I thought, wow, that is that, I mean, and for Dante, apparently, middle of our life's journey was your late 20s or early 30s.
[00:12:32.440 --> 00:12:39.480] And, you know, so I did find myself at this lost, you know, and the right road lost.
[00:12:39.480 --> 00:12:46.200] And to have that experience made me wonder, how can we, you know, how does that happen?
[00:12:47.800 --> 00:12:48.840] Really fascinating.
[00:12:48.840 --> 00:12:51.560] Was this a romantic relationship, I presume?
[00:12:51.560 --> 00:12:53.880] And how did you get out of that?
[00:12:54.200 --> 00:13:15.000] Well, I was just very fortunate that because, as is often the case in these kind of, I don't know, I was just saying bad relationships, you go in thinking it's fine, and then you lose touch with friends, you lose touch with family, and you lose relationships through which you can test the sanity of it.
[00:13:15.000 --> 00:13:21.480] And after, so I was kind of lost in this for a time.
[00:13:21.480 --> 00:13:26.280] And I happened to have a coffee date with one of my few remaining friends.
[00:13:26.280 --> 00:13:31.720] This is kind of funny, but I mean, I feel so grateful to her.
[00:13:31.720 --> 00:13:46.040] My friend Nikki, anyway, went out for coffee with her and she just said, you know, this person, he acts like he has, he acts like he's better looking, smarter, and much funnier than you, but he's none of those things.
[00:13:46.040 --> 00:13:52.920] And somehow it was like this wake-up call that this wasn't the only reality, was the reality this person had painted.
[00:13:52.920 --> 00:14:02.520] And also my, you know, my capacities were diminished by the kind of, you know, the effects of drugs as well.
[00:14:02.520 --> 00:14:06.760] So just having somebody else's perspective was useful.
[00:14:06.760 --> 00:14:22.240] Yeah, reality check is super important, which is one reason why cults, one of the characteristics of cults, is they get you to leave your family or disconnect from your friends and colleagues or anybody that could give you a reality check who might say, you know, are you sure this is the right thing to do?
[00:14:22.560 --> 00:14:44.720] Sometimes even children, you know, control over one's children is very, I shall say, effective or malevolent aspect of abusive cults is that you have to surrender your biological children supposedly for their own good, but it creates a lot of control, guilt, and emotional attachment.
[00:14:44.720 --> 00:14:57.760] Yeah, the drug use, that's interesting because that's one of the things Manson did with the young women was, you know, they did a lot of LSD, as you wrote about in your book, and other drugs, and also a lot of sex.
[00:14:58.560 --> 00:15:06.880] You know, I don't think you need MK Ultra mind control technology to explain Manson's influence over these followers of his.
[00:15:07.280 --> 00:15:15.360] I think, you know, just good old human nature and how people react that way in response to those sorts of things explains a lot.
[00:15:15.360 --> 00:15:21.760] As you were talking, I was thinking of, you know, is postmodernism in that sense, a kind of a cult?
[00:15:21.760 --> 00:15:30.960] Not really a cult, but, you know, along the lines of what you're talking about, you fall into a worldview in which everybody you know is kind of steeped in the language.
[00:15:30.960 --> 00:15:36.000] And the language of postmodernism, which I've, you know, I try, I try to read those things.
[00:15:36.000 --> 00:15:45.440] And, you know, it's a little bit like in UFO circles, we have a thing called low information zone, or you could apply this to Bigfoot or whatever.
[00:15:45.680 --> 00:15:48.240] You know, the photos are always kind of grainy, blurry.
[00:15:48.240 --> 00:15:50.560] You can't quite make out what's going on.
[00:15:50.560 --> 00:15:52.080] And that's where people kick in.
[00:15:52.080 --> 00:15:56.240] Well, see, if you kind of use your imagination and squint, you can sort of see right there.
[00:15:56.240 --> 00:15:58.160] It's like, I don't really see that.
[00:15:58.160 --> 00:16:00.760] But that's where the imagination then kind of fits in.
[00:16:01.000 --> 00:16:13.240] Maybe, you know, this kind of intentional obscurantism of that kind of writing is meant to do that, to make it a low information zone, so that therefore you can fill in the blanks however you want.
[00:16:13.240 --> 00:16:20.840] Or you're sort of elaborately instructed in how one should be filling in those blanks or the direct, at least the direction one should be going.
[00:16:20.840 --> 00:16:30.120] But it also reminds me: you know, there is an obscurantism in many beautiful things like poetry.
[00:16:30.120 --> 00:16:37.800] And, you know, who would want to deny themselves the ability to read into or to, you know, a poem shouldn't tell you what it means.
[00:16:37.800 --> 00:16:41.000] It should allow the reader to find those things.
[00:16:41.000 --> 00:16:43.000] So, but I think it can be abused.
[00:16:43.000 --> 00:16:51.560] So, I guess I found, you know, now with several decades behind me, I find that I can appreciate some of these writings.
[00:16:52.200 --> 00:16:55.080] Some of them I like still, some of them I don't like.
[00:16:55.080 --> 00:16:59.880] It's sort of like poets I like, poets I don't like, but it's more in that framework.
[00:16:59.880 --> 00:17:24.280] And I can also see that at some moments they could definitely become instruments of a kind of not, you know, not a nefarious, I wouldn't call it, you could say a kind of social circle where maybe it didn't lead to really interesting work necessarily, a lot of mimicry and kind of hyper-policing of each other's behavior thoughts a little bit.
[00:17:24.280 --> 00:17:25.480] That could happen.
[00:17:25.800 --> 00:17:26.520] Yeah.
[00:17:26.840 --> 00:17:34.520] From this relationship you were in, then let's extrapolate to talking about why people stay in abusive relationships at all.
[00:17:34.520 --> 00:17:46.800] Or, or you know, we just had the Diddy trial where in his defense, you know, his attorneys were saying, well, look, here's the text where she's saying, oh, I can't wait to freak off with you this weekend.
[00:17:47.120 --> 00:17:48.880] So, I mean, why is she doing that?
[00:17:48.880 --> 00:17:55.600] Why is she even saying that if she's under his control and influence and threat and so on?
[00:17:55.600 --> 00:17:58.000] Yeah, I don't know, maybe speak to that a little bit.
[00:17:58.000 --> 00:18:12.720] Yeah, I suppose if I hadn't been through this relationship myself in the past, of uh, no, I think there's a con that that worked very much against the uh it didn't work in trial.
[00:18:12.960 --> 00:18:27.200] I mean, the fact that there was a lot of evidence or in the in the words of the women themselves, that they seem to be enthusiastic participants in their own abuse, and that that could then be portrayed as, well, this young woman just liked sex, good for her.
[00:18:27.200 --> 00:18:41.600] That's what the attorney, the defense attorney, Mark Anifilo, who also defended Keith Vernieri, um, he was quite successful in this case because, you know, people say, well, they were grown women, this is just something they were choosing.
[00:18:41.600 --> 00:18:47.760] And I do have, so it's tricky when you're in a legal context to make these things stick.
[00:18:47.760 --> 00:18:57.280] And I think the history really shows that that questions of mind control, brainwashing, coercive control don't are very difficult to litigate in court.
[00:18:57.280 --> 00:19:08.480] And that's why they often have to take to make use of RICO and racketeering charges and other types of financial improprieties, which often accompany sexual abuse.
[00:19:08.480 --> 00:19:12.480] But personally speaking, I could relate to the women.
[00:19:12.480 --> 00:19:26.720] I mean, you know, I remember for years constantly flattering my boyfriend because it was this way that the relationship was set up that he was great.
[00:19:26.720 --> 00:19:28.480] You know, he was just so great.
[00:19:28.480 --> 00:19:32.920] And maybe one day the world would recognize how great he was.
[00:19:33.240 --> 00:19:37.880] But I could provide a constant stream of like reinforcement for him.
[00:19:37.880 --> 00:19:42.120] Oh, you know, you haven't written your dissertation yet, but when you do, it's going to be awesome.
[00:19:42.120 --> 00:19:46.360] And he could at the same time provide a constant stream of denigration to me.
[00:19:46.360 --> 00:19:48.120] So that was kind of the bargain we had.
[00:19:48.120 --> 00:19:52.200] And looking back, and I could think for years about why that happened.
[00:19:52.200 --> 00:19:53.000] What was it?
[00:19:53.320 --> 00:20:00.760] You know, it's provided a lot of opportunity for insight, and it never went to the level of me having to prove this in court.
[00:20:00.760 --> 00:20:08.920] But if you know, because there was no, ultimately, no crime, it was just a real, it was a lot of suffering.
[00:20:08.920 --> 00:20:12.040] And I feel a lot of compassion for the women, you know.
[00:20:12.680 --> 00:20:13.640] Yeah, for sure.
[00:20:13.640 --> 00:20:22.520] Well, you know, there's a couple of other phenomena: loss aversion and sunk cost fallacy, where, you know, you're, you're, you're, you're this far down the relationship.
[00:20:22.680 --> 00:20:23.960] I hate to give up now.
[00:20:23.960 --> 00:20:25.400] Maybe it'll work out.
[00:20:25.400 --> 00:20:30.920] You know, it's hard to cut your losses and get out like in investments or whatever.
[00:20:30.920 --> 00:20:33.640] And it's hard in relationships in the same way.
[00:20:33.640 --> 00:20:34.040] Yeah.
[00:20:34.040 --> 00:20:41.880] I mean, then they're just, I think people who do end up leaving abusive cults do describe a kind of moment.
[00:20:41.880 --> 00:20:43.720] Sometimes they go on for years.
[00:20:43.720 --> 00:20:53.720] And then, you know, surprisingly, there's just a break, or they sometimes describe it as snapping, where the whole system of belief suddenly shatters.
[00:20:53.720 --> 00:21:02.520] And it can be through maybe an accident or a shocking encounter with an old friend, or seeing a family member.
[00:21:02.520 --> 00:21:05.240] Or actually, this is a great example.
[00:21:05.240 --> 00:21:10.600] I was being interviewed by a cult therapist who's worked for decades with cult members.
[00:21:10.600 --> 00:21:25.680] And she was saying she has a client who was in QAnon, and he had been a kind of a just a mainstream Democrat for many years, and then had fallen into this QAnon circle and gotten really pulled in during COVID.
[00:21:25.680 --> 00:21:30.080] And he was then entering joining demonstrations.
[00:21:30.080 --> 00:21:39.920] He was in Times Square, and he was part of, he was part of, he was yelling in the street that something like Hillary Clinton needed to be arrested and shot.
[00:21:39.920 --> 00:21:47.440] And then he turned and saw himself in the window of a building, a mirrored building.
[00:21:47.440 --> 00:21:52.320] So he saw himself and he thought, who's that madman jumping up and down?
[00:21:53.200 --> 00:21:54.960] And he realized it was himself.
[00:21:54.960 --> 00:21:58.240] And that was the shocking moment of self-recognition.
[00:21:58.240 --> 00:22:01.120] But those kind of shocks can occur.
[00:22:01.120 --> 00:22:10.480] And many coercive groups, coercively controlling groups, have ammunition that they then provide that you should apply when shocked.
[00:22:10.480 --> 00:22:23.600] So they'll tell you to repeat certain words over and over again or reassure yourself or know that anyone who challenges you has to be evil or various or various techniques.
[00:22:23.920 --> 00:22:27.360] Don't have mirrors in your cult room.
[00:22:28.000 --> 00:22:28.640] Avoid mirrors.
[00:22:28.800 --> 00:22:29.600] Who is that madman?
[00:22:29.600 --> 00:22:30.800] Oh, it's me.
[00:22:31.520 --> 00:22:39.760] Yeah, well, the Keith Ranieri Nexium thing is interesting because you don't surely did not start off as a cult.
[00:22:39.760 --> 00:22:43.360] And in fact, I'm fond of saying no one in the history of the world has ever joined a cult.
[00:22:43.360 --> 00:22:46.160] You know, they join a group that they think is going to be really good.
[00:22:46.160 --> 00:22:47.360] We're going to help the poor.
[00:22:47.360 --> 00:22:54.880] I mean, you look at Jim Jones, you know, migrated from Indiana to San Francisco Bay Area in the 60s.
[00:22:54.880 --> 00:22:59.600] You know, one of the first to integrate a church instead of just being all white or all black.
[00:22:59.600 --> 00:23:00.440] It was both.
[00:23:00.440 --> 00:23:07.080] And, you know, there's pictures of him with Jerry Brown, Governor Jerry Brown, in his first round as governor of California.
[00:23:07.080 --> 00:23:10.440] That's very innovative and, you know, very progressive.
[00:23:10.440 --> 00:23:12.200] You know, we're going to man the soup kitchens.
[00:23:12.200 --> 00:23:13.240] We're going to help poor people.
[00:23:13.240 --> 00:23:14.120] This is all good.
[00:23:14.200 --> 00:23:17.160] You know, who would not want to join a helpful group like that?
[00:23:17.160 --> 00:23:19.480] You know, and then 20 years later, you're in Guyana.
[00:23:19.480 --> 00:23:19.880] Okay.
[00:23:20.520 --> 00:23:27.720] You know, so it's a kind of a gradual, slow, you know, 15 volts at a time kind of escalation game that happens there.
[00:23:27.720 --> 00:23:31.080] But this brings to mind a deeper question I have.
[00:23:31.560 --> 00:23:35.720] This relates to what Danny Kahneman called base rate neglect.
[00:23:35.720 --> 00:23:39.400] That is, you know, you say, here's this interesting phenomenon.
[00:23:39.400 --> 00:23:47.880] You know, here's, you know, the 15 women that joined Nexium and they had his initials branded in their crotch or something like that.
[00:23:47.880 --> 00:23:50.360] And so we focus on that, but what's the base rate?
[00:23:50.360 --> 00:23:58.280] How many women joined Nexium as a kind of self-help empowering women in business group?
[00:23:58.280 --> 00:24:06.760] You know, maybe it's hundreds or thousands or whatever, and they got 15 of them, or like that Netflix series, The Tinder Swindler.
[00:24:06.760 --> 00:24:13.320] You know, they had four women that they kind of focused on that fell for this guy and wired him tens of thousands of dollars and so on.
[00:24:13.320 --> 00:24:15.480] But how many women did he try it out on?
[00:24:15.480 --> 00:24:17.160] You know, they don't mention that.
[00:24:17.160 --> 00:24:18.120] What's the base rate?
[00:24:18.120 --> 00:24:21.880] I mean, is it four out of a thousand, four out of a hundred, four out of forty?
[00:24:22.440 --> 00:24:23.480] You know, I don't know.
[00:24:23.480 --> 00:24:25.560] And so how do we think about that?
[00:24:25.880 --> 00:24:30.760] Well, in the case of Nexium, that is a really great question to ask.
[00:24:30.760 --> 00:24:52.800] And I think that's one reason why there is no absolute list of abusive cults because many people skate along on the periphery of a of what is by almost all accounts a dangerous cult, but they're not harmed by it because maybe they didn't get that involved or they didn't go live in the compound or they they just took yoga classes once a week or something like that.
[00:24:52.800 --> 00:25:06.000] So it's hard to say, but I did have a very illuminating uh interview with Sarah Edmondson, who has a who is one of the women, one of the 15 or how she was one of the first to be branded on her leg.
[00:25:06.000 --> 00:25:09.840] And actually, she's a fascinating person.
[00:25:09.840 --> 00:25:24.640] And she said she said herself she was happy they didn't go all in because they never went to live in Albany in the compound and she she and her husband, so they they maintain for themselves some arena of uh autonomy.
[00:25:24.640 --> 00:25:27.760] She worries what would have happened if things had you know could have been worse.
[00:25:27.760 --> 00:25:43.280] But she did say at the moment she was receiving the brand on her body, they were laughing and they were saying, if somebody walked in right now, they might think we're in a cult, but haha, obviously we're not.
[00:25:43.280 --> 00:25:48.400] So it's so interesting how capable we are in incredible duress.
[00:25:48.400 --> 00:25:50.880] And she said she's had years of therapy.
[00:25:50.880 --> 00:25:53.840] Cheap Caribbean vacations.
[00:25:53.840 --> 00:25:56.240] Get more sen for your dollar.
[00:25:56.560 --> 00:26:02.400] Vacation planning should feel like a breeze, not a deep dive into countless travel sites searching for the best deal.
[00:26:02.400 --> 00:26:08.480] With Cheap Caribbean's Budget Beach Finder, get every destination and every date, all in one search.
[00:26:08.480 --> 00:26:14.880] Go to cheapcaribbean.com and try the Budget Beach Finder and see how stress-free vacation planning should be.
[00:26:14.880 --> 00:26:17.880] Cheap Caribbean vacations.
[00:26:17.880 --> 00:26:20.480] Get more sen for your dollar.
[00:26:21.360 --> 00:26:22.520] To deal with that.
[00:26:22.240 --> 00:26:32.120] But, but you know, just looking back at that moment, or to hold up a mirror, in a sense, to, you know, for herself to try to understand how that could have happened.
[00:26:32.360 --> 00:26:43.960] So, yeah, but the question of base rate neglect is also an important one because it's almost every group, not everyone is exposed to the most extreme.
[00:26:43.960 --> 00:26:48.280] And also, it happens step by step in a slow manner.
[00:26:48.280 --> 00:26:56.840] So certain people might gravitate to different, you know, echelons of the group.
[00:26:56.840 --> 00:26:58.920] And some people might be enthusiastic.
[00:26:58.920 --> 00:27:03.480] Or sometimes people draw back for a number of years and then they get more involved later.
[00:27:04.440 --> 00:27:18.680] Yeah, like how many people have taken the little personality test of Scientology or they're down on Sunset or Hollywood Boulevard and they have the little table set up there with the cans and you can do the little e-meter thing.
[00:27:18.680 --> 00:27:19.720] Big fun.
[00:27:20.120 --> 00:27:23.960] I mean, probably tens of thousands of people have done it and just walked away.
[00:27:25.240 --> 00:27:29.320] And how many actually take out a second mortgage on their home and give it to the church?
[00:27:29.640 --> 00:27:30.840] Not many.
[00:27:31.160 --> 00:27:34.760] So this, okay, here's the larger picture for me.
[00:27:35.240 --> 00:27:42.200] I've spent 30 plus years here at Skeptic Magazine debunking all manner of weird things that people believe.
[00:27:42.200 --> 00:27:45.480] So it just seems like people are incredibly irrational.
[00:27:45.800 --> 00:27:48.760] But I've been kind of rethinking it in the last couple of years.
[00:27:49.160 --> 00:28:01.000] Hugo Mercier's book, what's it not born yesterday, argues that we're not that irrational.
[00:28:01.000 --> 00:28:05.560] Most of the time, most people maintain a fairly rational life.
[00:28:06.280 --> 00:28:11.880] They maintain a job, a family, a marriage, kids, gas in the tank, food in the fridge.
[00:28:11.880 --> 00:28:13.240] They go about their lives.
[00:28:13.240 --> 00:28:14.920] And maybe there's just one quirky thing.
[00:28:15.680 --> 00:28:38.000] They went to the January 6th thing in Washington, D.C., and they did the, I don't know what I was thinking, but otherwise, you know, or like the guy that went to the ping pong pizzeria in Washington, D.C., Edgar Welch, you know, to break up the pedophile ring that, you know, QAnon told him was, you know, going on there.
[00:28:38.000 --> 00:28:40.400] And, you know, there's a kind of an underlying rationality.
[00:28:40.400 --> 00:28:50.880] If you really believe there was a pedophile ring and crime and no one was doing anything about it, you know, people do not infrequently take the law into their own hands when there's an injustice.
[00:28:50.880 --> 00:28:56.240] You know, so you could kind of see his, then he gets there and like, oh, and then he shoots his gun into the roof.
[00:28:56.240 --> 00:28:57.120] He gets arrested.
[00:28:57.120 --> 00:29:00.880] He spends a couple of years in jail and comes out and goes, I don't know what I was thinking.
[00:29:00.880 --> 00:29:02.080] That was just crazy.
[00:29:02.240 --> 00:29:04.240] Otherwise, you know, perfectly normal.
[00:29:04.240 --> 00:29:16.960] Or there was that woman in Texas we found for this Netflix series I worked on on brainwashing where she went down the QAnon rabbit hole and she was otherwise married with kids, attractive, ran her own PR business.
[00:29:16.960 --> 00:29:20.080] I forget her name now, but she talked us through the whole thing.
[00:29:20.080 --> 00:29:21.440] And this is what I did.
[00:29:21.440 --> 00:29:27.360] You know, with the COVID shutdown, you know, I just spent all day online reading all the QAnon stuff.
[00:29:27.360 --> 00:29:32.400] And before you know it, and then she plays on her phone, the voicemail, her husband left her.
[00:29:32.400 --> 00:29:36.800] I'm taking the kids and leaving you if you don't dump this QAnon craziness.
[00:29:36.800 --> 00:29:39.680] And she said, you know, it came down to this choice.
[00:29:39.680 --> 00:29:43.440] This is the most important thing I will ever have an opportunity to do.
[00:29:43.440 --> 00:29:50.000] I can save the country myself or just be a normal mom, wife, and, you know, that's boring.
[00:29:50.000 --> 00:29:55.600] And then eventually she went back and had her moment of revelation there.
[00:29:55.600 --> 00:29:59.680] But still, it's like, wow, okay, this could happen to anybody.
[00:29:59.880 --> 00:30:28.600] Yeah, I mean, it's, I think that this is why there are several stories of researchers going to study something like the Flat Earth Society or QAnon and actually getting converted because it's not, you think that would never happen, but actually, you do come to sympathize with the rationality that's being displayed there because often there is a kind of rigorous step following.
[00:30:28.840 --> 00:30:30.200] There's a kind of logic to it.
[00:30:30.360 --> 00:30:38.520] I actually, for whatever reason, I can often feel like interested in the logic, logical trains people follow to get to a certain point of view.
[00:30:38.520 --> 00:30:44.840] So I'm constantly feeling that just, it's almost like exercise.
[00:30:44.840 --> 00:30:58.280] But I think one of the points I'm trying to draw out from my research is that what we think of as rationality is, or reason, is often undergirded, is always undergirded by emotion.
[00:30:58.280 --> 00:31:05.560] There is an identification first, and then the rationality or the reasoning can kind of unfold.
[00:31:05.560 --> 00:31:22.440] And that is supported by even, you know, the Enlightenment scholars like David Hume, who talk about how our passions, they are, that our reason is only ever a slave to what we're convicted by, our passions, our emotions.
[00:31:23.080 --> 00:31:25.240] Yeah, that's one of my favorite quotes.
[00:31:26.360 --> 00:31:34.200] You know, what he's saying there is that how that reason and rationality is the best tool we have for achieving our goals.
[00:31:34.200 --> 00:31:39.400] But what goals you pick, you know, most people, we're not rational about that.
[00:31:39.400 --> 00:31:44.280] You know, I'm a conservative or I'm a liberal because, well, I don't know, these are my people.
[00:31:44.280 --> 00:31:46.880] This is kind of what I vibe with them.
[00:31:44.840 --> 00:31:51.200] But if you ask me, I'm going to go, well, okay, I'll tell you the six reasons why I'm a conservative.
[00:31:51.600 --> 00:31:55.360] I believe in freedom and religion and faith and family, you know, whatever.
[00:31:55.360 --> 00:31:58.080] I'll give you reasons after the fact.
[00:31:58.240 --> 00:31:59.600] Yeah, I appreciate that.
[00:31:59.600 --> 00:32:08.160] I mean, that reminds me also of one of my favorite passages from Max Weber, where he says, in Science as a Vocation, he says, science can tell you the answer to almost everything.
[00:32:08.160 --> 00:32:10.960] How, you know, how to, you know, what to eat.
[00:32:10.960 --> 00:32:20.800] I mean, just extrapolating, he says, it can give you in exactitude many answers, but it can never answer the one question that is important to us.
[00:32:21.680 --> 00:32:25.600] Who, you know, what am I going to do and how am I to live?
[00:32:25.600 --> 00:32:36.160] So it, you know, you can, it can give you the surrounding calculations and not just exactitude, but actually rational reasons.
[00:32:36.160 --> 00:32:44.560] But it can't answer the questions that he said, I think, divide every human heart or something like that, that pull at our hearts.
[00:32:44.880 --> 00:32:49.280] Yeah, it should be like trying to make a list of why I love my wife.
[00:32:49.280 --> 00:32:52.160] You know, well, give me the six reasons or whatever.
[00:32:52.400 --> 00:32:55.520] The real answer should be, there's, I just do.
[00:32:55.840 --> 00:32:56.480] That's it.
[00:32:56.480 --> 00:32:57.040] Full stop.
[00:32:57.040 --> 00:32:59.040] Because if I say, well, she's really good looking.
[00:32:59.040 --> 00:33:01.120] She's a, you know, a nine on a scale of 10.
[00:33:01.120 --> 00:33:02.240] Well, what if you met a 10?
[00:33:02.240 --> 00:33:03.040] Would you leave her?
[00:33:03.040 --> 00:33:06.240] Like, well, no, because that's not the reason in the first place.
[00:33:06.480 --> 00:33:06.960] Yeah.
[00:33:09.120 --> 00:33:13.680] So, I mean, that's kind of a rational irrationalities.
[00:33:14.000 --> 00:33:18.640] You know, other examples of this is, you know, I don't want to know the outcome of the game because I recorded it.
[00:33:18.640 --> 00:33:26.080] So, you know, I choose to remain ignorant so I can, you know, later watch it or the, or the sign on the Brinks truck that has the money.
[00:33:26.080 --> 00:33:29.880] You know, the driver does not know the combination to the safe, right?
[00:33:29.280 --> 00:33:33.960] So the lack of knowledge is actually better for you to not know things, right?
[00:33:29.760 --> 00:33:38.200] So it's kind of a rational ignorance in a way.
[00:33:38.200 --> 00:33:39.960] But yeah.
[00:33:39.960 --> 00:33:43.720] Yeah, I think that's frequently the case with military operations, right?
[00:33:43.720 --> 00:34:05.000] Because of, you know, to go back to what at the beginning of my book, you talk about the POWs, I mean, what they can actually know should they be captured is often a consideration because anyone under torture will eventually, I mean, it's pretty much anyone under if the conditions are applied will give up the information they have.
[00:34:05.000 --> 00:34:08.200] So it's better if they simply don't know.
[00:34:09.160 --> 00:34:18.040] I mean, this is the ultimate argument against torture or enhanced interrogation or whatever, because people, you're not getting accurate intelligence.
[00:34:18.200 --> 00:34:23.240] People are just going to say whatever they have to say to get you to stop, including confessing crimes, right?
[00:34:23.240 --> 00:34:29.640] I mean, back in the witch trials, yes, I am a witch or, you know, he's a witch, you know, whatever.
[00:34:29.800 --> 00:34:33.720] Just whatever it takes to stop turning the rack further.
[00:34:33.720 --> 00:34:34.440] Yeah.
[00:34:35.080 --> 00:34:35.400] Yeah.
[00:34:35.400 --> 00:34:37.400] It's a significant part of brainwashing.
[00:34:37.400 --> 00:34:50.200] I don't think it's exactly the same, but it does seem to be a huge part of, especially the generating, you know, incidents, episodes that cause brainwashing to come into the English language.
[00:34:50.360 --> 00:35:02.360] There's a significant amount of torture, just the question of what people will do and say when they find themselves in an impossible situation, a very painful situation, the types of confessions they'll attempt.
[00:35:02.360 --> 00:35:09.080] Like some interesting, in your chapter, you sent me, there was a mention of Robert J.
[00:35:09.080 --> 00:35:12.040] Lifton and his book on the Nazi doctors.
[00:35:12.040 --> 00:35:17.920] And of course, he's famous too for being one of the first, or he wrote one of the most significant books about thought reform.
[00:35:18.080 --> 00:35:22.800] And some of the cases he describes are so eloquent, so interesting.
[00:35:23.120 --> 00:35:37.680] For example, this French doctor who's caught, you know, he had been living in China as a physician and he didn't want to leave, even though people said when the communists came in, it's probably wise for you to go back to France.
[00:35:37.680 --> 00:35:41.680] But he sent his family back, but he stayed himself.
[00:35:41.680 --> 00:35:54.320] And he, once he was arrested and subjected to intensive thought reform, he was kept in chains and not allowed to any autonomy, you know, personally.
[00:35:54.640 --> 00:35:58.800] But he, and then these long interrogations, he wasn't allowed to sleep.
[00:35:58.800 --> 00:36:02.000] He couldn't even eat.
[00:36:02.000 --> 00:36:08.960] Or when he had to relieve himself, like other prisoners had to unzip him, you know, his pants, or he had to eat off of the floor like a dog.
[00:36:08.960 --> 00:36:11.840] And, you know, all these, and he has just had these very painful chains.
[00:36:11.840 --> 00:36:17.120] So he's, he says, after a while, you just tell yourself, I have to get rid of the chains.
[00:36:17.120 --> 00:36:19.440] I have to get rid of these chains.
[00:36:19.760 --> 00:36:32.240] And so when they ask him to confess to being a spy, initially his response is to say, you know, once he, you know, into a couple, a week or so into it, he says, I was a great spy.
[00:36:32.240 --> 00:36:36.480] I did all these, you know, he does these sort of grandiose confessions.
[00:36:36.480 --> 00:36:37.920] And they say, no, we won't accept that.
[00:36:37.920 --> 00:36:39.280] We know that's not true.
[00:36:39.280 --> 00:36:41.360] We know you're guilty, but that's not acceptable.
[00:36:41.360 --> 00:36:42.480] That type of confession.
[00:36:42.480 --> 00:36:45.920] It has to be an empirically grounded confession.
[00:36:45.920 --> 00:37:18.200] And he realizes that it's almost like they cope, they collaborate in the next several weeks, or actually, it turns out a couple of years to create a seemingly realistic account of all of his actions as he was living in China, who he spoke to, who he met on June 2nd, 1951, or and they had a conversation about the price of tea in China or how shoes were in short supply or petrol was difficult to find or things like that.
[00:37:18.200 --> 00:37:25.320] And he has to convince himself that he really was acting as a spy from a certain point of view.
[00:37:25.320 --> 00:37:29.560] It's not sufficient just to make up a confession to save himself the pain.
[00:37:29.560 --> 00:37:39.400] He actually has to come to believe it himself to the satisfaction of the authorities for him to be exonerated and ultimately released.
[00:37:40.680 --> 00:37:51.000] Yeah, that and the other stories that you tell in the book, particularly like Patty Hearst, I was thinking about Bob Triver's theory of deception and self-deception.
[00:37:51.320 --> 00:37:59.960] You know, that we evolved a deception detection device, as it were, neural network or whatever, because we know people lie.
[00:37:59.960 --> 00:38:09.080] So it's good to have some cues to look for if you think somebody's going to cheat you out of something or lie to you because we're a social primate species.
[00:38:09.080 --> 00:38:11.080] We have to get along and so on.
[00:38:11.080 --> 00:38:15.640] But, you know, it's kind of a game theory, a game theoretic analysis.
[00:38:15.640 --> 00:38:22.520] So Bob's theory is that it's not enough to deceive people because they may detect it.
[00:38:22.520 --> 00:38:27.880] You won't give off the cues that they may detect if you believe the lie yourself.
[00:38:28.520 --> 00:38:33.320] And so you just, you know, to really convince other people, you just start really believing it.
[00:38:33.320 --> 00:38:38.600] So I think a lot of cult leaders are, you know, psychics who talk to the dead or astrologers or whatever.
[00:38:38.600 --> 00:38:41.720] I think a lot of them are just messing around, having fun, just trying it out.
[00:38:41.720 --> 00:38:47.760] Yeah, I'll try doing a psychic reading or I'll try some astrology readings just for fun.
[00:38:47.760 --> 00:38:49.600] And then they start to get good at it.
[00:38:49.600 --> 00:38:53.280] And then the more they actually believe, you know, maybe I do have these powers.
[00:38:53.280 --> 00:38:57.360] And then they get more positive feedback from the people they're interacting with.
[00:38:57.360 --> 00:39:01.440] And that makes them, you know, think even more, I can really do this.
[00:39:01.760 --> 00:39:02.160] Right?
[00:39:02.160 --> 00:39:07.920] So maybe Patty Hearst, you know, at some point, she's just thinking, I really have to convince these people.
[00:39:07.920 --> 00:39:13.840] So I'm just going to believe their ideology or whatever, something along those lines in the course of weeks or months.
[00:39:13.840 --> 00:39:27.920] Yeah, I think she said later in an interview after her release, she said to Larry King, by the time they were done with me, I really was a soldier in the Symbian Easter Liberation Army.
[00:39:27.920 --> 00:39:52.640] I mean, because she had decided at some point in her captivity and under these extenuating, you know, horrendous conditions, being held in a closet in the dark for 59 days, not being allowed to even see, you know, being blindfolded, not being allowed any autonomy, multiply raped, exposed to propaganda, all these things.
[00:39:52.640 --> 00:40:01.520] She said, once she decided she did want to live, she had to accommodate her thoughts to coincide with theirs.
[00:40:01.520 --> 00:40:05.680] And it wasn't enough because they kept saying, you're not brainwashed, are you?
[00:40:05.680 --> 00:40:08.000] They wanted to find those cues that she was lying.
[00:40:08.000 --> 00:40:12.160] So, and they're, you know, not, they're highly sensitive to that.
[00:40:12.640 --> 00:40:15.120] It's a major thing in the group's life as well.
[00:40:15.120 --> 00:40:18.720] So she had to, she had to actually convert herself.
[00:40:18.720 --> 00:40:24.400] And I think that's one thing that came up in the trial I noticed is that the experts, including Robert J.
[00:40:24.400 --> 00:40:31.880] Lifton, but especially Lewis Jolly Ann West and others, they wanted to insist that she had never been converted.
[00:40:29.840 --> 00:40:34.280] But I think she actually did have to convert herself.
[00:40:34.440 --> 00:40:42.040] It's just that conversion, we mistake it, in my opinion, for something permanent, whereas it has to be perpetually renewed.
[00:40:42.360 --> 00:40:45.720] And that's why Patty Hearst isn't today a Symbianese liberation.
[00:40:45.720 --> 00:40:48.760] I mean, she just went through a terrible experience.
[00:40:48.760 --> 00:40:49.240] Yeah.
[00:40:49.560 --> 00:40:53.080] Well, but I think the way we think about it is that it's got to be black and white.
[00:40:53.080 --> 00:40:55.320] You're either this or that, and that's that.
[00:40:55.320 --> 00:40:57.960] Well, but, you know, life is just not like that.
[00:40:57.960 --> 00:41:07.480] I mean, we have a set of relatively permanent personality characteristics, like in the openness to experience or introversion, extroversion, conscientiousness, and so on.
[00:41:07.480 --> 00:41:09.160] But there's a lot of variation there.
[00:41:09.160 --> 00:41:12.040] And it does change throughout the lifetime a little bit.
[00:41:12.040 --> 00:41:16.920] So people are a lot more flexible, I think, than we realize.
[00:41:16.920 --> 00:41:26.440] And again, you have to, you know, so back to your original where you start the book with the Korean prisoners of war and what did you call it?
[00:41:26.440 --> 00:41:27.480] The volleyball problem.
[00:41:27.480 --> 00:41:28.840] You know, look, they're playing volleyball.
[00:41:28.840 --> 00:41:29.720] They must be having fun.
[00:41:29.720 --> 00:41:30.840] They must really love this.
[00:41:30.840 --> 00:41:32.520] Well, what else are you going to do?
[00:41:32.520 --> 00:41:34.040] You know, you're captured.
[00:41:34.360 --> 00:41:36.360] You got to kind of go along.
[00:41:36.360 --> 00:41:40.920] And then maybe at some point you think, yeah, you know, this communism thing, this is a really good idea.
[00:41:40.920 --> 00:41:43.560] Maybe, you know, maybe this would be good for America or something like that.
[00:41:43.560 --> 00:41:47.560] And you end up saying some idiotic statement that you later regret.
[00:41:47.560 --> 00:42:02.280] I suppose, too, also what I was struck by is in the volleyball, you know, the photos that came out of the POWs playing volleyball in some sandpit, and they looked a little, some of them looked a little chunky, like they'd gained weight.
[00:42:02.280 --> 00:42:06.760] And people commented in the U.S., you know, oh, they've put on some weight.
[00:42:06.760 --> 00:42:08.440] They can't possibly have starved.
[00:42:08.440 --> 00:42:10.440] They can't possibly have been maltreated.
[00:42:10.440 --> 00:42:12.840] They can't possibly have been tortured.
[00:42:12.840 --> 00:42:14.520] But all of those things had happened.
[00:42:14.520 --> 00:42:19.280] They just weren't immediately evident and it wasn't legible in these pictures.
[00:42:19.280 --> 00:42:36.240] And it's interesting how readily we do succumb to, I guess, visual evidence, but also, you know, these types of arguments, in a sense, it's an argument that what we can know, what we can know of somebody.
[00:42:36.240 --> 00:42:55.600] And I thought also striking as I studied these materials over years is that the men, the POWs returning from Korea, the ones who did return, as opposed to the 21 who decided to go to China, the ones who came back on a ship, they ended up writing a poem together, which is captured in a scholarly article.
[00:42:55.600 --> 00:42:59.680] And they said, they basically said, no one will ever understand.
[00:43:01.840 --> 00:43:06.240] They said, yes, we were badly wounded, badly treated, but what does it mean to you?
[00:43:06.240 --> 00:43:19.120] You'll never, in other words, people will never understand what we went through that extracted this seemingly strange, freakish behavior of, you know, us confess doing false confessions or quoting communist propaganda.
[00:43:19.120 --> 00:43:20.800] And this turned out to be the case.
[00:43:20.800 --> 00:43:21.760] They weren't understood.
[00:43:21.760 --> 00:43:23.360] And the same is true of Patty Hearst.
[00:43:23.360 --> 00:43:28.800] She said, I despaired of anyone believing or understanding my experience.
[00:43:28.800 --> 00:43:31.040] And that turned out to be true to this day.
[00:43:31.040 --> 00:43:45.120] You know, I think many people still find it impossible to believe what happened to her and that she impossible to trust that she wasn't just somehow on this wild.
[00:43:45.120 --> 00:43:49.120] There's this whole wild ride theory that she actually was enjoying what was happening.
[00:43:49.120 --> 00:43:54.000] And it's kind of a volleyball problem attributed to Patty Hearst.
[00:43:54.640 --> 00:44:00.200] Yeah, the famous picture in the bank with her hat and the rifle and all that.
[00:44:00.520 --> 00:44:01.240] Yeah, it's hard.
[00:43:59.440 --> 00:44:03.240] You look at that and go, Well, come on.
[00:43:59.600 --> 00:44:03.560] Yeah.
[00:44:03.880 --> 00:44:12.360] And the lawyer, the prosecuting lawyer, said her movements on the videotape taken by the Hibernia Bank, that she looked so agile.
[00:44:12.360 --> 00:44:13.720] She was moving so quickly.
[00:44:13.720 --> 00:44:15.240] She couldn't possibly have been brainwashed.
[00:44:15.240 --> 00:44:20.040] She couldn't have possibly been under their control because she was so she was so alert.
[00:44:20.040 --> 00:44:22.840] But she, you know, she had to make herself so.
[00:44:24.440 --> 00:44:32.040] Yeah, I just again, the term brainwashing, mind control, it's not accurate for the way the mind actually works.
[00:44:32.600 --> 00:44:39.400] It's an idea that you document went on for half a century that we think this is what is possible.
[00:44:39.400 --> 00:44:41.880] It's really not that possible.
[00:44:42.680 --> 00:44:44.840] It depends how you define it, I suppose.
[00:44:45.080 --> 00:44:54.040] I think it's not possible in the way people think, but it's a useful, I guess I like as a history, I'm now in the field of history of science, and we do.
[00:44:54.440 --> 00:45:00.680] I mean, what's interesting is how people have, how the word has flourished and how many things get attached to it.
[00:45:00.680 --> 00:45:07.320] And can, and personally, I think it's interesting to use as a window for self-reflection.
[00:45:07.320 --> 00:45:09.800] You know, what do we think it, what do we think it means?
[00:45:09.960 --> 00:45:17.160] I think it is possible for people to be deeply changed by their experiences, sometimes in radical ways.
[00:45:17.160 --> 00:45:18.680] I mean, I've certainly seen that.
[00:45:18.680 --> 00:45:27.000] And you can come up with maybe more neuroscientifically accurate accounts or psychodynamically accurate or things like that.
[00:45:27.000 --> 00:45:32.040] But I guess that's why I do I argue that it is real.
[00:45:33.320 --> 00:45:34.200] Yeah, it's real.
[00:45:34.200 --> 00:45:40.360] I mean, for sure, people go down a rabbit hole and end up somewhere that they never would have anticipated.
[00:45:40.360 --> 00:45:47.840] But it's not like the CIA can just take control of somebody and get them to assassinate the president or something like that.
[00:45:44.760 --> 00:45:51.520] I guess that's, but let me just think about this out loud here.
[00:45:51.760 --> 00:45:54.800] I mentioned working in a Skinnerian lab.
[00:45:55.840 --> 00:46:04.640] Remember when Skinner was experimenting with having pigeons in the nose cone of a missile, you know, pecking at keys to make it, you know, direct correctly?
[00:46:04.640 --> 00:46:06.880] Now we have computers that do this.
[00:46:07.120 --> 00:46:10.400] After 9-11, Richard Dawkins wrote a, I think it was in the.
[00:46:26.720 --> 00:46:27.920] All in one search.
[00:46:27.920 --> 00:46:34.320] Go to cheapcaribbean.com and try the budget beach finder and see how stress-free vacation planning should be.
[00:46:34.320 --> 00:46:37.280] Cheap Caribbean vacations.
[00:46:37.280 --> 00:46:39.920] Get more sen for your dollar.
[00:46:40.320 --> 00:46:52.480] The spectator, a brilliant op-ed about talking about Skinner's experiments and say, well, you know, if only we could get, you know, people to do this kind of thing.
[00:46:52.480 --> 00:46:54.000] Well, there's no way we could get somebody.
[00:46:54.160 --> 00:46:57.840] Well, I wonder if we could get somebody to like fly a plane into a building.
[00:46:57.840 --> 00:47:00.560] Well, no, no, there's no way anybody would do that.
[00:47:00.560 --> 00:47:05.040] You know, and then, of course, his point is that religion does that.
[00:47:05.600 --> 00:47:08.480] Religion could get people to do these sorts of things.
[00:47:08.640 --> 00:47:13.680] So that brings to mind, you know, the difference between a cult and a new religious movement and religion.
[00:47:13.680 --> 00:47:18.640] You know, how do you think about that kind of the spectrum of different beliefs there?
[00:47:19.280 --> 00:47:21.920] I think it's a really rich territory.
[00:47:21.720 --> 00:47:23.040] I I'll come back to that.
[00:47:23.040 --> 00:47:29.440] Just one small comment on the I can't, I can't resist because I teach sections of about B.F.
[00:47:29.440 --> 00:47:32.520] Skinner in my class, and I love to teach the Operation.
[00:47:33.160 --> 00:47:50.920] I don't know if he called it Operation Pigeon, but the one the part you're describing where he actually taught, trained pigeons to guide missiles during World War II and around 1942, so that they could connect with a target.
[00:47:50.920 --> 00:47:54.200] And he was successful.
[00:47:55.000 --> 00:48:04.680] The military came in and toured his facility, and the pigeons, of course, would end up dying, but it would be, you know, would be a worthy sacrifice as he saw it.
[00:48:04.680 --> 00:48:10.760] So it was actually an interesting project because it actually worked.
[00:48:10.760 --> 00:48:14.760] And radar had not yet been developed, but it was concurrently being developed.
[00:48:14.760 --> 00:48:17.800] So it's just the fact that radar came in around that time.
[00:48:17.800 --> 00:48:29.720] But there are interesting accounts of when high brass military came to tour his facility and they were expecting this high-tech operation and they saw the pigeons and they just were, they just couldn't deal.
[00:48:29.720 --> 00:48:37.720] It wasn't the image that they wanted of a strong nation, you know, able to prevail using the latest technology.
[00:48:37.720 --> 00:48:45.080] It turned out to be just these homegrown birds that were actually quite I mean, the heroism of pigeons is a whole other story.
[00:48:45.080 --> 00:48:55.720] So anyway, I always just find that amazing that they rejected it not that it not because it did it didn't work, but because they just didn't it didn't symbolically fit.
[00:48:55.720 --> 00:48:56.280] Interesting.
[00:48:56.280 --> 00:48:57.960] Then also radar then came along.
[00:48:57.960 --> 00:48:59.880] So all right.
[00:48:59.880 --> 00:49:00.840] I did not know that.
[00:49:00.840 --> 00:49:02.360] That's super interesting.
[00:49:02.360 --> 00:49:07.400] Yeah, there's a wonderful article about this, but about religion.
[00:49:08.040 --> 00:49:14.760] So, new religious movements is a really interesting area, and as well as the history of religion.
[00:49:15.360 --> 00:49:31.440] And I just spent two weeks in Amsterdam at the University of Amsterdam in a course that was kind of about the history of the history of human consciousness, psychedelic research, and also the history of religion because it was taught by religious studies scholars.
[00:49:31.440 --> 00:49:37.200] And it really gave me some insight in a way into how they look at this terrain.
[00:49:37.200 --> 00:50:00.080] I mean, it's, I became interested in researching my book in these cult wars back in the 80s, in which religious studies scholars and religious historians took great issue with the cult research and the sociologists and psychiatrists who tended to think they tended to focus on the really pathological groups that get a lot of attention.
[00:50:00.080 --> 00:50:05.360] And as you're saying, they were focusing on the worst outcomes of those groups.
[00:50:06.080 --> 00:50:15.520] And so these scholars were saying we need to look more broadly at the history of religion and see that the literal definition of cult is merely a small religious group.
[00:50:15.520 --> 00:50:17.680] I mean, Christianity was a cult.
[00:50:17.680 --> 00:50:20.800] It became this big, then it grew into a religion.
[00:50:20.800 --> 00:50:22.320] This is kind of how things develop.
[00:50:22.320 --> 00:50:28.080] And we've pathologized the very meaning of cult, and they thought that was very unfair too.
[00:50:28.080 --> 00:50:32.480] That's why they came up with the term new religious movements to distance it.
[00:50:32.480 --> 00:50:53.360] And so they started to also systematically take down the leading scholars of cult research, such as Margaret Singer, who had been, who had really developed these vibrant careers testifying in court about brainwashing and the dangers of cults.
[00:50:53.440 --> 00:50:56.240] So, anyway, I don't have an absolute answer.
[00:50:56.240 --> 00:51:00.040] I think it's really profound.
[00:50:58.960 --> 00:51:02.040] I mean, many profound questions are raised.
[00:51:02.200 --> 00:51:08.200] Like, you were bringing up the question of gurus and how they may come to believe their own hype.
[00:51:08.200 --> 00:51:13.800] Like, they start, they get into astrology, they get into whatever it is, whatever.
[00:51:13.800 --> 00:51:18.520] Maybe they're using meditation, chanting, group techniques.
[00:51:18.520 --> 00:51:21.160] They're cultivating bonds among their followers.
[00:51:21.160 --> 00:51:25.880] They're also cultivating their own magnetism and charisma.
[00:51:27.080 --> 00:51:28.600] I think they do have these.
[00:51:28.760 --> 00:51:43.080] I think that often, as part of meditation or part of these practices, they are having these, what do you call them, out-of-body or sort of altered states of consciousness that lead them to believe that they're somehow godlike.
[00:51:43.080 --> 00:51:55.960] And then they get and they lead that leads them to believe that nothing they can do, that they're somehow beyond good and evil, or that whatever they do is righteous.
[00:51:55.960 --> 00:52:00.200] And then the group seems to, their followers seem to reinforce that.
[00:52:00.200 --> 00:52:03.000] And that's how a group can kind of pinwheel.
[00:52:03.000 --> 00:52:21.640] So I think the territory is much less clear than we think, and it's much harder to easily demarcate whether over time or within a group itself, what is pathological or even, you know, we would have to say evil in something like the Jim Jones cult.
[00:52:21.960 --> 00:52:22.760] Yeah.
[00:52:22.760 --> 00:52:29.560] Yeah, it's very spectrum-y, and I think it depends very much on what actually happens in a group.
[00:52:29.800 --> 00:52:35.080] Most of the time, you know, nothing bad is particularly happening, but we focus on the negative things.
[00:52:35.320 --> 00:52:38.680] It's a little bit like the demarcation problem between science and pseudoscience.
[00:52:38.680 --> 00:52:43.240] Well, there's no single definition that's going to cover all specific cases.
[00:52:43.240 --> 00:52:49.040] You know, Scientology, for the most part, I think of it as pretty cult-y, you know, but I've met Scientologists.
[00:52:49.040 --> 00:52:58.640] I met Isaac Hayes, the great, you know, singer, songwriter, the voice of Chaff on South Park and so on.
[00:52:58.640 --> 00:53:00.960] He was at my best friend's Thanksgiving dinner.
[00:53:00.960 --> 00:53:03.440] He's sitting right across from me for hours.
[00:53:03.440 --> 00:53:04.480] Super nice guy.
[00:53:04.800 --> 00:53:11.040] And he had just had his career rejuvenated, and he was famously a Scientologist.
[00:53:11.040 --> 00:53:16.400] So I, as kindly and respectfully as I could, you know, what is it you get out of Scientology?
[00:53:16.400 --> 00:53:22.240] And he said, well, Michael, I was at the pinnacle of, you know, Hollywood music, the whole thing.
[00:53:22.240 --> 00:53:23.280] I had everything.
[00:53:23.280 --> 00:53:26.880] And I just lost it all, you know, basically sex, drugs, rock and roll, the whole thing.
[00:53:26.880 --> 00:53:27.840] Gone.
[00:53:27.840 --> 00:53:32.560] And I was at the bottom of my life, you know, and Scientology saved me.
[00:53:33.120 --> 00:53:33.760] Okay.
[00:53:34.080 --> 00:53:36.320] I mean, what am I going to say to that?
[00:53:36.320 --> 00:53:36.720] Nothing.
[00:53:36.720 --> 00:53:37.680] That's good.
[00:53:37.680 --> 00:53:38.240] Right?
[00:53:38.560 --> 00:53:42.400] And more recently, you know, Ian Herci Alley became a Christian.
[00:53:42.400 --> 00:53:47.520] You know, she was famously one of the, you know, the new atheists, and now she's a Christian.
[00:53:47.520 --> 00:53:48.720] What happened?
[00:53:48.720 --> 00:53:51.360] Well, you know, there's a lot of personal stuff, she said.
[00:53:51.360 --> 00:54:01.920] She's written about this now and spoken publicly about it, that, you know, there was issues with alcoholism and drugs and suicidal ideation.
[00:54:02.240 --> 00:54:06.960] You know, she's lived under a death threat, you know, the threat of being murdered since the 90s.
[00:54:07.440 --> 00:54:10.640] She lives with 24-hour protection.
[00:54:10.640 --> 00:54:14.880] You know, I've done a number of public events with her, and I hardly ever see her.
[00:54:14.880 --> 00:54:16.320] She's in the green room.
[00:54:16.320 --> 00:54:17.680] There's guards at the door.
[00:54:17.680 --> 00:54:20.240] I can't, you know, can't go in and see her, right?
[00:54:20.240 --> 00:54:22.880] And, you know, she has to leave immediately afterwards.
[00:54:22.880 --> 00:54:24.800] She doesn't get to go to the after-party and have fun.
[00:54:24.800 --> 00:54:27.840] She's with an armed guard because somebody will try to kill her.
[00:54:27.840 --> 00:54:32.360] And then she, you know, got married, had kids, and now she's afraid they'll kill her kids.
[00:54:29.840 --> 00:54:37.320] And this led to this, you know, understandable depression, anxiety.
[00:54:38.040 --> 00:54:41.720] And, you know, then it was Christianity that saved her.
[00:54:42.040 --> 00:54:46.440] And when she tells me the story, it's like, okay, I don't want to talk you out of anything.
[00:54:46.440 --> 00:54:50.040] You know, I'm an atheist, but you know, life is hard.
[00:54:50.040 --> 00:54:54.840] I can't imagine what it's like to live under a death threat like that.
[00:54:54.840 --> 00:54:58.120] You know, it's in a way of saying, you know, religion is good when it does good.
[00:54:58.120 --> 00:54:59.320] It's bad when it does bad.
[00:54:59.320 --> 00:55:01.400] It's a lot of different things.
[00:55:01.400 --> 00:55:04.840] And, you know, new religious movements are on the spectrum.
[00:55:04.840 --> 00:55:06.520] They're just smaller, whatever.
[00:55:06.520 --> 00:55:09.320] It just depends on what actually happens.
[00:55:09.640 --> 00:55:12.760] Yeah, I use that rubric in meditation.
[00:55:12.760 --> 00:55:13.960] And I've heard it used.
[00:55:14.040 --> 00:55:15.880] I do a lot of meditating.
[00:55:15.880 --> 00:55:20.440] And I find that, I mean, you can have a range of experiences.
[00:55:20.440 --> 00:55:26.840] These don't necessarily determine the quality or whether it's right, whether it's right for the person.
[00:55:26.840 --> 00:55:32.120] It can actually be, you know, anything can be abused, I guess, is what I've discovered in life.
[00:55:32.120 --> 00:55:41.800] Even something that seems to be so positive and maybe society seems to be so optimistic about meditation or cold plunges or deep breathing or whatever it is.
[00:55:41.800 --> 00:55:44.200] But for me, meditation's been a profound thing.
[00:55:44.200 --> 00:55:51.640] But I think you do have to constantly check, you know, reality check.
[00:55:51.640 --> 00:55:53.160] How is this affecting my life?
[00:55:53.160 --> 00:55:55.160] Are my relationships better?
[00:55:55.160 --> 00:55:58.440] Am I more compassionate generally?
[00:55:58.440 --> 00:56:09.000] Is it shutting down, you know, my, you know, to certain capacities, my, and, uh, and, and whether it's actually helping me in my life.
[00:56:09.000 --> 00:56:20.000] And I think, I think even that can sometimes be misleading because initially people can have these amazing experiences, whatever, but you have to keep doing that as over time.
[00:56:20.640 --> 00:56:23.840] Yeah, I had Steve Hassan on the show.
[00:56:23.840 --> 00:56:28.800] Uh, he wrote a book, The Cult of Donald Trump, The Cult of Trump, or whatever it was called.
[00:56:28.800 --> 00:56:31.600] And, you know, I like Steve his work.
[00:56:31.600 --> 00:56:32.480] I like him.
[00:56:32.480 --> 00:56:33.680] He's got a great story.
[00:56:33.680 --> 00:56:36.800] His bite model explains some things, I think.
[00:56:36.800 --> 00:56:39.840] But 80 million Americans are in a cult?
[00:56:39.840 --> 00:56:40.960] I mean, really?
[00:56:40.960 --> 00:56:42.080] I mean, come on.
[00:56:42.080 --> 00:56:45.440] You know, this is using the term, I think, too broadly.
[00:56:45.760 --> 00:56:46.480] I agree.
[00:56:46.480 --> 00:56:52.160] I mean, well, that's why if people want to pursue that question, I refer them to Steve's book.
[00:56:52.160 --> 00:56:55.360] But I personally think it's, what do you even say?
[00:56:55.360 --> 00:57:03.920] I've heard other people say, okay, 60 million, 80 million, no, everyone who voted for Donald Trump is somehow in a cult or brainwashed.
[00:57:03.920 --> 00:57:07.440] And then others will say, well, therefore, they all need to be deprogrammed.
[00:57:07.440 --> 00:57:10.480] I mean, this leads us down a hideous path.
[00:57:10.480 --> 00:57:23.920] And it doesn't, that's why I kind of really try to think about brainwashing as to the extent it's a useful mode of inquiry or window.
[00:57:23.920 --> 00:57:29.360] It's to look at oneself and to see that we're all connected.
[00:57:29.360 --> 00:57:37.200] These things don't, you know, they don't happen as like an isolated phenomenon that somehow we just lost half the population or something.
[00:57:37.200 --> 00:57:39.040] But we just, we have nothing to do with that.
[00:57:39.040 --> 00:57:42.080] It's a dynamic that our society is creating.
[00:57:42.080 --> 00:57:47.920] And this is actually part of it to imagine that it's just infected over there or something.
[00:57:47.920 --> 00:57:48.480] Yeah.
[00:57:48.800 --> 00:58:00.840] Let's talk about the Milgram experiments in the larger context of explaining how you get normal civilized Germans to become goose-stepping, you know, exterminating Nazis.
[00:57:59.120 --> 00:58:03.080] You know, I've been long fascinated by this.
[00:58:03.320 --> 00:58:18.840] Your discussion, I thought, was very insightful that the real target here is the people delivering the shocks, and the real response is how uncomfortable we all are watching this unfold and kind of giggling or laughing or kind of squirming.
[00:58:19.480 --> 00:58:22.120] You know, the shocker, the person doing the shocks is squirming.
[00:58:22.440 --> 00:58:24.760] It looks like they don't really want to do it.
[00:58:25.000 --> 00:58:27.480] And we're uncomfortable watching it.
[00:58:27.480 --> 00:58:29.560] What do you take from all that?
[00:58:30.120 --> 00:58:34.200] Yeah, I think these experiments are mysterious.
[00:58:34.200 --> 00:58:40.520] And that's why people are, they created a lot of discomfort in Stanley Milgram's life.
[00:58:40.520 --> 00:58:46.200] And I think they have something to do with the fact that he never got tenure and had to leave Yale.
[00:58:46.200 --> 00:58:48.760] Not that that's the worst thing that can happen to somebody.
[00:58:48.760 --> 00:58:52.360] And he did have a flourishing career at a city college, I think.
[00:58:52.360 --> 00:58:58.520] But he, yeah, but these experiments famously made people uncomfortable because it was hard to interpret them.
[00:58:58.520 --> 00:59:02.440] Also, I think that there are several, I mean, people were uncomfortable.
[00:59:02.440 --> 00:59:04.440] The people giving the shocks were uncomfortable.
[00:59:04.440 --> 00:59:09.560] And even if you mention that over and over again, it somehow disappears from the telling.
[00:59:09.560 --> 00:59:17.000] And when the final figures are given, oh, two-thirds, you know, administered lethal level shocks or thought they were.
[00:59:17.000 --> 00:59:31.240] Suddenly, those people, their experiences erased, the discomfort or even agony they were feeling is erased, and suddenly they're real live Nazis or the Eichmann thesis has been proved.
[00:59:31.720 --> 00:59:42.200] And the, you know, the deeper you go into how the actual experiments were run, you find that Milgram and his graduate students were taking bets on how far people would go too.
[00:59:42.200 --> 00:59:50.560] Like they kind of did treat it as a game sometimes that people would laugh while observing the experiments.
[00:59:50.880 --> 01:00:12.720] But also, I found over the years that students laugh in an uncomfortable way because it's, and even in the one you conducted yourself with NBC Dateline in the guise of a reality TV show, you see a lot of pained laughter and wanting the people who did comply, who were compliant and who gave shocks.
[01:00:12.720 --> 01:00:14.080] They want to be reassured.
[01:00:14.080 --> 01:00:15.040] They are laughing too.
[01:00:15.040 --> 01:00:19.440] They want to know, oh, this is in the realm of humor.
[01:00:19.440 --> 01:00:33.760] And I think what it is, is that you've just been, it's hard for them to find a foothold from which to describe what just happened to them.
[01:00:33.760 --> 01:00:40.000] And they've been, in a sense, tortured, but it's hard to recognize.
[01:00:40.000 --> 01:00:41.760] It's hard for anyone to recognize.
[01:00:41.760 --> 01:00:50.160] And it only results mostly in the experimenter seeming to have the power of interpreting what's just happened.
[01:00:50.160 --> 01:01:01.360] Anyway, so I try to give a new lens to these experiments, which I do find so interesting just because you can keep looking at them and find more in them.
[01:01:01.360 --> 01:01:23.840] And there's actually a scholar who wrote me from the UK, I'm forgetting his name right now, but he's coming out with a book shortly, studying what the, you know, how Milgram himself described these priming conditions that were, you know, he's doing a close study of almost like an ethnographic study of what happened in the laboratory.
[01:01:24.160 --> 01:01:25.040] Oh, interesting.
[01:01:25.040 --> 01:01:28.640] He's treating it like the subjects were primed to do something.
[01:01:29.160 --> 01:01:33.960] Or was he the agentic explanation Milgram offered?
[01:01:29.760 --> 01:01:39.800] I think he's actually, yeah, I think he's actually talking about binding conditions or something like that.
[01:01:39.800 --> 01:01:41.080] I just started to read it.
[01:01:41.080 --> 01:01:42.600] It's fascinating.
[01:01:42.600 --> 01:01:47.000] But it is, he seemed to find it, I mean, I think there is something to do.
[01:01:47.720 --> 01:01:54.680] Milgram was not, to my knowledge, people didn't take that seriously, his agentic shift.
[01:01:54.680 --> 01:01:58.360] But I think there's also something going on there.
[01:01:58.360 --> 01:02:01.400] Yeah, I mentioned working in that Skinnerian lab for two years.
[01:02:01.400 --> 01:02:10.440] That was under Douglas Naverick, who got his PhD under Douglas Fantino at UC San Diego, got his PhD under Kernstein at Harvard.
[01:02:10.440 --> 01:02:13.000] You know, then there's the connection to Skinner.
[01:02:13.000 --> 01:02:17.560] All right, so Doug wrote several papers reinterpreting Milgram.
[01:02:17.560 --> 01:02:21.480] He treated it as an approach avoidance conflict.
[01:02:21.480 --> 01:02:32.440] And that, you know, you have reinforcement and then you have punishment and, you know, you kind of go back and forth, like the rat that gets rewarded for going onto the plate where the cheese is, but then gets a little shocked.
[01:02:32.440 --> 01:02:34.040] So he doesn't want to go, but he wants to go.
[01:02:34.040 --> 01:02:34.760] He doesn't want to go.
[01:02:34.760 --> 01:02:35.720] He wants to go.
[01:02:35.720 --> 01:02:36.920] That was his interpretation.
[01:02:36.920 --> 01:02:39.400] You know, you want to obey the authority.
[01:02:39.400 --> 01:02:44.920] You know, here's a famous scientist at Yale University, or in our case, you know, here's the NBC studios.
[01:02:44.920 --> 01:02:46.760] I want to be part of this reality show.
[01:02:46.760 --> 01:02:47.640] That would be cool.
[01:02:47.640 --> 01:02:49.960] But on the other hand, I have to do this uncomfortable thing.
[01:02:49.960 --> 01:02:50.680] So I want it.
[01:02:50.680 --> 01:02:51.560] I don't want it.
[01:02:52.200 --> 01:02:53.160] That kind of thing.
[01:02:53.160 --> 01:02:58.520] I think there's just many different ways to interpret it because it is somewhat mysterious.
[01:02:58.840 --> 01:03:05.560] And, you know, maybe there's just a variety of the way you know different people respond differently, so they have different motivations.
[01:03:05.560 --> 01:03:08.520] There's not a single theory that's going to explain it.
[01:03:08.520 --> 0
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
ere.
[01:01:58.360 --> 01:02:01.400] Yeah, I mentioned working in that Skinnerian lab for two years.
[01:02:01.400 --> 01:02:10.440] That was under Douglas Naverick, who got his PhD under Douglas Fantino at UC San Diego, got his PhD under Kernstein at Harvard.
[01:02:10.440 --> 01:02:13.000] You know, then there's the connection to Skinner.
[01:02:13.000 --> 01:02:17.560] All right, so Doug wrote several papers reinterpreting Milgram.
[01:02:17.560 --> 01:02:21.480] He treated it as an approach avoidance conflict.
[01:02:21.480 --> 01:02:32.440] And that, you know, you have reinforcement and then you have punishment and, you know, you kind of go back and forth, like the rat that gets rewarded for going onto the plate where the cheese is, but then gets a little shocked.
[01:02:32.440 --> 01:02:34.040] So he doesn't want to go, but he wants to go.
[01:02:34.040 --> 01:02:34.760] He doesn't want to go.
[01:02:34.760 --> 01:02:35.720] He wants to go.
[01:02:35.720 --> 01:02:36.920] That was his interpretation.
[01:02:36.920 --> 01:02:39.400] You know, you want to obey the authority.
[01:02:39.400 --> 01:02:44.920] You know, here's a famous scientist at Yale University, or in our case, you know, here's the NBC studios.
[01:02:44.920 --> 01:02:46.760] I want to be part of this reality show.
[01:02:46.760 --> 01:02:47.640] That would be cool.
[01:02:47.640 --> 01:02:49.960] But on the other hand, I have to do this uncomfortable thing.
[01:02:49.960 --> 01:02:50.680] So I want it.
[01:02:50.680 --> 01:02:51.560] I don't want it.
[01:02:52.200 --> 01:02:53.160] That kind of thing.
[01:02:53.160 --> 01:02:58.520] I think there's just many different ways to interpret it because it is somewhat mysterious.
[01:02:58.840 --> 01:03:05.560] And, you know, maybe there's just a variety of the way you know different people respond differently, so they have different motivations.
[01:03:05.560 --> 01:03:08.520] There's not a single theory that's going to explain it.
[01:03:08.520 --> 01:03:09.240] That's true.
[01:03:09.240 --> 01:03:13.400] And it's also quite amazing how generative it has been of so many theories.
[01:03:13.400 --> 01:03:20.800] So many people have things to say about it, from this is the root of all evil, to this is this proves that everyone is completely situational in there.
[01:03:21.120 --> 01:03:23.760] There's no evil, no one can ever be evil.
[01:03:23.760 --> 01:03:29.200] But I think what you were saying about approach avoidance is kind of interesting.
[01:03:29.760 --> 01:03:45.600] It brought to mind something like the Catch-22 situation or Gregory Bateson, you know, his schismogenesis, just like situations that can actually drive someone insane because you're being punished for the very thing you're being incentivized to seek.
[01:03:46.480 --> 01:03:51.520] I mean, so that's when used in child rearing, it has terrible results.
[01:03:51.520 --> 01:03:55.840] So perhaps, you know, it sets up a very difficult situation.
[01:03:55.840 --> 01:04:00.800] And the whole story of how Milgram came to the experiment is also pretty interesting.
[01:04:00.800 --> 01:04:03.840] I don't think you wrote about Zimbardo in your book, did you?
[01:04:03.840 --> 01:04:04.320] No.
[01:04:04.960 --> 01:04:06.480] Because it's a similar situation.
[01:04:06.720 --> 01:04:12.080] You know, he and Milgram were colleagues and of the same age, essentially trying to explain the same thing.
[01:04:12.080 --> 01:04:13.120] They're both Jewish.
[01:04:13.120 --> 01:04:15.200] Like, how did this happen to our people?
[01:04:15.200 --> 01:04:18.560] You know, and we're scientists, let's explain it.
[01:04:18.560 --> 01:04:39.360] You know, toward the end of his life, he came under heavy criticism that he coached the prisoner, you know, the randomly chosen students who played guards a little too much and that they were therefore just doing what he told them to do, not, you know, not this is the role I'm adopting now.
[01:04:40.000 --> 01:04:49.600] I think that's an overly criticism, but I'd like to get your thoughts on that because in a way, he is studying how prisons operate.
[01:04:49.600 --> 01:04:54.960] And, you know, how would an undergraduate Stanford University student have any idea how a prison operates?
[01:04:54.960 --> 01:04:57.200] So you have to kind of give them some instructions.
[01:04:57.200 --> 01:04:58.640] This is what you're supposed to do.
[01:04:58.640 --> 01:05:01.160] And then they kind of take over, like the mirrored sunglasses.
[01:05:01.160 --> 01:05:12.680] He told me he got that idea from that film, Kool Han Luke, where the prison guard wore those mirrored sunglasses because it dehumanizes you, makes a disconnect between you and the victim, that kind of thing.
[01:05:12.680 --> 01:05:13.080] I don't know.
[01:05:13.640 --> 01:05:15.240] What are your thoughts on that?
[01:05:15.240 --> 01:05:17.240] Yeah, that's interesting.
[01:05:17.240 --> 01:05:19.960] I think it's another such a rich experiment.
[01:05:19.960 --> 01:05:24.200] That's why I love teaching it as well, just because there's so much there.
[01:05:24.200 --> 01:05:33.640] And the deeper you go, the more, because now the participants in the experiment, many of them have been interviewed and reflected on it over many decades.
[01:05:33.640 --> 01:05:50.840] So the guy who played John Wayne, who did come up with those mirrored sunglasses, I think he did say that he, but this was his own, in a sense, he showed an unusual amount of initiative in embodying and carrying out Zimbardo's instructions.
[01:05:51.160 --> 01:06:07.240] So when Marvels, this guy went on to become a something like an insurance adjuster, just, you know, some guy, you know, later fairly successful in life, but not very, you know, not, he doesn't stand out as particularly, certainly not evil.
[01:06:08.440 --> 01:06:25.560] He seems in the film that he has a real taste for harming people, for really, and for really extracting obedience from these prisoners or even torturing them in a sense.
[01:06:25.560 --> 01:06:31.320] But, but he's just like, you know, he doesn't seem to also have much reflective capacity.
[01:06:31.320 --> 01:06:33.400] He just is a little bit proud of what he did.
[01:06:33.400 --> 01:06:35.960] So it's been so interesting to see what's come out of it.
[01:06:35.960 --> 01:06:53.120] Some other people, but I think maybe the backlash against Philip Zimbardo has to do with the fact that people on some level felt uneasily as if he didn't accept responsibility in some way for what he had caused to happen.
[01:06:53.440 --> 01:07:12.640] And maybe that's hard to even put into words, but there's almost an inchoate sense that he unleashed or set up this phenomena and then he didn't fully describe his own role and also perhaps his responsibility for stopping it earlier.
[01:07:12.880 --> 01:07:25.520] Because he, he, maybe it's even a discomfort with scientific experimentation with human beings at that level where these undergraduates, who's who's to know what, what the lasting effects might have been.
[01:07:25.520 --> 01:07:35.840] So maybe there's just, maybe there's the backlash is not, you know, as rash, it's not perfectly logical, but there's a sort of emotional sense that Zimbardo must have done something.
[01:07:35.840 --> 01:07:38.480] Is there like an uneasiness with it?
[01:07:38.480 --> 01:07:38.960] Yeah.
[01:07:39.520 --> 01:07:45.440] Well, yeah, that one guy that gave an interview a couple years ago, he was interviewed decades ago.
[01:07:45.440 --> 01:07:48.080] And he was saying very different things then than he is now.
[01:07:48.080 --> 01:07:49.840] So I thought, I don't know.
[01:07:50.160 --> 01:07:55.600] Oh, I just went along with it because, you know, Zimbardo told us he was like, that's not what you said 20 years ago.
[01:07:55.920 --> 01:07:56.480] Right.
[01:07:56.960 --> 01:07:59.840] Cheap Caribbean vacations.
[01:07:59.840 --> 01:08:02.240] Get more sen for your dollar.
[01:08:02.560 --> 01:08:08.400] Vacation planning should feel like a breeze, not a deep dive into countless travel sites searching for the best deal.
[01:08:08.400 --> 01:08:14.480] With Cheap Caribbean's Budget Beach Finder, get every destination and every date all in one search.
[01:08:14.480 --> 01:08:20.880] Go to cheapcaribbean.com and try the Budget Beach Finder and see how stress-free vacation planning should be.
[01:08:20.880 --> 01:08:26.560] Cheap Caribbean vacations, get more sen for your dollar.
[01:08:27.280 --> 01:08:28.080] So I don't know.
[01:08:28.080 --> 01:08:31.480] There's some post hoc rationalization going on there.
[01:08:32.040 --> 01:08:44.360] But to extrapolate it to a real-world situation, like Christopher Browning's book, Ordinary Men, also a great docudrama on Netflix, of the same title, Ordinary Men.
[01:08:44.360 --> 01:08:46.440] You know, how do you get these guys to do these things?
[01:08:46.440 --> 01:08:49.160] They're actually killing Jews up close.
[01:08:49.160 --> 01:08:49.720] Boom.
[01:08:49.720 --> 01:08:51.480] You know, and they're uncomfortable doing it.
[01:08:51.480 --> 01:08:53.160] They get shit-faced every night.
[01:08:53.160 --> 01:08:58.280] They have to move them to pits so that they're not quite so close and so on.
[01:08:58.280 --> 01:09:03.320] And you just wonder, is there some also self-deception there?
[01:09:03.320 --> 01:09:04.840] Like, I don't really want to do this.
[01:09:04.840 --> 01:09:10.600] But then you, well, they are Jews, and we all know they're subhuman, and they're the ones that started this war.
[01:09:10.600 --> 01:09:12.040] Remember, Hitler told us that?
[01:09:12.040 --> 01:09:19.800] All the propaganda throughout all the 30s, you know, the 1930s, there was a lot of stuff that happened that had nothing to do with exterminating Jews.
[01:09:19.800 --> 01:09:26.200] It was just a bunch of other stuff that kind of primed people to be able to do that at some point.
[01:09:26.200 --> 01:09:32.040] But then you don't want to let them off the hook, like, you know, Hannah Arant's, you know, the banality of evil.
[01:09:32.040 --> 01:09:36.360] I expected Eichmann to look like a monster and he looked like an ordinary bureaucrat.
[01:09:36.360 --> 01:09:39.480] But I thought she let him off the hook a little too much.
[01:09:39.480 --> 01:09:46.600] You know, he really was a rabid anti-Semite and worked mightily to exterminate as many Jews as he could.
[01:09:46.600 --> 01:09:52.200] So, you know, there's somewhere in there, you know, there's a complex explanation, I think.
[01:09:52.200 --> 01:09:53.560] What are your thoughts?
[01:09:53.880 --> 01:09:55.480] Yeah, I agree.
[01:09:55.480 --> 01:10:05.160] I mean, one of the profound experiences of living in Berlin for a couple years when I was at the Max Planck Institute, well, one year for each sabbatical.
[01:10:05.160 --> 01:10:18.080] And there was a museum exhibit, I think, the second, probably around 2013 at the Central National Museum in Berlin.
[01:10:18.240 --> 01:10:27.040] And it was about, it was just the ordinary accoutrements of life under Nazi, under National Socialism.
[01:10:27.040 --> 01:10:42.240] So it was things like a cute little chess table that had, you know, the Nazi insignia swastikas or like potholders for the home with full, you know, anti-Semitic slogans.
[01:10:42.240 --> 01:10:52.320] It just showed just the sheer ordinariness and the way that day-to-day life was kind of woven in if you were, I guess, sufficiently adherent.
[01:10:52.560 --> 01:10:55.120] But you would just find these household items.
[01:10:55.520 --> 01:11:04.160] And that, you know, it wasn't the chilling, the usual chilling things like the photographs of emaciated people or gas chambers.
[01:11:04.160 --> 01:11:07.760] It was actually just like oven mitts with swastikas.
[01:11:07.760 --> 01:11:10.640] It's so interesting to see that.
[01:11:10.640 --> 01:11:16.800] And it's good, it's a good reminder or something to really contemplate on the one hand.
[01:11:16.800 --> 01:11:22.720] And then with the ordinariness of someone like, then can you extend that as Arent did so successfully?
[01:11:22.720 --> 01:11:42.720] I mean, I think what was, you can criticize her work, I suppose, but I think, and you probably can, but I think what was successful about it is that she, in a sense, used Eichmann to make a broader point about his, you know, there's something about the way she describes him.
[01:11:45.680 --> 01:12:00.000] Maybe she underestimates his rabidness, but the way she describes how his conscience, you know, that passage in Eichmann in Jerusalem where she says his conscience worked for two hours and 53 minutes.
[01:12:00.520 --> 01:12:06.440] There was one time where he saved one Jewish gardener from being exterminated.
[01:12:06.440 --> 01:12:13.080] And then after that, he never did anything else because he knew the man and he took some action.
[01:12:13.080 --> 01:12:19.720] And so she said, basically, we saw his conscience operate for two hours and then it never was seen again.
[01:12:19.720 --> 01:12:28.520] But she's having like, she's because she's so talented, I guess, she's bringing forward this idea that has stuck with us.
[01:12:28.520 --> 01:12:35.400] And perhaps it doesn't fully describe Eichmann himself, but more her interest in him.
[01:12:35.400 --> 01:12:36.920] Like in that question.
[01:12:36.920 --> 01:12:37.560] Yeah.
[01:12:37.560 --> 01:12:38.440] Okay, fair enough.
[01:12:38.440 --> 01:12:39.880] That's that's a good point.
[01:12:39.880 --> 01:12:49.480] Yeah, you do wonder after the war, these people are all dead now, but how many of these Germans that did participate and then they just go about their lives?
[01:12:49.480 --> 01:12:50.760] Well, well, that was then.
[01:12:51.080 --> 01:12:51.880] Now is now.
[01:12:51.880 --> 01:12:52.440] It's different.
[01:12:52.680 --> 01:12:54.200] I wouldn't do that now.
[01:12:54.760 --> 01:12:55.560] Something like that.
[01:12:55.560 --> 01:13:05.480] Of course, you know, with war crimes as a thing and genocide as an actual term that was coined in 1945, now it's a thing.
[01:13:05.480 --> 01:13:11.960] So we got to put some people in the docket for this or else this could happen again and we don't want, you know, all right.
[01:13:11.960 --> 01:13:20.760] So, yeah, so you get, you know, you get these trials like that guy, Damon Junk, who has worked for Ford Motor Company in Ohio for 40 years or whatever.
[01:13:20.760 --> 01:13:23.000] And then, okay, we're going to bring this guy down.
[01:13:23.000 --> 01:13:25.560] It's like, yeah, but that was a long time ago.
[01:13:25.560 --> 01:13:28.040] I mean, that's not the guy he is now.
[01:13:28.360 --> 01:13:31.160] But we have a hard time thinking of it like that.
[01:13:31.160 --> 01:13:45.520] Yeah, I think we do have trouble with history, with, you know, the length, I guess, just the human life, how we are changing, like to think that people are just one thing all the time for your whole life on some level.
[01:13:45.760 --> 01:13:57.760] Also, we have trouble with social bonds, you know, the extent to which we actually are generated by our social environments and our deeply, you know, embeddedness in those things.
[01:13:57.760 --> 01:14:10.560] All those questions are just hard to reckon with, especially when practical, practical, or even just maybe the sense that there needs to be a reckoning.
[01:14:10.560 --> 01:14:16.080] There needs to be a public sense that we are vigilant against monsters.
[01:14:16.080 --> 01:14:16.480] Yeah.
[01:14:16.800 --> 01:14:18.240] And that it can't, you know.
[01:14:18.240 --> 01:14:18.720] So.
[01:14:19.040 --> 01:14:26.000] Well, the name of that film is the series, Netflix, The Devil Next Door, I think it's called.
[01:14:26.320 --> 01:14:35.120] It's like, God, I was living next to this guy for all these years, and he was a Nazi guard at this concentration camp, and he heard of Jews in the gas chambers.
[01:14:35.120 --> 01:14:36.000] Wow.
[01:14:36.560 --> 01:14:38.800] But he was a quiet man.
[01:14:40.000 --> 01:14:41.680] Well, that's who he is now.
[01:14:41.680 --> 01:14:44.480] He was somebody else in this other situation.
[01:14:44.480 --> 01:14:45.920] Yeah, I mean, this goes back and forth.
[01:14:45.920 --> 01:14:47.200] You know, that was Phil's whole thing.
[01:14:47.200 --> 01:14:54.560] You know, it's at the Abu Ghraib Abu Ghraib soldier who was in charge of Abu Ghraib.
[01:14:54.560 --> 01:14:55.760] I forget his name now.
[01:14:56.320 --> 01:15:02.560] That Phil was on his defense team saying, well, he's not like that when he was in America.
[01:15:02.560 --> 01:15:03.680] He was a Boy Scout.
[01:15:03.680 --> 01:15:07.280] He was the, you know, the high school standout hero and so on.
[01:15:07.280 --> 01:15:09.440] And so he had all these decorations and stuff.
[01:15:09.440 --> 01:15:12.000] And he's put in this bad situation.
[01:15:12.000 --> 01:15:13.440] But then, where is free will?
[01:15:13.440 --> 01:15:15.520] Where is your volition?
[01:15:15.840 --> 01:15:17.280] Yeah, where is responsibility?
[01:15:17.280 --> 01:15:33.080] It's like a lot of people who do end up in cults and maybe do terrible things, like get involved in or at least see and sanction and somehow stand by when you know the abuse of children is happening.
[01:15:33.400 --> 01:15:53.800] Maybe they wouldn't have entered that cult if they hadn't walked down a certain street on a certain day and waited at a certain bus stop and happened to have met these recruiters and happened to have, maybe they just, you know, all sorts of circumstances and happenstances may have resulted in their decision and it was an ill-informed decision.
[01:15:53.800 --> 01:15:55.880] Nonetheless, is there not a moment?
[01:15:55.880 --> 01:15:57.320] Is there not responsibility?
[01:15:57.320 --> 01:15:59.400] I think that's what's confusing.
[01:15:59.720 --> 01:16:00.760] It is confusing.
[01:16:00.760 --> 01:16:13.640] And when you get maybe the legal context actually is illuminating in that way that when it really push comes to shove, you know, how does responsibility how will that function?
[01:16:13.640 --> 01:16:19.080] Because ultimately, everyone has their reasons.
[01:16:19.080 --> 01:16:19.480] Yeah.
[01:16:19.800 --> 01:16:21.720] Well, that was the fascinating thing about the Manson trials.
[01:16:21.720 --> 01:16:32.360] Buglyosi had to get these women convicted for murder because they really did do it, but then also get Manson, who wasn't even there.
[01:16:32.360 --> 01:16:36.440] So he must have had supreme brain mind control over these girls.
[01:16:36.440 --> 01:16:38.600] But on the other hand, we don't want to let them off.
[01:16:38.600 --> 01:16:41.800] Like, oh, I was just a victim of Charlie's mind control.
[01:16:41.800 --> 01:16:47.640] You know, so he, you know, he concocted the story about the helter-skelter and the race war and all that stuff.
[01:16:47.640 --> 01:16:54.840] That I think Tom O'Neill, I think he does debunk that is probably was not the real motive of the Manson family, Helter Skelter.
[01:16:54.840 --> 01:16:59.880] I mean, but Bugosi's goal was not to figure out what happened, the truth.
[01:16:59.880 --> 01:17:01.560] His goal is to win the case, right?
[01:17:01.560 --> 01:17:04.920] So, lawyers operate differently than scientists in that sense.
[01:17:05.320 --> 01:17:10.600] I actually am going to be talking to Tom O'Neill shortly about some of these questions.
[01:17:11.800 --> 01:17:39.200] And we've probably crossed paths a lot in the Lewis, Jolly, and West archives, but I think one thing maybe that should be considered with the Manson family is the shift from not just intensive use of LSD, but then into amphetamines, amphetamine, augmenting the LSD and creating a different, I mean, as they progressed in their crime.
[01:17:39.200 --> 01:17:43.520] So, and that actually I'm curious to hear.
[01:17:43.520 --> 01:17:53.360] This is where there was an amphetamine research project going on in Haight-Ashbury where Manson's family, I think, was a subject of them.
[01:17:53.360 --> 01:17:56.880] Or I'm trying to find out if that was the case.
[01:17:56.880 --> 01:17:59.680] But yeah, there's some interesting connections there.
[01:17:59.680 --> 01:18:03.840] I don't know.
[01:18:04.160 --> 01:18:04.800] Yeah, I don't.
[01:18:04.800 --> 01:18:06.720] I think they wanted to make a story.
[01:18:06.720 --> 01:18:15.600] I think the family wanted to make a story that would somehow create more of an aura around their acts, perhaps.
[01:18:15.600 --> 01:18:16.880] Oh, I love the book, Chaos.
[01:18:16.880 --> 01:18:18.800] And I watched, I read the whole thing.
[01:18:18.800 --> 01:18:20.560] I watched him on Rogan for three hours.
[01:18:20.560 --> 01:18:22.000] I watched you on Rogan for a couple hours.
[01:18:22.080 --> 01:18:22.640] That was great.
[01:18:22.640 --> 01:18:24.080] This is all good stuff.
[01:18:24.400 --> 01:18:28.160] But at the end of the book, he basically says, I really have no smoking gun.
[01:18:28.160 --> 01:18:31.920] I can't prove the connection between Manson and the CIA, MK Ultra and all.
[01:18:31.920 --> 01:18:34.720] It's like, like, damn, darn.
[01:18:35.440 --> 01:18:39.120] Well, it's a tribute to him that he kind of ends up that way.
[01:18:39.120 --> 01:18:40.160] Yeah, oh, absolutely.
[01:18:40.160 --> 01:18:40.640] Totally.
[01:18:40.960 --> 01:18:45.520] You know, high integrity and honesty on his part as a good scholar.
[01:18:46.240 --> 01:18:52.800] Part of the problem is that it's such a bigger-than-life target, Manson, in the same way of Lee Harvey Oswald.
[01:18:52.800 --> 01:19:04.760] I mean, you know, conspiracy theorists have picked through every single person Lee Harvey Oswald ever met, any group he had any connection to whatsoever, and you could somehow tie it back to the CIA.
[01:19:04.760 --> 01:19:15.080] You know, if you have the back to Milgram's six degrees of separation, you can find links to between anybody and anything, almost anywhere, if you look hard enough.
[01:19:16.040 --> 01:19:19.640] Well, we'll see what Tom O'Neill's next book comes up with.
[01:19:19.960 --> 01:19:21.640] Oh, is he still working on this?
[01:19:22.520 --> 01:19:23.640] I believe he is.
[01:19:23.960 --> 01:19:24.840] I'll find out more.
[01:19:24.840 --> 01:19:28.120] I'm looking forward to comparing notes.
[01:19:28.120 --> 01:19:30.120] Okay, last topic, because I know you got to run.
[01:19:30.840 --> 01:19:41.320] You know, I love that you brought up Norbert Elliot's book, The Civilizing Process, in terms of how norms shift gradually over time.
[01:19:41.640 --> 01:19:58.520] Elliot's database were these books of etiquette and how people had to be instructed, you know, how to blow your nose, how to eat politely at the dinner table and use your fork for this and use your knife for that.
[01:19:58.520 --> 01:20:03.320] And, you know, and when you're sleeping next to somebody, you know, don't do this or that.
[01:20:03.320 --> 01:20:04.200] I mean, it just goes on.
[01:20:04.360 --> 01:20:05.320] It's astonishing.
[01:20:05.320 --> 01:20:08.600] I mean, our ancestors were just disgusting.
[01:20:09.240 --> 01:20:12.360] I don't know if that's the conclusion we're supposed to draw.
[01:20:12.360 --> 01:20:13.480] Our ancestors.
[01:20:13.480 --> 01:20:15.480] But it is really, it is stunning.
[01:20:15.480 --> 01:20:20.280] Please, you know, refrain from blowing your nose in the tablecloth when we're reading.
[01:20:20.760 --> 01:20:24.040] And then looking at it, going, oh, look at these tables I have here.
[01:20:24.040 --> 01:20:24.440] Yeah.
[01:20:25.160 --> 01:20:34.440] But your larger point there is that these norms shift so slowly over a decadal or maybe even century-long timeframe, we don't even notice it.
[01:20:34.440 --> 01:20:37.880] And we just don't do the things that these people did in the Middle Ages.
[01:20:37.880 --> 01:20:40.040] And it never even occurs to us.
[01:20:40.040 --> 01:20:48.720] You don't have to implement self-control because, you know, I really want to blow my nose into the tablecloth at my neighbor's house, but I'm just not going to do it because I have self-control.
[01:20:48.720 --> 01:20:51.520] It never even enters my mind to do that.
[01:20:51.520 --> 01:21:03.840] Or we're not aware of that desire because at first the etiquette manuals addressed all people and then, or first the gentry, then it addressed a wider population, then it just addressed children.
[01:21:03.840 --> 01:21:21.120] And then after a while, it just seemed to be so inculcated that you didn't have to tell anyone not to, you know, relieve themselves in the corner of the room or to, you know, blow your nose into your hand while you're eating.
[01:21:21.120 --> 01:21:27.680] But what's interesting is that Elias published this in 1939, I believe.
[01:21:27.680 --> 01:21:40.880] So he's kind of in Germany as a Jewish sociologist watching the rise of, I mean, I think in a way he's asking these questions with the concepts of sociogenesis and psychogenesis, like what can be removed?
[01:21:40.880 --> 01:21:43.200] What is our, what is civilization?
[01:21:43.200 --> 01:21:56.800] And it also reminds me of Cheslov Milos, who says, living in Warsaw during World War II, he said, you know, I'm surrounded by people, you know, basically these couples, families living their lives.
[01:21:56.800 --> 01:22:01.120] Everyone tries to remain normal for as long as they can.
[01:22:01.120 --> 01:22:03.200] Like you're putting away money in your 401k.
[01:22:03.200 --> 01:22:04.400] They don't have 401ks.
[01:22:04.400 --> 01:22:07.760] But essentially that idea, you keep doing the things you do.
[01:22:07.760 --> 01:22:08.960] You live the best you can.
[01:22:08.960 --> 01:22:13.520] And then he said he saw how quickly it decayed.
[01:22:13.520 --> 01:22:16.320] And soon people are copulating in the street.
[01:22:16.320 --> 01:22:19.040] They're dying, you know, next to me.
[01:22:20.000 --> 01:22:26.480] You go out for a walk, and all of a sudden, this veneer of reality shatters.
[01:22:26.480 --> 01:22:29.240] And I think this profoundly shaped him.
[01:22:29.240 --> 01:22:34.360] And probably that same thing was happening to Elias.
[01:22:29.680 --> 01:22:35.480] That's why I like his work.
[01:22:35.800 --> 01:22:37.880] Yeah, yeah, that's interesting.
[01:22:37.880 --> 01:22:42.360] I hadn't thought of it as a commentary on his own time, but that does make sense.
[01:22:42.360 --> 01:22:44.040] I was thinking of it in a larger context.
[01:22:44.040 --> 01:22:47.480] I had Gene Twangy on the show for her book, Generations.
[01:22:47.640 --> 01:22:50.280] Her book is about how generations change.
[01:22:50.280 --> 01:22:59.480] Like, I'm a baby boomer, and you know, the way I think about the world is very different from, say, the Gen Z and my kids and so on.
[01:22:59.480 --> 01:23:03.080] But you don't even notice it until somebody points it out, like Gene does with data.
[01:23:03.080 --> 01:23:06.280] Like, you know, here's the kind of songs that were popular in the 60s.
[01:23:06.280 --> 01:23:10.920] Here's what's popular now, or names, or just, you know, anything that's happening like that.
[01:23:10.920 --> 01:23:20.600] So one of her data points that stuck out to me was the age of first pregnancy for women in my generation, baby boomers, was 19.
[01:23:20.600 --> 01:23:22.520] And today it's 29.
[01:23:22.520 --> 01:23:23.880] It's 10 years later.
[01:23:23.880 --> 01:23:27.240] And instead of having three or four kids, you have one, maybe two.
[01:23:27.560 --> 01:23:28.120] Okay.
[01:23:28.120 --> 01:23:30.120] So I asked her, you know, how did this happen?
[01:23:30.120 --> 01:23:32.920] I mean, does anybody sit you down and go, okay, here's the deal.
[01:23:32.920 --> 01:23:34.920] You're not going to get pregnant till you're 29.
[01:23:34.920 --> 01:23:36.600] It's like, no, of course not.
[01:23:36.600 --> 01:23:40.360] It's just that nobody you know is doing this.
[01:23:40.360 --> 01:23:42.920] Everybody you know in high school is going to college.
[01:23:42.920 --> 01:23:45.640] And then everybody you know in college is going to go get a career.
[01:23:45.640 --> 01:23:48.440] And then they're not going to get married till they're 30s.
[01:23:48.440 --> 01:23:49.000] And so on.
[01:23:49.000 --> 01:23:50.520] This is just what everybody does.
[01:23:50.520 --> 01:23:52.520] You don't even think about it.
[01:23:53.160 --> 01:23:53.640] Yeah.
[01:23:53.880 --> 01:23:55.240] It's a powerful thing.
[01:23:55.240 --> 01:24:08.280] But we don't see, I guess that's from, I really like Gene Twangi's work or taking generations as a kind of analytical frame because it is profound.
[01:24:08.280 --> 01:24:10.440] All that we don't see is so profound.
[01:24:10.440 --> 01:24:18.720] And I think that's what I'm trying to do with brainwashing: is it just kind of a reminder of how much we take for granted?
[01:24:19.040 --> 01:24:28.720] Well, your last two chapters on social media and AI and all that stuff are profound because we're in the middle of it now and we really have no idea what the effects are going to be.
[01:24:28.720 --> 01:24:34.400] Apparently, there's some really negative effects, as Gene and Jonathan Haidt and others have shown.
[01:24:34.720 --> 01:24:36.000] But this too may pass.
[01:24:36.000 --> 01:24:45.840] I mean, maybe, you know, social media, it's only what, 2007 that the smartphone is introduced, and then Facebook and Instagram, all this stuff is pretty new compared to, you know, books or whatever.
[01:24:45.840 --> 01:24:48.560] Newspapers go back centuries.
[01:24:49.520 --> 01:24:50.320] Yeah.
[01:24:50.960 --> 01:24:52.080] Very true.
[01:24:52.080 --> 01:24:53.200] Impossible to know.
[01:24:53.200 --> 01:24:56.480] I mean, I'm always interested in this: like, what's the future of moral progress?
[01:24:56.480 --> 01:25:07.920] You know, what are we doing now that our descendants will look back like we look back at slaveholders or the way people treated women or Jews or blacks and are appalled?
[01:25:07.920 --> 01:25:10.000] What are we doing that they'll be appalled by us?
[01:25:10.000 --> 01:25:13.040] Well, if I knew, I wouldn't do it, right?
[01:25:13.360 --> 01:25:14.960] But I don't know.
[01:25:15.440 --> 01:25:17.440] Animal rights, maybe we shouldn't be eating meat.
[01:25:18.160 --> 01:25:19.440] That is a good guess.
[01:25:19.440 --> 01:25:20.560] I think it's a good guess.
[01:25:20.560 --> 01:25:34.400] I think I'm convinced by those who say when we look back, when historians look back, the immense cruelty of animal slaughter through factory farming, industrialized agricultural will be the great, you know, unseen.
[01:25:34.560 --> 01:25:41.360] And there are even studies of just how these facilities are hidden from view and even from the workers themselves.
[01:25:41.360 --> 01:25:44.640] Anyway, I have a whole interest in that myself.
[01:25:44.640 --> 01:25:45.200] Oh, you do?
[01:25:45.200 --> 01:25:45.680] Okay, good.
[01:25:45.680 --> 01:25:46.000] Yeah.
[01:25:46.000 --> 01:25:46.400] All right.
[01:25:46.400 --> 01:25:49.200] Let's let you get on to your next project here.
[01:25:49.200 --> 01:25:50.560] Here it is: The Instability of Truth.
[01:25:50.640 --> 01:25:54.720] By the way, I love this cover because it's difficult to read, which is the point, right?
[01:25:54.720 --> 01:25:56.640] It's like, what am I reading here?
[01:25:56.640 --> 01:25:59.120] It's a little bit exactly unseen.
[01:25:59.360 --> 01:26:02.280] Staywashing, mind control, and hyper-persuasion.
[01:26:02.280 --> 01:26:03.160] Yes, indeed.
[01:25:59.840 --> 01:26:04.600] Rebecca, thanks so much for your work.
[01:26:05.000 --> 01:26:06.280] What are you working on next?
[01:26:07.320 --> 01:26:12.520] Oh, my next year I'm working on a project on psychedelics and brainwashing.
[01:26:12.520 --> 01:26:20.200] So really thinking about the psychedelic renaissance, its positives and potential pitfalls.
[01:26:20.520 --> 01:26:21.080] Oh, my God.
[01:26:21.080 --> 01:26:21.480] That's great.
[01:26:21.480 --> 01:26:26.600] Yeah, because micro-dosing and psychedelic therapies are now kind of in vogue.
[01:26:26.600 --> 01:26:30.360] Questions about suggestibility and what might happen, you know.
[01:26:31.240 --> 01:26:32.760] I'm looking forward to it.
[01:26:32.760 --> 01:26:36.040] It's going to be in the form of a comic book.
[01:26:36.840 --> 01:26:37.400] Oh, well.
[01:26:37.800 --> 01:26:38.360] Oh, cool.
[01:26:38.360 --> 01:26:39.000] That's good.
[01:26:39.000 --> 01:26:41.080] Yeah, we didn't even get a chance to talk about hypnosis.
[01:26:41.080 --> 01:26:45.720] Another one of these super interesting, difficult-to-explain mysteries, but it's real.
[01:26:45.720 --> 01:26:46.280] Yeah.
[01:26:46.520 --> 01:26:47.160] Yeah, yeah.
[01:26:47.160 --> 01:26:48.600] It was great to talk to you, too.
[01:26:48.600 --> 01:26:48.840] All right.
[01:26:48.840 --> 01:26:49.480] Thanks, Rebecca.
[01:26:56.840 --> 01:26:59.160] Marketing is hard.
[01:26:59.160 --> 01:27:00.520] But I'll tell you a little secret.
[01:27:00.520 --> 01:27:01.320] It doesn't have to be.
[01:27:01.320 --> 01:27:02.520] Let me point something out.
[01:27:02.520 --> 01:27:04.920] You're listening to a podcast right now, and it's great.
[01:27:04.920 --> 01:27:05.720] You love the host.
[01:27:05.720 --> 01:27:07.000] You seek it out and download it.
[01:27:07.000 --> 01:27:10.840] You listen to it while driving, working out, cooking, even going to the bathroom.
[01:27:10.840 --> 01:27:13.640] Podcasts are a pretty close companion.
[01:27:13.640 --> 01:27:15.320] And this is a podcast ad.
[01:27:15.320 --> 01:27:16.760] Did I get your attention?
[01:27:16.760 --> 01:27:21.480] You can reach great listeners like yourself with podcast advertising from LibSyn Ads.
[01:27:21.480 --> 01:27:31.400] Choose from hundreds of top podcasts offering host endorsements or run a pre-produced ad like this one across thousands of shows to reach your target audience and their favorite podcasts with LibSynAds.
[01:27:31.400 --> 01:27:33.000] Go to libsynads.com.
[01:27:33.000 --> 01:27:37.160] That's L-I-B-S-Y-N ADS.com today.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.880 --> 00:00:08.720] A mochi moment from Tara, who writes, For years, all my doctor said was eat less and move more, which never worked.
[00:00:08.720 --> 00:00:10.080] But you know what does?
[00:00:10.080 --> 00:00:13.280] The simple eating tips from my nutritionist at Mochi.
[00:00:13.280 --> 00:00:19.920] And after losing over 30 pounds, I can say you're not just another GLP1 source, you're a life source.
[00:00:19.920 --> 00:00:21.040] Thanks, Tara.
[00:00:21.040 --> 00:00:23.600] I'm Myra Amet, founder of Mochi Health.
[00:00:23.600 --> 00:00:27.360] To find your Mochi moment, visit joinmochi.com.
[00:00:27.360 --> 00:00:30.400] Tara is a mochi member compensated for her story.
[00:00:30.400 --> 00:00:32.320] Every business has an ambition.
[00:00:32.320 --> 00:00:41.760] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:00:41.760 --> 00:00:50.240] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:00:50.240 --> 00:00:55.040] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:00:55.040 --> 00:00:57.680] Grow today at PayPalOpen.com.
[00:00:57.680 --> 00:01:00.720] Loan subject to approval in available locations.
[00:01:03.920 --> 00:01:09.600] You're listening to The Michael Shermer Show.
[00:01:15.040 --> 00:01:18.080] All right, everybody, it's time for another episode of the Michael Shermer Show.
[00:01:18.080 --> 00:01:20.400] I'm your host, as always, Michael Shermer.
[00:01:20.400 --> 00:01:29.760] My guest today is Rebecca Lamove, who's a historian of science at Harvard University and has been a visiting scholar at the Max Planck Institute.
[00:01:30.560 --> 00:01:36.240] Her research explores data, technology, and the history of human and behavioral sciences.
[00:01:36.240 --> 00:01:42.240] She's the author of Database of Dreams: The Lost Quest to Catalog Humanity.
[00:01:42.240 --> 00:01:52.800] That is how scientists between 1942 and 1963 attempted to map the elusive and subjective parts of the human psyche V via once-futuristic data storage techniques.
[00:01:52.800 --> 00:01:53.840] You can only imagine.
[00:01:53.840 --> 00:01:55.920] That must have been like magnetic tapes or something.
[00:01:56.320 --> 00:01:57.040] Exactly.
[00:01:57.920 --> 00:02:01.960] World is Laboratory: Experiments with Mice, Mazes, and Men.
[00:01:59.680 --> 00:02:07.240] I'll have to tell you about my experiences working for two years in a Skinnerian lab with pigeons and rats.
[00:02:08.840 --> 00:02:16.680] And co-author of How Reason Almost Lost Its Mind: The Strange Career of Cold War Rationality, all of which I'm interested in.
[00:02:16.680 --> 00:02:23.320] But our new book is The Instability of Truth: Brainwashing, Mind Control, and Hyper Persuasion.
[00:02:23.320 --> 00:02:24.040] There it is.
[00:02:24.040 --> 00:02:25.640] It's also available on audio.
[00:02:25.640 --> 00:02:27.880] I've listened to it one and a half times, Rebecca.
[00:02:27.880 --> 00:02:29.000] It's a great book.
[00:02:29.560 --> 00:02:31.400] So glad you're enjoying it.
[00:02:31.960 --> 00:02:34.120] Did you listen both times?
[00:02:34.680 --> 00:02:37.480] Well, I had the book, so I've been kind of skimming around taking some notes.
[00:02:37.480 --> 00:02:44.840] But yeah, I mainly listen to it when I'm out riding my bike or walking with the dog or doing errands or driving and that sort of thing.
[00:02:44.840 --> 00:02:49.960] But I've read all these books on mind control, cults, and persuasion and all that stuff.
[00:02:49.960 --> 00:02:55.080] This is really the best kind of overall summary of the whole history of the phenomenon.
[00:02:55.240 --> 00:02:59.160] So, but before we get into all that, maybe just tell us your story.
[00:02:59.880 --> 00:03:04.040] What's your trajectory into this from your background?
[00:03:04.040 --> 00:03:08.600] Yeah, thank you for saying that, and thanks for having me on your show.
[00:03:09.080 --> 00:03:20.200] Yeah, so my trajectory is that I guess I went into graduate school, having a background in English literature in college.
[00:03:20.200 --> 00:03:23.800] And I thought I'd really like to do something more in the world.
[00:03:23.800 --> 00:03:29.000] And it seemed, you know, not that I don't love reading epic poetry.
[00:03:29.000 --> 00:03:34.360] Or if I went to graduate school, I thought, I really want to study something meaningful and meaningful to me.
[00:03:34.360 --> 00:03:38.360] And that really asks, you know, the biggest questions I could think of.
[00:03:38.360 --> 00:03:43.800] So I kind of made my way into anthropology, or I was living in California.
[00:03:43.800 --> 00:03:58.080] I decided to study anthropology, but then I ended up not quite doing fieldwork, but thinking that anthropology would equip me to study the scientists themselves who had asked questions about human freedom, how free are we really?
[00:03:58.080 --> 00:04:02.080] What is what are the things, how rational we assume we are?
[00:04:02.080 --> 00:04:03.760] Is that actually the case?
[00:04:03.760 --> 00:04:09.120] And also, to what extent are we shaped by factors that we just take for granted?
[00:04:09.120 --> 00:04:15.600] As, oh, that's just the way it is, our culture, or even just things we take to be trivial.
[00:04:15.600 --> 00:04:19.040] But, you know, how much of our, what we take to be ourselves is really us.
[00:04:19.120 --> 00:04:25.280] So I ended up writing a history of behaviorism, which it sounds like you have direct experience in as well.
[00:04:25.280 --> 00:04:42.720] I was interested in these great ambitious plans during the Cold War, especially, but even in the earlier periods of behaviorists to create a kind of science of human control and maybe human engineering.
[00:04:42.720 --> 00:04:52.080] And then I, at the end of my dissertation, I wrote a chapter on brainwashing because brainwashing seemed like it posed these questions in an especially urgent way.
[00:04:52.080 --> 00:04:58.640] And I also thought, what could be more out of date and seemingly irrelevant than this thing called brainwashing or mind control?
[00:04:58.640 --> 00:05:04.960] Because at the time I was writing in the late 1990s, it really had fallen out of favor.
[00:05:04.960 --> 00:05:12.560] It seemed just like a relic of an earlier moment of hysteria, and it couldn't possibly have anything to do with our present or future.
[00:05:12.560 --> 00:05:15.280] So it seemed like the most out of date.
[00:05:15.280 --> 00:05:19.200] And I've always had a soft spot for ideas that have fallen out of favor.
[00:05:19.200 --> 00:05:31.480] So I ended up taking that kernel of brainwashing that was left over from the dissertation and teaching classes on it for the last couple decades and eventually and writing about it in different ways.
[00:05:31.640 --> 00:05:44.760] And now I finally, then I thought a few years ago, I should actually put it all together in a book if I can, but I'm not sure anyone will be interested because it's again, it didn't yet seem very urgent.
[00:05:45.720 --> 00:05:46.200] Nice.
[00:05:46.200 --> 00:05:46.600] Yes.
[00:05:46.600 --> 00:05:48.440] Well, you know, it's a fascinating history.
[00:05:48.440 --> 00:05:54.680] I've had a number of authors on the show about books about MKUltra and CIA mind control and all that stuff.
[00:05:54.680 --> 00:05:56.360] It's so interesting.
[00:05:56.680 --> 00:06:07.880] And, you know, this whole like Tom O'Neill's chaos, which was just made into a Netflix documentary series, I think, you know, was Manson part of the whole CIA, MK Ultra control, mind control?
[00:06:07.880 --> 00:06:12.600] How did he get those young women to commit murder when he wasn't even there and so on and so forth?
[00:06:13.240 --> 00:06:23.400] There does seem to still be a mystery that's unsolved, namely, how do you get people to do things that on the surface they would not normally do?
[00:06:23.400 --> 00:06:24.120] Yeah, I agree.
[00:06:24.120 --> 00:06:26.360] There does seem to be a mystery.
[00:06:26.360 --> 00:06:40.760] And I think, I mean, I don't think maybe people growing up today necessarily know the story of Manson or the fact that he didn't actually commit the murders, but yet people acted as if they were extensions of his will.
[00:06:40.760 --> 00:06:52.840] But also, you know, the evidences they felt or they spoke of how they said, you know, certainly he may be a prophet, and I'm following in his footsteps, but I may be a prophet too.
[00:06:52.840 --> 00:07:07.160] So they were also, his followers were strangely not simply robotic, but also seemed to be empowered in some way, or enjoying themselves, which is very, it's very strange to even contemplate.
[00:07:07.160 --> 00:07:18.160] So, every time I've taught the class on brainwashing at my university, students do seem to, at least they usually seem like they have just a question.
[00:07:18.880 --> 00:07:19.920] You know, what is this?
[00:07:19.920 --> 00:07:22.400] Or is this a how-to class?
[00:07:22.400 --> 00:07:26.560] Or what is even the question at the heart of this?
[00:07:27.200 --> 00:07:31.840] Yeah, I guess at some level they're probably asking themselves, would I fall for something like that?
[00:07:32.320 --> 00:07:38.800] You know, it's like when you show the Milgram shock experiment stuff, you know, you know, they're thinking I would never do this.
[00:07:39.120 --> 00:07:43.200] Yeah, very few people immediately identify that they would.
[00:07:43.200 --> 00:07:44.720] They know they would.
[00:07:44.720 --> 00:07:50.320] And if they do, either they're being brave or, you know, I don't know.
[00:07:50.880 --> 00:07:51.920] It's just really rare.
[00:07:51.920 --> 00:07:55.920] I know one writer said that she knew she would have delivered shocks.
[00:07:55.920 --> 00:07:59.120] But yeah, it poses the same kind of question, the Milgram experiments.
[00:07:59.120 --> 00:07:59.920] What would I do?
[00:07:59.920 --> 00:08:08.880] Are there situations where, I mean, it's also the fascination with cult documentaries and stories of elaborate financial scams.
[00:08:08.880 --> 00:08:16.480] We always want to reassure ourselves that we wouldn't take part or we wouldn't fall for that or we wouldn't do that.
[00:08:17.120 --> 00:08:19.600] Have you ever fallen for anything like that?
[00:08:19.600 --> 00:08:23.360] Any personal experiences that made you interested in this topic?
[00:08:23.360 --> 00:08:37.440] Yeah, I mean, I have many from the, I guess what, you know, there's the one I mentioned in the book is just the experience in graduate school of kind of falling into a circle of it.
[00:08:37.440 --> 00:08:44.000] So there's seeming, there's sort of intellectual level and then there's deeply personal levels, and then there's seemingly trivial levels.
[00:08:44.000 --> 00:08:59.560] But on the intellectual level, during graduate school, I kind of became, I fell into a crowd of scholars, and my dissertation advisor was well known for his, for he was quite famous for his relationship with Michel Foucault.
[00:08:59.440 --> 00:09:01.960] And I learned a lot from learning from him.
[00:09:02.120 --> 00:09:13.640] But these kind of post-structuralist theorists that were in vogue at the time at UC Berkeley, I mean, I just was in a circle where people were reading them as if they were these fragments of sacred texts.
[00:09:13.640 --> 00:09:16.360] And they would be, you know, I was falling into this too.
[00:09:16.360 --> 00:09:20.680] I thought if I could just read these very, they were very hard to understand.
[00:09:20.680 --> 00:09:28.840] So you would take them home and you puzzle over what could this Jacques Lacan have meant here and bring it back to the group and people would disagree.
[00:09:28.840 --> 00:09:45.640] I remember a moment seeing a graduate, I was meeting with my advisor and a graduate student ran in, or just a professor I was taking a class with, and he ran in brandishing one of these books and waving it around like he had, like he was intoxicated.
[00:09:45.640 --> 00:09:47.240] And he said, it is not about truth.
[00:09:47.240 --> 00:09:48.280] It is about love.
[00:09:48.600 --> 00:09:51.560] And then she yelled back, no, it is about truth.
[00:09:51.560 --> 00:09:52.840] And he said, no, love.
[00:09:53.160 --> 00:09:55.880] And it was like a, it was like a movie.
[00:09:55.880 --> 00:09:56.840] It was very funny.
[00:09:56.840 --> 00:10:01.960] And I don't think, I don't suppose they saw it quite, you know, you get carried away.
[00:10:01.960 --> 00:10:10.120] And at anyway, as it relates to myself, I wrote a paper of which I received, for which I received a lot of praise.
[00:10:10.120 --> 00:10:15.080] It seemed at the very edge of being understandable, yet maybe profound.
[00:10:15.080 --> 00:10:16.920] And I eventually got it published.
[00:10:16.920 --> 00:10:23.880] But some friend of mine who was a journalist read it and he said, I don't even recognize your writing here.
[00:10:23.880 --> 00:10:25.560] I don't know, like, who is this?
[00:10:25.560 --> 00:10:27.400] Who is this person writing?
[00:10:27.400 --> 00:10:33.720] He said that it just seemed not like me and not really completely understandable.
[00:10:34.120 --> 00:10:41.480] He was kind of a stickler for not necessarily easy writing, but writing that you could really get to the bottom of.
[00:10:41.480 --> 00:10:53.440] So, anyway, that was a little bit of a wake-up call where I saw that, you know, at certain moments you can get carried away, you can become distant from maybe yourself.
[00:10:53.440 --> 00:11:20.560] And, you know, the more personal part that happened around the same time was just that I fell into a kind of coercively controlling relationship that had to do with drug addiction somewhat because of a fair, you know, I think I was, I think I felt at the time that I was, I disappointed my family and I basically wanted to check out and I thought I can handle this.
[00:11:20.560 --> 00:11:25.440] This will just be a little bit, I'll just take a few steps in this direction.
[00:11:25.440 --> 00:11:35.600] And it turned into a couple of years of just a very painful experience where the deeper I got into it, the less I was able to get out.
[00:11:35.600 --> 00:11:37.600] And I lost perspective on it.
[00:11:37.600 --> 00:11:54.560] I really thought I no longer deserved, you know, to maybe even survive this addiction and this relationship where I constantly felt that I wasn't measuring up to the standards this person was constantly illuminating.
[00:11:54.560 --> 00:12:08.320] So just having, I guess, one thing that added to my research and just the fortune of getting out of that difficult situation gave me a lot of insight into, you know, on the personal level.
[00:12:08.320 --> 00:12:15.360] It reminded me recently, my daughter took a class on the poetry, the literature of Dante.
[00:12:15.360 --> 00:12:24.960] And he, at the beginning of his great work, he talks about how in the middle of our life's journey, I found myself in a dark wood, the right road lost.
[00:12:24.960 --> 00:12:32.200] And I thought, wow, that is that, I mean, and for Dante, apparently, middle of our life's journey was your late 20s or early 30s.
[00:12:32.440 --> 00:12:39.480] And, you know, so I did find myself at this lost, you know, and the right road lost.
[00:12:39.480 --> 00:12:46.200] And to have that experience made me wonder, how can we, you know, how does that happen?
[00:12:47.800 --> 00:12:48.840] Really fascinating.
[00:12:48.840 --> 00:12:51.560] Was this a romantic relationship, I presume?
[00:12:51.560 --> 00:12:53.880] And how did you get out of that?
[00:12:54.200 --> 00:13:15.000] Well, I was just very fortunate that because, as is often the case in these kind of, I don't know, I was just saying bad relationships, you go in thinking it's fine, and then you lose touch with friends, you lose touch with family, and you lose relationships through which you can test the sanity of it.
[00:13:15.000 --> 00:13:21.480] And after, so I was kind of lost in this for a time.
[00:13:21.480 --> 00:13:26.280] And I happened to have a coffee date with one of my few remaining friends.
[00:13:26.280 --> 00:13:31.720] This is kind of funny, but I mean, I feel so grateful to her.
[00:13:31.720 --> 00:13:46.040] My friend Nikki, anyway, went out for coffee with her and she just said, you know, this person, he acts like he has, he acts like he's better looking, smarter, and much funnier than you, but he's none of those things.
[00:13:46.040 --> 00:13:52.920] And somehow it was like this wake-up call that this wasn't the only reality, was the reality this person had painted.
[00:13:52.920 --> 00:14:02.520] And also my, you know, my capacities were diminished by the kind of, you know, the effects of drugs as well.
[00:14:02.520 --> 00:14:06.760] So just having somebody else's perspective was useful.
[00:14:06.760 --> 00:14:22.240] Yeah, reality check is super important, which is one reason why cults, one of the characteristics of cults, is they get you to leave your family or disconnect from your friends and colleagues or anybody that could give you a reality check who might say, you know, are you sure this is the right thing to do?
[00:14:22.560 --> 00:14:44.720] Sometimes even children, you know, control over one's children is very, I shall say, effective or malevolent aspect of abusive cults is that you have to surrender your biological children supposedly for their own good, but it creates a lot of control, guilt, and emotional attachment.
[00:14:44.720 --> 00:14:57.760] Yeah, the drug use, that's interesting because that's one of the things Manson did with the young women was, you know, they did a lot of LSD, as you wrote about in your book, and other drugs, and also a lot of sex.
[00:14:58.560 --> 00:15:06.880] You know, I don't think you need MK Ultra mind control technology to explain Manson's influence over these followers of his.
[00:15:07.280 --> 00:15:15.360] I think, you know, just good old human nature and how people react that way in response to those sorts of things explains a lot.
[00:15:15.360 --> 00:15:21.760] As you were talking, I was thinking of, you know, is postmodernism in that sense, a kind of a cult?
[00:15:21.760 --> 00:15:30.960] Not really a cult, but, you know, along the lines of what you're talking about, you fall into a worldview in which everybody you know is kind of steeped in the language.
[00:15:30.960 --> 00:15:36.000] And the language of postmodernism, which I've, you know, I try, I try to read those things.
[00:15:36.000 --> 00:15:45.440] And, you know, it's a little bit like in UFO circles, we have a thing called low information zone, or you could apply this to Bigfoot or whatever.
[00:15:45.680 --> 00:15:48.240] You know, the photos are always kind of grainy, blurry.
[00:15:48.240 --> 00:15:50.560] You can't quite make out what's going on.
[00:15:50.560 --> 00:15:52.080] And that's where people kick in.
[00:15:52.080 --> 00:15:56.240] Well, see, if you kind of use your imagination and squint, you can sort of see right there.
[00:15:56.240 --> 00:15:58.160] It's like, I don't really see that.
[00:15:58.160 --> 00:16:00.760] But that's where the imagination then kind of fits in.
[00:16:01.000 --> 00:16:13.240] Maybe, you know, this kind of intentional obscurantism of that kind of writing is meant to do that, to make it a low information zone, so that therefore you can fill in the blanks however you want.
[00:16:13.240 --> 00:16:20.840] Or you're sort of elaborately instructed in how one should be filling in those blanks or the direct, at least the direction one should be going.
[00:16:20.840 --> 00:16:30.120] But it also reminds me: you know, there is an obscurantism in many beautiful things like poetry.
[00:16:30.120 --> 00:16:37.800] And, you know, who would want to deny themselves the ability to read into or to, you know, a poem shouldn't tell you what it means.
[00:16:37.800 --> 00:16:41.000] It should allow the reader to find those things.
[00:16:41.000 --> 00:16:43.000] So, but I think it can be abused.
[00:16:43.000 --> 00:16:51.560] So, I guess I found, you know, now with several decades behind me, I find that I can appreciate some of these writings.
[00:16:52.200 --> 00:16:55.080] Some of them I like still, some of them I don't like.
[00:16:55.080 --> 00:16:59.880] It's sort of like poets I like, poets I don't like, but it's more in that framework.
[00:16:59.880 --> 00:17:24.280] And I can also see that at some moments they could definitely become instruments of a kind of not, you know, not a nefarious, I wouldn't call it, you could say a kind of social circle where maybe it didn't lead to really interesting work necessarily, a lot of mimicry and kind of hyper-policing of each other's behavior thoughts a little bit.
[00:17:24.280 --> 00:17:25.480] That could happen.
[00:17:25.800 --> 00:17:26.520] Yeah.
[00:17:26.840 --> 00:17:34.520] From this relationship you were in, then let's extrapolate to talking about why people stay in abusive relationships at all.
[00:17:34.520 --> 00:17:46.800] Or, or you know, we just had the Diddy trial where in his defense, you know, his attorneys were saying, well, look, here's the text where she's saying, oh, I can't wait to freak off with you this weekend.
[00:17:47.120 --> 00:17:48.880] So, I mean, why is she doing that?
[00:17:48.880 --> 00:17:55.600] Why is she even saying that if she's under his control and influence and threat and so on?
[00:17:55.600 --> 00:17:58.000] Yeah, I don't know, maybe speak to that a little bit.
[00:17:58.000 --> 00:18:12.720] Yeah, I suppose if I hadn't been through this relationship myself in the past, of uh, no, I think there's a con that that worked very much against the uh it didn't work in trial.
[00:18:12.960 --> 00:18:27.200] I mean, the fact that there was a lot of evidence or in the in the words of the women themselves, that they seem to be enthusiastic participants in their own abuse, and that that could then be portrayed as, well, this young woman just liked sex, good for her.
[00:18:27.200 --> 00:18:41.600] That's what the attorney, the defense attorney, Mark Anifilo, who also defended Keith Vernieri, um, he was quite successful in this case because, you know, people say, well, they were grown women, this is just something they were choosing.
[00:18:41.600 --> 00:18:47.760] And I do have, so it's tricky when you're in a legal context to make these things stick.
[00:18:47.760 --> 00:18:57.280] And I think the history really shows that that questions of mind control, brainwashing, coercive control don't are very difficult to litigate in court.
[00:18:57.280 --> 00:19:08.480] And that's why they often have to take to make use of RICO and racketeering charges and other types of financial improprieties, which often accompany sexual abuse.
[00:19:08.480 --> 00:19:12.480] But personally speaking, I could relate to the women.
[00:19:12.480 --> 00:19:26.720] I mean, you know, I remember for years constantly flattering my boyfriend because it was this way that the relationship was set up that he was great.
[00:19:26.720 --> 00:19:28.480] You know, he was just so great.
[00:19:28.480 --> 00:19:32.920] And maybe one day the world would recognize how great he was.
[00:19:33.240 --> 00:19:37.880] But I could provide a constant stream of like reinforcement for him.
[00:19:37.880 --> 00:19:42.120] Oh, you know, you haven't written your dissertation yet, but when you do, it's going to be awesome.
[00:19:42.120 --> 00:19:46.360] And he could at the same time provide a constant stream of denigration to me.
[00:19:46.360 --> 00:19:48.120] So that was kind of the bargain we had.
[00:19:48.120 --> 00:19:52.200] And looking back, and I could think for years about why that happened.
[00:19:52.200 --> 00:19:53.000] What was it?
[00:19:53.320 --> 00:20:00.760] You know, it's provided a lot of opportunity for insight, and it never went to the level of me having to prove this in court.
[00:20:00.760 --> 00:20:08.920] But if you know, because there was no, ultimately, no crime, it was just a real, it was a lot of suffering.
[00:20:08.920 --> 00:20:12.040] And I feel a lot of compassion for the women, you know.
[00:20:12.680 --> 00:20:13.640] Yeah, for sure.
[00:20:13.640 --> 00:20:22.520] Well, you know, there's a couple of other phenomena: loss aversion and sunk cost fallacy, where, you know, you're, you're, you're, you're this far down the relationship.
[00:20:22.680 --> 00:20:23.960] I hate to give up now.
[00:20:23.960 --> 00:20:25.400] Maybe it'll work out.
[00:20:25.400 --> 00:20:30.920] You know, it's hard to cut your losses and get out like in investments or whatever.
[00:20:30.920 --> 00:20:33.640] And it's hard in relationships in the same way.
[00:20:33.640 --> 00:20:34.040] Yeah.
[00:20:34.040 --> 00:20:41.880] I mean, then they're just, I think people who do end up leaving abusive cults do describe a kind of moment.
[00:20:41.880 --> 00:20:43.720] Sometimes they go on for years.
[00:20:43.720 --> 00:20:53.720] And then, you know, surprisingly, there's just a break, or they sometimes describe it as snapping, where the whole system of belief suddenly shatters.
[00:20:53.720 --> 00:21:02.520] And it can be through maybe an accident or a shocking encounter with an old friend, or seeing a family member.
[00:21:02.520 --> 00:21:05.240] Or actually, this is a great example.
[00:21:05.240 --> 00:21:10.600] I was being interviewed by a cult therapist who's worked for decades with cult members.
[00:21:10.600 --> 00:21:25.680] And she was saying she has a client who was in QAnon, and he had been a kind of a just a mainstream Democrat for many years, and then had fallen into this QAnon circle and gotten really pulled in during COVID.
[00:21:25.680 --> 00:21:30.080] And he was then entering joining demonstrations.
[00:21:30.080 --> 00:21:39.920] He was in Times Square, and he was part of, he was part of, he was yelling in the street that something like Hillary Clinton needed to be arrested and shot.
[00:21:39.920 --> 00:21:47.440] And then he turned and saw himself in the window of a building, a mirrored building.
[00:21:47.440 --> 00:21:52.320] So he saw himself and he thought, who's that madman jumping up and down?
[00:21:53.200 --> 00:21:54.960] And he realized it was himself.
[00:21:54.960 --> 00:21:58.240] And that was the shocking moment of self-recognition.
[00:21:58.240 --> 00:22:01.120] But those kind of shocks can occur.
[00:22:01.120 --> 00:22:10.480] And many coercive groups, coercively controlling groups, have ammunition that they then provide that you should apply when shocked.
[00:22:10.480 --> 00:22:23.600] So they'll tell you to repeat certain words over and over again or reassure yourself or know that anyone who challenges you has to be evil or various or various techniques.
[00:22:23.920 --> 00:22:27.360] Don't have mirrors in your cult room.
[00:22:28.000 --> 00:22:28.640] Avoid mirrors.
[00:22:28.800 --> 00:22:29.600] Who is that madman?
[00:22:29.600 --> 00:22:30.800] Oh, it's me.
[00:22:31.520 --> 00:22:39.760] Yeah, well, the Keith Ranieri Nexium thing is interesting because you don't surely did not start off as a cult.
[00:22:39.760 --> 00:22:43.360] And in fact, I'm fond of saying no one in the history of the world has ever joined a cult.
[00:22:43.360 --> 00:22:46.160] You know, they join a group that they think is going to be really good.
[00:22:46.160 --> 00:22:47.360] We're going to help the poor.
[00:22:47.360 --> 00:22:54.880] I mean, you look at Jim Jones, you know, migrated from Indiana to San Francisco Bay Area in the 60s.
[00:22:54.880 --> 00:22:59.600] You know, one of the first to integrate a church instead of just being all white or all black.
[00:22:59.600 --> 00:23:00.440] It was both.
[00:23:00.440 --> 00:23:07.080] And, you know, there's pictures of him with Jerry Brown, Governor Jerry Brown, in his first round as governor of California.
[00:23:07.080 --> 00:23:10.440] That's very innovative and, you know, very progressive.
[00:23:10.440 --> 00:23:12.200] You know, we're going to man the soup kitchens.
[00:23:12.200 --> 00:23:13.240] We're going to help poor people.
[00:23:13.240 --> 00:23:14.120] This is all good.
[00:23:14.200 --> 00:23:17.160] You know, who would not want to join a helpful group like that?
[00:23:17.160 --> 00:23:19.480] You know, and then 20 years later, you're in Guyana.
[00:23:19.480 --> 00:23:19.880] Okay.
[00:23:20.520 --> 00:23:27.720] You know, so it's a kind of a gradual, slow, you know, 15 volts at a time kind of escalation game that happens there.
[00:23:27.720 --> 00:23:31.080] But this brings to mind a deeper question I have.
[00:23:31.560 --> 00:23:35.720] This relates to what Danny Kahneman called base rate neglect.
[00:23:35.720 --> 00:23:39.400] That is, you know, you say, here's this interesting phenomenon.
[00:23:39.400 --> 00:23:47.880] You know, here's, you know, the 15 women that joined Nexium and they had his initials branded in their crotch or something like that.
[00:23:47.880 --> 00:23:50.360] And so we focus on that, but what's the base rate?
[00:23:50.360 --> 00:23:58.280] How many women joined Nexium as a kind of self-help empowering women in business group?
[00:23:58.280 --> 00:24:06.760] You know, maybe it's hundreds or thousands or whatever, and they got 15 of them, or like that Netflix series, The Tinder Swindler.
[00:24:06.760 --> 00:24:13.320] You know, they had four women that they kind of focused on that fell for this guy and wired him tens of thousands of dollars and so on.
[00:24:13.320 --> 00:24:15.480] But how many women did he try it out on?
[00:24:15.480 --> 00:24:17.160] You know, they don't mention that.
[00:24:17.160 --> 00:24:18.120] What's the base rate?
[00:24:18.120 --> 00:24:21.880] I mean, is it four out of a thousand, four out of a hundred, four out of forty?
[00:24:22.440 --> 00:24:23.480] You know, I don't know.
[00:24:23.480 --> 00:24:25.560] And so how do we think about that?
[00:24:25.880 --> 00:24:30.760] Well, in the case of Nexium, that is a really great question to ask.
[00:24:30.760 --> 00:24:52.800] And I think that's one reason why there is no absolute list of abusive cults because many people skate along on the periphery of a of what is by almost all accounts a dangerous cult, but they're not harmed by it because maybe they didn't get that involved or they didn't go live in the compound or they they just took yoga classes once a week or something like that.
[00:24:52.800 --> 00:25:06.000] So it's hard to say, but I did have a very illuminating uh interview with Sarah Edmondson, who has a who is one of the women, one of the 15 or how she was one of the first to be branded on her leg.
[00:25:06.000 --> 00:25:09.840] And actually, she's a fascinating person.
[00:25:09.840 --> 00:25:24.640] And she said she said herself she was happy they didn't go all in because they never went to live in Albany in the compound and she she and her husband, so they they maintain for themselves some arena of uh autonomy.
[00:25:24.640 --> 00:25:27.760] She worries what would have happened if things had you know could have been worse.
[00:25:27.760 --> 00:25:43.280] But she did say at the moment she was receiving the brand on her body, they were laughing and they were saying, if somebody walked in right now, they might think we're in a cult, but haha, obviously we're not.
[00:25:43.280 --> 00:25:48.400] So it's so interesting how capable we are in incredible duress.
[00:25:48.400 --> 00:25:50.880] And she said she's had years of therapy.
[00:25:50.880 --> 00:25:53.840] Cheap Caribbean vacations.
[00:25:53.840 --> 00:25:56.240] Get more sen for your dollar.
[00:25:56.560 --> 00:26:02.400] Vacation planning should feel like a breeze, not a deep dive into countless travel sites searching for the best deal.
[00:26:02.400 --> 00:26:08.480] With Cheap Caribbean's Budget Beach Finder, get every destination and every date, all in one search.
[00:26:08.480 --> 00:26:14.880] Go to cheapcaribbean.com and try the Budget Beach Finder and see how stress-free vacation planning should be.
[00:26:14.880 --> 00:26:17.880] Cheap Caribbean vacations.
[00:26:17.880 --> 00:26:20.480] Get more sen for your dollar.
[00:26:21.360 --> 00:26:22.520] To deal with that.
[00:26:22.240 --> 00:26:32.120] But, but you know, just looking back at that moment, or to hold up a mirror, in a sense, to, you know, for herself to try to understand how that could have happened.
[00:26:32.360 --> 00:26:43.960] So, yeah, but the question of base rate neglect is also an important one because it's almost every group, not everyone is exposed to the most extreme.
[00:26:43.960 --> 00:26:48.280] And also, it happens step by step in a slow manner.
[00:26:48.280 --> 00:26:56.840] So certain people might gravitate to different, you know, echelons of the group.
[00:26:56.840 --> 00:26:58.920] And some people might be enthusiastic.
[00:26:58.920 --> 00:27:03.480] Or sometimes people draw back for a number of years and then they get more involved later.
[00:27:04.440 --> 00:27:18.680] Yeah, like how many people have taken the little personality test of Scientology or they're down on Sunset or Hollywood Boulevard and they have the little table set up there with the cans and you can do the little e-meter thing.
[00:27:18.680 --> 00:27:19.720] Big fun.
[00:27:20.120 --> 00:27:23.960] I mean, probably tens of thousands of people have done it and just walked away.
[00:27:25.240 --> 00:27:29.320] And how many actually take out a second mortgage on their home and give it to the church?
[00:27:29.640 --> 00:27:30.840] Not many.
[00:27:31.160 --> 00:27:34.760] So this, okay, here's the larger picture for me.
[00:27:35.240 --> 00:27:42.200] I've spent 30 plus years here at Skeptic Magazine debunking all manner of weird things that people believe.
[00:27:42.200 --> 00:27:45.480] So it just seems like people are incredibly irrational.
[00:27:45.800 --> 00:27:48.760] But I've been kind of rethinking it in the last couple of years.
[00:27:49.160 --> 00:28:01.000] Hugo Mercier's book, what's it not born yesterday, argues that we're not that irrational.
[00:28:01.000 --> 00:28:05.560] Most of the time, most people maintain a fairly rational life.
[00:28:06.280 --> 00:28:11.880] They maintain a job, a family, a marriage, kids, gas in the tank, food in the fridge.
[00:28:11.880 --> 00:28:13.240] They go about their lives.
[00:28:13.240 --> 00:28:14.920] And maybe there's just one quirky thing.
[00:28:15.680 --> 00:28:38.000] They went to the January 6th thing in Washington, D.C., and they did the, I don't know what I was thinking, but otherwise, you know, or like the guy that went to the ping pong pizzeria in Washington, D.C., Edgar Welch, you know, to break up the pedophile ring that, you know, QAnon told him was, you know, going on there.
[00:28:38.000 --> 00:28:40.400] And, you know, there's a kind of an underlying rationality.
[00:28:40.400 --> 00:28:50.880] If you really believe there was a pedophile ring and crime and no one was doing anything about it, you know, people do not infrequently take the law into their own hands when there's an injustice.
[00:28:50.880 --> 00:28:56.240] You know, so you could kind of see his, then he gets there and like, oh, and then he shoots his gun into the roof.
[00:28:56.240 --> 00:28:57.120] He gets arrested.
[00:28:57.120 --> 00:29:00.880] He spends a couple of years in jail and comes out and goes, I don't know what I was thinking.
[00:29:00.880 --> 00:29:02.080] That was just crazy.
[00:29:02.240 --> 00:29:04.240] Otherwise, you know, perfectly normal.
[00:29:04.240 --> 00:29:16.960] Or there was that woman in Texas we found for this Netflix series I worked on on brainwashing where she went down the QAnon rabbit hole and she was otherwise married with kids, attractive, ran her own PR business.
[00:29:16.960 --> 00:29:20.080] I forget her name now, but she talked us through the whole thing.
[00:29:20.080 --> 00:29:21.440] And this is what I did.
[00:29:21.440 --> 00:29:27.360] You know, with the COVID shutdown, you know, I just spent all day online reading all the QAnon stuff.
[00:29:27.360 --> 00:29:32.400] And before you know it, and then she plays on her phone, the voicemail, her husband left her.
[00:29:32.400 --> 00:29:36.800] I'm taking the kids and leaving you if you don't dump this QAnon craziness.
[00:29:36.800 --> 00:29:39.680] And she said, you know, it came down to this choice.
[00:29:39.680 --> 00:29:43.440] This is the most important thing I will ever have an opportunity to do.
[00:29:43.440 --> 00:29:50.000] I can save the country myself or just be a normal mom, wife, and, you know, that's boring.
[00:29:50.000 --> 00:29:55.600] And then eventually she went back and had her moment of revelation there.
[00:29:55.600 --> 00:29:59.680] But still, it's like, wow, okay, this could happen to anybody.
[00:29:59.880 --> 00:30:28.600] Yeah, I mean, it's, I think that this is why there are several stories of researchers going to study something like the Flat Earth Society or QAnon and actually getting converted because it's not, you think that would never happen, but actually, you do come to sympathize with the rationality that's being displayed there because often there is a kind of rigorous step following.
[00:30:28.840 --> 00:30:30.200] There's a kind of logic to it.
[00:30:30.360 --> 00:30:38.520] I actually, for whatever reason, I can often feel like interested in the logic, logical trains people follow to get to a certain point of view.
[00:30:38.520 --> 00:30:44.840] So I'm constantly feeling that just, it's almost like exercise.
[00:30:44.840 --> 00:30:58.280] But I think one of the points I'm trying to draw out from my research is that what we think of as rationality is, or reason, is often undergirded, is always undergirded by emotion.
[00:30:58.280 --> 00:31:05.560] There is an identification first, and then the rationality or the reasoning can kind of unfold.
[00:31:05.560 --> 00:31:22.440] And that is supported by even, you know, the Enlightenment scholars like David Hume, who talk about how our passions, they are, that our reason is only ever a slave to what we're convicted by, our passions, our emotions.
[00:31:23.080 --> 00:31:25.240] Yeah, that's one of my favorite quotes.
[00:31:26.360 --> 00:31:34.200] You know, what he's saying there is that how that reason and rationality is the best tool we have for achieving our goals.
[00:31:34.200 --> 00:31:39.400] But what goals you pick, you know, most people, we're not rational about that.
[00:31:39.400 --> 00:31:44.280] You know, I'm a conservative or I'm a liberal because, well, I don't know, these are my people.
[00:31:44.280 --> 00:31:46.880] This is kind of what I vibe with them.
[00:31:44.840 --> 00:31:51.200] But if you ask me, I'm going to go, well, okay, I'll tell you the six reasons why I'm a conservative.
[00:31:51.600 --> 00:31:55.360] I believe in freedom and religion and faith and family, you know, whatever.
[00:31:55.360 --> 00:31:58.080] I'll give you reasons after the fact.
[00:31:58.240 --> 00:31:59.600] Yeah, I appreciate that.
[00:31:59.600 --> 00:32:08.160] I mean, that reminds me also of one of my favorite passages from Max Weber, where he says, in Science as a Vocation, he says, science can tell you the answer to almost everything.
[00:32:08.160 --> 00:32:10.960] How, you know, how to, you know, what to eat.
[00:32:10.960 --> 00:32:20.800] I mean, just extrapolating, he says, it can give you in exactitude many answers, but it can never answer the one question that is important to us.
[00:32:21.680 --> 00:32:25.600] Who, you know, what am I going to do and how am I to live?
[00:32:25.600 --> 00:32:36.160] So it, you know, you can, it can give you the surrounding calculations and not just exactitude, but actually rational reasons.
[00:32:36.160 --> 00:32:44.560] But it can't answer the questions that he said, I think, divide every human heart or something like that, that pull at our hearts.
[00:32:44.880 --> 00:32:49.280] Yeah, it should be like trying to make a list of why I love my wife.
[00:32:49.280 --> 00:32:52.160] You know, well, give me the six reasons or whatever.
[00:32:52.400 --> 00:32:55.520] The real answer should be, there's, I just do.
[00:32:55.840 --> 00:32:56.480] That's it.
[00:32:56.480 --> 00:32:57.040] Full stop.
[00:32:57.040 --> 00:32:59.040] Because if I say, well, she's really good looking.
[00:32:59.040 --> 00:33:01.120] She's a, you know, a nine on a scale of 10.
[00:33:01.120 --> 00:33:02.240] Well, what if you met a 10?
[00:33:02.240 --> 00:33:03.040] Would you leave her?
[00:33:03.040 --> 00:33:06.240] Like, well, no, because that's not the reason in the first place.
[00:33:06.480 --> 00:33:06.960] Yeah.
[00:33:09.120 --> 00:33:13.680] So, I mean, that's kind of a rational irrationalities.
[00:33:14.000 --> 00:33:18.640] You know, other examples of this is, you know, I don't want to know the outcome of the game because I recorded it.
[00:33:18.640 --> 00:33:26.080] So, you know, I choose to remain ignorant so I can, you know, later watch it or the, or the sign on the Brinks truck that has the money.
[00:33:26.080 --> 00:33:29.880] You know, the driver does not know the combination to the safe, right?
[00:33:29.280 --> 00:33:33.960] So the lack of knowledge is actually better for you to not know things, right?
[00:33:29.760 --> 00:33:38.200] So it's kind of a rational ignorance in a way.
[00:33:38.200 --> 00:33:39.960] But yeah.
[00:33:39.960 --> 00:33:43.720] Yeah, I think that's frequently the case with military operations, right?
[00:33:43.720 --> 00:34:05.000] Because of, you know, to go back to what at the beginning of my book, you talk about the POWs, I mean, what they can actually know should they be captured is often a consideration because anyone under torture will eventually, I mean, it's pretty much anyone under if the conditions are applied will give up the information they have.
[00:34:05.000 --> 00:34:08.200] So it's better if they simply don't know.
[00:34:09.160 --> 00:34:18.040] I mean, this is the ultimate argument against torture or enhanced interrogation or whatever, because people, you're not getting accurate intelligence.
[00:34:18.200 --> 00:34:23.240] People are just going to say whatever they have to say to get you to stop, including confessing crimes, right?
[00:34:23.240 --> 00:34:29.640] I mean, back in the witch trials, yes, I am a witch or, you know, he's a witch, you know, whatever.
[00:34:29.800 --> 00:34:33.720] Just whatever it takes to stop turning the rack further.
[00:34:33.720 --> 00:34:34.440] Yeah.
[00:34:35.080 --> 00:34:35.400] Yeah.
[00:34:35.400 --> 00:34:37.400] It's a significant part of brainwashing.
[00:34:37.400 --> 00:34:50.200] I don't think it's exactly the same, but it does seem to be a huge part of, especially the generating, you know, incidents, episodes that cause brainwashing to come into the English language.
[00:34:50.360 --> 00:35:02.360] There's a significant amount of torture, just the question of what people will do and say when they find themselves in an impossible situation, a very painful situation, the types of confessions they'll attempt.
[00:35:02.360 --> 00:35:09.080] Like some interesting, in your chapter, you sent me, there was a mention of Robert J.
[00:35:09.080 --> 00:35:12.040] Lifton and his book on the Nazi doctors.
[00:35:12.040 --> 00:35:17.920] And of course, he's famous too for being one of the first, or he wrote one of the most significant books about thought reform.
[00:35:18.080 --> 00:35:22.800] And some of the cases he describes are so eloquent, so interesting.
[00:35:23.120 --> 00:35:37.680] For example, this French doctor who's caught, you know, he had been living in China as a physician and he didn't want to leave, even though people said when the communists came in, it's probably wise for you to go back to France.
[00:35:37.680 --> 00:35:41.680] But he sent his family back, but he stayed himself.
[00:35:41.680 --> 00:35:54.320] And he, once he was arrested and subjected to intensive thought reform, he was kept in chains and not allowed to any autonomy, you know, personally.
[00:35:54.640 --> 00:35:58.800] But he, and then these long interrogations, he wasn't allowed to sleep.
[00:35:58.800 --> 00:36:02.000] He couldn't even eat.
[00:36:02.000 --> 00:36:08.960] Or when he had to relieve himself, like other prisoners had to unzip him, you know, his pants, or he had to eat off of the floor like a dog.
[00:36:08.960 --> 00:36:11.840] And, you know, all these, and he has just had these very painful chains.
[00:36:11.840 --> 00:36:17.120] So he's, he says, after a while, you just tell yourself, I have to get rid of the chains.
[00:36:17.120 --> 00:36:19.440] I have to get rid of these chains.
[00:36:19.760 --> 00:36:32.240] And so when they ask him to confess to being a spy, initially his response is to say, you know, once he, you know, into a couple, a week or so into it, he says, I was a great spy.
[00:36:32.240 --> 00:36:36.480] I did all these, you know, he does these sort of grandiose confessions.
[00:36:36.480 --> 00:36:37.920] And they say, no, we won't accept that.
[00:36:37.920 --> 00:36:39.280] We know that's not true.
[00:36:39.280 --> 00:36:41.360] We know you're guilty, but that's not acceptable.
[00:36:41.360 --> 00:36:42.480] That type of confession.
[00:36:42.480 --> 00:36:45.920] It has to be an empirically grounded confession.
[00:36:45.920 --> 00:37:18.200] And he realizes that it's almost like they cope, they collaborate in the next several weeks, or actually, it turns out a couple of years to create a seemingly realistic account of all of his actions as he was living in China, who he spoke to, who he met on June 2nd, 1951, or and they had a conversation about the price of tea in China or how shoes were in short supply or petrol was difficult to find or things like that.
[00:37:18.200 --> 00:37:25.320] And he has to convince himself that he really was acting as a spy from a certain point of view.
[00:37:25.320 --> 00:37:29.560] It's not sufficient just to make up a confession to save himself the pain.
[00:37:29.560 --> 00:37:39.400] He actually has to come to believe it himself to the satisfaction of the authorities for him to be exonerated and ultimately released.
[00:37:40.680 --> 00:37:51.000] Yeah, that and the other stories that you tell in the book, particularly like Patty Hearst, I was thinking about Bob Triver's theory of deception and self-deception.
[00:37:51.320 --> 00:37:59.960] You know, that we evolved a deception detection device, as it were, neural network or whatever, because we know people lie.
[00:37:59.960 --> 00:38:09.080] So it's good to have some cues to look for if you think somebody's going to cheat you out of something or lie to you because we're a social primate species.
[00:38:09.080 --> 00:38:11.080] We have to get along and so on.
[00:38:11.080 --> 00:38:15.640] But, you know, it's kind of a game theory, a game theoretic analysis.
[00:38:15.640 --> 00:38:22.520] So Bob's theory is that it's not enough to deceive people because they may detect it.
[00:38:22.520 --> 00:38:27.880] You won't give off the cues that they may detect if you believe the lie yourself.
[00:38:28.520 --> 00:38:33.320] And so you just, you know, to really convince other people, you just start really believing it.
[00:38:33.320 --> 00:38:38.600] So I think a lot of cult leaders are, you know, psychics who talk to the dead or astrologers or whatever.
[00:38:38.600 --> 00:38:41.720] I think a lot of them are just messing around, having fun, just trying it out.
[00:38:41.720 --> 00:38:47.760] Yeah, I'll try doing a psychic reading or I'll try some astrology readings just for fun.
[00:38:47.760 --> 00:38:49.600] And then they start to get good at it.
[00:38:49.600 --> 00:38:53.280] And then the more they actually believe, you know, maybe I do have these powers.
[00:38:53.280 --> 00:38:57.360] And then they get more positive feedback from the people they're interacting with.
[00:38:57.360 --> 00:39:01.440] And that makes them, you know, think even more, I can really do this.
[00:39:01.760 --> 00:39:02.160] Right?
[00:39:02.160 --> 00:39:07.920] So maybe Patty Hearst, you know, at some point, she's just thinking, I really have to convince these people.
[00:39:07.920 --> 00:39:13.840] So I'm just going to believe their ideology or whatever, something along those lines in the course of weeks or months.
[00:39:13.840 --> 00:39:27.920] Yeah, I think she said later in an interview after her release, she said to Larry King, by the time they were done with me, I really was a soldier in the Symbian Easter Liberation Army.
[00:39:27.920 --> 00:39:52.640] I mean, because she had decided at some point in her captivity and under these extenuating, you know, horrendous conditions, being held in a closet in the dark for 59 days, not being allowed to even see, you know, being blindfolded, not being allowed any autonomy, multiply raped, exposed to propaganda, all these things.
[00:39:52.640 --> 00:40:01.520] She said, once she decided she did want to live, she had to accommodate her thoughts to coincide with theirs.
[00:40:01.520 --> 00:40:05.680] And it wasn't enough because they kept saying, you're not brainwashed, are you?
[00:40:05.680 --> 00:40:08.000] They wanted to find those cues that she was lying.
[00:40:08.000 --> 00:40:12.160] So, and they're, you know, not, they're highly sensitive to that.
[00:40:12.640 --> 00:40:15.120] It's a major thing in the group's life as well.
[00:40:15.120 --> 00:40:18.720] So she had to, she had to actually convert herself.
[00:40:18.720 --> 00:40:24.400] And I think that's one thing that came up in the trial I noticed is that the experts, including Robert J.
[00:40:24.400 --> 00:40:31.880] Lifton, but especially Lewis Jolly Ann West and others, they wanted to insist that she had never been converted.
[00:40:29.840 --> 00:40:34.280] But I think she actually did have to convert herself.
[00:40:34.440 --> 00:40:42.040] It's just that conversion, we mistake it, in my opinion, for something permanent, whereas it has to be perpetually renewed.
[00:40:42.360 --> 00:40:45.720] And that's why Patty Hearst isn't today a Symbianese liberation.
[00:40:45.720 --> 00:40:48.760] I mean, she just went through a terrible experience.
[00:40:48.760 --> 00:40:49.240] Yeah.
[00:40:49.560 --> 00:40:53.080] Well, but I think the way we think about it is that it's got to be black and white.
[00:40:53.080 --> 00:40:55.320] You're either this or that, and that's that.
[00:40:55.320 --> 00:40:57.960] Well, but, you know, life is just not like that.
[00:40:57.960 --> 00:41:07.480] I mean, we have a set of relatively permanent personality characteristics, like in the openness to experience or introversion, extroversion, conscientiousness, and so on.
[00:41:07.480 --> 00:41:09.160] But there's a lot of variation there.
[00:41:09.160 --> 00:41:12.040] And it does change throughout the lifetime a little bit.
[00:41:12.040 --> 00:41:16.920] So people are a lot more flexible, I think, than we realize.
[00:41:16.920 --> 00:41:26.440] And again, you have to, you know, so back to your original where you start the book with the Korean prisoners of war and what did you call it?
[00:41:26.440 --> 00:41:27.480] The volleyball problem.
[00:41:27.480 --> 00:41:28.840] You know, look, they're playing volleyball.
[00:41:28.840 --> 00:41:29.720] They must be having fun.
[00:41:29.720 --> 00:41:30.840] They must really love this.
[00:41:30.840 --> 00:41:32.520] Well, what else are you going to do?
[00:41:32.520 --> 00:41:34.040] You know, you're captured.
[00:41:34.360 --> 00:41:36.360] You got to kind of go along.
[00:41:36.360 --> 00:41:40.920] And then maybe at some point you think, yeah, you know, this communism thing, this is a really good idea.
[00:41:40.920 --> 00:41:43.560] Maybe, you know, maybe this would be good for America or something like that.
[00:41:43.560 --> 00:41:47.560] And you end up saying some idiotic statement that you later regret.
[00:41:47.560 --> 00:42:02.280] I suppose, too, also what I was struck by is in the volleyball, you know, the photos that came out of the POWs playing volleyball in some sandpit, and they looked a little, some of them looked a little chunky, like they'd gained weight.
[00:42:02.280 --> 00:42:06.760] And people commented in the U.S., you know, oh, they've put on some weight.
[00:42:06.760 --> 00:42:08.440] They can't possibly have starved.
[00:42:08.440 --> 00:42:10.440] They can't possibly have been maltreated.
[00:42:10.440 --> 00:42:12.840] They can't possibly have been tortured.
[00:42:12.840 --> 00:42:14.520] But all of those things had happened.
[00:42:14.520 --> 00:42:19.280] They just weren't immediately evident and it wasn't legible in these pictures.
[00:42:19.280 --> 00:42:36.240] And it's interesting how readily we do succumb to, I guess, visual evidence, but also, you know, these types of arguments, in a sense, it's an argument that what we can know, what we can know of somebody.
[00:42:36.240 --> 00:42:55.600] And I thought also striking as I studied these materials over years is that the men, the POWs returning from Korea, the ones who did return, as opposed to the 21 who decided to go to China, the ones who came back on a ship, they ended up writing a poem together, which is captured in a scholarly article.
[00:42:55.600 --> 00:42:59.680] And they said, they basically said, no one will ever understand.
[00:43:01.840 --> 00:43:06.240] They said, yes, we were badly wounded, badly treated, but what does it mean to you?
[00:43:06.240 --> 00:43:19.120] You'll never, in other words, people will never understand what we went through that extracted this seemingly strange, freakish behavior of, you know, us confess doing false confessions or quoting communist propaganda.
[00:43:19.120 --> 00:43:20.800] And this turned out to be the case.
[00:43:20.800 --> 00:43:21.760] They weren't understood.
[00:43:21.760 --> 00:43:23.360] And the same is true of Patty Hearst.
[00:43:23.360 --> 00:43:28.800] She said, I despaired of anyone believing or understanding my experience.
[00:43:28.800 --> 00:43:31.040] And that turned out to be true to this day.
[00:43:31.040 --> 00:43:45.120] You know, I think many people still find it impossible to believe what happened to her and that she impossible to trust that she wasn't just somehow on this wild.
[00:43:45.120 --> 00:43:49.120] There's this whole wild ride theory that she actually was enjoying what was happening.
[00:43:49.120 --> 00:43:54.000] And it's kind of a volleyball problem attributed to Patty Hearst.
[00:43:54.640 --> 00:44:00.200] Yeah, the famous picture in the bank with her hat and the rifle and all that.
[00:44:00.520 --> 00:44:01.240] Yeah, it's hard.
[00:43:59.440 --> 00:44:03.240] You look at that and go, Well, come on.
[00:43:59.600 --> 00:44:03.560] Yeah.
[00:44:03.880 --> 00:44:12.360] And the lawyer, the prosecuting lawyer, said her movements on the videotape taken by the Hibernia Bank, that she looked so agile.
[00:44:12.360 --> 00:44:13.720] She was moving so quickly.
[00:44:13.720 --> 00:44:15.240] She couldn't possibly have been brainwashed.
[00:44:15.240 --> 00:44:20.040] She couldn't have possibly been under their control because she was so she was so alert.
[00:44:20.040 --> 00:44:22.840] But she, you know, she had to make herself so.
[00:44:24.440 --> 00:44:32.040] Yeah, I just again, the term brainwashing, mind control, it's not accurate for the way the mind actually works.
[00:44:32.600 --> 00:44:39.400] It's an idea that you document went on for half a century that we think this is what is possible.
[00:44:39.400 --> 00:44:41.880] It's really not that possible.
[00:44:42.680 --> 00:44:44.840] It depends how you define it, I suppose.
[00:44:45.080 --> 00:44:54.040] I think it's not possible in the way people think, but it's a useful, I guess I like as a history, I'm now in the field of history of science, and we do.
[00:44:54.440 --> 00:45:00.680] I mean, what's interesting is how people have, how the word has flourished and how many things get attached to it.
[00:45:00.680 --> 00:45:07.320] And can, and personally, I think it's interesting to use as a window for self-reflection.
[00:45:07.320 --> 00:45:09.800] You know, what do we think it, what do we think it means?
[00:45:09.960 --> 00:45:17.160] I think it is possible for people to be deeply changed by their experiences, sometimes in radical ways.
[00:45:17.160 --> 00:45:18.680] I mean, I've certainly seen that.
[00:45:18.680 --> 00:45:27.000] And you can come up with maybe more neuroscientifically accurate accounts or psychodynamically accurate or things like that.
[00:45:27.000 --> 00:45:32.040] But I guess that's why I do I argue that it is real.
[00:45:33.320 --> 00:45:34.200] Yeah, it's real.
[00:45:34.200 --> 00:45:40.360] I mean, for sure, people go down a rabbit hole and end up somewhere that they never would have anticipated.
[00:45:40.360 --> 00:45:47.840] But it's not like the CIA can just take control of somebody and get them to assassinate the president or something like that.
[00:45:44.760 --> 00:45:51.520] I guess that's, but let me just think about this out loud here.
[00:45:51.760 --> 00:45:54.800] I mentioned working in a Skinnerian lab.
[00:45:55.840 --> 00:46:04.640] Remember when Skinner was experimenting with having pigeons in the nose cone of a missile, you know, pecking at keys to make it, you know, direct correctly?
[00:46:04.640 --> 00:46:06.880] Now we have computers that do this.
[00:46:07.120 --> 00:46:10.400] After 9-11, Richard Dawkins wrote a, I think it was in the.
[00:46:26.720 --> 00:46:27.920] All in one search.
[00:46:27.920 --> 00:46:34.320] Go to cheapcaribbean.com and try the budget beach finder and see how stress-free vacation planning should be.
[00:46:34.320 --> 00:46:37.280] Cheap Caribbean vacations.
[00:46:37.280 --> 00:46:39.920] Get more sen for your dollar.
[00:46:40.320 --> 00:46:52.480] The spectator, a brilliant op-ed about talking about Skinner's experiments and say, well, you know, if only we could get, you know, people to do this kind of thing.
[00:46:52.480 --> 00:46:54.000] Well, there's no way we could get somebody.
[00:46:54.160 --> 00:46:57.840] Well, I wonder if we could get somebody to like fly a plane into a building.
[00:46:57.840 --> 00:47:00.560] Well, no, no, there's no way anybody would do that.
[00:47:00.560 --> 00:47:05.040] You know, and then, of course, his point is that religion does that.
[00:47:05.600 --> 00:47:08.480] Religion could get people to do these sorts of things.
[00:47:08.640 --> 00:47:13.680] So that brings to mind, you know, the difference between a cult and a new religious movement and religion.
[00:47:13.680 --> 00:47:18.640] You know, how do you think about that kind of the spectrum of different beliefs there?
[00:47:19.280 --> 00:47:21.920] I think it's a really rich territory.
[00:47:21.720 --> 00:47:23.040] I I'll come back to that.
[00:47:23.040 --> 00:47:29.440] Just one small comment on the I can't, I can't resist because I teach sections of about B.F.
[00:47:29.440 --> 00:47:32.520] Skinner in my class, and I love to teach the Operation.
[00:47:33.160 --> 00:47:50.920] I don't know if he called it Operation Pigeon, but the one the part you're describing where he actually taught, trained pigeons to guide missiles during World War II and around 1942, so that they could connect with a target.
[00:47:50.920 --> 00:47:54.200] And he was successful.
[00:47:55.000 --> 00:48:04.680] The military came in and toured his facility, and the pigeons, of course, would end up dying, but it would be, you know, would be a worthy sacrifice as he saw it.
[00:48:04.680 --> 00:48:10.760] So it was actually an interesting project because it actually worked.
[00:48:10.760 --> 00:48:14.760] And radar had not yet been developed, but it was concurrently being developed.
[00:48:14.760 --> 00:48:17.800] So it's just the fact that radar came in around that time.
[00:48:17.800 --> 00:48:29.720] But there are interesting accounts of when high brass military came to tour his facility and they were expecting this high-tech operation and they saw the pigeons and they just were, they just couldn't deal.
[00:48:29.720 --> 00:48:37.720] It wasn't the image that they wanted of a strong nation, you know, able to prevail using the latest technology.
[00:48:37.720 --> 00:48:45.080] It turned out to be just these homegrown birds that were actually quite I mean, the heroism of pigeons is a whole other story.
[00:48:45.080 --> 00:48:55.720] So anyway, I always just find that amazing that they rejected it not that it not because it did it didn't work, but because they just didn't it didn't symbolically fit.
[00:48:55.720 --> 00:48:56.280] Interesting.
[00:48:56.280 --> 00:48:57.960] Then also radar then came along.
[00:48:57.960 --> 00:48:59.880] So all right.
[00:48:59.880 --> 00:49:00.840] I did not know that.
[00:49:00.840 --> 00:49:02.360] That's super interesting.
[00:49:02.360 --> 00:49:07.400] Yeah, there's a wonderful article about this, but about religion.
[00:49:08.040 --> 00:49:14.760] So, new religious movements is a really interesting area, and as well as the history of religion.
[00:49:15.360 --> 00:49:31.440] And I just spent two weeks in Amsterdam at the University of Amsterdam in a course that was kind of about the history of the history of human consciousness, psychedelic research, and also the history of religion because it was taught by religious studies scholars.
[00:49:31.440 --> 00:49:37.200] And it really gave me some insight in a way into how they look at this terrain.
[00:49:37.200 --> 00:50:00.080] I mean, it's, I became interested in researching my book in these cult wars back in the 80s, in which religious studies scholars and religious historians took great issue with the cult research and the sociologists and psychiatrists who tended to think they tended to focus on the really pathological groups that get a lot of attention.
[00:50:00.080 --> 00:50:05.360] And as you're saying, they were focusing on the worst outcomes of those groups.
[00:50:06.080 --> 00:50:15.520] And so these scholars were saying we need to look more broadly at the history of religion and see that the literal definition of cult is merely a small religious group.
[00:50:15.520 --> 00:50:17.680] I mean, Christianity was a cult.
[00:50:17.680 --> 00:50:20.800] It became this big, then it grew into a religion.
[00:50:20.800 --> 00:50:22.320] This is kind of how things develop.
[00:50:22.320 --> 00:50:28.080] And we've pathologized the very meaning of cult, and they thought that was very unfair too.
[00:50:28.080 --> 00:50:32.480] That's why they came up with the term new religious movements to distance it.
[00:50:32.480 --> 00:50:53.360] And so they started to also systematically take down the leading scholars of cult research, such as Margaret Singer, who had been, who had really developed these vibrant careers testifying in court about brainwashing and the dangers of cults.
[00:50:53.440 --> 00:50:56.240] So, anyway, I don't have an absolute answer.
[00:50:56.240 --> 00:51:00.040] I think it's really profound.
[00:50:58.960 --> 00:51:02.040] I mean, many profound questions are raised.
[00:51:02.200 --> 00:51:08.200] Like, you were bringing up the question of gurus and how they may come to believe their own hype.
[00:51:08.200 --> 00:51:13.800] Like, they start, they get into astrology, they get into whatever it is, whatever.
[00:51:13.800 --> 00:51:18.520] Maybe they're using meditation, chanting, group techniques.
[00:51:18.520 --> 00:51:21.160] They're cultivating bonds among their followers.
[00:51:21.160 --> 00:51:25.880] They're also cultivating their own magnetism and charisma.
[00:51:27.080 --> 00:51:28.600] I think they do have these.
[00:51:28.760 --> 00:51:43.080] I think that often, as part of meditation or part of these practices, they are having these, what do you call them, out-of-body or sort of altered states of consciousness that lead them to believe that they're somehow godlike.
[00:51:43.080 --> 00:51:55.960] And then they get and they lead that leads them to believe that nothing they can do, that they're somehow beyond good and evil, or that whatever they do is righteous.
[00:51:55.960 --> 00:52:00.200] And then the group seems to, their followers seem to reinforce that.
[00:52:00.200 --> 00:52:03.000] And that's how a group can kind of pinwheel.
[00:52:03.000 --> 00:52:21.640] So I think the territory is much less clear than we think, and it's much harder to easily demarcate whether over time or within a group itself, what is pathological or even, you know, we would have to say evil in something like the Jim Jones cult.
[00:52:21.960 --> 00:52:22.760] Yeah.
[00:52:22.760 --> 00:52:29.560] Yeah, it's very spectrum-y, and I think it depends very much on what actually happens in a group.
[00:52:29.800 --> 00:52:35.080] Most of the time, you know, nothing bad is particularly happening, but we focus on the negative things.
[00:52:35.320 --> 00:52:38.680] It's a little bit like the demarcation problem between science and pseudoscience.
[00:52:38.680 --> 00:52:43.240] Well, there's no single definition that's going to cover all specific cases.
[00:52:43.240 --> 00:52:49.040] You know, Scientology, for the most part, I think of it as pretty cult-y, you know, but I've met Scientologists.
[00:52:49.040 --> 00:52:58.640] I met Isaac Hayes, the great, you know, singer, songwriter, the voice of Chaff on South Park and so on.
[00:52:58.640 --> 00:53:00.960] He was at my best friend's Thanksgiving dinner.
[00:53:00.960 --> 00:53:03.440] He's sitting right across from me for hours.
[00:53:03.440 --> 00:53:04.480] Super nice guy.
[00:53:04.800 --> 00:53:11.040] And he had just had his career rejuvenated, and he was famously a Scientologist.
[00:53:11.040 --> 00:53:16.400] So I, as kindly and respectfully as I could, you know, what is it you get out of Scientology?
[00:53:16.400 --> 00:53:22.240] And he said, well, Michael, I was at the pinnacle of, you know, Hollywood music, the whole thing.
[00:53:22.240 --> 00:53:23.280] I had everything.
[00:53:23.280 --> 00:53:26.880] And I just lost it all, you know, basically sex, drugs, rock and roll, the whole thing.
[00:53:26.880 --> 00:53:27.840] Gone.
[00:53:27.840 --> 00:53:32.560] And I was at the bottom of my life, you know, and Scientology saved me.
[00:53:33.120 --> 00:53:33.760] Okay.
[00:53:34.080 --> 00:53:36.320] I mean, what am I going to say to that?
[00:53:36.320 --> 00:53:36.720] Nothing.
[00:53:36.720 --> 00:53:37.680] That's good.
[00:53:37.680 --> 00:53:38.240] Right?
[00:53:38.560 --> 00:53:42.400] And more recently, you know, Ian Herci Alley became a Christian.
[00:53:42.400 --> 00:53:47.520] You know, she was famously one of the, you know, the new atheists, and now she's a Christian.
[00:53:47.520 --> 00:53:48.720] What happened?
[00:53:48.720 --> 00:53:51.360] Well, you know, there's a lot of personal stuff, she said.
[00:53:51.360 --> 00:54:01.920] She's written about this now and spoken publicly about it, that, you know, there was issues with alcoholism and drugs and suicidal ideation.
[00:54:02.240 --> 00:54:06.960] You know, she's lived under a death threat, you know, the threat of being murdered since the 90s.
[00:54:07.440 --> 00:54:10.640] She lives with 24-hour protection.
[00:54:10.640 --> 00:54:14.880] You know, I've done a number of public events with her, and I hardly ever see her.
[00:54:14.880 --> 00:54:16.320] She's in the green room.
[00:54:16.320 --> 00:54:17.680] There's guards at the door.
[00:54:17.680 --> 00:54:20.240] I can't, you know, can't go in and see her, right?
[00:54:20.240 --> 00:54:22.880] And, you know, she has to leave immediately afterwards.
[00:54:22.880 --> 00:54:24.800] She doesn't get to go to the after-party and have fun.
[00:54:24.800 --> 00:54:27.840] She's with an armed guard because somebody will try to kill her.
[00:54:27.840 --> 00:54:32.360] And then she, you know, got married, had kids, and now she's afraid they'll kill her kids.
[00:54:29.840 --> 00:54:37.320] And this led to this, you know, understandable depression, anxiety.
[00:54:38.040 --> 00:54:41.720] And, you know, then it was Christianity that saved her.
[00:54:42.040 --> 00:54:46.440] And when she tells me the story, it's like, okay, I don't want to talk you out of anything.
[00:54:46.440 --> 00:54:50.040] You know, I'm an atheist, but you know, life is hard.
[00:54:50.040 --> 00:54:54.840] I can't imagine what it's like to live under a death threat like that.
[00:54:54.840 --> 00:54:58.120] You know, it's in a way of saying, you know, religion is good when it does good.
[00:54:58.120 --> 00:54:59.320] It's bad when it does bad.
[00:54:59.320 --> 00:55:01.400] It's a lot of different things.
[00:55:01.400 --> 00:55:04.840] And, you know, new religious movements are on the spectrum.
[00:55:04.840 --> 00:55:06.520] They're just smaller, whatever.
[00:55:06.520 --> 00:55:09.320] It just depends on what actually happens.
[00:55:09.640 --> 00:55:12.760] Yeah, I use that rubric in meditation.
[00:55:12.760 --> 00:55:13.960] And I've heard it used.
[00:55:14.040 --> 00:55:15.880] I do a lot of meditating.
[00:55:15.880 --> 00:55:20.440] And I find that, I mean, you can have a range of experiences.
[00:55:20.440 --> 00:55:26.840] These don't necessarily determine the quality or whether it's right, whether it's right for the person.
[00:55:26.840 --> 00:55:32.120] It can actually be, you know, anything can be abused, I guess, is what I've discovered in life.
[00:55:32.120 --> 00:55:41.800] Even something that seems to be so positive and maybe society seems to be so optimistic about meditation or cold plunges or deep breathing or whatever it is.
[00:55:41.800 --> 00:55:44.200] But for me, meditation's been a profound thing.
[00:55:44.200 --> 00:55:51.640] But I think you do have to constantly check, you know, reality check.
[00:55:51.640 --> 00:55:53.160] How is this affecting my life?
[00:55:53.160 --> 00:55:55.160] Are my relationships better?
[00:55:55.160 --> 00:55:58.440] Am I more compassionate generally?
[00:55:58.440 --> 00:56:09.000] Is it shutting down, you know, my, you know, to certain capacities, my, and, uh, and, and whether it's actually helping me in my life.
[00:56:09.000 --> 00:56:20.000] And I think, I think even that can sometimes be misleading because initially people can have these amazing experiences, whatever, but you have to keep doing that as over time.
[00:56:20.640 --> 00:56:23.840] Yeah, I had Steve Hassan on the show.
[00:56:23.840 --> 00:56:28.800] Uh, he wrote a book, The Cult of Donald Trump, The Cult of Trump, or whatever it was called.
[00:56:28.800 --> 00:56:31.600] And, you know, I like Steve his work.
[00:56:31.600 --> 00:56:32.480] I like him.
[00:56:32.480 --> 00:56:33.680] He's got a great story.
[00:56:33.680 --> 00:56:36.800] His bite model explains some things, I think.
[00:56:36.800 --> 00:56:39.840] But 80 million Americans are in a cult?
[00:56:39.840 --> 00:56:40.960] I mean, really?
[00:56:40.960 --> 00:56:42.080] I mean, come on.
[00:56:42.080 --> 00:56:45.440] You know, this is using the term, I think, too broadly.
[00:56:45.760 --> 00:56:46.480] I agree.
[00:56:46.480 --> 00:56:52.160] I mean, well, that's why if people want to pursue that question, I refer them to Steve's book.
[00:56:52.160 --> 00:56:55.360] But I personally think it's, what do you even say?
[00:56:55.360 --> 00:57:03.920] I've heard other people say, okay, 60 million, 80 million, no, everyone who voted for Donald Trump is somehow in a cult or brainwashed.
[00:57:03.920 --> 00:57:07.440] And then others will say, well, therefore, they all need to be deprogrammed.
[00:57:07.440 --> 00:57:10.480] I mean, this leads us down a hideous path.
[00:57:10.480 --> 00:57:23.920] And it doesn't, that's why I kind of really try to think about brainwashing as to the extent it's a useful mode of inquiry or window.
[00:57:23.920 --> 00:57:29.360] It's to look at oneself and to see that we're all connected.
[00:57:29.360 --> 00:57:37.200] These things don't, you know, they don't happen as like an isolated phenomenon that somehow we just lost half the population or something.
[00:57:37.200 --> 00:57:39.040] But we just, we have nothing to do with that.
[00:57:39.040 --> 00:57:42.080] It's a dynamic that our society is creating.
[00:57:42.080 --> 00:57:47.920] And this is actually part of it to imagine that it's just infected over there or something.
[00:57:47.920 --> 00:57:48.480] Yeah.
[00:57:48.800 --> 00:58:00.840] Let's talk about the Milgram experiments in the larger context of explaining how you get normal civilized Germans to become goose-stepping, you know, exterminating Nazis.
[00:57:59.120 --> 00:58:03.080] You know, I've been long fascinated by this.
[00:58:03.320 --> 00:58:18.840] Your discussion, I thought, was very insightful that the real target here is the people delivering the shocks, and the real response is how uncomfortable we all are watching this unfold and kind of giggling or laughing or kind of squirming.
[00:58:19.480 --> 00:58:22.120] You know, the shocker, the person doing the shocks is squirming.
[00:58:22.440 --> 00:58:24.760] It looks like they don't really want to do it.
[00:58:25.000 --> 00:58:27.480] And we're uncomfortable watching it.
[00:58:27.480 --> 00:58:29.560] What do you take from all that?
[00:58:30.120 --> 00:58:34.200] Yeah, I think these experiments are mysterious.
[00:58:34.200 --> 00:58:40.520] And that's why people are, they created a lot of discomfort in Stanley Milgram's life.
[00:58:40.520 --> 00:58:46.200] And I think they have something to do with the fact that he never got tenure and had to leave Yale.
[00:58:46.200 --> 00:58:48.760] Not that that's the worst thing that can happen to somebody.
[00:58:48.760 --> 00:58:52.360] And he did have a flourishing career at a city college, I think.
[00:58:52.360 --> 00:58:58.520] But he, yeah, but these experiments famously made people uncomfortable because it was hard to interpret them.
[00:58:58.520 --> 00:59:02.440] Also, I think that there are several, I mean, people were uncomfortable.
[00:59:02.440 --> 00:59:04.440] The people giving the shocks were uncomfortable.
[00:59:04.440 --> 00:59:09.560] And even if you mention that over and over again, it somehow disappears from the telling.
[00:59:09.560 --> 00:59:17.000] And when the final figures are given, oh, two-thirds, you know, administered lethal level shocks or thought they were.
[00:59:17.000 --> 00:59:31.240] Suddenly, those people, their experiences erased, the discomfort or even agony they were feeling is erased, and suddenly they're real live Nazis or the Eichmann thesis has been proved.
[00:59:31.720 --> 00:59:42.200] And the, you know, the deeper you go into how the actual experiments were run, you find that Milgram and his graduate students were taking bets on how far people would go too.
[00:59:42.200 --> 00:59:50.560] Like they kind of did treat it as a game sometimes that people would laugh while observing the experiments.
[00:59:50.880 --> 01:00:12.720] But also, I found over the years that students laugh in an uncomfortable way because it's, and even in the one you conducted yourself with NBC Dateline in the guise of a reality TV show, you see a lot of pained laughter and wanting the people who did comply, who were compliant and who gave shocks.
[01:00:12.720 --> 01:00:14.080] They want to be reassured.
[01:00:14.080 --> 01:00:15.040] They are laughing too.
[01:00:15.040 --> 01:00:19.440] They want to know, oh, this is in the realm of humor.
[01:00:19.440 --> 01:00:33.760] And I think what it is, is that you've just been, it's hard for them to find a foothold from which to describe what just happened to them.
[01:00:33.760 --> 01:00:40.000] And they've been, in a sense, tortured, but it's hard to recognize.
[01:00:40.000 --> 01:00:41.760] It's hard for anyone to recognize.
[01:00:41.760 --> 01:00:50.160] And it only results mostly in the experimenter seeming to have the power of interpreting what's just happened.
[01:00:50.160 --> 01:01:01.360] Anyway, so I try to give a new lens to these experiments, which I do find so interesting just because you can keep looking at them and find more in them.
[01:01:01.360 --> 01:01:23.840] And there's actually a scholar who wrote me from the UK, I'm forgetting his name right now, but he's coming out with a book shortly, studying what the, you know, how Milgram himself described these priming conditions that were, you know, he's doing a close study of almost like an ethnographic study of what happened in the laboratory.
[01:01:24.160 --> 01:01:25.040] Oh, interesting.
[01:01:25.040 --> 01:01:28.640] He's treating it like the subjects were primed to do something.
[01:01:29.160 --> 01:01:33.960] Or was he the agentic explanation Milgram offered?
[01:01:29.760 --> 01:01:39.800] I think he's actually, yeah, I think he's actually talking about binding conditions or something like that.
[01:01:39.800 --> 01:01:41.080] I just started to read it.
[01:01:41.080 --> 01:01:42.600] It's fascinating.
[01:01:42.600 --> 01:01:47.000] But it is, he seemed to find it, I mean, I think there is something to do.
[01:01:47.720 --> 01:01:54.680] Milgram was not, to my knowledge, people didn't take that seriously, his agentic shift.
[01:01:54.680 --> 01:01:58.360] But I think there's also something going on there.
[01:01:58.360 --> 01:02:01.400] Yeah, I mentioned working in that Skinnerian lab for two years.
[01:02:01.400 --> 01:02:10.440] That was under Douglas Naverick, who got his PhD under Douglas Fantino at UC San Diego, got his PhD under Kernstein at Harvard.
[01:02:10.440 --> 01:02:13.000] You know, then there's the connection to Skinner.
[01:02:13.000 --> 01:02:17.560] All right, so Doug wrote several papers reinterpreting Milgram.
[01:02:17.560 --> 01:02:21.480] He treated it as an approach avoidance conflict.
[01:02:21.480 --> 01:02:32.440] And that, you know, you have reinforcement and then you have punishment and, you know, you kind of go back and forth, like the rat that gets rewarded for going onto the plate where the cheese is, but then gets a little shocked.
[01:02:32.440 --> 01:02:34.040] So he doesn't want to go, but he wants to go.
[01:02:34.040 --> 01:02:34.760] He doesn't want to go.
[01:02:34.760 --> 01:02:35.720] He wants to go.
[01:02:35.720 --> 01:02:36.920] That was his interpretation.
[01:02:36.920 --> 01:02:39.400] You know, you want to obey the authority.
[01:02:39.400 --> 01:02:44.920] You know, here's a famous scientist at Yale University, or in our case, you know, here's the NBC studios.
[01:02:44.920 --> 01:02:46.760] I want to be part of this reality show.
[01:02:46.760 --> 01:02:47.640] That would be cool.
[01:02:47.640 --> 01:02:49.960] But on the other hand, I have to do this uncomfortable thing.
[01:02:49.960 --> 01:02:50.680] So I want it.
[01:02:50.680 --> 01:02:51.560] I don't want it.
[01:02:52.200 --> 01:02:53.160] That kind of thing.
[01:02:53.160 --> 01:02:58.520] I think there's just many different ways to interpret it because it is somewhat mysterious.
[01:02:58.840 --> 01:03:05.560] And, you know, maybe there's just a variety of the way you know different people respond differently, so they have different motivations.
[01:03:05.560 --> 01:03:08.520] There's not a single theory that's going to explain it.
[01:03:08.520 --> 01:03:09.240] That's true.
[01:03:09.240 --> 01:03:13.400] And it's also quite amazing how generative it has been of so many theories.
[01:03:13.400 --> 01:03:20.800] So many people have things to say about it, from this is the root of all evil, to this is this proves that everyone is completely situational in there.
[01:03:21.120 --> 01:03:23.760] There's no evil, no one can ever be evil.
[01:03:23.760 --> 01:03:29.200] But I think what you were saying about approach avoidance is kind of interesting.
[01:03:29.760 --> 01:03:45.600] It brought to mind something like the Catch-22 situation or Gregory Bateson, you know, his schismogenesis, just like situations that can actually drive someone insane because you're being punished for the very thing you're being incentivized to seek.
[01:03:46.480 --> 01:03:51.520] I mean, so that's when used in child rearing, it has terrible results.
[01:03:51.520 --> 01:03:55.840] So perhaps, you know, it sets up a very difficult situation.
[01:03:55.840 --> 01:04:00.800] And the whole story of how Milgram came to the experiment is also pretty interesting.
[01:04:00.800 --> 01:04:03.840] I don't think you wrote about Zimbardo in your book, did you?
[01:04:03.840 --> 01:04:04.320] No.
[01:04:04.960 --> 01:04:06.480] Because it's a similar situation.
[01:04:06.720 --> 01:04:12.080] You know, he and Milgram were colleagues and of the same age, essentially trying to explain the same thing.
[01:04:12.080 --> 01:04:13.120] They're both Jewish.
[01:04:13.120 --> 01:04:15.200] Like, how did this happen to our people?
[01:04:15.200 --> 01:04:18.560] You know, and we're scientists, let's explain it.
[01:04:18.560 --> 01:04:39.360] You know, toward the end of his life, he came under heavy criticism that he coached the prisoner, you know, the randomly chosen students who played guards a little too much and that they were therefore just doing what he told them to do, not, you know, not this is the role I'm adopting now.
[01:04:40.000 --> 01:04:49.600] I think that's an overly criticism, but I'd like to get your thoughts on that because in a way, he is studying how prisons operate.
[01:04:49.600 --> 01:04:54.960] And, you know, how would an undergraduate Stanford University student have any idea how a prison operates?
[01:04:54.960 --> 01:04:57.200] So you have to kind of give them some instructions.
[01:04:57.200 --> 01:04:58.640] This is what you're supposed to do.
[01:04:58.640 --> 01:05:01.160] And then they kind of take over, like the mirrored sunglasses.
[01:05:01.160 --> 01:05:12.680] He told me he got that idea from that film, Kool Han Luke, where the prison guard wore those mirrored sunglasses because it dehumanizes you, makes a disconnect between you and the victim, that kind of thing.
[01:05:12.680 --> 01:05:13.080] I don't know.
[01:05:13.640 --> 01:05:15.240] What are your thoughts on that?
[01:05:15.240 --> 01:05:17.240] Yeah, that's interesting.
[01:05:17.240 --> 01:05:19.960] I think it's another such a rich experiment.
[01:05:19.960 --> 01:05:24.200] That's why I love teaching it as well, just because there's so much there.
[01:05:24.200 --> 01:05:33.640] And the deeper you go, the more, because now the participants in the experiment, many of them have been interviewed and reflected on it over many decades.
[01:05:33.640 --> 01:05:50.840] So the guy who played John Wayne, who did come up with those mirrored sunglasses, I think he did say that he, but this was his own, in a sense, he showed an unusual amount of initiative in embodying and carrying out Zimbardo's instructions.
[01:05:51.160 --> 01:06:07.240] So when Marvels, this guy went on to become a something like an insurance adjuster, just, you know, some guy, you know, later fairly successful in life, but not very, you know, not, he doesn't stand out as particularly, certainly not evil.
[01:06:08.440 --> 01:06:25.560] He seems in the film that he has a real taste for harming people, for really, and for really extracting obedience from these prisoners or even torturing them in a sense.
[01:06:25.560 --> 01:06:31.320] But, but he's just like, you know, he doesn't seem to also have much reflective capacity.
[01:06:31.320 --> 01:06:33.400] He just is a little bit proud of what he did.
[01:06:33.400 --> 01:06:35.960] So it's been so interesting to see what's come out of it.
[01:06:35.960 --> 01:06:53.120] Some other people, but I think maybe the backlash against Philip Zimbardo has to do with the fact that people on some level felt uneasily as if he didn't accept responsibility in some way for what he had caused to happen.
[01:06:53.440 --> 01:07:12.640] And maybe that's hard to even put into words, but there's almost an inchoate sense that he unleashed or set up this phenomena and then he didn't fully describe his own role and also perhaps his responsibility for stopping it earlier.
[01:07:12.880 --> 01:07:25.520] Because he, he, maybe it's even a discomfort with scientific experimentation with human beings at that level where these undergraduates, who's who's to know what, what the lasting effects might have been.
[01:07:25.520 --> 01:07:35.840] So maybe there's just, maybe there's the backlash is not, you know, as rash, it's not perfectly logical, but there's a sort of emotional sense that Zimbardo must have done something.
[01:07:35.840 --> 01:07:38.480] Is there like an uneasiness with it?
[01:07:38.480 --> 01:07:38.960] Yeah.
[01:07:39.520 --> 01:07:45.440] Well, yeah, that one guy that gave an interview a couple years ago, he was interviewed decades ago.
[01:07:45.440 --> 01:07:48.080] And he was saying very different things then than he is now.
[01:07:48.080 --> 01:07:49.840] So I thought, I don't know.
[01:07:50.160 --> 01:07:55.600] Oh, I just went along with it because, you know, Zimbardo told us he was like, that's not what you said 20 years ago.
[01:07:55.920 --> 01:07:56.480] Right.
[01:07:56.960 --> 01:07:59.840] Cheap Caribbean vacations.
[01:07:59.840 --> 01:08:02.240] Get more sen for your dollar.
[01:08:02.560 --> 01:08:08.400] Vacation planning should feel like a breeze, not a deep dive into countless travel sites searching for the best deal.
[01:08:08.400 --> 01:08:14.480] With Cheap Caribbean's Budget Beach Finder, get every destination and every date all in one search.
[01:08:14.480 --> 01:08:20.880] Go to cheapcaribbean.com and try the Budget Beach Finder and see how stress-free vacation planning should be.
[01:08:20.880 --> 01:08:26.560] Cheap Caribbean vacations, get more sen for your dollar.
[01:08:27.280 --> 01:08:28.080] So I don't know.
[01:08:28.080 --> 01:08:31.480] There's some post hoc rationalization going on there.
[01:08:32.040 --> 01:08:44.360] But to extrapolate it to a real-world situation, like Christopher Browning's book, Ordinary Men, also a great docudrama on Netflix, of the same title, Ordinary Men.
[01:08:44.360 --> 01:08:46.440] You know, how do you get these guys to do these things?
[01:08:46.440 --> 01:08:49.160] They're actually killing Jews up close.
[01:08:49.160 --> 01:08:49.720] Boom.
[01:08:49.720 --> 01:08:51.480] You know, and they're uncomfortable doing it.
[01:08:51.480 --> 01:08:53.160] They get shit-faced every night.
[01:08:53.160 --> 01:08:58.280] They have to move them to pits so that they're not quite so close and so on.
[01:08:58.280 --> 01:09:03.320] And you just wonder, is there some also self-deception there?
[01:09:03.320 --> 01:09:04.840] Like, I don't really want to do this.
[01:09:04.840 --> 01:09:10.600] But then you, well, they are Jews, and we all know they're subhuman, and they're the ones that started this war.
[01:09:10.600 --> 01:09:12.040] Remember, Hitler told us that?
[01:09:12.040 --> 01:09:19.800] All the propaganda throughout all the 30s, you know, the 1930s, there was a lot of stuff that happened that had nothing to do with exterminating Jews.
[01:09:19.800 --> 01:09:26.200] It was just a bunch of other stuff that kind of primed people to be able to do that at some point.
[01:09:26.200 --> 01:09:32.040] But then you don't want to let them off the hook, like, you know, Hannah Arant's, you know, the banality of evil.
[01:09:32.040 --> 01:09:36.360] I expected Eichmann to look like a monster and he looked like an ordinary bureaucrat.
[01:09:36.360 --> 01:09:39.480] But I thought she let him off the hook a little too much.
[01:09:39.480 --> 01:09:46.600] You know, he really was a rabid anti-Semite and worked mightily to exterminate as many Jews as he could.
[01:09:46.600 --> 01:09:52.200] So, you know, there's somewhere in there, you know, there's a complex explanation, I think.
[01:09:52.200 --> 01:09:53.560] What are your thoughts?
[01:09:53.880 --> 01:09:55.480] Yeah, I agree.
[01:09:55.480 --> 01:10:05.160] I mean, one of the profound experiences of living in Berlin for a couple years when I was at the Max Planck Institute, well, one year for each sabbatical.
[01:10:05.160 --> 01:10:18.080] And there was a museum exhibit, I think, the second, probably around 2013 at the Central National Museum in Berlin.
[01:10:18.240 --> 01:10:27.040] And it was about, it was just the ordinary accoutrements of life under Nazi, under National Socialism.
[01:10:27.040 --> 01:10:42.240] So it was things like a cute little chess table that had, you know, the Nazi insignia swastikas or like potholders for the home with full, you know, anti-Semitic slogans.
[01:10:42.240 --> 01:10:52.320] It just showed just the sheer ordinariness and the way that day-to-day life was kind of woven in if you were, I guess, sufficiently adherent.
[01:10:52.560 --> 01:10:55.120] But you would just find these household items.
[01:10:55.520 --> 01:11:04.160] And that, you know, it wasn't the chilling, the usual chilling things like the photographs of emaciated people or gas chambers.
[01:11:04.160 --> 01:11:07.760] It was actually just like oven mitts with swastikas.
[01:11:07.760 --> 01:11:10.640] It's so interesting to see that.
[01:11:10.640 --> 01:11:16.800] And it's good, it's a good reminder or something to really contemplate on the one hand.
[01:11:16.800 --> 01:11:22.720] And then with the ordinariness of someone like, then can you extend that as Arent did so successfully?
[01:11:22.720 --> 01:11:42.720] I mean, I think what was, you can criticize her work, I suppose, but I think, and you probably can, but I think what was successful about it is that she, in a sense, used Eichmann to make a broader point about his, you know, there's something about the way she describes him.
[01:11:45.680 --> 01:12:00.000] Maybe she underestimates his rabidness, but the way she describes how his conscience, you know, that passage in Eichmann in Jerusalem where she says his conscience worked for two hours and 53 minutes.
[01:12:00.520 --> 01:12:06.440] There was one time where he saved one Jewish gardener from being exterminated.
[01:12:06.440 --> 01:12:13.080] And then after that, he never did anything else because he knew the man and he took some action.
[01:12:13.080 --> 01:12:19.720] And so she said, basically, we saw his conscience operate for two hours and then it never was seen again.
[01:12:19.720 --> 01:12:28.520] But she's having like, she's because she's so talented, I guess, she's bringing forward this idea that has stuck with us.
[01:12:28.520 --> 01:12:35.400] And perhaps it doesn't fully describe Eichmann himself, but more her interest in him.
[01:12:35.400 --> 01:12:36.920] Like in that question.
[01:12:36.920 --> 01:12:37.560] Yeah.
[01:12:37.560 --> 01:12:38.440] Okay, fair enough.
[01:12:38.440 --> 01:12:39.880] That's that's a good point.
[01:12:39.880 --> 01:12:49.480] Yeah, you do wonder after the war, these people are all dead now, but how many of these Germans that did participate and then they just go about their lives?
[01:12:49.480 --> 01:12:50.760] Well, well, that was then.
[01:12:51.080 --> 01:12:51.880] Now is now.
[01:12:51.880 --> 01:12:52.440] It's different.
[01:12:52.680 --> 01:12:54.200] I wouldn't do that now.
[01:12:54.760 --> 01:12:55.560] Something like that.
[01:12:55.560 --> 01:13:05.480] Of course, you know, with war crimes as a thing and genocide as an actual term that was coined in 1945, now it's a thing.
[01:13:05.480 --> 01:13:11.960] So we got to put some people in the docket for this or else this could happen again and we don't want, you know, all right.
[01:13:11.960 --> 01:13:20.760] So, yeah, so you get, you know, you get these trials like that guy, Damon Junk, who has worked for Ford Motor Company in Ohio for 40 years or whatever.
[01:13:20.760 --> 01:13:23.000] And then, okay, we're going to bring this guy down.
[01:13:23.000 --> 01:13:25.560] It's like, yeah, but that was a long time ago.
[01:13:25.560 --> 01:13:28.040] I mean, that's not the guy he is now.
[01:13:28.360 --> 01:13:31.160] But we have a hard time thinking of it like that.
[01:13:31.160 --> 01:13:45.520] Yeah, I think we do have trouble with history, with, you know, the length, I guess, just the human life, how we are changing, like to think that people are just one thing all the time for your whole life on some level.
[01:13:45.760 --> 01:13:57.760] Also, we have trouble with social bonds, you know, the extent to which we actually are generated by our social environments and our deeply, you know, embeddedness in those things.
[01:13:57.760 --> 01:14:10.560] All those questions are just hard to reckon with, especially when practical, practical, or even just maybe the sense that there needs to be a reckoning.
[01:14:10.560 --> 01:14:16.080] There needs to be a public sense that we are vigilant against monsters.
[01:14:16.080 --> 01:14:16.480] Yeah.
[01:14:16.800 --> 01:14:18.240] And that it can't, you know.
[01:14:18.240 --> 01:14:18.720] So.
[01:14:19.040 --> 01:14:26.000] Well, the name of that film is the series, Netflix, The Devil Next Door, I think it's called.
[01:14:26.320 --> 01:14:35.120] It's like, God, I was living next to this guy for all these years, and he was a Nazi guard at this concentration camp, and he heard of Jews in the gas chambers.
[01:14:35.120 --> 01:14:36.000] Wow.
[01:14:36.560 --> 01:14:38.800] But he was a quiet man.
[01:14:40.000 --> 01:14:41.680] Well, that's who he is now.
[01:14:41.680 --> 01:14:44.480] He was somebody else in this other situation.
[01:14:44.480 --> 01:14:45.920] Yeah, I mean, this goes back and forth.
[01:14:45.920 --> 01:14:47.200] You know, that was Phil's whole thing.
[01:14:47.200 --> 01:14:54.560] You know, it's at the Abu Ghraib Abu Ghraib soldier who was in charge of Abu Ghraib.
[01:14:54.560 --> 01:14:55.760] I forget his name now.
[01:14:56.320 --> 01:15:02.560] That Phil was on his defense team saying, well, he's not like that when he was in America.
[01:15:02.560 --> 01:15:03.680] He was a Boy Scout.
[01:15:03.680 --> 01:15:07.280] He was the, you know, the high school standout hero and so on.
[01:15:07.280 --> 01:15:09.440] And so he had all these decorations and stuff.
[01:15:09.440 --> 01:15:12.000] And he's put in this bad situation.
[01:15:12.000 --> 01:15:13.440] But then, where is free will?
[01:15:13.440 --> 01:15:15.520] Where is your volition?
[01:15:15.840 --> 01:15:17.280] Yeah, where is responsibility?
[01:15:17.280 --> 01:15:33.080] It's like a lot of people who do end up in cults and maybe do terrible things, like get involved in or at least see and sanction and somehow stand by when you know the abuse of children is happening.
[01:15:33.400 --> 01:15:53.800] Maybe they wouldn't have entered that cult if they hadn't walked down a certain street on a certain day and waited at a certain bus stop and happened to have met these recruiters and happened to have, maybe they just, you know, all sorts of circumstances and happenstances may have resulted in their decision and it was an ill-informed decision.
[01:15:53.800 --> 01:15:55.880] Nonetheless, is there not a moment?
[01:15:55.880 --> 01:15:57.320] Is there not responsibility?
[01:15:57.320 --> 01:15:59.400] I think that's what's confusing.
[01:15:59.720 --> 01:16:00.760] It is confusing.
[01:16:00.760 --> 01:16:13.640] And when you get maybe the legal context actually is illuminating in that way that when it really push comes to shove, you know, how does responsibility how will that function?
[01:16:13.640 --> 01:16:19.080] Because ultimately, everyone has their reasons.
[01:16:19.080 --> 01:16:19.480] Yeah.
[01:16:19.800 --> 01:16:21.720] Well, that was the fascinating thing about the Manson trials.
[01:16:21.720 --> 01:16:32.360] Buglyosi had to get these women convicted for murder because they really did do it, but then also get Manson, who wasn't even there.
[01:16:32.360 --> 01:16:36.440] So he must have had supreme brain mind control over these girls.
[01:16:36.440 --> 01:16:38.600] But on the other hand, we don't want to let them off.
[01:16:38.600 --> 01:16:41.800] Like, oh, I was just a victim of Charlie's mind control.
[01:16:41.800 --> 01:16:47.640] You know, so he, you know, he concocted the story about the helter-skelter and the race war and all that stuff.
[01:16:47.640 --> 01:16:54.840] That I think Tom O'Neill, I think he does debunk that is probably was not the real motive of the Manson family, Helter Skelter.
[01:16:54.840 --> 01:16:59.880] I mean, but Bugosi's goal was not to figure out what happened, the truth.
[01:16:59.880 --> 01:17:01.560] His goal is to win the case, right?
[01:17:01.560 --> 01:17:04.920] So, lawyers operate differently than scientists in that sense.
[01:17:05.320 --> 01:17:10.600] I actually am going to be talking to Tom O'Neill shortly about some of these questions.
[01:17:11.800 --> 01:17:39.200] And we've probably crossed paths a lot in the Lewis, Jolly, and West archives, but I think one thing maybe that should be considered with the Manson family is the shift from not just intensive use of LSD, but then into amphetamines, amphetamine, augmenting the LSD and creating a different, I mean, as they progressed in their crime.
[01:17:39.200 --> 01:17:43.520] So, and that actually I'm curious to hear.
[01:17:43.520 --> 01:17:53.360] This is where there was an amphetamine research project going on in Haight-Ashbury where Manson's family, I think, was a subject of them.
[01:17:53.360 --> 01:17:56.880] Or I'm trying to find out if that was the case.
[01:17:56.880 --> 01:17:59.680] But yeah, there's some interesting connections there.
[01:17:59.680 --> 01:18:03.840] I don't know.
[01:18:04.160 --> 01:18:04.800] Yeah, I don't.
[01:18:04.800 --> 01:18:06.720] I think they wanted to make a story.
[01:18:06.720 --> 01:18:15.600] I think the family wanted to make a story that would somehow create more of an aura around their acts, perhaps.
[01:18:15.600 --> 01:18:16.880] Oh, I love the book, Chaos.
[01:18:16.880 --> 01:18:18.800] And I watched, I read the whole thing.
[01:18:18.800 --> 01:18:20.560] I watched him on Rogan for three hours.
[01:18:20.560 --> 01:18:22.000] I watched you on Rogan for a couple hours.
[01:18:22.080 --> 01:18:22.640] That was great.
[01:18:22.640 --> 01:18:24.080] This is all good stuff.
[01:18:24.400 --> 01:18:28.160] But at the end of the book, he basically says, I really have no smoking gun.
[01:18:28.160 --> 01:18:31.920] I can't prove the connection between Manson and the CIA, MK Ultra and all.
[01:18:31.920 --> 01:18:34.720] It's like, like, damn, darn.
[01:18:35.440 --> 01:18:39.120] Well, it's a tribute to him that he kind of ends up that way.
[01:18:39.120 --> 01:18:40.160] Yeah, oh, absolutely.
[01:18:40.160 --> 01:18:40.640] Totally.
[01:18:40.960 --> 01:18:45.520] You know, high integrity and honesty on his part as a good scholar.
[01:18:46.240 --> 01:18:52.800] Part of the problem is that it's such a bigger-than-life target, Manson, in the same way of Lee Harvey Oswald.
[01:18:52.800 --> 01:19:04.760] I mean, you know, conspiracy theorists have picked through every single person Lee Harvey Oswald ever met, any group he had any connection to whatsoever, and you could somehow tie it back to the CIA.
[01:19:04.760 --> 01:19:15.080] You know, if you have the back to Milgram's six degrees of separation, you can find links to between anybody and anything, almost anywhere, if you look hard enough.
[01:19:16.040 --> 01:19:19.640] Well, we'll see what Tom O'Neill's next book comes up with.
[01:19:19.960 --> 01:19:21.640] Oh, is he still working on this?
[01:19:22.520 --> 01:19:23.640] I believe he is.
[01:19:23.960 --> 01:19:24.840] I'll find out more.
[01:19:24.840 --> 01:19:28.120] I'm looking forward to comparing notes.
[01:19:28.120 --> 01:19:30.120] Okay, last topic, because I know you got to run.
[01:19:30.840 --> 01:19:41.320] You know, I love that you brought up Norbert Elliot's book, The Civilizing Process, in terms of how norms shift gradually over time.
[01:19:41.640 --> 01:19:58.520] Elliot's database were these books of etiquette and how people had to be instructed, you know, how to blow your nose, how to eat politely at the dinner table and use your fork for this and use your knife for that.
[01:19:58.520 --> 01:20:03.320] And, you know, and when you're sleeping next to somebody, you know, don't do this or that.
[01:20:03.320 --> 01:20:04.200] I mean, it just goes on.
[01:20:04.360 --> 01:20:05.320] It's astonishing.
[01:20:05.320 --> 01:20:08.600] I mean, our ancestors were just disgusting.
[01:20:09.240 --> 01:20:12.360] I don't know if that's the conclusion we're supposed to draw.
[01:20:12.360 --> 01:20:13.480] Our ancestors.
[01:20:13.480 --> 01:20:15.480] But it is really, it is stunning.
[01:20:15.480 --> 01:20:20.280] Please, you know, refrain from blowing your nose in the tablecloth when we're reading.
[01:20:20.760 --> 01:20:24.040] And then looking at it, going, oh, look at these tables I have here.
[01:20:24.040 --> 01:20:24.440] Yeah.
[01:20:25.160 --> 01:20:34.440] But your larger point there is that these norms shift so slowly over a decadal or maybe even century-long timeframe, we don't even notice it.
[01:20:34.440 --> 01:20:37.880] And we just don't do the things that these people did in the Middle Ages.
[01:20:37.880 --> 01:20:40.040] And it never even occurs to us.
[01:20:40.040 --> 01:20:48.720] You don't have to implement self-control because, you know, I really want to blow my nose into the tablecloth at my neighbor's house, but I'm just not going to do it because I have self-control.
[01:20:48.720 --> 01:20:51.520] It never even enters my mind to do that.
[01:20:51.520 --> 01:21:03.840] Or we're not aware of that desire because at first the etiquette manuals addressed all people and then, or first the gentry, then it addressed a wider population, then it just addressed children.
[01:21:03.840 --> 01:21:21.120] And then after a while, it just seemed to be so inculcated that you didn't have to tell anyone not to, you know, relieve themselves in the corner of the room or to, you know, blow your nose into your hand while you're eating.
[01:21:21.120 --> 01:21:27.680] But what's interesting is that Elias published this in 1939, I believe.
[01:21:27.680 --> 01:21:40.880] So he's kind of in Germany as a Jewish sociologist watching the rise of, I mean, I think in a way he's asking these questions with the concepts of sociogenesis and psychogenesis, like what can be removed?
[01:21:40.880 --> 01:21:43.200] What is our, what is civilization?
[01:21:43.200 --> 01:21:56.800] And it also reminds me of Cheslov Milos, who says, living in Warsaw during World War II, he said, you know, I'm surrounded by people, you know, basically these couples, families living their lives.
[01:21:56.800 --> 01:22:01.120] Everyone tries to remain normal for as long as they can.
[01:22:01.120 --> 01:22:03.200] Like you're putting away money in your 401k.
[01:22:03.200 --> 01:22:04.400] They don't have 401ks.
[01:22:04.400 --> 01:22:07.760] But essentially that idea, you keep doing the things you do.
[01:22:07.760 --> 01:22:08.960] You live the best you can.
[01:22:08.960 --> 01:22:13.520] And then he said he saw how quickly it decayed.
[01:22:13.520 --> 01:22:16.320] And soon people are copulating in the street.
[01:22:16.320 --> 01:22:19.040] They're dying, you know, next to me.
[01:22:20.000 --> 01:22:26.480] You go out for a walk, and all of a sudden, this veneer of reality shatters.
[01:22:26.480 --> 01:22:29.240] And I think this profoundly shaped him.
[01:22:29.240 --> 01:22:34.360] And probably that same thing was happening to Elias.
[01:22:29.680 --> 01:22:35.480] That's why I like his work.
[01:22:35.800 --> 01:22:37.880] Yeah, yeah, that's interesting.
[01:22:37.880 --> 01:22:42.360] I hadn't thought of it as a commentary on his own time, but that does make sense.
[01:22:42.360 --> 01:22:44.040] I was thinking of it in a larger context.
[01:22:44.040 --> 01:22:47.480] I had Gene Twangy on the show for her book, Generations.
[01:22:47.640 --> 01:22:50.280] Her book is about how generations change.
[01:22:50.280 --> 01:22:59.480] Like, I'm a baby boomer, and you know, the way I think about the world is very different from, say, the Gen Z and my kids and so on.
[01:22:59.480 --> 01:23:03.080] But you don't even notice it until somebody points it out, like Gene does with data.
[01:23:03.080 --> 01:23:06.280] Like, you know, here's the kind of songs that were popular in the 60s.
[01:23:06.280 --> 01:23:10.920] Here's what's popular now, or names, or just, you know, anything that's happening like that.
[01:23:10.920 --> 01:23:20.600] So one of her data points that stuck out to me was the age of first pregnancy for women in my generation, baby boomers, was 19.
[01:23:20.600 --> 01:23:22.520] And today it's 29.
[01:23:22.520 --> 01:23:23.880] It's 10 years later.
[01:23:23.880 --> 01:23:27.240] And instead of having three or four kids, you have one, maybe two.
[01:23:27.560 --> 01:23:28.120] Okay.
[01:23:28.120 --> 01:23:30.120] So I asked her, you know, how did this happen?
[01:23:30.120 --> 01:23:32.920] I mean, does anybody sit you down and go, okay, here's the deal.
[01:23:32.920 --> 01:23:34.920] You're not going to get pregnant till you're 29.
[01:23:34.920 --> 01:23:36.600] It's like, no, of course not.
[01:23:36.600 --> 01:23:40.360] It's just that nobody you know is doing this.
[01:23:40.360 --> 01:23:42.920] Everybody you know in high school is going to college.
[01:23:42.920 --> 01:23:45.640] And then everybody you know in college is going to go get a career.
[01:23:45.640 --> 01:23:48.440] And then they're not going to get married till they're 30s.
[01:23:48.440 --> 01:23:49.000] And so on.
[01:23:49.000 --> 01:23:50.520] This is just what everybody does.
[01:23:50.520 --> 01:23:52.520] You don't even think about it.
[01:23:53.160 --> 01:23:53.640] Yeah.
[01:23:53.880 --> 01:23:55.240] It's a powerful thing.
[01:23:55.240 --> 01:24:08.280] But we don't see, I guess that's from, I really like Gene Twangi's work or taking generations as a kind of analytical frame because it is profound.
[01:24:08.280 --> 01:24:10.440] All that we don't see is so profound.
[01:24:10.440 --> 01:24:18.720] And I think that's what I'm trying to do with brainwashing: is it just kind of a reminder of how much we take for granted?
[01:24:19.040 --> 01:24:28.720] Well, your last two chapters on social media and AI and all that stuff are profound because we're in the middle of it now and we really have no idea what the effects are going to be.
[01:24:28.720 --> 01:24:34.400] Apparently, there's some really negative effects, as Gene and Jonathan Haidt and others have shown.
[01:24:34.720 --> 01:24:36.000] But this too may pass.
[01:24:36.000 --> 01:24:45.840] I mean, maybe, you know, social media, it's only what, 2007 that the smartphone is introduced, and then Facebook and Instagram, all this stuff is pretty new compared to, you know, books or whatever.
[01:24:45.840 --> 01:24:48.560] Newspapers go back centuries.
[01:24:49.520 --> 01:24:50.320] Yeah.
[01:24:50.960 --> 01:24:52.080] Very true.
[01:24:52.080 --> 01:24:53.200] Impossible to know.
[01:24:53.200 --> 01:24:56.480] I mean, I'm always interested in this: like, what's the future of moral progress?
[01:24:56.480 --> 01:25:07.920] You know, what are we doing now that our descendants will look back like we look back at slaveholders or the way people treated women or Jews or blacks and are appalled?
[01:25:07.920 --> 01:25:10.000] What are we doing that they'll be appalled by us?
[01:25:10.000 --> 01:25:13.040] Well, if I knew, I wouldn't do it, right?
[01:25:13.360 --> 01:25:14.960] But I don't know.
[01:25:15.440 --> 01:25:17.440] Animal rights, maybe we shouldn't be eating meat.
[01:25:18.160 --> 01:25:19.440] That is a good guess.
[01:25:19.440 --> 01:25:20.560] I think it's a good guess.
[01:25:20.560 --> 01:25:34.400] I think I'm convinced by those who say when we look back, when historians look back, the immense cruelty of animal slaughter through factory farming, industrialized agricultural will be the great, you know, unseen.
[01:25:34.560 --> 01:25:41.360] And there are even studies of just how these facilities are hidden from view and even from the workers themselves.
[01:25:41.360 --> 01:25:44.640] Anyway, I have a whole interest in that myself.
[01:25:44.640 --> 01:25:45.200] Oh, you do?
[01:25:45.200 --> 01:25:45.680] Okay, good.
[01:25:45.680 --> 01:25:46.000] Yeah.
[01:25:46.000 --> 01:25:46.400] All right.
[01:25:46.400 --> 01:25:49.200] Let's let you get on to your next project here.
[01:25:49.200 --> 01:25:50.560] Here it is: The Instability of Truth.
[01:25:50.640 --> 01:25:54.720] By the way, I love this cover because it's difficult to read, which is the point, right?
[01:25:54.720 --> 01:25:56.640] It's like, what am I reading here?
[01:25:56.640 --> 01:25:59.120] It's a little bit exactly unseen.
[01:25:59.360 --> 01:26:02.280] Staywashing, mind control, and hyper-persuasion.
[01:26:02.280 --> 01:26:03.160] Yes, indeed.
[01:25:59.840 --> 01:26:04.600] Rebecca, thanks so much for your work.
[01:26:05.000 --> 01:26:06.280] What are you working on next?
[01:26:07.320 --> 01:26:12.520] Oh, my next year I'm working on a project on psychedelics and brainwashing.
[01:26:12.520 --> 01:26:20.200] So really thinking about the psychedelic renaissance, its positives and potential pitfalls.
[01:26:20.520 --> 01:26:21.080] Oh, my God.
[01:26:21.080 --> 01:26:21.480] That's great.
[01:26:21.480 --> 01:26:26.600] Yeah, because micro-dosing and psychedelic therapies are now kind of in vogue.
[01:26:26.600 --> 01:26:30.360] Questions about suggestibility and what might happen, you know.
[01:26:31.240 --> 01:26:32.760] I'm looking forward to it.
[01:26:32.760 --> 01:26:36.040] It's going to be in the form of a comic book.
[01:26:36.840 --> 01:26:37.400] Oh, well.
[01:26:37.800 --> 01:26:38.360] Oh, cool.
[01:26:38.360 --> 01:26:39.000] That's good.
[01:26:39.000 --> 01:26:41.080] Yeah, we didn't even get a chance to talk about hypnosis.
[01:26:41.080 --> 01:26:45.720] Another one of these super interesting, difficult-to-explain mysteries, but it's real.
[01:26:45.720 --> 01:26:46.280] Yeah.
[01:26:46.520 --> 01:26:47.160] Yeah, yeah.
[01:26:47.160 --> 01:26:48.600] It was great to talk to you, too.
[01:26:48.600 --> 01:26:48.840] All right.
[01:26:48.840 --> 01:26:49.480] Thanks, Rebecca.
[01:26:56.840 --> 01:26:59.160] Marketing is hard.
[01:26:59.160 --> 01:27:00.520] But I'll tell you a little secret.
[01:27:00.520 --> 01:27:01.320] It doesn't have to be.
[01:27:01.320 --> 01:27:02.520] Let me point something out.
[01:27:02.520 --> 01:27:04.920] You're listening to a podcast right now, and it's great.
[01:27:04.920 --> 01:27:05.720] You love the host.
[01:27:05.720 --> 01:27:07.000] You seek it out and download it.
[01:27:07.000 --> 01:27:10.840] You listen to it while driving, working out, cooking, even going to the bathroom.
[01:27:10.840 --> 01:27:13.640] Podcasts are a pretty close companion.
[01:27:13.640 --> 01:27:15.320] And this is a podcast ad.
[01:27:15.320 --> 01:27:16.760] Did I get your attention?
[01:27:16.760 --> 01:27:21.480] You can reach great listeners like yourself with podcast advertising from LibSyn Ads.
[01:27:21.480 --> 01:27:31.400] Choose from hundreds of top podcasts offering host endorsements or run a pre-produced ad like this one across thousands of shows to reach your target audience and their favorite podcasts with LibSynAds.
[01:27:31.400 --> 01:27:33.000] Go to libsynads.com.
[01:27:33.000 --> 01:27:37.160] That's L-I-B-S-Y-N ADS.com today.