Debug Information
Processing Details
- VTT File: mss531_Steven_Sloman_2025_07_15.vtt
- Processing Time: September 11, 2025 at 03:26 PM
- Total Chunks: 2
- Transcript Length: 106,439 characters
- Caption Count: 1,045 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.960 --> 00:00:03.520] You know what can really slow down your business?
[00:00:03.520 --> 00:00:07.920] Anything from sophisticated phishing attacks to plain old spotty internet.
[00:00:07.920 --> 00:00:17.600] Get Business Secure Internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:00:17.600 --> 00:00:20.640] Plus, the more services you bundle, the more you save.
[00:00:20.640 --> 00:00:22.880] Don't let your network slow down your business.
[00:00:22.880 --> 00:00:29.280] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business.
[00:00:30.240 --> 00:00:32.320] Every business has an ambition.
[00:00:32.320 --> 00:00:41.760] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:00:41.760 --> 00:00:50.160] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:00:50.160 --> 00:00:55.040] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:00:55.040 --> 00:00:57.600] Grow today at paypalopen.com.
[00:00:57.600 --> 00:01:00.640] Loan subject to approval in available locations.
[00:01:03.840 --> 00:01:09.840] You're listening to The Michael Shermer Show.
[00:01:15.600 --> 00:01:17.120] All right, here we go.
[00:01:17.120 --> 00:01:17.760] Hi, everybody.
[00:01:17.760 --> 00:01:22.800] It's Michael Shermer, and it's time for another episode of, you know what, the Michael Shermer Show.
[00:01:22.800 --> 00:01:23.680] Here we go.
[00:01:23.680 --> 00:01:31.520] I'm going to open or introduce today's episode by reading you a thought experiment, a vignette, as it were.
[00:01:31.520 --> 00:01:34.080] Here is vignette number one.
[00:01:34.080 --> 00:01:36.720] You have $50 available.
[00:01:36.720 --> 00:01:45.920] A neighbor who just lost their job comes by and says they need $50 to buy a meal plan for their child, otherwise their child will not have lunch at school.
[00:01:45.920 --> 00:01:47.600] Can you give them the money?
[00:01:47.600 --> 00:01:49.360] You ask two friends for their advice.
[00:01:49.360 --> 00:01:55.760] Frankie thinks about it for quite a while, then advises you to go ahead and give the money to your neighbor.
[00:01:55.760 --> 00:02:00.000] Charlie hardly thinks at all, advising you to immediately give the neighbor the money.
[00:02:00.680 --> 00:02:05.480] On the basis of their advice, who would you say is more morally praiseworthy?
[00:02:05.480 --> 00:02:11.560] Frankie, the first one, who thought about it for quite a while, or Charlie, who just made a snap decision?
[00:02:11.560 --> 00:02:13.480] Okay, here's vignette number two.
[00:02:13.480 --> 00:02:15.320] Very similar, but not quite.
[00:02:16.120 --> 00:02:18.040] You have only $50 available.
[00:02:18.040 --> 00:02:25.240] A neighbor who just lost their job comes by and says they need $50 to buy a meal plan for their child, otherwise, the child will not have lunch at school.
[00:02:25.240 --> 00:02:26.840] Same scenario as the first one.
[00:02:26.840 --> 00:02:27.880] Can you give them the money?
[00:02:27.880 --> 00:02:29.240] But there's a problem.
[00:02:29.240 --> 00:02:37.320] You just got a text from an old schoolmate asking to borrow $50 to buy the medication they need to cure a serious illness.
[00:02:37.320 --> 00:02:39.400] You ask two friends for advice.
[00:02:39.400 --> 00:02:44.200] Frankie thinks about it for quite a while and then advises you to give your neighbor the money.
[00:02:44.200 --> 00:02:48.120] Charlie hardly thinks at all, advising you to immediately give the neighbor the money.
[00:02:48.120 --> 00:02:51.560] On the basis of their advice, who would you say is more morally praiseworthy?
[00:02:51.560 --> 00:02:54.440] Frankie, the first one, or Charlie?
[00:02:54.760 --> 00:03:01.720] Well, my guest today writes about this in his new book, The Cost of Conviction, which I'll give a proper introduction in just a moment.
[00:03:01.880 --> 00:03:09.320] And he says, most people think Charlie is more praiseworthy in vignette one, but Frankie is more praiseworthy in vignette two.
[00:03:09.320 --> 00:03:15.000] To find out why, we will discuss with him after I give him a proper introduction here.
[00:03:15.000 --> 00:03:18.920] He is Steve Sloan, who has taught at Brown University since 1992.
[00:03:18.920 --> 00:03:31.400] He's a fellow of the Cognitive Science Society, the Society of Experimental Psychologists, the American Psychological Society, and the Eastern Psychological Association, and the Psychonomic Society.
[00:03:31.720 --> 00:03:39.240] He's the author of Causal Models and the co-author of The Knowledge Illusion with Phil Fernbach.
[00:03:39.240 --> 00:03:49.440] He's been editor-in-chief of the journal Cognition, chair of the Brown University faculty, and the creator of Brown University's concentration of behavioral decisions, decision sciences.
[00:03:49.760 --> 00:03:55.200] Here's the new book, The Cost of Conviction: How Our Deepest Values Lead Us Astray.
[00:03:55.200 --> 00:03:56.400] All right, Steve, nice to see you.
[00:03:56.400 --> 00:03:57.040] How are you doing?
[00:03:57.040 --> 00:04:02.480] And explain to us what in the world are we supposed to interpret from that thought experiment?
[00:04:02.800 --> 00:04:03.520] Well, great.
[00:04:03.520 --> 00:04:05.040] Thanks so much for having me.
[00:04:05.040 --> 00:04:06.960] And it's great to meet you.
[00:04:06.960 --> 00:04:10.720] And I got to say that this is the first time I've ever seen the book.
[00:04:10.720 --> 00:04:11.440] Oh, really?
[00:04:11.440 --> 00:04:11.840] Yeah.
[00:04:12.000 --> 00:04:14.320] Well, I got it, I don't know, maybe a week ago or so.
[00:04:14.320 --> 00:04:15.120] And yeah, it's great.
[00:04:15.120 --> 00:04:15.440] It's fun.
[00:04:15.440 --> 00:04:15.920] I read the whole book.
[00:04:16.000 --> 00:04:18.160] Well, they sent it to you before they sent it to me.
[00:04:18.160 --> 00:04:19.760] Okay, I read it this time.
[00:04:19.760 --> 00:04:24.400] I didn't get the audio file, so I actually went old school and read it ever to cover.
[00:04:24.400 --> 00:04:25.600] Well, why you didn't know that?
[00:04:25.600 --> 00:04:27.760] Even though I haven't seen the book, I have read it.
[00:04:27.760 --> 00:04:29.280] Okay, I hope so.
[00:04:31.680 --> 00:04:36.800] So I would like to say that I did not make up the examples you just read.
[00:04:36.800 --> 00:04:45.040] I mean, I made up the specific examples, but the idea comes from work by Phil Tetlock and his colleagues at the University of Pennsylvania.
[00:04:45.040 --> 00:04:48.320] So I want to give credit where credit is due.
[00:04:48.320 --> 00:04:48.960] Indeed.
[00:04:50.560 --> 00:04:52.880] So, well, what did you think?
[00:04:52.880 --> 00:04:55.840] Did you have an opinion about who?
[00:04:55.920 --> 00:05:02.080] Well, I thought the interpretation you made there was, you know, is the way I thought about it as well.
[00:05:02.400 --> 00:05:07.120] That is to say, a moral person should have no trouble choosing the sacred value in the first one.
[00:05:07.120 --> 00:05:10.080] Obviously, you should give the money to the neighbor.
[00:05:10.080 --> 00:05:15.600] But in the second vignette, you really should give that some thought because you have conflicting interests there.
[00:05:15.600 --> 00:05:20.320] $50 for the drug to help the friend, or $50 to the neighbor's kid.
[00:05:20.720 --> 00:05:21.920] What's the right one?
[00:05:21.920 --> 00:05:23.040] How do you calculate that?
[00:05:23.040 --> 00:05:25.360] That should not be a snap decision.
[00:05:25.360 --> 00:05:25.920] Right.
[00:05:26.240 --> 00:05:31.000] So the first one is an example of a taboo trade-off, right?
[00:05:31.000 --> 00:05:43.480] It's a trade-off you're not supposed to make because when you have a sacred value, like help friends in need, certainly if they're children, then you're supposed to follow that all the time.
[00:05:43.480 --> 00:05:56.040] And, you know, deciding whether or not to follow it versus, you know, having a good time, spending 50 bucks is supposed to be an easy choice because there's this absolute rule to follow.
[00:05:56.040 --> 00:06:03.800] Whereas in the second case, there's a competition between two different sacred values, right?
[00:06:03.800 --> 00:06:08.680] The helping a child in need versus helping a friend in need.
[00:06:09.560 --> 00:06:17.000] And that requires some thought, that requires some deliberation in order to negotiate that.
[00:06:17.000 --> 00:06:29.400] So some trade-offs, if you believe in sacred values, some trade-offs are permitted, namely trade-offs between sacred values, but some trade-offs are not.
[00:06:29.720 --> 00:06:35.560] Trade-offs between sacred values and material goods, essentially.
[00:06:35.560 --> 00:06:49.160] So I love the example, and I'm very happy that you chose that to read because it's very direct evidence, I think, that there's something psychologically real about sacred values, right?
[00:06:49.160 --> 00:06:50.920] That's really the point.
[00:06:50.920 --> 00:06:59.640] So the theme of the book is that when we make decisions, there are two strategies we use.
[00:06:59.960 --> 00:07:06.360] One is the one we think we use almost all the time, namely a consequentialist strategy, right?
[00:07:06.360 --> 00:07:12.680] We think about the outcomes of the various options and choose the one that leads to the best outcomes.
[00:07:12.680 --> 00:07:24.880] But in fact, for purposes of simplification, what we often do is trade, is make decisions based on the actions and our values about those actions.
[00:07:24.880 --> 00:07:27.840] And that's what I'm referring to by sacred values.
[00:07:27.840 --> 00:07:28.320] Yeah.
[00:07:28.320 --> 00:07:34.960] Can you extrapolate from that to ethical theories, utilitarianism versus deontological ethics?
[00:07:34.960 --> 00:07:35.600] Yeah.
[00:07:35.920 --> 00:07:54.640] So, look, the distinction certainly is a child, is inherited from the distinction between deontology and utilitarian ethics, which is like the prototypical example of a consequentialist ethical theory.
[00:07:56.240 --> 00:08:08.720] But the book is psychology, and it seems to me that people don't use the full sophistication of those kinds of ethical theories.
[00:08:09.040 --> 00:08:16.800] We're doing something simpler, certainly something different, which is why I don't use that language.
[00:08:17.440 --> 00:08:20.880] Do you think it's worth spending some time explaining what we mean by this?
[00:08:21.520 --> 00:08:24.720] Only because, yeah, there are clumsy words.
[00:08:25.520 --> 00:08:30.560] And I was just, I was trying to explain your book to one of my cycling buddies on the ride this morning.
[00:08:30.960 --> 00:08:35.520] Utilitarianism and deontological, I can see his eyes glazing over, like, oh, okay.
[00:08:37.120 --> 00:08:38.320] So your words, not mine.
[00:08:38.560 --> 00:08:40.160] I know, I know, I know, I know.
[00:08:40.480 --> 00:08:58.400] But, you know, if I just summarize the book, you know, given current events, you know, we have so much political divisiveness because both sides, the left and the right, are making decisions as if it's just pure sacred values on the line and the other side just doesn't see that, you know, our sacred values are better than theirs.
[00:08:58.400 --> 00:09:01.160] Whereas, in fact, most politics is just consequentialism.
[00:09:01.320 --> 00:09:09.720] I mean, we're just making compromises and trying to do the calculations about what's going to be the best for the most citizens of the country or whatever.
[00:09:09.720 --> 00:09:20.120] And that, you know, it used to be that this kind of consequentialism is how politicians, you know, wrangled with these issues and made some compromise, but they don't do that as much anymore.
[00:09:20.120 --> 00:09:22.120] That seemed to make more sense.
[00:09:22.120 --> 00:09:22.680] Yeah.
[00:09:23.000 --> 00:09:35.720] No, well, that is certainly the sub-theme of the book: that we use sacred values more than we should, and that that leads to many of the world's problems, like polarization and war.
[00:09:35.720 --> 00:09:37.240] Absolutely.
[00:09:38.520 --> 00:09:47.640] So, one of the distinctions that I appeal to in the book is also a very common distinction between normative theory and descriptive theory, right?
[00:09:47.640 --> 00:09:52.760] So, theories about what we should do versus theories about what we do do.
[00:09:52.760 --> 00:10:00.120] And you described consequentialism in descriptive terms, that that is what we used to do.
[00:10:00.120 --> 00:10:02.280] And I'm not sure that that's right.
[00:10:02.600 --> 00:10:14.920] You know, so at a normative level, I completely agree that the world would be better off if we focused more on consequentialist theory.
[00:10:15.160 --> 00:10:24.360] I don't think we should only focus on consequentialist theory because, you know, we only respect people who have some sacred values.
[00:10:24.680 --> 00:10:29.640] I wouldn't have someone as a friend if they didn't have some sacred values.
[00:10:31.000 --> 00:10:35.960] But descriptively speaking, I'm actually not so sure.
[00:10:36.760 --> 00:10:45.000] We're clearly at an extreme point today in thinking in terms of and making decisions by sacred value rather than consequential.
[00:10:45.760 --> 00:10:58.080] But there have been other extreme points through history, and I actually think that we've been biased, unfortunately, in favor of sacred values for a long time, if not forever.
[00:10:58.720 --> 00:11:00.320] Yeah, I think that's probably right.
[00:11:00.560 --> 00:11:12.880] I guess what I was getting at is, I mean, if you have an issue like what's the best upper tax bracket percentage, it used to be 90%, then 70%, 37% now federal in the United States.
[00:11:12.880 --> 00:11:13.760] What's the right number?
[00:11:13.760 --> 00:11:15.040] Well, there's no right number.
[00:11:15.040 --> 00:11:20.960] You know, we just argue and fight about it and vote, and whoever wins the election gets to try to nudge it one way or the other.
[00:11:21.280 --> 00:11:23.680] Or immigration, you know, how many people should we let in?
[00:11:23.680 --> 00:11:24.880] Well, what's the goals of the country?
[00:11:24.880 --> 00:11:25.440] And so on.
[00:11:25.440 --> 00:11:30.080] And you end up wrangling over these things, which often does come down to, well, what is it we're trying to do?
[00:11:30.080 --> 00:11:31.200] You know, fill jobs.
[00:11:31.200 --> 00:11:33.120] Do we need the tech sector filled?
[00:11:33.120 --> 00:11:37.600] Do we need the, you know, the farming community in California?
[00:11:37.600 --> 00:11:39.920] We need more of those people.
[00:11:40.160 --> 00:11:40.960] What is it we're doing?
[00:11:40.960 --> 00:11:41.680] And so on.
[00:11:42.560 --> 00:11:44.960] Those are mostly consequential decisions.
[00:11:44.960 --> 00:11:53.040] But then when you hear it on Fox News or maybe MSNBC on the other side, it's just, you know, these are evil people or these are good people.
[00:11:53.040 --> 00:11:56.640] And, you know, then the consequentialism kind of goes out the door.
[00:11:56.640 --> 00:11:57.120] Yeah.
[00:11:57.120 --> 00:11:58.960] No, that's exactly right.
[00:11:58.960 --> 00:12:05.040] I mean, what we should be talking about are these nitty, gritty, very difficult details, right?
[00:12:05.040 --> 00:12:10.000] Like, as you said, in most of these domains, we don't have a normative theory.
[00:12:10.000 --> 00:12:12.480] We have to sort of make it up as we go along.
[00:12:12.480 --> 00:12:19.680] So we have to argue about it and we have different interests and, you know, different intuitions.
[00:12:20.000 --> 00:12:32.440] But instead of dealing with that, what we often do, and certainly what our current administration only does, is appeal to see sacred values and not really think about consequences at all.
[00:12:29.840 --> 00:12:32.760] Yeah.
[00:12:33.560 --> 00:12:39.160] You discussed the trolley problem, which is a longtime favorite of thought experiments here, and I've talked about it a lot.
[00:12:39.160 --> 00:12:42.920] Let me queue it up for you and then give you my thoughts and then you can run with it from there.
[00:12:42.920 --> 00:12:46.760] So, you know, the trolley's hurling down the track about to kill the five workers.
[00:12:46.760 --> 00:12:47.960] You're at the switch.
[00:12:48.120 --> 00:12:51.160] And if you throw the switch, it goes off the other track and kills one worker.
[00:12:51.160 --> 00:12:52.360] Do you throw the switch?
[00:12:52.360 --> 00:12:55.720] You know, millions of people that have taken this online, you know, go, yeah.
[00:12:55.720 --> 00:12:58.360] Most people go, oh, yeah, of course, that's the right thing to do.
[00:12:58.600 --> 00:13:09.400] Then the footbridge alternative to it is you're standing there on a bridge over the track, and you yourself are too small to stop the trolley by jumping down in front of it.
[00:13:09.640 --> 00:13:19.320] But the person standing next to you, this is often called the fat man, but I've since learned that this is now politically incorrect and it's just a big, husky guy with a backpack.
[00:13:20.280 --> 00:13:24.840] I call it the big man problem now in my effort to be politically appropriate.
[00:13:24.840 --> 00:13:25.480] You call it the what?
[00:13:25.480 --> 00:13:26.520] The pig man?
[00:13:26.520 --> 00:13:27.000] Big man.
[00:13:27.160 --> 00:13:28.200] Oh, big man, the big man.
[00:13:28.200 --> 00:13:28.520] Yes.
[00:13:28.520 --> 00:13:29.080] Okay.
[00:13:29.080 --> 00:13:30.360] He's not necessarily fat.
[00:13:30.360 --> 00:13:31.800] What's important is that he weighs a lot.
[00:13:31.960 --> 00:13:32.760] Yeah, he weighs a little bit.
[00:13:32.760 --> 00:13:33.080] That's right.
[00:13:33.080 --> 00:13:33.640] Yeah, okay.
[00:13:33.640 --> 00:13:34.760] So the big man.
[00:13:35.000 --> 00:13:39.160] You know, do you hip check him off the bridge and he stops the trolley and you save the five workers?
[00:13:39.160 --> 00:13:42.520] And, you know, most people go, oh, no, that just doesn't feel right.
[00:13:43.160 --> 00:13:46.120] You know, and so why is there that difference?
[00:13:46.120 --> 00:13:56.440] Well, you know, Joshua Green's brain scans on this, you know, one initiates the prefrontal cortex, I think it was, the rational calculation one, and the other one, you know, the sacred value.
[00:13:57.400 --> 00:14:04.200] Different parts of the brain, more, I guess, the amygdala and the limbic system dealing with deep emotions, something like that.
[00:14:04.520 --> 00:14:11.400] But in fact, normatively speaking, why wouldn't you want those choices made?
[00:14:11.400 --> 00:14:17.360] So the alternative is, you know, the doctor has five dying patients, and then there's a perfectly healthy person in the waiting room.
[00:14:14.600 --> 00:14:21.680] You could sacrifice that person, take the five organs, and then save the five patients.
[00:14:21.840 --> 00:14:25.440] And almost everybody goes, oh, no, that would, no, no, no, that would be wrong.
[00:14:25.440 --> 00:14:29.280] But in fact, what's the reason why it's wrong?
[00:14:29.280 --> 00:14:33.760] But my answer is, well, you don't want to live in a world in which you could be walking around at any given moment.
[00:14:33.760 --> 00:14:36.800] Somebody could just nab you and sacrifice you.
[00:14:36.800 --> 00:14:38.960] And so, but we don't do the calculation.
[00:14:38.960 --> 00:14:48.640] We just say in the Constitution, it says all people have the right to life, liberty, and pursuit of happiness and bodily autonomy and the right to your body.
[00:14:48.640 --> 00:14:50.880] So we can't enslave you and so on.
[00:14:50.880 --> 00:14:52.240] That's a sacred value.
[00:14:52.240 --> 00:15:00.400] But in fact, it's a kind of utilitarian calculus or consequential calculus in the sense that it would not be a good world.
[00:15:00.400 --> 00:15:05.920] It would be a society broken down in distrust if that was available.
[00:15:05.920 --> 00:15:06.400] Yeah.
[00:15:06.720 --> 00:15:10.640] Well, look, you could make that argument about all sacred values, right?
[00:15:10.640 --> 00:15:14.880] They might all have consequentialist underpinnings.
[00:15:15.200 --> 00:15:19.120] They may have evolved for good consequentialist reasons.
[00:15:19.120 --> 00:15:27.760] And there are some very nice papers that make that argument about various things that we've attributed to sacred values.
[00:15:28.000 --> 00:15:31.120] And I don't disagree with those analyses.
[00:15:31.760 --> 00:15:33.360] Two points about that.
[00:15:33.360 --> 00:15:36.400] One is those analyses are.
[00:15:36.400 --> 00:15:39.760] So my book is about how we make decisions, right?
[00:15:39.760 --> 00:15:41.360] So here we are, a human being.
[00:15:41.360 --> 00:15:42.880] We're faced with the decision.
[00:15:42.880 --> 00:15:45.040] What strategy should we use?
[00:15:45.360 --> 00:15:50.800] And I claim, like many people, that there are two strategies available to us.
[00:15:50.800 --> 00:15:56.560] And by the way, people confuse them all the time, but nevertheless, there are two strategies available to us.
[00:15:56.560 --> 00:16:02.840] We can either base it on an analysis of action, which is what sacred values are, right?
[00:16:02.840 --> 00:16:04.360] They're in the space of actions.
[00:16:04.360 --> 00:16:07.560] Is this action appropriate or is it not appropriate?
[00:16:07.560 --> 00:16:10.440] Or we can think about the space of outcomes.
[00:16:10.440 --> 00:16:13.800] What will be the consequence of doing this versus doing that?
[00:16:14.120 --> 00:16:24.200] And even if it's the case that there's a consequentialist precursor to my sacred value, that doesn't mean I'm not deciding by sacred values, right?
[00:16:24.520 --> 00:16:27.080] I mean, so that's point one.
[00:16:27.080 --> 00:16:38.040] Point two is you can actually turn that argument on its head and point out that every consequentialist analysis is based on a sacred value, right?
[00:16:38.040 --> 00:16:48.360] Namely, the sacred value that we should maximize utility, say, or that we should do whatever we're doing to maximize the outcomes that we're appealing to.
[00:16:48.360 --> 00:17:04.120] So there is a cycle of reductions you can do, but I actually think that that's sort of independent of the question: how do people make decisions and how should people make decisions?
[00:17:04.440 --> 00:17:05.080] Yeah.
[00:17:05.640 --> 00:17:10.440] Another thought experiment: Jonathan Heights, famous, Mark and Julie are on vacation, brother and sister.
[00:17:10.440 --> 00:17:13.960] They decide it would be fun and interesting to have sex.
[00:17:14.280 --> 00:17:15.560] And so they do.
[00:17:15.560 --> 00:17:17.240] They use two forms of birth control.
[00:17:17.240 --> 00:17:18.360] They don't tell anybody about it.
[00:17:18.360 --> 00:17:19.480] It makes them closer.
[00:17:19.480 --> 00:17:20.120] They enjoyed it.
[00:17:20.120 --> 00:17:21.560] They decide never to do it again.
[00:17:21.560 --> 00:17:22.680] Was that wrong?
[00:17:22.680 --> 00:17:25.320] As you know, almost everybody goes, yeah, no.
[00:17:25.640 --> 00:17:26.680] But why is it wrong?
[00:17:26.680 --> 00:17:27.720] Well, she might get pregnant.
[00:17:27.720 --> 00:17:29.560] No, no, they use two forms of birth control.
[00:17:29.560 --> 00:17:32.840] Well, but people will find out and they they'll um disown them.
[00:17:32.840 --> 00:17:34.600] No, no, they kept it a secret.
[00:17:34.600 --> 00:17:36.520] It'll come between them and destroy the relationship.
[00:17:36.520 --> 00:17:37.640] No, it brought them closer.
[00:17:37.640 --> 00:17:38.280] I don't know.
[00:17:38.760 --> 00:17:40.360] It's just gross.
[00:17:40.680 --> 00:17:42.440] Is that an example?
[00:17:42.760 --> 00:17:43.480] Right.
[00:17:43.800 --> 00:17:44.440] Yeah.
[00:17:44.440 --> 00:17:50.000] So that so Height calls that an example of moral dumbfounding, right?
[00:17:50.000 --> 00:18:02.560] And his point is that we make decisions, at least decisions of that type, without having a reason, that the reason comes after the decision rather than before the decision.
[00:18:02.560 --> 00:18:12.640] And I actually think there's a fair amount of converging evidence that for single decisions, that that's often the case.
[00:18:12.640 --> 00:18:15.280] You know, sometimes we deliberate carefully.
[00:18:15.680 --> 00:18:21.280] If we're buying a house, then we tend to deliberate before we make the decision.
[00:18:21.280 --> 00:18:31.360] But certainly in scenario type examples, and lots of decisions, lots of consumer decisions, for instance, that we make.
[00:18:31.840 --> 00:18:47.280] We make the decision and then we justify it either because our mother needs to know why or our spouse needs to know why our kid needs to know why, or we ourselves, right, want a reason to justify the decision to ourselves.
[00:18:47.920 --> 00:18:54.640] And so I think Heid's right that often the decision comes before the justification.
[00:18:54.640 --> 00:18:57.600] We make it on some other basis, right?
[00:18:57.600 --> 00:19:05.760] Like a sacred value that we might not have conscious access to or some emotion, some affect.
[00:19:05.760 --> 00:19:09.040] Our body basically tells us what to do.
[00:19:10.000 --> 00:19:19.040] I will say that you can run into some of the same conceptual problems that you brought up for the trolley problem, since you seem to like conceptual problems.
[00:19:19.680 --> 00:19:25.600] That is, you might ask, well, why do I make the decision?
[00:19:25.600 --> 00:19:30.000] You know, why do I follow the incest taboo in the example that you just gave?
[00:19:31.320 --> 00:19:42.360] And presumably, there was some reasoning that occurred beforehand that led me to be reactive to the incest taboo.
[00:19:43.240 --> 00:19:52.200] Or, you know, it's something I discussed or something that I evolved with by virtue of bad consequences that happened in the past.
[00:19:52.520 --> 00:19:56.040] And my response would be exactly the same if you say that.
[00:19:56.040 --> 00:20:05.160] That, yeah, that's true, that there is reasoning that's part of this cycle by which society makes decisions over time.
[00:20:05.160 --> 00:20:17.000] But nevertheless, when we as individuals or as small groups are faced with particular decisions, I think the moral dumbfounding phenomenon holds.
[00:20:17.320 --> 00:20:19.160] And sorry, go ahead.
[00:20:19.160 --> 00:20:22.440] Well, I was just going to say I had Kurt Gray on the podcast a few months ago.
[00:20:22.440 --> 00:20:24.440] His new book is Outrage.
[00:20:24.440 --> 00:20:31.320] And he's critical of moral foundations theory and height and those thought experiments.
[00:20:31.640 --> 00:20:33.480] So I don't know if you've read it, but give me your thoughts.
[00:20:33.720 --> 00:20:34.440] No, I haven't actually.
[00:20:34.840 --> 00:20:35.160] I should.
[00:20:35.160 --> 00:20:36.040] I'm going to write it down.
[00:20:36.040 --> 00:20:36.920] Well, it's interesting.
[00:20:36.920 --> 00:20:41.400] No, because he kind of goes along the lines of what I was just saying.
[00:20:41.400 --> 00:20:45.560] At the bottom of sacred values is a kind of consequentialism.
[00:20:45.560 --> 00:20:48.600] That is, there's a reason why we have the incest taboo.
[00:20:48.600 --> 00:20:57.400] Because in our ancestry, in the environment of our evolutionary ancestry, the EEA, as the evolutionary psychologists call it, it would have been harmful.
[00:20:57.560 --> 00:20:59.000] It's adaptation, I think.
[00:20:59.000 --> 00:21:00.200] I think A stands for adaptation.
[00:21:00.520 --> 00:21:00.840] That's right.
[00:21:00.840 --> 00:21:01.560] Yes, yes, correct.
[00:21:01.560 --> 00:21:01.800] Correct.
[00:21:01.800 --> 00:21:02.280] Yes, yes.
[00:21:02.600 --> 00:21:07.240] It would have been consequentially negative for siblings to have sex.
[00:21:07.560 --> 00:21:11.640] And of course, no one's doing genetic testing 100,000 years ago.
[00:21:11.640 --> 00:21:20.800] So the proxy is anyone you grew up with and you spent a lot of time with, you're going to then be disgusted by as adults with the thought of having intimacy with them.
[00:21:21.040 --> 00:21:23.600] So it is at heart, at the bottom.
[00:21:23.600 --> 00:21:26.320] Maybe they're just talking about different levels of causality.
[00:21:26.320 --> 00:21:26.800] I don't know.
[00:21:27.120 --> 00:21:28.240] What are your thoughts on that?
[00:21:28.480 --> 00:21:32.720] Yeah, look, it's the same story as I've already told.
[00:21:32.960 --> 00:21:36.240] I think that, you know, I don't disagree with him.
[00:21:36.480 --> 00:21:53.040] I think that there probably is that kind of consequentialist dynamic that led to the evolution of these values that Height calls moral foundations and that I'm calling sacred values.
[00:21:53.920 --> 00:21:56.320] But that's not really the issue.
[00:21:56.320 --> 00:22:01.600] The issue is how do we make decisions at a particular point in time?
[00:22:03.360 --> 00:22:14.000] Look, you know, you can trace, it's sort of a turtles all the way down kind of issue, right?
[00:22:14.000 --> 00:22:19.680] That is, there's some reason that we do the thing that we're doing.
[00:22:19.680 --> 00:22:24.480] And it may be that there's a different reason that we have that habit.
[00:22:24.480 --> 00:22:28.160] My book is about decision-making in general.
[00:22:28.160 --> 00:22:35.920] And the through line in the book is that the world is incredibly complex, right?
[00:22:35.920 --> 00:22:41.120] This actually comes directly from my previous book with Phil Fernbach, The Knowledge Illusion.
[00:22:41.120 --> 00:22:43.840] The world is incredibly complex.
[00:22:43.840 --> 00:22:47.760] And so we have to simplify it in order to make decisions.
[00:22:47.760 --> 00:22:50.240] And we simplify it in many ways.
[00:22:50.240 --> 00:22:54.000] So, you know, we use a variety of heuristics.
[00:22:54.000 --> 00:22:57.600] Convinced Tversky made those heuristics very famous, right?
[00:22:58.480 --> 00:23:08.920] They talked about heuristics for probability judgment, and they talked about frames that we use to understand decisions that we're making.
[00:23:09.240 --> 00:23:19.480] Other people have talked about how we take the complexity of problems and simplify them either by ignoring things or by outsourcing things, or right?
[00:23:19.480 --> 00:23:24.920] Like making a decision is a process of trying to simplify.
[00:23:25.240 --> 00:23:27.640] And that's what sacred values are, right?
[00:23:27.640 --> 00:23:30.440] They're a way of simplifying.
[00:23:30.440 --> 00:23:41.800] So they ignore a bunch of stuff and they let us just use this very simple absolutist rule that means we hardly have to think at all.
[00:23:41.800 --> 00:23:49.960] In fact, often we can respond directly from emotion, which was kind of Josh Green's point about the trolley problem.
[00:23:51.480 --> 00:24:24.120] But I do think that the message to take home is that the consequences of decisions, even if they have long-term effects, don't necessarily guide what we're doing in the moment.
[00:24:24.440 --> 00:24:31.160] I was thinking about emotions there in the context of guiding decision-making and what people find attractive.
[00:24:31.160 --> 00:24:44.800] So my example from this is in the evolutionary psychologist, the research showing that people like symmetrical faces, clear complexion, men with broad shoulders and a narrow waist, women with a kind of hourglass.
[00:24:52.000 --> 00:25:01.680] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:25:01.680 --> 00:25:04.720] Plus, the more services you bundle, the more you save.
[00:25:04.720 --> 00:25:07.040] Don't let your network slow down your business.
[00:25:07.040 --> 00:25:13.040] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business.
[00:25:14.000 --> 00:25:18.720] That's a figure in a 0.67 hip-to-waist, waist-to-hip ratio.
[00:25:19.040 --> 00:25:19.760] Is that right?
[00:25:19.760 --> 00:25:20.160] Yeah.
[00:25:20.160 --> 00:25:23.840] So, but no one walks around with calipers, you know, making measurements, right?
[00:25:25.040 --> 00:25:28.000] In other words, natural selection has done all the calculations for us.
[00:25:28.000 --> 00:25:30.160] These are proxies for health and so on.
[00:25:30.160 --> 00:25:33.120] And you just look at something, you go, well, I like what I look at.
[00:25:33.120 --> 00:25:35.280] That feels good, you know, whatever.
[00:25:35.280 --> 00:25:41.520] So, the emotions are proxies for these ancient decisions that were already made.
[00:25:41.520 --> 00:25:43.840] Yeah, no, that's a perfect example.
[00:25:43.840 --> 00:25:45.040] That's a great example.
[00:25:45.040 --> 00:25:45.600] Yeah.
[00:25:45.600 --> 00:26:14.080] And the fact that, you know, ants are capable of making sophisticated decisions in the sense that the outcomes of their decisions can be, you know, incredibly sophisticated ant colonies, can be these, you know, architectural triumphs, suggests that it doesn't take a lot of sophistication to make a set of decisions that lead to a sophisticated outcome.
[00:26:14.080 --> 00:26:23.280] You can do things in a very simple way, as long as you've had a sufficiently long evolutionary history to give you the tools to do that.
[00:26:23.280 --> 00:26:28.240] So, the two biggest obstacles you read about the book are the knowledge illusion and the control illusion.
[00:26:28.240 --> 00:26:33.080] Maybe give some examples of that and how that clouds our reasoning.
[00:26:29.840 --> 00:26:36.920] The knowledge illusion and the illusion of control.
[00:26:37.240 --> 00:26:42.600] So, the knowledge illusion is the fact that we think we know more than we do, right?
[00:26:42.920 --> 00:26:46.680] And there's a bunch of evidence for that.
[00:26:46.920 --> 00:26:51.800] You know, one of my favorite examples is you ask people to draw a bicycle.
[00:26:51.800 --> 00:26:53.880] And you say, Can you draw a bicycle?
[00:26:53.880 --> 00:26:56.280] And most people think, well, of course I can draw a bicycle.
[00:26:56.920 --> 00:26:57.880] I own a bicycle.
[00:26:57.880 --> 00:27:00.200] I've seen by fixed bicycles.
[00:27:00.200 --> 00:27:07.880] And then when you sit down and actually try to draw it, with some exceptions, people make all sorts of errors.
[00:27:08.840 --> 00:27:16.280] They put the chain, you know, between the two wheels or up above at the top of the frame.
[00:27:16.280 --> 00:27:20.600] They just crazy things that would not allow the bike to work.
[00:27:20.840 --> 00:27:23.640] And there are a million examples of this type.
[00:27:23.640 --> 00:27:28.600] So that suggests that we don't understand as well as we think we do.
[00:27:28.600 --> 00:27:35.880] And what we argue in the book, The Knowledge Illusion, is that the reason for that is because we rely on other people for our thinking.
[00:27:35.880 --> 00:27:45.080] We live in a community of knowledge, and we don't have to know how to draw a bicycle because, you know, there are bike mechanics who can do it for us.
[00:27:45.560 --> 00:27:47.960] And in fact, there are bicycles themselves.
[00:27:47.960 --> 00:27:52.120] Like when our bike is broken, we don't actually have to fix it from scratch.
[00:27:52.120 --> 00:27:54.760] We can rather look at the bike and see what's broken.
[00:27:54.760 --> 00:28:03.640] And the physical thing provides a frame that makes it easy often to see what the problem is.
[00:28:03.640 --> 00:28:09.400] So we use the physical world as well as other people in order to help us think.
[00:28:09.720 --> 00:28:22.000] So that's part of the argument that the world is actually very complex and we only know a small part of it and therefore we have to simplify, right?
[00:28:24.240 --> 00:28:34.480] In fact, the origin of the current book, The Cost of Conviction, is that the previous book was discussed this phenomenon.
[00:28:34.480 --> 00:28:50.880] The phenomenon of the illusion of explanatory depth is called, is that you ask people how a simple thing works, like a ballpoint pen, or how well they understand how it works, and they'll give you a rating, you know, indicating that they think they understand pretty well.
[00:28:50.880 --> 00:28:52.960] And then you say, okay, how's it work?
[00:28:52.960 --> 00:28:54.240] Explain it to me.
[00:28:54.560 --> 00:29:00.720] And they try and they fail and they discover they don't understand how it works at all.
[00:29:00.720 --> 00:29:12.960] And so they rate their own understanding lower after trying to explain, implying that the attempt to explain punctured their illusion of understanding.
[00:29:12.960 --> 00:29:15.760] And that's what most people say, right?
[00:29:16.400 --> 00:29:20.080] And we did the same thing with policies, with political policies.
[00:29:20.080 --> 00:29:25.760] We asked people how well they understand policies, and then we asked them to explain how the policies work.
[00:29:25.760 --> 00:29:30.320] And we got exactly the same result that people's sense of understanding went down.
[00:29:30.880 --> 00:29:34.960] But the other thing that happened is we changed their attitudes, right?
[00:29:34.960 --> 00:29:37.840] They became less confident in their position.
[00:29:37.840 --> 00:29:47.920] So we not only punctured their sense of understanding, we punctured their confidence in their belief for or against the policy.
[00:29:47.920 --> 00:29:51.040] We depolarized the group in that sense.
[00:29:51.360 --> 00:30:04.920] And it always struck me that what was going on in those cases, at least a good chunk of the time, is that people came into the situation thinking about policies from a sacred values perspective.
[00:30:05.560 --> 00:30:15.240] Like if you ask, you know, Republicans what they think of tariffs, well, you know, tariffs are part of their identity now, right?
[00:30:16.520 --> 00:30:20.920] And so they're not thinking, well, this will be the consequence.
[00:30:20.920 --> 00:30:25.720] Other countries will respond and prices will go.
[00:30:26.440 --> 00:30:28.040] I mean, that's hard to think about.
[00:30:28.040 --> 00:30:29.080] I'm not an economist.
[00:30:29.160 --> 00:30:30.760] I don't know how to think about that.
[00:30:30.760 --> 00:30:39.400] But what I can do is say that my group, you know, thinks that tariffs are bad, or my group thinks that tariffs are good.
[00:30:39.400 --> 00:30:41.480] And therefore, I'm all in on tariffs.
[00:30:41.480 --> 00:30:44.040] They become a sacred value.
[00:30:44.040 --> 00:30:51.800] And then when we ask people to explain, what's happening is we're forcing them to think about the issue consequentially.
[00:30:51.800 --> 00:30:59.880] And that both reduces their sense of understanding, but also makes them realize there's another way to think about this.
[00:30:59.880 --> 00:31:02.920] And I actually don't have the goods anymore.
[00:31:03.800 --> 00:31:04.360] Yeah.
[00:31:04.360 --> 00:31:14.760] Oh, just another example of the knowledge illusion was Andrew Stollman's research at Occidental College on what people know about scientific concepts.
[00:31:14.760 --> 00:31:19.560] So one experiment I liked that he did was asking his students, you know, do you accept the theory of evolution?
[00:31:19.560 --> 00:31:20.920] Oh, yeah, of course.
[00:31:20.920 --> 00:31:23.000] Explain how it works.
[00:31:23.000 --> 00:31:27.000] Oh, well, the giraffe stretches its neck and then the babies have longer necks.
[00:31:27.000 --> 00:31:28.600] They give like a Lamarckian explanation.
[00:31:29.080 --> 00:31:30.840] No, that's not it at all.
[00:31:31.160 --> 00:31:36.600] In other words, having more knowledge doesn't, you know, nudge you in the direction of believing it.
[00:31:36.600 --> 00:31:39.400] It's that, you know, the scientists, they usually get it right.
[00:31:39.400 --> 00:31:43.240] So if they say evolution theory theory is true, it probably is.
[00:31:43.240 --> 00:31:44.680] I'll just go along with it.
[00:31:44.800 --> 00:31:52.400] And so his interpretation is it's kind of a signal, a public signal, like, yeah, I guess I trust science for the most part.
[00:31:52.720 --> 00:31:53.920] Oh, that's interesting.
[00:31:53.920 --> 00:31:54.320] Yeah.
[00:31:55.120 --> 00:32:02.560] We have some interesting data on that subject that I collected with Phil Fernbach and his student, Nick Light.
[00:32:02.560 --> 00:32:05.200] Nick Light was actually the leader of the project.
[00:32:05.680 --> 00:32:27.760] But what we found was that for a bunch of issues on which there's a scientific consensus, like whether evolution took place or whether there was a big bang, right, that started the universe or right, there are a variety of issues on which the vast majority of scientists agree, whether climate change is anthropogenic.
[00:32:29.840 --> 00:32:42.240] It turns out that the people who know the most, we gave people little general knowledge tests, and the people who know the most actually tend to be the ones who agree most with the scientific consensus.
[00:32:42.240 --> 00:32:44.080] So that's good, right?
[00:32:44.240 --> 00:32:46.560] That should make you feel a little more secure.
[00:32:46.880 --> 00:32:55.280] But the people who think they know the most oppose the scientific consensus the most, right?
[00:32:55.280 --> 00:33:04.400] So there's this divorce, this dissociation between whether people, what people actually know and how much they think they know.
[00:33:04.400 --> 00:33:12.160] And they relate in opposite ways to how much they agree with, you know, the scientific process and science.
[00:33:12.160 --> 00:33:25.840] Yeah, I think the whole climate change thing got hijacked when Al Gore's film became so hugely popular, Academy Award winner, and then he won, I don't know, an Emmy Award and I don't know, and he won a bunch of awards for this.
[00:33:25.840 --> 00:33:40.840] And then the whole issue became a democratic cause, you know, and so conservatives' brains auto-correct when they hear global warming, they hear anti-capitalism and anti-business and anti-America because it's Al Gore's cause.
[00:33:41.720 --> 00:33:42.920] Yeah, that's interesting.
[00:33:42.920 --> 00:33:52.920] You know, I heard a different analysis today that it all started when the Koch brothers got into business and did what was necessary to protect themselves.
[00:33:53.240 --> 00:33:54.920] That could also be part of it, yeah.
[00:33:54.920 --> 00:33:55.640] Yeah, exactly.
[00:33:55.640 --> 00:34:11.800] They're not mutually exclusive, but it's all evidence that when you have a belief that becomes part of your community's dynamic and discourse, then it becomes sacralized, right?
[00:34:11.800 --> 00:34:18.520] And once it becomes sacralized, then the opposition has to take the opposite view.
[00:34:18.520 --> 00:34:23.000] And that's the dynamic that leads to extreme polarization.
[00:34:23.000 --> 00:34:23.480] Yeah.
[00:34:24.360 --> 00:34:28.680] So to be a good consequentialist, you have to understand causality.
[00:34:28.680 --> 00:34:33.560] So you have a nice couple chapters in the middle of your book on what is causality?
[00:34:33.560 --> 00:34:34.680] How do we determine it?
[00:34:34.680 --> 00:34:40.280] How do people actually determine causality versus how scientists determine causality?
[00:34:40.280 --> 00:34:41.640] Because they're not the same thing.
[00:34:41.640 --> 00:34:43.720] So talk a little bit about that.
[00:34:44.680 --> 00:34:51.960] Well, you know, philosophers have talked about what the nature of causality is for a long time.
[00:34:51.960 --> 00:35:01.480] And I think these days there's some consensus that to say that A causes B is to imply a counterfactual, right?
[00:35:01.480 --> 00:35:08.760] The counterfactual is if that, so to say that A causes B implies that A occurred and B occurred.
[00:35:08.760 --> 00:35:14.760] And the counterfactual is if A had not occurred, then B would not have occurred either.
[00:35:16.240 --> 00:35:26.720] And the little twist on that that has become sort of de rigueur these days is what's called the interventional theory, right?
[00:35:26.720 --> 00:35:35.760] Judea Pearl is a computer scientist, statistician who's most famous for this, although he's probably not the origin of it.
[00:35:35.760 --> 00:35:38.400] There's a long history behind it.
[00:35:38.960 --> 00:35:54.880] But his suggestion is that to say that A causes B is to say that if I tweaked A, if I, as an agent external to the system, came in and intervened on A, then B would change, right?
[00:35:54.880 --> 00:36:06.960] And it's that intervention that would be the counterfactual change that describes this other possible world that would leave B in a different state.
[00:36:07.600 --> 00:36:31.280] And there is some evidence, and I think about Tanya Lombroso's work when I say this, that when we're thinking about physical causality, we think differently about it than when we think about intentional causality, like the causality between humans, right?
[00:36:31.280 --> 00:36:41.040] So if I say, yes, you're right, Michael, and you respond by feeling good, then I have caused you to feel good.
[00:36:41.040 --> 00:36:43.600] But that's not a physical effect.
[00:36:43.600 --> 00:36:46.240] That's a psychological effect.
[00:36:46.560 --> 00:36:57.040] And what the evidence seems to suggest is that this sort of counterfactual analysis describes psychological effects pretty well.
[00:36:57.040 --> 00:37:13.880] But when we're thinking about physical causality, then we tend to think more in terms of forces, in terms of processes by which A exerts some kind of change to B.
[00:37:14.200 --> 00:37:25.320] So it's sometimes referred to as a conserved quantity theory, because there's some quantity that A passes to B over space and time that changes B.
[00:37:25.640 --> 00:37:30.760] So it's really a different kind of mental model that we have in the two cases.
[00:37:31.080 --> 00:37:34.200] Is that what you were asking me about, or did I miss the boat completely?
[00:37:34.200 --> 00:37:35.640] No, no, no, that's a good start.
[00:37:35.960 --> 00:37:41.000] Thinking of Hume's definition of causality as constant conjunction.
[00:37:41.000 --> 00:37:43.400] A happens, B happens, A happens, B happens.
[00:37:43.480 --> 00:37:51.880] Human mind is designed to assume, especially after three, three steps, you know, you hear that, and you go, huh, I'm wondering what that is.
[00:37:52.200 --> 00:37:54.200] I wonder if somebody's there.
[00:37:54.520 --> 00:37:56.600] Somebody's definitely there, right?
[00:37:56.600 --> 00:38:07.480] So, but then as he points out, you know, the rooster could think, well, you know, every time I crow early in the morning, the sun rises, therefore, I am the cause of the sunrise.
[00:38:07.480 --> 00:38:10.840] So you just silence the rooster and then the sun rises anyway.
[00:38:10.840 --> 00:38:20.440] So the counterfactual there, the inter Pearl's intervention is you change the A and see if B happens anyway.
[00:38:20.440 --> 00:38:28.040] So, you know, people smoke or they don't smoke, and you can't run these double-blind experiments, but you can run natural experiments.
[00:38:28.040 --> 00:38:31.640] You see what's already been done, and then after the fact, you make comparisons.
[00:38:31.640 --> 00:38:34.360] And so we do determine causality that way.
[00:38:34.360 --> 00:38:40.200] You know, and I like Pearl's, you know, multiple levels of causality as a useful tool.
[00:38:40.200 --> 00:38:47.440] Like my example of this was: I read this book on the assassination of Admiral Yamamoto.
[00:38:47.440 --> 00:38:52.000] He was the Japanese architect of the Pearl Harbor sneak attack.
[00:38:52.000 --> 00:38:58.400] And he was, you know, a massive hero in Japan and basically running everything.
[00:38:58.720 --> 00:39:04.480] And so our intelligence services, I think it was 1943 or 44.
[00:39:04.480 --> 00:39:11.200] No, I think it was 44, found out where his plane was going from this place to that place in the Philippines somewhere.
[00:39:11.200 --> 00:39:17.200] And so we sent, I don't know, like a dozen different P-51 fighter jets and they found him.
[00:39:17.200 --> 00:39:24.800] And there he is, one plane with two protecting planes, and they shot him down and found him in the jungle in his seat, still dead, right?
[00:39:24.800 --> 00:39:32.400] So then there arose a big controversy about who gets the credit for shooting him down because it wasn't clear which of the half dozen P-51.
[00:39:32.400 --> 00:39:34.160] And they all said, oh, it was mine.
[00:39:34.160 --> 00:39:35.360] I'm the one that got the bullets.
[00:39:35.840 --> 00:39:37.440] And they all said that.
[00:39:37.440 --> 00:39:45.600] Anyway, but the point of this is that, but for the intelligence services, they wouldn't have had anywhere near where to go, right?
[00:39:45.600 --> 00:39:49.200] So really, it's the intelligence services that broke the Japanese code.
[00:39:49.200 --> 00:39:51.120] Those are the guys that caused his death.
[00:39:51.120 --> 00:39:54.000] But there are people above them that have to order the assassination.
[00:39:54.320 --> 00:39:57.200] Okay, now we're going to try to do this mission and take him out.
[00:39:57.200 --> 00:40:07.120] And there's somebody above that, the general or the admiral, and maybe even President Roosevelt at the top is the one who is the real cause of Admiral Yamamoto's death.
[00:40:07.120 --> 00:40:08.960] So different levels.
[00:40:09.280 --> 00:40:12.320] You're getting into the whole.
[00:40:12.320 --> 00:40:16.240] So cause is a really interesting English word, right?
[00:40:16.240 --> 00:40:18.480] That has all sorts of meanings.
[00:40:18.480 --> 00:40:24.480] So when I was talking about it, I was thinking more like, how do we think about mechanisms?
[00:40:24.960 --> 00:40:28.320] But what you're talking about is causal attribution.
[00:40:28.320 --> 00:40:28.640] Oh, right.
[00:40:28.800 --> 00:40:31.480] Like the thing that we have to do in a court of law.
[00:40:29.840 --> 00:40:35.720] We have to decide who to blame, who's guilty, who, what's the cause.
[00:40:36.040 --> 00:40:41.480] Or if you're troubleshooting a problem, then you have to engage in causal attribution.
[00:40:41.720 --> 00:40:43.720] But cause is really interesting.
[00:40:43.720 --> 00:40:52.600] Like, think about the difference between Mary killed Joe versus Mary caused Joe to die.
[00:40:54.040 --> 00:40:57.000] They're slightly different meanings, right?
[00:40:57.320 --> 00:41:01.720] And it's in the first one that you really want to throw Mary in jail.
[00:41:02.360 --> 00:41:02.760] Right?
[00:41:02.760 --> 00:41:06.200] The second one, it's sort of Mary's a little removed.
[00:41:06.840 --> 00:41:13.000] So that's another, you know, yet another property of the word cause.
[00:41:13.320 --> 00:41:15.480] It's a sophisticated word.
[00:41:15.480 --> 00:41:16.280] Yeah, indeed.
[00:41:16.280 --> 00:41:19.080] I guess in the law they call it the but for question.
[00:41:19.080 --> 00:41:23.320] You know, but for John with the gun, Mary would be alive or whatever.
[00:41:24.280 --> 00:41:34.040] I saw an interesting example of this I brought up a couple times because I watched that OJ, new OJ Doc series on Netflix, multi-part, most of the stuff we already know.
[00:41:34.040 --> 00:41:42.360] But they did have an interview with his agent, his ex-agent, who said, well, after the trial, years after the trial, they're just hanging out and having beers.
[00:41:42.360 --> 00:41:48.840] And, you know, he kind of, you know, got up the gumption to ask him, so OJ, come on, did you do it?
[00:41:49.480 --> 00:41:58.920] And OJ's, you know, like had a few drinks and paused and said, you know, if Nicole hadn't come to the door with a knife, he'd still be alive.
[00:41:59.240 --> 00:42:01.640] And this guy's just like, holy shit.
[00:42:01.960 --> 00:42:04.840] You know, because, you know, we know he was a stalker.
[00:42:04.840 --> 00:42:08.200] We know he went over there to spy on her and watch her and so on.
[00:42:08.520 --> 00:42:12.440] And it's possible he didn't go over there intending to kill her.
[00:42:12.440 --> 00:42:14.040] He was just another stalker.
[00:42:14.040 --> 00:42:16.160] And there she was with Ron Goldman.
[00:42:14.840 --> 00:42:21.200] And maybe she, you know, she had already been calling 911 multiple times on him.
[00:42:21.280 --> 00:42:24.240] And you could hear him screaming in the background like he was going to kill her.
[00:42:24.240 --> 00:42:27.200] And she had been beaten and bruised by him with those pictures.
[00:42:27.200 --> 00:42:28.640] You see her bruised up face.
[00:42:28.640 --> 00:42:32.080] So maybe she grabbed a knife just to protect herself in case.
[00:42:32.080 --> 00:42:33.600] And then she opens the door.
[00:42:33.600 --> 00:42:34.720] He opens the door.
[00:42:34.720 --> 00:42:37.680] She's got the knife and all hell breaks loose and then she's dead.
[00:42:37.680 --> 00:42:41.600] And then Ron Goldman comes in to fight for her and boom, he's dead.
[00:42:41.600 --> 00:42:43.280] You know, it's something like that.
[00:42:43.360 --> 00:42:50.800] You know, it's you think of, you know, how do you think of OJ as all these friends were, I had no idea he had this in them, in him to do.
[00:42:50.800 --> 00:42:53.760] He might not have known he had that in him to do, right?
[00:42:53.760 --> 00:42:55.120] I mean, it just snaps.
[00:42:56.240 --> 00:43:01.680] It's like the problem of predicting who's the next school shooter is going to be.
[00:43:01.680 --> 00:43:02.880] Impossible.
[00:43:02.880 --> 00:43:05.120] They may not even know, right?
[00:43:05.120 --> 00:43:07.440] Until that morning, somebody commits suicide.
[00:43:07.440 --> 00:43:08.560] They don't even know they're going to do it.
[00:43:08.720 --> 00:43:10.720] They wake up one morning and go, you know, that's it.
[00:43:10.720 --> 00:43:12.080] I'm done today.
[00:43:12.080 --> 00:43:12.560] Yeah.
[00:43:12.960 --> 00:43:13.440] Yeah.
[00:43:13.760 --> 00:43:31.280] Yeah, no, I mean, what I hear you saying is agreeing with my claim that the world is incredibly complex and every event is different and there's a whole different set of variables that are relevant in every individual case and we just can't have access to all of it.
[00:43:31.280 --> 00:43:33.120] So we have to simplify.
[00:43:33.120 --> 00:43:39.040] Well, this is your argument against consequentialism is that you can't possibly do all the calculations.
[00:43:39.040 --> 00:43:40.240] Nobody can.
[00:43:40.640 --> 00:43:43.120] And so therefore, we have to do shortcuts.
[00:43:43.120 --> 00:43:44.240] So what are the shortcuts?
[00:43:44.240 --> 00:43:46.000] Well, these are the sacred values.
[00:43:46.000 --> 00:43:46.240] All right.
[00:43:46.640 --> 00:43:48.000] Well, that's one kind of shortcut.
[00:43:48.160 --> 00:43:48.880] Okay, yeah, go ahead.
[00:43:49.280 --> 00:43:49.920] Yeah, go ahead.
[00:43:49.920 --> 00:43:50.720] Give us a few others.
[00:43:50.720 --> 00:43:51.360] Yeah.
[00:43:52.000 --> 00:43:56.960] Well, so there are the heuristics we use to make judgments, right?
[00:43:56.920 --> 00:44:02.360] There are famous representativeness heuristic and availability heuristic and anchoring and adjustment.
[00:44:02.680 --> 00:44:09.640] Those are all ways of simplifying in that probability judgment or confidence judgment, right?
[00:44:10.520 --> 00:44:15.240] And it's worth noting that they all are useful.
[00:44:15.240 --> 00:44:28.680] So I don't know how familiar you or your listeners are with these ideas, but representativeness is basically the idea that we think something is probable if it's similar to a model of that thing, right?
[00:44:28.680 --> 00:44:32.760] The famous example is Linda the bank teller.
[00:44:32.760 --> 00:44:41.480] So I said, there's this woman, she went to Berkeley and she majored in political philosophy and she was active in social movements.
[00:44:41.480 --> 00:44:47.720] Do you think it's more likely that she's a bank teller now or a feminist bank teller?
[00:44:47.720 --> 00:44:51.400] And most people say, well, a feminist bank teller, right?
[00:44:51.400 --> 00:44:52.840] Because she seems like a feminist.
[00:44:52.840 --> 00:44:55.080] She has all the properties of the feminist.
[00:44:55.080 --> 00:45:03.400] Well, she can't be more likely to be a feminist bank teller than a bank teller, because if she's a feminist bank teller, she's still a bank teller, right?
[00:45:03.720 --> 00:45:05.960] So that's called the conjunction fallacy.
[00:45:05.960 --> 00:45:17.240] And it's supposed to be evidence that we use representativeness, namely, she is representative of feminists, and so we judge the probability high that she's a feminist.
[00:45:17.240 --> 00:45:27.000] In the same sense that I judge the probability high that, you know, what you're holding there is a pen because it looks like a pen and sounds like a pen and quacks like a pen.
[00:45:27.000 --> 00:45:28.360] So it must be a pen, right?
[00:45:29.560 --> 00:45:39.080] So this is a very useful heuristic that gets us the right answer most of the time, but it does cause certain errors, systematic errors.
[00:45:39.080 --> 00:45:41.400] And it's a simplifying heuristic.
[00:45:41.400 --> 00:45:43.000] And, you know, there's a host of them.
[00:45:43.000 --> 00:45:48.960] You go online and look for heuristics of judgment, you'll find a couple of dozen.
[00:45:50.960 --> 00:45:53.360] So that's one way we simplify.
[00:45:53.600 --> 00:45:54.960] Another way we simplify.
[00:45:55.280 --> 00:46:11.680] Hang on, before you go there, let's talk about some of Tetlock's forbidden taboo heuristics, you know, that we generalize by blacks are more likely to be like this, or Jews are more likely to be like that, or women are more likely to be like this.
[00:46:11.680 --> 00:46:16.080] I mean, you could back them up with some statistics.
[00:46:16.080 --> 00:46:31.760] You know, the black crime rate is higher than the white crime rate, or I've actually heard this, when women score higher in neuroticism on the big five personality dimensions, therefore they have lower control over their emotional extremes.
[00:46:31.760 --> 00:46:39.520] Therefore, maybe they're not as likely to be successful as computer programmers or engineers or CEOs of Fortune 500 companies.
[00:46:39.840 --> 00:46:41.280] These are arguments I've heard made.
[00:46:41.280 --> 00:46:44.480] Of course, Tetlock says these are for the most part forbidden.
[00:46:44.480 --> 00:46:47.440] I mean, you really should not be asking.
[00:46:47.440 --> 00:46:51.120] And maybe that is a, I don't know if that's a sacred value or a consequence.
[00:46:51.120 --> 00:47:01.680] Maybe it's better we don't ask those questions because of the history of how those kinds of questions are answered, which is not good for the treatment of these groups.
[00:47:01.680 --> 00:47:02.560] Right.
[00:47:02.880 --> 00:47:08.240] Well, so I think what you're talking about are stereotypes.
[00:47:08.240 --> 00:47:08.480] Yes.
[00:47:08.800 --> 00:47:09.840] Yes, yeah, prejudices.
[00:47:10.160 --> 00:47:15.920] And you know, yes, we think in terms of stereotypes, there's no question about it.
[00:47:18.160 --> 00:47:20.000] Is that good or is that bad?
[00:47:20.000 --> 00:47:26.160] Well, so you could, you know, that's another simplifying heuristic, right?
[00:47:26.480 --> 00:47:34.520] So it gets us the right answer most of the time, I would argue, but it also leads to certain kinds of systematic errors.
[00:47:29.600 --> 00:47:36.760] So we have to be super careful about it.
[00:47:37.080 --> 00:47:49.640] I mean, if you say, here's the example I like to use in my class: I say, it turns out that most snowshoe thieves are Canadian.
[00:47:50.760 --> 00:47:53.240] I'm allowed to say this because I'm Canadian.
[00:47:53.240 --> 00:47:54.200] All right.
[00:47:56.120 --> 00:48:02.840] And like, so let's, so is it true that Canadians have the property of being snowshoe thieves?
[00:48:02.840 --> 00:48:04.760] No, that's not true.
[00:48:04.760 --> 00:48:06.600] What I said was the opposite, right?
[00:48:06.600 --> 00:48:16.840] So, but is it wrong to say that the prototypical snowshoe thief or our stereotype of snowshoe thieves shouldn't involve them being Canadian?
[00:48:16.840 --> 00:48:19.400] Well, if it's a fact, it's a fact, right?
[00:48:19.400 --> 00:48:22.840] Like, it doesn't mean it's always the case.
[00:48:22.840 --> 00:48:28.280] It doesn't mean it has to be the case, but it may be true, right, about snowshoe thieves.
[00:48:28.280 --> 00:48:29.160] I don't know.
[00:48:29.160 --> 00:48:34.040] Or, you know, most Canadians own warm winter jackets.
[00:48:34.040 --> 00:48:36.120] Well, yeah, that's true.
[00:48:36.440 --> 00:48:40.600] Like, and we use these stereotypes.
[00:48:40.600 --> 00:48:43.240] You know what can really slow down your business?
[00:48:43.240 --> 00:48:47.560] Anything from sophisticated phishing attacks to plain old spotty internet.
[00:48:47.560 --> 00:48:57.240] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:48:57.240 --> 00:49:00.280] Plus, the more services you bundle, the more you save.
[00:49:00.280 --> 00:49:02.600] Don't let your network slow down your business.
[00:49:02.600 --> 00:49:16.160] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business constantly, constantly, like every five minutes, just to negotiate the world, right?
[00:49:16.960 --> 00:49:22.960] Because they capture statistical invariants in the world.
[00:49:22.960 --> 00:49:25.280] And in that sense, they're accurate.
[00:49:25.280 --> 00:49:26.640] Are they always right?
[00:49:26.640 --> 00:49:28.240] Well, no, of course not.
[00:49:28.240 --> 00:49:33.600] Like, you know, my brother, it turns out, lives in Canada and only wears a black leather jacket.
[00:49:33.600 --> 00:49:35.360] That's all he ever wears, right?
[00:49:35.360 --> 00:49:37.840] He doesn't own a winter coat.
[00:49:38.640 --> 00:49:40.560] But, you know, he's the exception.
[00:49:40.560 --> 00:49:44.480] And the stereotype captures most of the information.
[00:49:44.800 --> 00:49:49.520] So, and actually, it's interesting reading Walter Littman on this, right?
[00:49:49.520 --> 00:49:54.080] He wrote in 1922 and he used the word stereotype.
[00:49:54.080 --> 00:50:00.000] And he was talking about World War I propaganda, which is full of stereotypes, right?
[00:50:00.000 --> 00:50:02.080] Like the worst kind of stereotype.
[00:50:02.080 --> 00:50:07.280] So stereotypes can be deployed in nefarious ways, for sure.
[00:50:07.600 --> 00:50:15.200] You can build stereotypes that are systematically incorrect in order to degrade a people, right?
[00:50:15.200 --> 00:50:24.960] And there's all sorts of history of various countries doing that in order to scapegoat people.
[00:50:24.960 --> 00:50:30.000] Look, I think Trump is doing that now with Venezuelans, right?
[00:50:33.680 --> 00:50:36.320] So it's a way of simplifying.
[00:50:36.320 --> 00:50:39.440] It's a necessary way of simplifying.
[00:50:39.760 --> 00:50:46.800] And it's funny, you know, like when we talk about dog stereotypes, which I do all the time because I take my dogs to the dog park.
[00:50:46.800 --> 00:50:48.640] And what do you talk about at the dog park?
[00:50:50.800 --> 00:50:58.000] And so, you know, like, yes, German shepherds are like this, and labs are like that, and poodles are like this.
[00:50:58.000 --> 00:51:05.880] And like, it's just not true in every single case, but the stereotypes capture a lot, and they allow conversation.
[00:51:06.760 --> 00:51:13.160] So, we have to appeal to these invariants in order to simplify the world enough to try to make sense of it.
[00:51:13.160 --> 00:51:14.600] Yeah, I understand, of course.
[00:51:14.600 --> 00:51:24.040] But Tetlock's point about the forbidden taboo base rates, for example, what is the crime rate, the black-white crime rate differences in intersection, whatever.
[00:51:24.040 --> 00:51:30.920] Even asking the question is taboo because it suggests something that's deep that bothers people.
[00:51:30.920 --> 00:51:34.760] I think this comes from Tetlock, but I got it from Pinker's book, Rationality.
[00:51:34.760 --> 00:51:36.520] So, I'm not quite sure about that.
[00:51:36.520 --> 00:51:43.720] But you're at a dinner party with all your couple friends, and so float this one by.
[00:51:43.720 --> 00:51:50.360] So, of course, none of us would ever cheat on our spouses, but if you had to cheat with somebody here at the table, who would it be?
[00:51:51.320 --> 00:51:54.360] You cannot answer the question, just keep your mouth shut.
[00:51:54.360 --> 00:51:56.280] There is no correct answer, right?
[00:51:57.400 --> 00:52:00.600] Even if you say, I would never do this, but it would be Mary for sure.
[00:52:00.600 --> 00:52:05.880] No, you can't say that, you know, or worse, you know, back to the other example.
[00:52:05.880 --> 00:52:13.240] You know, of course, none of us here at the table are prejudiced, but you know, if you had to pick one group out that the stereotypes fit the statistic, who would it be?
[00:52:13.240 --> 00:52:17.240] You can't answer, well, of course, I'm not prejudiced, but the blacks are the Jews or whatever.
[00:52:17.240 --> 00:52:19.480] You can't keep your mouth shut.
[00:52:19.480 --> 00:52:23.640] And Tetlock's point is: maybe that's actually okay.
[00:52:23.640 --> 00:52:27.960] In a civil society, maybe some things are taboo enough we shouldn't do it.
[00:52:27.960 --> 00:52:35.960] So, another example of this is a debate in psychology circles of the scientific study of IQ differences between races and their causes.
[00:52:35.960 --> 00:52:37.400] Oh, boy.
[00:52:37.400 --> 00:52:42.120] I mean, I just, when I was in graduate school, this was already a hot topic in psychology.
[00:52:42.120 --> 00:52:43.960] And I thought, I'm never touching this one.
[00:52:43.960 --> 00:52:44.520] Oh, my God.
[00:52:44.520 --> 00:52:46.720] And it's only gotten worse.
[00:52:46.720 --> 00:52:47.200] Yeah.
[00:52:47.520 --> 00:52:48.080] Yeah.
[00:52:44.840 --> 00:52:49.360] No, for sure.
[00:52:50.320 --> 00:52:57.360] You know, and part of the problem with those things is that you're making judgments about ill-defined categories, right?
[00:52:57.360 --> 00:53:01.840] Like, what do you even mean by an African-American?
[00:53:01.840 --> 00:53:05.600] Like, what percentage African genes do they have to have?
[00:53:05.600 --> 00:53:07.760] And, you know, none of them are 100%.
[00:53:07.920 --> 00:53:09.840] Very few of them are 100%.
[00:53:09.840 --> 00:53:12.400] And look, my parents were born in South Africa.
[00:53:12.400 --> 00:53:13.600] Does that make me a Southern?
[00:53:14.560 --> 00:53:15.760] Oh, okay.
[00:53:16.960 --> 00:53:21.360] So, just defining the categories itself is tough.
[00:53:21.360 --> 00:53:25.440] But here's an interesting example of the kind of thing you're talking about.
[00:53:25.440 --> 00:53:29.120] Courts of law have to ignore base rates, right?
[00:53:29.120 --> 00:53:29.520] Yes.
[00:53:29.520 --> 00:53:38.560] So if someone is accused of being a snowshoe thief, you're not allowed to say, well, this person is likely to be guilty because they're Canadian.
[00:53:39.680 --> 00:53:46.720] Even if Canadians are the sole snowshoe thieves around, you're not allowed to peel to base rates.
[00:53:46.720 --> 00:53:53.600] And so that's, I think, another example of a favorable taboo subject.
[00:53:54.240 --> 00:54:01.120] All right, let me look at some specific examples: abortion, euthanasia, death penalty, capital punishment, so on.
[00:54:01.120 --> 00:54:02.320] I'll read from your book.
[00:54:02.320 --> 00:54:11.520] Cameron Todd Williams, William's house burned down in Texas in December 1991, killing his three young daughters.
[00:54:11.520 --> 00:54:15.600] A police investigation determined that the fire had been intentionally set.
[00:54:15.600 --> 00:54:19.680] William himself quickly became a suspect and eventually was accused.
[00:54:19.680 --> 00:54:22.800] He was tried and convicted of arson and murder in 1992.
[00:54:22.800 --> 00:54:27.920] He insisted that the fire was accidental and maintained his innocence throughout the trial and afterwards.
[00:54:27.920 --> 00:54:34.840] He even turned down a deal of a life in prison term in exchange for a guilty plea that would have taken execution off the table.
[00:54:34.840 --> 00:54:56.200] Partly on the strength of claims made by medical experts that William's tattoo of a skull and serpent fit the profile of a sociopath, oh my God, crazy, along with a statement by a psychologist that his poster of the heavy metal band Iron Maiden and another of a fallen angel from the rock band Led Zeppelin, oh my God, I love Led Zeppelin.
[00:54:56.200 --> 00:54:57.320] What does that make me?
[00:54:57.960 --> 00:55:01.720] Was an indicator of cultive type activities.
[00:55:01.720 --> 00:55:06.040] William Him was sentenced to death and executed by lethal injection in 2004.
[00:55:06.040 --> 00:55:07.480] Good reason to believe that William H.
[00:55:07.560 --> 00:55:09.000] was in fact innocent.
[00:55:09.000 --> 00:55:14.680] The evidence against him was effectively rebutted in 2004 by fire investigator Gerald Hearst.
[00:55:14.840 --> 00:55:21.800] In 2009, a report by Texas Forensic Science Commission found that the arson investigation in William's case was deeply flawed.
[00:55:21.880 --> 00:55:25.160] By the way, we did do the thing in Skeptic about fire investigations.
[00:55:25.160 --> 00:55:30.360] And, you know, if the paint is curled this way or if the glass is melted this way, or you know, it's this or that.
[00:55:30.360 --> 00:55:31.800] It's mostly bullshit.
[00:55:31.800 --> 00:55:35.480] You know, there's a really good under control.
[00:55:35.560 --> 00:55:41.560] You know, when you have blind conditions, they're not very good at predicting which ones were arson and which ones were accident.
[00:55:41.560 --> 00:55:43.960] There's a lot of after-the-fact reasoning in that.
[00:55:43.960 --> 00:55:45.320] But that's set aside.
[00:55:45.560 --> 00:55:48.760] You know, how do people think about the death penalty now?
[00:55:48.760 --> 00:55:50.360] You hear consequential arguments.
[00:55:50.360 --> 00:55:54.040] It costs more to keep somebody on death row than if they were in life in prison.
[00:55:54.040 --> 00:55:56.360] In the long run, you'd save money and so on.
[00:55:56.360 --> 00:55:58.280] But others make sacred value arguments.
[00:55:58.280 --> 00:55:59.800] We don't want to give the state.
[00:55:59.800 --> 00:56:01.320] This is a libertarian argument.
[00:56:01.320 --> 00:56:03.720] I don't want the state to have the power over life and death.
[00:56:03.720 --> 00:56:06.520] I don't care what the guy did.
[00:56:06.760 --> 00:56:08.760] So you hear those kind of arguments.
[00:56:08.760 --> 00:56:09.800] Yeah, yeah.
[00:56:10.680 --> 00:56:11.080] Yeah.
[00:56:11.080 --> 00:56:20.720] No, I think that capital punishment is definitely the kind of argument that elicits sacred values on all sides.
[00:56:21.040 --> 00:56:26.640] You know, I have spent my life actually making the financial argument, right?
[00:56:27.440 --> 00:56:31.840] Someone kills 30 people, I don't care about them anymore.
[00:56:31.840 --> 00:56:44.960] But if it's going to cost more to put them to death than to just keep them in prison for life, then why should the state be spending its money on this person that isn't worth anything?
[00:56:45.200 --> 00:56:48.880] But that argument has never gone down very well with anybody.
[00:56:49.360 --> 00:56:52.400] People want to think about this in terms of sacred values.
[00:56:52.400 --> 00:56:53.440] Absolutely.
[00:56:53.440 --> 00:57:00.080] I mean, the point of the story you just read, and by the way, I was attracted to it by the Led Zeppelin rapper.
[00:57:01.280 --> 00:57:02.160] That was funny.
[00:57:02.960 --> 00:57:05.760] That was too much for me.
[00:57:06.640 --> 00:57:19.200] But it's that he himself, the accused, refused to admit guilt and allowed the state to put him to death.
[00:57:19.200 --> 00:57:31.280] Like, that's how strongly he felt he held this sacred value that he did not want to admit guilt for something that he clearly didn't think he had done.
[00:57:32.800 --> 00:57:42.560] And so the point is the power of sacred values, that they're actually, you know, they have the power of life and death.
[00:57:42.560 --> 00:57:44.080] Well, I guess, you know, murder.
[00:57:44.080 --> 00:57:45.280] I mean, is murder wrong?
[00:57:45.280 --> 00:57:49.600] I did a debate with Dennis Breger once on that very subject.
[00:57:49.600 --> 00:57:51.720] But if you look it up, what's the, what is the word?
[00:57:51.720 --> 00:57:52.880] How do you define the word murder?
[00:57:52.880 --> 00:57:54.400] The wrongful killing of somebody.
[00:57:54.400 --> 00:57:58.800] Well, then you're asking, is wrongfully killing somebody wrong?
[00:57:58.800 --> 00:58:01.400] Yeah, it's right there in the definition, right?
[00:58:01.400 --> 00:58:03.720] But of course, if you say, is killing somebody wrong?
[00:57:59.760 --> 00:58:04.840] Well, it's like, well, it depends.
[00:57:59.920 --> 00:58:05.960] Self-defense.
[00:58:06.760 --> 00:58:10.600] You know, it's a just war and we're going to war to fight these evil people.
[00:58:10.600 --> 00:58:12.280] Or, you know, capital punishment.
[00:58:12.280 --> 00:58:14.600] The state, it's right there in the state constitution.
[00:58:14.600 --> 00:58:15.720] We have to execute them.
[00:58:15.720 --> 00:58:17.000] The courts ordered it.
[00:58:17.000 --> 00:58:18.200] End of story.
[00:58:18.440 --> 00:58:18.680] Right?
[00:58:18.680 --> 00:58:21.960] So, I mean, it depends on the context in that case.
[00:58:21.960 --> 00:58:22.520] Yeah.
[00:58:22.520 --> 00:58:27.240] But remember, I mean, we do all sorts of things that we know are wrong, right?
[00:58:27.240 --> 00:58:34.840] We drink too much or we speed on the highway or, you know, I mean, all kinds of things all the time.
[00:58:35.160 --> 00:58:40.840] And yet, very few of us are willing to commit murder.
[00:58:41.160 --> 00:58:52.360] So there's a sacred value there that distinguishes that act from other acts which also have that circularity in their definition, right?
[00:58:52.360 --> 00:58:56.520] Of being wrong by virtue of being defined as being wrong.
[00:58:56.520 --> 00:58:59.400] I mean, you know, what does it mean to go over the speed limit?
[00:58:59.400 --> 00:59:02.120] It means you're going faster than you're supposed to go.
[00:59:02.520 --> 00:59:05.320] But there's not a sacred value attached to that.
[00:59:05.320 --> 00:59:05.960] Yeah.
[00:59:06.280 --> 00:59:09.000] I mean, I think about, you know, moral bad luck.
[00:59:09.000 --> 00:59:16.280] If, you know, I mean, my wife and I go out to dinner, I have maybe two drinks, maybe three glasses of wine if we're there for a couple hours.
[00:59:16.280 --> 00:59:18.200] And then I get in the car and drive home.
[00:59:18.200 --> 00:59:19.480] You know, nothing's ever happened.
[00:59:19.480 --> 00:59:20.440] But what if it did?
[00:59:20.440 --> 00:59:25.160] You know, what if I got the wreck and I got a DUI and somebody was hurt or killed?
[00:59:25.160 --> 00:59:31.720] You know, but I mean, I've got a thousand, I don't know, a thousand, whatever, lots, dozens, whatever, in my 70 years.
[00:59:32.120 --> 00:59:35.720] But, you know, the other guy, he just, you know, got bad luck.
[00:59:36.040 --> 00:59:40.200] And there's not a lot of justice, cosmic justice behind all that.
[00:59:40.520 --> 00:59:41.160] Yeah.
[00:59:41.480 --> 00:59:41.880] Yeah.
[00:59:41.880 --> 00:59:47.520] No, those cases, I mean, I don't deal with them in the book, but they're interesting cases.
[00:59:47.760 --> 00:59:56.400] So you have two people, they're drunk the same amount, they've eaten the same amount, they're going just as far as each other.
[00:59:56.400 --> 00:59:59.440] One, you know, so they're both slight drunk.
[00:59:59.440 --> 01:00:04.640] One of them swerves and hits a tree, and the other swerves and hits a child.
[01:00:04.960 --> 01:00:12.320] And the one who hits a tree, you know, has to buy a new car, and the one who hits the child goes to jail for the rest of their lives.
[01:00:13.760 --> 01:00:22.320] And the world sort of is identical, except for what happened to be in their way when they swerved, right?
[01:00:22.320 --> 01:00:25.600] Which is not something they had any control over.
[01:00:25.920 --> 01:00:31.360] And I think it's an interesting question whether that's justified or not.
[01:00:31.360 --> 01:00:31.840] Yeah.
[01:00:32.480 --> 01:00:36.240] I think often these things come down to conflicting rights.
[01:00:36.240 --> 01:00:39.120] So they're, I guess, in a way, conflicting sacred values.
[01:00:39.120 --> 01:00:47.040] But on the other hand, you could have conflicting consequential calculations, like in your thought experiment in vignette number two.
[01:00:47.280 --> 01:00:53.120] You know, is it $50 for the kids' lunch program or $50 for the friends' medicine?
[01:00:54.080 --> 01:00:55.520] Is there a right answer?
[01:00:55.840 --> 01:00:57.360] Probably not, really.
[01:00:57.360 --> 01:00:58.080] No.
[01:00:58.400 --> 01:00:58.880] No.
[01:00:59.200 --> 01:00:59.600] Yeah.
[01:00:59.600 --> 01:01:02.240] I mean, that's why you have to deliberate.
[01:01:02.480 --> 01:01:10.240] If there were a clear right answer, then presumably the person would get no extra credit for spending the time thinking.
[01:01:10.240 --> 01:01:10.720] Yeah.
[01:01:11.280 --> 01:01:16.320] So on the abortion issue, I think I was thinking about that also reading that section in your book.
[01:01:16.640 --> 01:01:18.400] It really, it's kind of a conflicting rights.
[01:01:18.720 --> 01:01:24.240] I'm pro-choice, but I've tried to listen to the pro-life arguments and it's like, okay, those are pretty good arguments.
[01:01:24.240 --> 01:01:30.680] You know, their sacred value is, you know, the right to the fetus has a right to life.
[01:01:30.680 --> 01:01:32.920] Okay, yeah, I'll grant that.
[01:01:29.120 --> 01:01:35.400] But the woman has a right to life too, the mother.
[01:01:35.720 --> 01:01:39.480] You know, so we're going to value one or the other, and then you get, you know, get the calculations.
[01:01:39.880 --> 01:01:46.040] What's the likelihood she'll die in childbirth or whatever, much lower now than before, and so on and so forth.
[01:01:46.280 --> 01:01:51.160] Or what could the child have become, you know, that had Beethoven's mother aborted him?
[01:01:51.160 --> 01:01:54.280] You know, you've heard all these argu
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
ten these things come down to conflicting rights.
[01:00:36.240 --> 01:00:39.120] So they're, I guess, in a way, conflicting sacred values.
[01:00:39.120 --> 01:00:47.040] But on the other hand, you could have conflicting consequential calculations, like in your thought experiment in vignette number two.
[01:00:47.280 --> 01:00:53.120] You know, is it $50 for the kids' lunch program or $50 for the friends' medicine?
[01:00:54.080 --> 01:00:55.520] Is there a right answer?
[01:00:55.840 --> 01:00:57.360] Probably not, really.
[01:00:57.360 --> 01:00:58.080] No.
[01:00:58.400 --> 01:00:58.880] No.
[01:00:59.200 --> 01:00:59.600] Yeah.
[01:00:59.600 --> 01:01:02.240] I mean, that's why you have to deliberate.
[01:01:02.480 --> 01:01:10.240] If there were a clear right answer, then presumably the person would get no extra credit for spending the time thinking.
[01:01:10.240 --> 01:01:10.720] Yeah.
[01:01:11.280 --> 01:01:16.320] So on the abortion issue, I think I was thinking about that also reading that section in your book.
[01:01:16.640 --> 01:01:18.400] It really, it's kind of a conflicting rights.
[01:01:18.720 --> 01:01:24.240] I'm pro-choice, but I've tried to listen to the pro-life arguments and it's like, okay, those are pretty good arguments.
[01:01:24.240 --> 01:01:30.680] You know, their sacred value is, you know, the right to the fetus has a right to life.
[01:01:30.680 --> 01:01:32.920] Okay, yeah, I'll grant that.
[01:01:29.120 --> 01:01:35.400] But the woman has a right to life too, the mother.
[01:01:35.720 --> 01:01:39.480] You know, so we're going to value one or the other, and then you get, you know, get the calculations.
[01:01:39.880 --> 01:01:46.040] What's the likelihood she'll die in childbirth or whatever, much lower now than before, and so on and so forth.
[01:01:46.280 --> 01:01:51.160] Or what could the child have become, you know, that had Beethoven's mother aborted him?
[01:01:51.160 --> 01:01:54.280] You know, you've heard all these arguments.
[01:01:54.440 --> 01:02:00.520] How do you think about the abortion issue in terms of your consequentialism versus sacred values?
[01:02:00.520 --> 01:02:02.520] No, exactly the same way you do.
[01:02:02.760 --> 01:02:06.680] I mean, people tend to approach it from a sacred values perspective.
[01:02:07.000 --> 01:02:08.680] That's clear.
[01:02:08.680 --> 01:02:14.520] But in fact, it's an act that has all kinds of consequences.
[01:02:14.520 --> 01:02:20.920] And I think you have to weigh those consequences if you want to be wise when making a decision about abortion.
[01:02:20.920 --> 01:02:36.920] I mean, I'm also pro-choice, but I also think we should do what we can to minimize abortion, not only because of the fetus, and whether the fetus has rights or not is not something that I have any clear perspective on.
[01:02:37.240 --> 01:02:46.680] But there's no question that having an abortion can ruin someone's life, but it can also save someone's life.
[01:02:47.400 --> 01:02:49.800] There are all sorts of emotional consequences.
[01:02:49.800 --> 01:02:52.280] There are physical consequences, right?
[01:02:52.280 --> 01:02:56.280] There's the possibility of something terrible going wrong.
[01:02:56.600 --> 01:03:04.600] And then, yeah, I mean, the issue you brought up about who the child would have become is a whole other domain of uncertainty.
[01:03:04.840 --> 01:03:08.920] So there's a ton of uncertainty that really has to be considered.
[01:03:08.920 --> 01:03:24.160] So abortion is not a pleasant thing, and it's not something you want to happen, but I definitely think it's something that we would do better by thinking about consequentially rather than in terms of sacred values.
[01:03:24.160 --> 01:03:39.280] And I think all those medical professionals in red states that are now denying medical care to women because they have there's the possibility that they might end up killing their fetus in the process.
[01:03:39.280 --> 01:03:40.880] I think that's just pathetic.
[01:03:40.880 --> 01:03:43.040] Yeah, gone way too far.
[01:03:43.360 --> 01:03:47.040] Trans issue also, I've been thinking about that as a conflicting rights issues.
[01:03:47.040 --> 01:03:55.680] You don't want to support trans people if a man feels like a woman or vice versa and they want to compete in that other division in sports or whatever.
[01:03:55.680 --> 01:03:58.320] Yeah, I don't want to, I'm not opposed to that.
[01:03:58.320 --> 01:04:06.320] But if women are in the locker room going, I don't want a guy in here or a bathroom or whatever, it's like, well, yeah, I want to support women's rights also.
[01:04:06.320 --> 01:04:10.000] Well, you can't have, sometimes you can't have both, right?
[01:04:10.000 --> 01:04:14.160] I mean, I've looked for solutions like in sports, you could have an open division.
[01:04:14.400 --> 01:04:15.600] Frisbee Golf did this.
[01:04:16.160 --> 01:04:20.400] We have the men's division, women's division, and the open division where anybody would compete, right?
[01:04:20.400 --> 01:04:22.000] You know, or a gender-neutral bathroom.
[01:04:22.000 --> 01:04:24.000] It just says right on there, gender-neutral bathroom.
[01:04:24.080 --> 01:04:26.160] Okay, everybody knows, and that's fine.
[01:04:26.160 --> 01:04:28.880] You know, and you could build other locker rooms.
[01:04:28.880 --> 01:04:29.760] I don't know.
[01:04:30.000 --> 01:04:32.080] Sometimes the solutions get expensive, maybe.
[01:04:32.720 --> 01:04:36.720] But there, you know, I mean, life is mostly just trade-offs, complex trade-offs.
[01:04:36.720 --> 01:04:40.160] I mean, this was always Thomas Sowell's point that there are no solutions.
[01:04:40.160 --> 01:04:41.360] They're just trade-offs.
[01:04:41.360 --> 01:04:42.880] And most of life is trade-offs.
[01:04:42.880 --> 01:04:44.320] You can't have everything.
[01:04:45.600 --> 01:04:46.240] Yeah.
[01:04:47.440 --> 01:04:47.920] Yeah.
[01:04:47.920 --> 01:04:54.480] No, it sounds like in some sense in my book, I'm repeating the conclusions of Thomas Sowell.
[01:04:54.480 --> 01:04:55.760] In some ways, yeah.
[01:04:55.760 --> 01:04:56.560] I think so.
[01:04:57.120 --> 01:05:00.840] Yeah, no, look, lots of people have made the point, right?
[01:04:59.840 --> 01:05:03.320] I'm not going to pretend to be the first to say anything.
[01:05:04.440 --> 01:05:08.840] I'm bringing things together in a way that others haven't, I think.
[01:05:08.840 --> 01:05:12.920] But I think that that's exactly right.
[01:05:13.720 --> 01:05:27.080] On the other hand, so what's right is that when cases are difficult, you just have to take the case on its merits and you're just going to have to devote a lot of time to thinking really hard.
[01:05:27.080 --> 01:05:33.000] But, you know, you do have to appreciate that sometimes you don't have the resources to do that, right?
[01:05:33.000 --> 01:05:35.640] So it's like if you're at war, right?
[01:05:35.640 --> 01:05:43.000] And if you have to decide whether to kill the enemy, you just don't have time to deliberate about it.
[01:05:43.000 --> 01:05:45.480] You either do it or you don't, right?
[01:05:47.080 --> 01:05:53.560] You know, whether to which way to swerve, right?
[01:05:54.120 --> 01:06:03.320] If you're driving and suddenly something appears and you have to decide whether to risk your own life or to risk the life of the car beside you.
[01:06:03.320 --> 01:06:07.480] I mean, it's not something you have time to think about consequentially.
[01:06:07.480 --> 01:06:11.560] Moreover, we're making decisions constantly.
[01:06:12.120 --> 01:06:19.480] And we have to make decisions that other people understand often, if we're, say, working in an organization, right?
[01:06:20.120 --> 01:06:25.560] And decisions that other people will agree with, otherwise, we'll immediately become pariahs.
[01:06:25.880 --> 01:06:44.120] So while I agree with you that the best decision is one that, you know, maximizes benefits, minimizes costs, I do think we have no choice given the realities of life but to make use of sacred values a lot.
[01:06:45.200 --> 01:06:53.600] You talk about artificial intelligence in your book in the context of how complicated consequential calculations can be.
[01:06:53.600 --> 01:06:56.160] Maybe AI can do it for us.
[01:06:56.560 --> 01:07:06.160] Elon's full driving mode engineers will design into the car, you know, the trolley problem choice and make the right choice.
[01:07:06.160 --> 01:07:07.840] Something like that.
[01:07:09.120 --> 01:07:10.560] Yeah, do you believe that?
[01:07:10.560 --> 01:07:12.480] No, I don't.
[01:07:12.800 --> 01:07:13.760] I do not.
[01:07:14.160 --> 01:07:15.840] Never mind Elon Musk, right?
[01:07:15.840 --> 01:07:19.920] Even if somebody respectable did it, I don't think it would be possible.
[01:07:19.920 --> 01:07:29.360] Yeah, actually, the thought experiment that I thought was most clever in the book is the one where I suggest that there's a decision aid, right?
[01:07:29.360 --> 01:07:47.120] In which makes decisions that are just as good as us, because nowadays, AI, you can imagine, can have at least as much knowledge as we do and at least as much computing power and could make decisions that are at least as good as the ones that we make.
[01:07:47.120 --> 01:07:51.520] And so why not, you know, allow it to make all your decisions?
[01:07:51.840 --> 01:07:56.880] But then there are cases where that would just not be acceptable.
[01:07:56.880 --> 01:08:06.960] Like, for instance, if you decide to dump somebody or somebody decides to dump you and they say, well, you know, my AI told me to do it.
[01:08:07.760 --> 01:08:09.840] You're not going to forgive them.
[01:08:09.840 --> 01:08:25.200] Or if the Supreme Court was making a decision and they just said, oh, you know, each one of them allowed their little AI support or helpers make the call.
[01:08:25.200 --> 01:08:27.600] And so, you know, America was changed.
[01:08:27.600 --> 01:08:29.040] The immigration law was changed.
[01:08:29.120 --> 01:08:34.840] The president became king because the AIs decided that that would be the right thing to do.
[01:08:35.480 --> 01:08:38.040] I don't think anybody would find that acceptable.
[01:08:38.040 --> 01:08:39.560] No, that doesn't feel right.
[01:08:39.560 --> 01:08:49.000] But on the other hand, we're sort of saying evolution designed in us the capacity to reason to a certain extent that we make rational choices, more or less.
[01:08:49.240 --> 01:08:50.840] I did want to ask you about that.
[01:08:51.000 --> 01:08:55.320] Big debate I have here on the show all the time is to what extent humans are rational.
[01:08:55.320 --> 01:09:06.760] We assume it's skeptic that we can reason people to the right answer using critical thinking, making them aware of all the confirmation bias and hindsight bias, motivated reasoning, and all that stuff.
[01:09:06.760 --> 01:09:11.000] The research on this is pretty mixed, as you know, because you're right about that.
[01:09:11.080 --> 01:09:11.400] Right?
[01:09:11.400 --> 01:09:24.920] I mean, you can teach some of it, and if you teach them about other people's, or you teach them about the motivated reasoning, all the biases, they get better at seeing it in other people, but not themselves.
[01:09:25.400 --> 01:09:26.600] So you have a lot of people who are.
[01:09:26.680 --> 01:09:32.280] But you can teach people to behave in an unbiased way in specific contexts.
[01:09:32.280 --> 01:09:32.920] Yes, yes.
[01:09:32.920 --> 01:09:33.240] Right?
[01:09:33.240 --> 01:09:37.560] Like, like no one can really slow down your business.
[01:09:37.560 --> 01:09:41.880] Anything from sophisticated phishing attacks to plain old spotty internet.
[01:09:41.880 --> 01:09:51.560] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[01:09:51.560 --> 01:09:54.600] Plus, the more services you bundle, the more you save.
[01:09:54.600 --> 01:09:56.920] Don't let your network slow down your business.
[01:09:56.920 --> 01:10:03.000] Get Optimum Business Secure Internet starting at just $60 a months at optimum.com/slash business.
[01:10:03.960 --> 01:10:16.480] Doctors tend to, even though people neglect base rates in general, doctors these days are pretty good at observing base rates when patients come into their office looking for a diagnosis.
[01:10:16.800 --> 01:10:22.640] But that doesn't mean when the doctor goes home that they don't neglect base rates.
[01:10:22.640 --> 01:10:38.640] But I mean, so like if we teach you general critical thinking skills, and this is why you should be skeptical of spoon benders or astrology or cult leaders, will the reader then take that into the future and 10 years later they encounter something that I can't even think of.
[01:10:38.640 --> 01:10:41.200] And do they apply those skills or not?
[01:10:41.680 --> 01:10:42.400] No, no.
[01:10:42.400 --> 01:10:43.680] The answer is clear.
[01:10:43.680 --> 01:10:45.120] No, they don't.
[01:10:46.480 --> 01:11:00.400] I don't think there's ever been a study of critical reasoning skills which makes anything other than a tiny difference in a very contextually specific way to the quality of people's decisions.
[01:11:01.360 --> 01:11:02.480] That's the sad truth.
[01:11:02.480 --> 01:11:05.120] It's like I actually find the story.
[01:11:05.360 --> 01:11:12.320] It shocks me that there's still a debate about this because it strikes me that the evidence is very clear.
[01:11:12.320 --> 01:11:24.640] Like, first of all, so what I consider an irrational decision is one in which you have a normative theory which tells you you should be doing X and instead you do Y, right?
[01:11:24.640 --> 01:11:33.200] So there's a divergence between a theory that tells you what you should do and what you actually do, then that's irrational.
[01:11:33.520 --> 01:11:38.160] And first of all, we don't often have normative theories.
[01:11:38.160 --> 01:11:49.760] Like you have given me plenty of examples over the course of the last hour and a half in which there just is no normative theory of like, should you push the big man off the bridge?
[01:11:49.760 --> 01:11:55.040] Well, I, you know, that depends on your theory, on your ethical theory.
[01:11:55.000 --> 01:12:08.040] Um, so so you simply can't attribute irrationality in those cases because you're, you have no basis if you don't know what you should do.
[01:12:08.040 --> 01:12:22.280] But in those cases where you do know what you should do, then it turns out, yeah, people do the right thing a good chunk of the time, but we use all these simplifying strategies and that leads to systematic bias.
[01:12:22.280 --> 01:12:27.240] And it strikes me that that has been demonstrated time and time again.
[01:12:27.240 --> 01:12:45.640] So, you know, the conclusion I draw is some chunk of the time and sometimes when it really matters, like when we're making decisions about war or about tariffs or about immigration or at any big political issue, yeah, we have systematic biases.
[01:12:45.640 --> 01:12:49.400] We are irrational and we really should be taking that into account.
[01:12:49.400 --> 01:12:50.040] Yeah.
[01:12:50.040 --> 01:13:00.760] It's like that stretch of time with research on Alzheimer's and that as a preventative measure, you should do things like Sudoku puzzles, chess, and crossword puzzles and so on to keep your brain sharp.
[01:13:00.760 --> 01:13:06.600] And it turns out it doesn't do anything for Alzheimer's, but it does make you a better Sudoku puzzle solver.
[01:13:08.520 --> 01:13:09.000] Really?
[01:13:09.000 --> 01:13:11.000] It doesn't do anything for Alzheimer's.
[01:13:11.640 --> 01:13:12.360] Oh, no.
[01:13:12.360 --> 01:13:12.920] Oh, boy.
[01:13:12.920 --> 01:13:13.480] Yeah, sorry.
[01:13:13.480 --> 01:13:15.160] I'm going to stop playing Sudoku.
[01:13:15.160 --> 01:13:16.440] But you'll get better at Sudoku.
[01:13:16.920 --> 01:13:18.120] That's worth something.
[01:13:18.120 --> 01:13:28.840] But what I'm getting at is a slightly deeper question with like Hugo Mercier's book, Not Born Yesterday, kind of the follow-up to his enigma of reason.
[01:13:29.400 --> 01:13:36.760] That people are not as irrational as people like me think we are because we're always pointing out all the crazy cults people join and so on.
[01:13:36.760 --> 01:13:39.880] Like one of his examples is most people don't join cults.
[01:13:39.880 --> 01:13:44.280] We can spill off the examples of Scientology or Jonestown or Heaven's Gate.
[01:13:44.280 --> 01:13:47.920] There's a handful and you can point to the people and what were they thinking.
[01:13:48.240 --> 01:13:51.280] But, you know, how many millions of people have encountered Scientology?
[01:13:51.280 --> 01:13:58.320] They took the personality test or they see him on the sidewalk on Hollywood Boulevard there and have a good laugh about it and then walk on.
[01:13:58.320 --> 01:14:01.280] They don't take a second mortgage out on their home and give it to the church.
[01:14:01.280 --> 01:14:02.400] Most people don't follow.
[01:14:02.560 --> 01:14:07.120] This is his argument that like political persuasive ads are largely ineffective.
[01:14:07.120 --> 01:14:10.400] Most people, you know, are not persuaded by those things.
[01:14:10.400 --> 01:14:11.040] Yeah.
[01:14:11.360 --> 01:14:15.760] Look, I mean, this is a half glass full versus half glass empty debate.
[01:14:15.760 --> 01:14:22.400] So, you know, Hugo wants to see the world as with a glass that's half full and good for him.
[01:14:22.400 --> 01:14:24.320] And I hope he's happier for it.
[01:14:24.320 --> 01:14:31.520] But the fact is, you just got to turn on the news and see what the consequences of human irrationality are.
[01:14:31.840 --> 01:14:33.920] And, you know, they're horrific.
[01:14:33.920 --> 01:14:34.480] Yeah.
[01:14:34.880 --> 01:14:42.560] And, you know, when someone you know gets picked up and thrown in jail in El Salvador, you'll feel that way too.
[01:14:42.560 --> 01:14:43.040] Right.
[01:14:43.040 --> 01:14:43.520] Right.
[01:14:43.520 --> 01:14:44.320] Right.
[01:14:44.640 --> 01:14:45.040] All right.
[01:14:45.040 --> 01:14:46.160] Last line of thinking here.
[01:14:46.160 --> 01:14:50.640] I wanted to ask you about free will, determinism, compatibilism in the context of making choices.
[01:14:50.640 --> 01:14:53.920] I'll read my favorite passage from William James.
[01:14:53.920 --> 01:14:58.080] Romeo wants Juliet as the Philenes want the magnet.
[01:14:58.080 --> 01:15:03.120] And if no obstacles intervene, he moves toward her by as straight a line as they.
[01:15:03.120 --> 01:15:12.720] But Romeo and Juliet, if a wall built between them, do not remain idiotically pressing their faces against its opposite sides, like the magnet and the philenes with the card.
[01:15:12.720 --> 01:15:21.120] Romeo soon finds a circuitous path by scaling the wall or otherwise of touching Juliet's lips directly.
[01:15:21.120 --> 01:15:23.680] With the philenes, the path is fixed.
[01:15:23.680 --> 01:15:26.560] Whether it reaches the end depends on accidents.
[01:15:26.560 --> 01:15:29.280] With the lover, it is the end which is fixed.
[01:15:29.280 --> 01:15:32.280] The path may be modified indefinitely.
[01:15:29.920 --> 01:15:38.520] I like that as an argument for free will or some more volition or degrees of freedom compared to the iron filing.
[01:15:39.160 --> 01:15:46.600] So notice we've already discussed this when we discussed psychological versus physical causality.
[01:15:46.600 --> 01:15:50.040] This was exactly the distinction that I was pointing to.
[01:15:51.400 --> 01:15:52.040] Right?
[01:15:52.040 --> 01:15:57.400] So psychological causality depends on this counterfactual, right?
[01:15:57.800 --> 01:16:03.400] Would this have happened if the action had not been performed?
[01:16:03.400 --> 01:16:08.680] Whereas physical causality is just a matter of forces being enacted.
[01:16:09.560 --> 01:16:25.560] So the make it, the counterfactual view allows for Romeo to say, I would have kissed Juliet, I would have found a way to kiss her, right?
[01:16:27.240 --> 01:16:27.960] Right.
[01:16:27.960 --> 01:16:28.520] Yeah.
[01:16:28.840 --> 01:16:35.400] Precisely because it doesn't specify the process by which you got to her.
[01:16:36.680 --> 01:16:38.840] So are you a compatibilist?
[01:16:39.800 --> 01:16:40.760] Oh, I don't know.
[01:16:40.760 --> 01:16:41.480] You don't know?
[01:16:41.480 --> 01:16:42.360] You don't have to comment.
[01:16:42.760 --> 01:16:47.000] I mean, it's such a dead issue, but it's a live issue.
[01:16:47.000 --> 01:16:51.080] I can't tell you how many books I get every year on free will and determinism here.
[01:16:52.680 --> 01:16:58.040] That tells me something that it's not a soluble problem in any kind of scientific sense, right?
[01:16:58.040 --> 01:17:05.720] People just stake out their arguments and it's become something of a sacred value, like I have volition, or no, you don't, you're determined.
[01:17:05.720 --> 01:17:06.320] That's right.
[01:17:06.120 --> 01:17:06.800] Yeah.
[01:17:06.680 --> 01:17:06.920] Yeah.
[01:17:06.920 --> 01:17:07.160] Yeah.
[01:17:07.800 --> 01:17:22.480] No, look, I mean, the so studying Pearl's view of causality is the most informative, provided the most information I've managed to find about that issue, right?
[01:17:22.480 --> 01:17:30.400] Because basically what he says is if you want to make good inferences, you have to assume free will.
[01:17:30.400 --> 01:17:30.960] Yeah.
[01:17:30.960 --> 01:17:34.240] Because that's what intervention is, right?
[01:17:34.640 --> 01:17:56.880] So if you understand free will to mean that there's an action that's taking place from outside the system that isn't caused by anything, but determines the value of something inside the system, then that allows inferences that are different than if you're just observing the system directly.
[01:17:57.360 --> 01:17:59.280] You know, I didn't say that very well.
[01:17:59.680 --> 01:18:01.280] No, no, that makes sense.
[01:18:01.520 --> 01:18:03.200] It's some time to really paint the picture.
[01:18:03.600 --> 01:18:06.320] Today's book is, I think it's called The Book of Why.
[01:18:06.720 --> 01:18:07.520] Was that what it was?
[01:18:07.840 --> 01:18:10.240] Yeah, so that's the later popular book.
[01:18:10.640 --> 01:18:16.160] These ideas came out of a book called Causality, actually, that he published in 2000.
[01:18:16.560 --> 01:18:30.080] But the basic idea is that to make correct probabilistic inferences, you have to assume something like free will, which doesn't mean that free will actually exists.
[01:18:30.400 --> 01:18:34.880] It just means you have to assume it to make correct inferences.
[01:18:34.880 --> 01:18:41.280] Well, the point I always make is that I know all the determinists and have read their books, but none of them actually act like it.
[01:18:42.240 --> 01:18:47.200] They don't walk around their lives leading their lives as if they're determined because nobody does.
[01:18:47.520 --> 01:18:48.320] Interesting.
[01:18:48.360 --> 01:18:50.960] Yeah, it's really kind of funny that way.
[01:18:50.960 --> 01:18:51.440] All right.
[01:18:51.440 --> 01:18:57.120] Well, so you're in the end here, the cost of conviction, how our deepest values lead us astray.
[01:18:57.120 --> 01:18:57.600] There it is.
[01:18:57.600 --> 01:19:03.000] You make the case slightly in favor of consequentialism over sacred values.
[01:19:03.960 --> 01:19:06.840] You want to put a final exclamation point on that?
[01:19:07.800 --> 01:19:10.680] Yeah, but descriptively, right?
[01:19:10.680 --> 01:19:11.400] Descriptively, yeah.
[01:19:12.680 --> 01:19:16.120] We rely more on sacred values than we should.
[01:19:16.440 --> 01:19:21.160] I'm not making normative claims about what the right way to make a decision is.
[01:19:21.160 --> 01:19:21.640] Right.
[01:19:22.760 --> 01:19:30.200] By the way, since you talk about Kahneman so much, were you surprised to hear the story that he euthanized himself?
[01:19:30.280 --> 01:19:30.920] Went to Switzerland?
[01:19:31.160 --> 01:19:32.120] I was shocked.
[01:19:32.120 --> 01:19:33.160] Unbelievable.
[01:19:33.480 --> 01:19:33.960] Yeah.
[01:19:34.600 --> 01:19:35.160] Yeah.
[01:19:36.120 --> 01:19:36.760] Yeah.
[01:19:36.760 --> 01:19:47.560] I had just, you know, a couple of months beforehand had a long conversation with the woman he had been living with until that point, and she didn't say a word about it.
[01:19:47.560 --> 01:19:50.600] It was clearly a big secret.
[01:19:50.600 --> 01:19:51.240] Yeah.
[01:19:52.200 --> 01:19:58.600] But yeah, it worries me that he did it for its symbolic value.
[01:19:59.960 --> 01:20:00.520] Right.
[01:20:00.520 --> 01:20:04.120] Because he's the great decision maker in America.
[01:20:04.120 --> 01:20:07.800] And he's sort of teaching people how to make good decisions.
[01:20:08.120 --> 01:20:10.040] But hopefully that's not why.
[01:20:10.360 --> 01:20:16.920] And I, you know, I don't know enough of the details of his state of health to know what he had coming.
[01:20:16.920 --> 01:20:20.120] Well, yeah, apparently he wasn't terminal or anything like that.
[01:20:20.120 --> 01:20:22.200] But, you know, I'm 70.
[01:20:22.200 --> 01:20:27.160] If I was 91 and I could tell things were starting to go and I'd have a long, good life.
[01:20:27.160 --> 01:20:34.600] It's like, well, you know, what's the point of dragging it out for another 10 years, in which I can't even reason at all?
[01:20:34.600 --> 01:20:35.560] I can't even think.
[01:20:35.960 --> 01:20:37.640] Something like that.
[01:20:37.640 --> 01:20:41.160] That could be a calculation, you know, rational calculation.
[01:20:41.160 --> 01:20:42.040] Absolutely.
[01:20:42.040 --> 01:20:44.800] But I just don't know because I'm not in that position.
[01:20:43.320 --> 01:20:47.200] I don't want to be in that position.
[01:20:44.280 --> 01:20:49.280] All right, Stephen, great book.
[01:20:49.440 --> 01:20:50.480] What are you working on next?
[01:20:50.480 --> 01:20:52.800] What's your research programs these days?
[01:20:53.120 --> 01:20:54.640] Hey, that book hasn't even come out.
[01:20:54.800 --> 01:20:55.360] Oh, that's true.
[01:20:55.360 --> 01:20:55.760] That's right.
[01:20:55.920 --> 01:20:57.040] I shouldn't ask you that question.
[01:20:57.040 --> 01:20:58.320] I shouldn't ask you that question.
[01:20:58.320 --> 01:20:58.720] Okay.
[01:20:58.720 --> 01:20:59.520] You're not doing anything.
[01:20:59.520 --> 01:21:00.080] Oh, that's right.
[01:21:00.080 --> 01:21:00.880] May 20th.
[01:21:00.880 --> 01:21:06.400] Well, we will release this on the pub date of the book, and then you can worry about the future.
[01:21:06.720 --> 01:21:07.600] Thanks so much.
[01:21:07.600 --> 01:21:07.920] All right.
[01:21:08.080 --> 01:21:09.520] Been a real pleasure talking to you.
[01:21:09.520 --> 01:21:10.560] Likewise.
[01:21:17.280 --> 01:21:21.120] I actually chose FDU because of the small classroom sizes.
[01:21:21.120 --> 01:21:22.880] That's something that was extremely important to me.
[01:21:22.880 --> 01:21:25.600] I wanted to have a connection and a relationship with my professors.
[01:21:25.600 --> 01:21:27.760] I didn't want to just be a number in a classroom.
[01:21:27.760 --> 01:21:31.520] I seized the moment at FDU and found my purpose.
[01:21:32.480 --> 01:21:37.920] Groons just launched a limited edition Grooney Smith apple flavor, and it's only available through October.
[01:21:37.920 --> 01:21:42.640] Same full body benefits you love, but now it tastes like sweet tart green apple candy.
[01:21:42.640 --> 01:21:46.720] Like walking through an orchard in a cable-knit sweater, warm cider in hand.
[01:21:46.720 --> 01:21:52.080] Each packable pouch delivers six grams of prebiotic fiber and a powerful daily dose of vitamins and minerals.
[01:21:52.080 --> 01:21:54.640] Grab your limited edition Grooney Smith Apple Groons.
[01:21:54.640 --> 01:21:56.560] Stock up because they will sell out.
[01:21:56.560 --> 01:21:58.560] Use code apple at groons.co.
[01:21:58.560 --> 01:22:02.320] That's g-u-r-n-s.co-o to get up to 52% off.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.960 --> 00:00:03.520] You know what can really slow down your business?
[00:00:03.520 --> 00:00:07.920] Anything from sophisticated phishing attacks to plain old spotty internet.
[00:00:07.920 --> 00:00:17.600] Get Business Secure Internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:00:17.600 --> 00:00:20.640] Plus, the more services you bundle, the more you save.
[00:00:20.640 --> 00:00:22.880] Don't let your network slow down your business.
[00:00:22.880 --> 00:00:29.280] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business.
[00:00:30.240 --> 00:00:32.320] Every business has an ambition.
[00:00:32.320 --> 00:00:41.760] PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
[00:00:41.760 --> 00:00:50.160] And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
[00:00:50.160 --> 00:00:55.040] When it's time to get growing, there's one platform for all business: PayPal Open.
[00:00:55.040 --> 00:00:57.600] Grow today at paypalopen.com.
[00:00:57.600 --> 00:01:00.640] Loan subject to approval in available locations.
[00:01:03.840 --> 00:01:09.840] You're listening to The Michael Shermer Show.
[00:01:15.600 --> 00:01:17.120] All right, here we go.
[00:01:17.120 --> 00:01:17.760] Hi, everybody.
[00:01:17.760 --> 00:01:22.800] It's Michael Shermer, and it's time for another episode of, you know what, the Michael Shermer Show.
[00:01:22.800 --> 00:01:23.680] Here we go.
[00:01:23.680 --> 00:01:31.520] I'm going to open or introduce today's episode by reading you a thought experiment, a vignette, as it were.
[00:01:31.520 --> 00:01:34.080] Here is vignette number one.
[00:01:34.080 --> 00:01:36.720] You have $50 available.
[00:01:36.720 --> 00:01:45.920] A neighbor who just lost their job comes by and says they need $50 to buy a meal plan for their child, otherwise their child will not have lunch at school.
[00:01:45.920 --> 00:01:47.600] Can you give them the money?
[00:01:47.600 --> 00:01:49.360] You ask two friends for their advice.
[00:01:49.360 --> 00:01:55.760] Frankie thinks about it for quite a while, then advises you to go ahead and give the money to your neighbor.
[00:01:55.760 --> 00:02:00.000] Charlie hardly thinks at all, advising you to immediately give the neighbor the money.
[00:02:00.680 --> 00:02:05.480] On the basis of their advice, who would you say is more morally praiseworthy?
[00:02:05.480 --> 00:02:11.560] Frankie, the first one, who thought about it for quite a while, or Charlie, who just made a snap decision?
[00:02:11.560 --> 00:02:13.480] Okay, here's vignette number two.
[00:02:13.480 --> 00:02:15.320] Very similar, but not quite.
[00:02:16.120 --> 00:02:18.040] You have only $50 available.
[00:02:18.040 --> 00:02:25.240] A neighbor who just lost their job comes by and says they need $50 to buy a meal plan for their child, otherwise, the child will not have lunch at school.
[00:02:25.240 --> 00:02:26.840] Same scenario as the first one.
[00:02:26.840 --> 00:02:27.880] Can you give them the money?
[00:02:27.880 --> 00:02:29.240] But there's a problem.
[00:02:29.240 --> 00:02:37.320] You just got a text from an old schoolmate asking to borrow $50 to buy the medication they need to cure a serious illness.
[00:02:37.320 --> 00:02:39.400] You ask two friends for advice.
[00:02:39.400 --> 00:02:44.200] Frankie thinks about it for quite a while and then advises you to give your neighbor the money.
[00:02:44.200 --> 00:02:48.120] Charlie hardly thinks at all, advising you to immediately give the neighbor the money.
[00:02:48.120 --> 00:02:51.560] On the basis of their advice, who would you say is more morally praiseworthy?
[00:02:51.560 --> 00:02:54.440] Frankie, the first one, or Charlie?
[00:02:54.760 --> 00:03:01.720] Well, my guest today writes about this in his new book, The Cost of Conviction, which I'll give a proper introduction in just a moment.
[00:03:01.880 --> 00:03:09.320] And he says, most people think Charlie is more praiseworthy in vignette one, but Frankie is more praiseworthy in vignette two.
[00:03:09.320 --> 00:03:15.000] To find out why, we will discuss with him after I give him a proper introduction here.
[00:03:15.000 --> 00:03:18.920] He is Steve Sloan, who has taught at Brown University since 1992.
[00:03:18.920 --> 00:03:31.400] He's a fellow of the Cognitive Science Society, the Society of Experimental Psychologists, the American Psychological Society, and the Eastern Psychological Association, and the Psychonomic Society.
[00:03:31.720 --> 00:03:39.240] He's the author of Causal Models and the co-author of The Knowledge Illusion with Phil Fernbach.
[00:03:39.240 --> 00:03:49.440] He's been editor-in-chief of the journal Cognition, chair of the Brown University faculty, and the creator of Brown University's concentration of behavioral decisions, decision sciences.
[00:03:49.760 --> 00:03:55.200] Here's the new book, The Cost of Conviction: How Our Deepest Values Lead Us Astray.
[00:03:55.200 --> 00:03:56.400] All right, Steve, nice to see you.
[00:03:56.400 --> 00:03:57.040] How are you doing?
[00:03:57.040 --> 00:04:02.480] And explain to us what in the world are we supposed to interpret from that thought experiment?
[00:04:02.800 --> 00:04:03.520] Well, great.
[00:04:03.520 --> 00:04:05.040] Thanks so much for having me.
[00:04:05.040 --> 00:04:06.960] And it's great to meet you.
[00:04:06.960 --> 00:04:10.720] And I got to say that this is the first time I've ever seen the book.
[00:04:10.720 --> 00:04:11.440] Oh, really?
[00:04:11.440 --> 00:04:11.840] Yeah.
[00:04:12.000 --> 00:04:14.320] Well, I got it, I don't know, maybe a week ago or so.
[00:04:14.320 --> 00:04:15.120] And yeah, it's great.
[00:04:15.120 --> 00:04:15.440] It's fun.
[00:04:15.440 --> 00:04:15.920] I read the whole book.
[00:04:16.000 --> 00:04:18.160] Well, they sent it to you before they sent it to me.
[00:04:18.160 --> 00:04:19.760] Okay, I read it this time.
[00:04:19.760 --> 00:04:24.400] I didn't get the audio file, so I actually went old school and read it ever to cover.
[00:04:24.400 --> 00:04:25.600] Well, why you didn't know that?
[00:04:25.600 --> 00:04:27.760] Even though I haven't seen the book, I have read it.
[00:04:27.760 --> 00:04:29.280] Okay, I hope so.
[00:04:31.680 --> 00:04:36.800] So I would like to say that I did not make up the examples you just read.
[00:04:36.800 --> 00:04:45.040] I mean, I made up the specific examples, but the idea comes from work by Phil Tetlock and his colleagues at the University of Pennsylvania.
[00:04:45.040 --> 00:04:48.320] So I want to give credit where credit is due.
[00:04:48.320 --> 00:04:48.960] Indeed.
[00:04:50.560 --> 00:04:52.880] So, well, what did you think?
[00:04:52.880 --> 00:04:55.840] Did you have an opinion about who?
[00:04:55.920 --> 00:05:02.080] Well, I thought the interpretation you made there was, you know, is the way I thought about it as well.
[00:05:02.400 --> 00:05:07.120] That is to say, a moral person should have no trouble choosing the sacred value in the first one.
[00:05:07.120 --> 00:05:10.080] Obviously, you should give the money to the neighbor.
[00:05:10.080 --> 00:05:15.600] But in the second vignette, you really should give that some thought because you have conflicting interests there.
[00:05:15.600 --> 00:05:20.320] $50 for the drug to help the friend, or $50 to the neighbor's kid.
[00:05:20.720 --> 00:05:21.920] What's the right one?
[00:05:21.920 --> 00:05:23.040] How do you calculate that?
[00:05:23.040 --> 00:05:25.360] That should not be a snap decision.
[00:05:25.360 --> 00:05:25.920] Right.
[00:05:26.240 --> 00:05:31.000] So the first one is an example of a taboo trade-off, right?
[00:05:31.000 --> 00:05:43.480] It's a trade-off you're not supposed to make because when you have a sacred value, like help friends in need, certainly if they're children, then you're supposed to follow that all the time.
[00:05:43.480 --> 00:05:56.040] And, you know, deciding whether or not to follow it versus, you know, having a good time, spending 50 bucks is supposed to be an easy choice because there's this absolute rule to follow.
[00:05:56.040 --> 00:06:03.800] Whereas in the second case, there's a competition between two different sacred values, right?
[00:06:03.800 --> 00:06:08.680] The helping a child in need versus helping a friend in need.
[00:06:09.560 --> 00:06:17.000] And that requires some thought, that requires some deliberation in order to negotiate that.
[00:06:17.000 --> 00:06:29.400] So some trade-offs, if you believe in sacred values, some trade-offs are permitted, namely trade-offs between sacred values, but some trade-offs are not.
[00:06:29.720 --> 00:06:35.560] Trade-offs between sacred values and material goods, essentially.
[00:06:35.560 --> 00:06:49.160] So I love the example, and I'm very happy that you chose that to read because it's very direct evidence, I think, that there's something psychologically real about sacred values, right?
[00:06:49.160 --> 00:06:50.920] That's really the point.
[00:06:50.920 --> 00:06:59.640] So the theme of the book is that when we make decisions, there are two strategies we use.
[00:06:59.960 --> 00:07:06.360] One is the one we think we use almost all the time, namely a consequentialist strategy, right?
[00:07:06.360 --> 00:07:12.680] We think about the outcomes of the various options and choose the one that leads to the best outcomes.
[00:07:12.680 --> 00:07:24.880] But in fact, for purposes of simplification, what we often do is trade, is make decisions based on the actions and our values about those actions.
[00:07:24.880 --> 00:07:27.840] And that's what I'm referring to by sacred values.
[00:07:27.840 --> 00:07:28.320] Yeah.
[00:07:28.320 --> 00:07:34.960] Can you extrapolate from that to ethical theories, utilitarianism versus deontological ethics?
[00:07:34.960 --> 00:07:35.600] Yeah.
[00:07:35.920 --> 00:07:54.640] So, look, the distinction certainly is a child, is inherited from the distinction between deontology and utilitarian ethics, which is like the prototypical example of a consequentialist ethical theory.
[00:07:56.240 --> 00:08:08.720] But the book is psychology, and it seems to me that people don't use the full sophistication of those kinds of ethical theories.
[00:08:09.040 --> 00:08:16.800] We're doing something simpler, certainly something different, which is why I don't use that language.
[00:08:17.440 --> 00:08:20.880] Do you think it's worth spending some time explaining what we mean by this?
[00:08:21.520 --> 00:08:24.720] Only because, yeah, there are clumsy words.
[00:08:25.520 --> 00:08:30.560] And I was just, I was trying to explain your book to one of my cycling buddies on the ride this morning.
[00:08:30.960 --> 00:08:35.520] Utilitarianism and deontological, I can see his eyes glazing over, like, oh, okay.
[00:08:37.120 --> 00:08:38.320] So your words, not mine.
[00:08:38.560 --> 00:08:40.160] I know, I know, I know, I know.
[00:08:40.480 --> 00:08:58.400] But, you know, if I just summarize the book, you know, given current events, you know, we have so much political divisiveness because both sides, the left and the right, are making decisions as if it's just pure sacred values on the line and the other side just doesn't see that, you know, our sacred values are better than theirs.
[00:08:58.400 --> 00:09:01.160] Whereas, in fact, most politics is just consequentialism.
[00:09:01.320 --> 00:09:09.720] I mean, we're just making compromises and trying to do the calculations about what's going to be the best for the most citizens of the country or whatever.
[00:09:09.720 --> 00:09:20.120] And that, you know, it used to be that this kind of consequentialism is how politicians, you know, wrangled with these issues and made some compromise, but they don't do that as much anymore.
[00:09:20.120 --> 00:09:22.120] That seemed to make more sense.
[00:09:22.120 --> 00:09:22.680] Yeah.
[00:09:23.000 --> 00:09:35.720] No, well, that is certainly the sub-theme of the book: that we use sacred values more than we should, and that that leads to many of the world's problems, like polarization and war.
[00:09:35.720 --> 00:09:37.240] Absolutely.
[00:09:38.520 --> 00:09:47.640] So, one of the distinctions that I appeal to in the book is also a very common distinction between normative theory and descriptive theory, right?
[00:09:47.640 --> 00:09:52.760] So, theories about what we should do versus theories about what we do do.
[00:09:52.760 --> 00:10:00.120] And you described consequentialism in descriptive terms, that that is what we used to do.
[00:10:00.120 --> 00:10:02.280] And I'm not sure that that's right.
[00:10:02.600 --> 00:10:14.920] You know, so at a normative level, I completely agree that the world would be better off if we focused more on consequentialist theory.
[00:10:15.160 --> 00:10:24.360] I don't think we should only focus on consequentialist theory because, you know, we only respect people who have some sacred values.
[00:10:24.680 --> 00:10:29.640] I wouldn't have someone as a friend if they didn't have some sacred values.
[00:10:31.000 --> 00:10:35.960] But descriptively speaking, I'm actually not so sure.
[00:10:36.760 --> 00:10:45.000] We're clearly at an extreme point today in thinking in terms of and making decisions by sacred value rather than consequential.
[00:10:45.760 --> 00:10:58.080] But there have been other extreme points through history, and I actually think that we've been biased, unfortunately, in favor of sacred values for a long time, if not forever.
[00:10:58.720 --> 00:11:00.320] Yeah, I think that's probably right.
[00:11:00.560 --> 00:11:12.880] I guess what I was getting at is, I mean, if you have an issue like what's the best upper tax bracket percentage, it used to be 90%, then 70%, 37% now federal in the United States.
[00:11:12.880 --> 00:11:13.760] What's the right number?
[00:11:13.760 --> 00:11:15.040] Well, there's no right number.
[00:11:15.040 --> 00:11:20.960] You know, we just argue and fight about it and vote, and whoever wins the election gets to try to nudge it one way or the other.
[00:11:21.280 --> 00:11:23.680] Or immigration, you know, how many people should we let in?
[00:11:23.680 --> 00:11:24.880] Well, what's the goals of the country?
[00:11:24.880 --> 00:11:25.440] And so on.
[00:11:25.440 --> 00:11:30.080] And you end up wrangling over these things, which often does come down to, well, what is it we're trying to do?
[00:11:30.080 --> 00:11:31.200] You know, fill jobs.
[00:11:31.200 --> 00:11:33.120] Do we need the tech sector filled?
[00:11:33.120 --> 00:11:37.600] Do we need the, you know, the farming community in California?
[00:11:37.600 --> 00:11:39.920] We need more of those people.
[00:11:40.160 --> 00:11:40.960] What is it we're doing?
[00:11:40.960 --> 00:11:41.680] And so on.
[00:11:42.560 --> 00:11:44.960] Those are mostly consequential decisions.
[00:11:44.960 --> 00:11:53.040] But then when you hear it on Fox News or maybe MSNBC on the other side, it's just, you know, these are evil people or these are good people.
[00:11:53.040 --> 00:11:56.640] And, you know, then the consequentialism kind of goes out the door.
[00:11:56.640 --> 00:11:57.120] Yeah.
[00:11:57.120 --> 00:11:58.960] No, that's exactly right.
[00:11:58.960 --> 00:12:05.040] I mean, what we should be talking about are these nitty, gritty, very difficult details, right?
[00:12:05.040 --> 00:12:10.000] Like, as you said, in most of these domains, we don't have a normative theory.
[00:12:10.000 --> 00:12:12.480] We have to sort of make it up as we go along.
[00:12:12.480 --> 00:12:19.680] So we have to argue about it and we have different interests and, you know, different intuitions.
[00:12:20.000 --> 00:12:32.440] But instead of dealing with that, what we often do, and certainly what our current administration only does, is appeal to see sacred values and not really think about consequences at all.
[00:12:29.840 --> 00:12:32.760] Yeah.
[00:12:33.560 --> 00:12:39.160] You discussed the trolley problem, which is a longtime favorite of thought experiments here, and I've talked about it a lot.
[00:12:39.160 --> 00:12:42.920] Let me queue it up for you and then give you my thoughts and then you can run with it from there.
[00:12:42.920 --> 00:12:46.760] So, you know, the trolley's hurling down the track about to kill the five workers.
[00:12:46.760 --> 00:12:47.960] You're at the switch.
[00:12:48.120 --> 00:12:51.160] And if you throw the switch, it goes off the other track and kills one worker.
[00:12:51.160 --> 00:12:52.360] Do you throw the switch?
[00:12:52.360 --> 00:12:55.720] You know, millions of people that have taken this online, you know, go, yeah.
[00:12:55.720 --> 00:12:58.360] Most people go, oh, yeah, of course, that's the right thing to do.
[00:12:58.600 --> 00:13:09.400] Then the footbridge alternative to it is you're standing there on a bridge over the track, and you yourself are too small to stop the trolley by jumping down in front of it.
[00:13:09.640 --> 00:13:19.320] But the person standing next to you, this is often called the fat man, but I've since learned that this is now politically incorrect and it's just a big, husky guy with a backpack.
[00:13:20.280 --> 00:13:24.840] I call it the big man problem now in my effort to be politically appropriate.
[00:13:24.840 --> 00:13:25.480] You call it the what?
[00:13:25.480 --> 00:13:26.520] The pig man?
[00:13:26.520 --> 00:13:27.000] Big man.
[00:13:27.160 --> 00:13:28.200] Oh, big man, the big man.
[00:13:28.200 --> 00:13:28.520] Yes.
[00:13:28.520 --> 00:13:29.080] Okay.
[00:13:29.080 --> 00:13:30.360] He's not necessarily fat.
[00:13:30.360 --> 00:13:31.800] What's important is that he weighs a lot.
[00:13:31.960 --> 00:13:32.760] Yeah, he weighs a little bit.
[00:13:32.760 --> 00:13:33.080] That's right.
[00:13:33.080 --> 00:13:33.640] Yeah, okay.
[00:13:33.640 --> 00:13:34.760] So the big man.
[00:13:35.000 --> 00:13:39.160] You know, do you hip check him off the bridge and he stops the trolley and you save the five workers?
[00:13:39.160 --> 00:13:42.520] And, you know, most people go, oh, no, that just doesn't feel right.
[00:13:43.160 --> 00:13:46.120] You know, and so why is there that difference?
[00:13:46.120 --> 00:13:56.440] Well, you know, Joshua Green's brain scans on this, you know, one initiates the prefrontal cortex, I think it was, the rational calculation one, and the other one, you know, the sacred value.
[00:13:57.400 --> 00:14:04.200] Different parts of the brain, more, I guess, the amygdala and the limbic system dealing with deep emotions, something like that.
[00:14:04.520 --> 00:14:11.400] But in fact, normatively speaking, why wouldn't you want those choices made?
[00:14:11.400 --> 00:14:17.360] So the alternative is, you know, the doctor has five dying patients, and then there's a perfectly healthy person in the waiting room.
[00:14:14.600 --> 00:14:21.680] You could sacrifice that person, take the five organs, and then save the five patients.
[00:14:21.840 --> 00:14:25.440] And almost everybody goes, oh, no, that would, no, no, no, that would be wrong.
[00:14:25.440 --> 00:14:29.280] But in fact, what's the reason why it's wrong?
[00:14:29.280 --> 00:14:33.760] But my answer is, well, you don't want to live in a world in which you could be walking around at any given moment.
[00:14:33.760 --> 00:14:36.800] Somebody could just nab you and sacrifice you.
[00:14:36.800 --> 00:14:38.960] And so, but we don't do the calculation.
[00:14:38.960 --> 00:14:48.640] We just say in the Constitution, it says all people have the right to life, liberty, and pursuit of happiness and bodily autonomy and the right to your body.
[00:14:48.640 --> 00:14:50.880] So we can't enslave you and so on.
[00:14:50.880 --> 00:14:52.240] That's a sacred value.
[00:14:52.240 --> 00:15:00.400] But in fact, it's a kind of utilitarian calculus or consequential calculus in the sense that it would not be a good world.
[00:15:00.400 --> 00:15:05.920] It would be a society broken down in distrust if that was available.
[00:15:05.920 --> 00:15:06.400] Yeah.
[00:15:06.720 --> 00:15:10.640] Well, look, you could make that argument about all sacred values, right?
[00:15:10.640 --> 00:15:14.880] They might all have consequentialist underpinnings.
[00:15:15.200 --> 00:15:19.120] They may have evolved for good consequentialist reasons.
[00:15:19.120 --> 00:15:27.760] And there are some very nice papers that make that argument about various things that we've attributed to sacred values.
[00:15:28.000 --> 00:15:31.120] And I don't disagree with those analyses.
[00:15:31.760 --> 00:15:33.360] Two points about that.
[00:15:33.360 --> 00:15:36.400] One is those analyses are.
[00:15:36.400 --> 00:15:39.760] So my book is about how we make decisions, right?
[00:15:39.760 --> 00:15:41.360] So here we are, a human being.
[00:15:41.360 --> 00:15:42.880] We're faced with the decision.
[00:15:42.880 --> 00:15:45.040] What strategy should we use?
[00:15:45.360 --> 00:15:50.800] And I claim, like many people, that there are two strategies available to us.
[00:15:50.800 --> 00:15:56.560] And by the way, people confuse them all the time, but nevertheless, there are two strategies available to us.
[00:15:56.560 --> 00:16:02.840] We can either base it on an analysis of action, which is what sacred values are, right?
[00:16:02.840 --> 00:16:04.360] They're in the space of actions.
[00:16:04.360 --> 00:16:07.560] Is this action appropriate or is it not appropriate?
[00:16:07.560 --> 00:16:10.440] Or we can think about the space of outcomes.
[00:16:10.440 --> 00:16:13.800] What will be the consequence of doing this versus doing that?
[00:16:14.120 --> 00:16:24.200] And even if it's the case that there's a consequentialist precursor to my sacred value, that doesn't mean I'm not deciding by sacred values, right?
[00:16:24.520 --> 00:16:27.080] I mean, so that's point one.
[00:16:27.080 --> 00:16:38.040] Point two is you can actually turn that argument on its head and point out that every consequentialist analysis is based on a sacred value, right?
[00:16:38.040 --> 00:16:48.360] Namely, the sacred value that we should maximize utility, say, or that we should do whatever we're doing to maximize the outcomes that we're appealing to.
[00:16:48.360 --> 00:17:04.120] So there is a cycle of reductions you can do, but I actually think that that's sort of independent of the question: how do people make decisions and how should people make decisions?
[00:17:04.440 --> 00:17:05.080] Yeah.
[00:17:05.640 --> 00:17:10.440] Another thought experiment: Jonathan Heights, famous, Mark and Julie are on vacation, brother and sister.
[00:17:10.440 --> 00:17:13.960] They decide it would be fun and interesting to have sex.
[00:17:14.280 --> 00:17:15.560] And so they do.
[00:17:15.560 --> 00:17:17.240] They use two forms of birth control.
[00:17:17.240 --> 00:17:18.360] They don't tell anybody about it.
[00:17:18.360 --> 00:17:19.480] It makes them closer.
[00:17:19.480 --> 00:17:20.120] They enjoyed it.
[00:17:20.120 --> 00:17:21.560] They decide never to do it again.
[00:17:21.560 --> 00:17:22.680] Was that wrong?
[00:17:22.680 --> 00:17:25.320] As you know, almost everybody goes, yeah, no.
[00:17:25.640 --> 00:17:26.680] But why is it wrong?
[00:17:26.680 --> 00:17:27.720] Well, she might get pregnant.
[00:17:27.720 --> 00:17:29.560] No, no, they use two forms of birth control.
[00:17:29.560 --> 00:17:32.840] Well, but people will find out and they they'll um disown them.
[00:17:32.840 --> 00:17:34.600] No, no, they kept it a secret.
[00:17:34.600 --> 00:17:36.520] It'll come between them and destroy the relationship.
[00:17:36.520 --> 00:17:37.640] No, it brought them closer.
[00:17:37.640 --> 00:17:38.280] I don't know.
[00:17:38.760 --> 00:17:40.360] It's just gross.
[00:17:40.680 --> 00:17:42.440] Is that an example?
[00:17:42.760 --> 00:17:43.480] Right.
[00:17:43.800 --> 00:17:44.440] Yeah.
[00:17:44.440 --> 00:17:50.000] So that so Height calls that an example of moral dumbfounding, right?
[00:17:50.000 --> 00:18:02.560] And his point is that we make decisions, at least decisions of that type, without having a reason, that the reason comes after the decision rather than before the decision.
[00:18:02.560 --> 00:18:12.640] And I actually think there's a fair amount of converging evidence that for single decisions, that that's often the case.
[00:18:12.640 --> 00:18:15.280] You know, sometimes we deliberate carefully.
[00:18:15.680 --> 00:18:21.280] If we're buying a house, then we tend to deliberate before we make the decision.
[00:18:21.280 --> 00:18:31.360] But certainly in scenario type examples, and lots of decisions, lots of consumer decisions, for instance, that we make.
[00:18:31.840 --> 00:18:47.280] We make the decision and then we justify it either because our mother needs to know why or our spouse needs to know why our kid needs to know why, or we ourselves, right, want a reason to justify the decision to ourselves.
[00:18:47.920 --> 00:18:54.640] And so I think Heid's right that often the decision comes before the justification.
[00:18:54.640 --> 00:18:57.600] We make it on some other basis, right?
[00:18:57.600 --> 00:19:05.760] Like a sacred value that we might not have conscious access to or some emotion, some affect.
[00:19:05.760 --> 00:19:09.040] Our body basically tells us what to do.
[00:19:10.000 --> 00:19:19.040] I will say that you can run into some of the same conceptual problems that you brought up for the trolley problem, since you seem to like conceptual problems.
[00:19:19.680 --> 00:19:25.600] That is, you might ask, well, why do I make the decision?
[00:19:25.600 --> 00:19:30.000] You know, why do I follow the incest taboo in the example that you just gave?
[00:19:31.320 --> 00:19:42.360] And presumably, there was some reasoning that occurred beforehand that led me to be reactive to the incest taboo.
[00:19:43.240 --> 00:19:52.200] Or, you know, it's something I discussed or something that I evolved with by virtue of bad consequences that happened in the past.
[00:19:52.520 --> 00:19:56.040] And my response would be exactly the same if you say that.
[00:19:56.040 --> 00:20:05.160] That, yeah, that's true, that there is reasoning that's part of this cycle by which society makes decisions over time.
[00:20:05.160 --> 00:20:17.000] But nevertheless, when we as individuals or as small groups are faced with particular decisions, I think the moral dumbfounding phenomenon holds.
[00:20:17.320 --> 00:20:19.160] And sorry, go ahead.
[00:20:19.160 --> 00:20:22.440] Well, I was just going to say I had Kurt Gray on the podcast a few months ago.
[00:20:22.440 --> 00:20:24.440] His new book is Outrage.
[00:20:24.440 --> 00:20:31.320] And he's critical of moral foundations theory and height and those thought experiments.
[00:20:31.640 --> 00:20:33.480] So I don't know if you've read it, but give me your thoughts.
[00:20:33.720 --> 00:20:34.440] No, I haven't actually.
[00:20:34.840 --> 00:20:35.160] I should.
[00:20:35.160 --> 00:20:36.040] I'm going to write it down.
[00:20:36.040 --> 00:20:36.920] Well, it's interesting.
[00:20:36.920 --> 00:20:41.400] No, because he kind of goes along the lines of what I was just saying.
[00:20:41.400 --> 00:20:45.560] At the bottom of sacred values is a kind of consequentialism.
[00:20:45.560 --> 00:20:48.600] That is, there's a reason why we have the incest taboo.
[00:20:48.600 --> 00:20:57.400] Because in our ancestry, in the environment of our evolutionary ancestry, the EEA, as the evolutionary psychologists call it, it would have been harmful.
[00:20:57.560 --> 00:20:59.000] It's adaptation, I think.
[00:20:59.000 --> 00:21:00.200] I think A stands for adaptation.
[00:21:00.520 --> 00:21:00.840] That's right.
[00:21:00.840 --> 00:21:01.560] Yes, yes, correct.
[00:21:01.560 --> 00:21:01.800] Correct.
[00:21:01.800 --> 00:21:02.280] Yes, yes.
[00:21:02.600 --> 00:21:07.240] It would have been consequentially negative for siblings to have sex.
[00:21:07.560 --> 00:21:11.640] And of course, no one's doing genetic testing 100,000 years ago.
[00:21:11.640 --> 00:21:20.800] So the proxy is anyone you grew up with and you spent a lot of time with, you're going to then be disgusted by as adults with the thought of having intimacy with them.
[00:21:21.040 --> 00:21:23.600] So it is at heart, at the bottom.
[00:21:23.600 --> 00:21:26.320] Maybe they're just talking about different levels of causality.
[00:21:26.320 --> 00:21:26.800] I don't know.
[00:21:27.120 --> 00:21:28.240] What are your thoughts on that?
[00:21:28.480 --> 00:21:32.720] Yeah, look, it's the same story as I've already told.
[00:21:32.960 --> 00:21:36.240] I think that, you know, I don't disagree with him.
[00:21:36.480 --> 00:21:53.040] I think that there probably is that kind of consequentialist dynamic that led to the evolution of these values that Height calls moral foundations and that I'm calling sacred values.
[00:21:53.920 --> 00:21:56.320] But that's not really the issue.
[00:21:56.320 --> 00:22:01.600] The issue is how do we make decisions at a particular point in time?
[00:22:03.360 --> 00:22:14.000] Look, you know, you can trace, it's sort of a turtles all the way down kind of issue, right?
[00:22:14.000 --> 00:22:19.680] That is, there's some reason that we do the thing that we're doing.
[00:22:19.680 --> 00:22:24.480] And it may be that there's a different reason that we have that habit.
[00:22:24.480 --> 00:22:28.160] My book is about decision-making in general.
[00:22:28.160 --> 00:22:35.920] And the through line in the book is that the world is incredibly complex, right?
[00:22:35.920 --> 00:22:41.120] This actually comes directly from my previous book with Phil Fernbach, The Knowledge Illusion.
[00:22:41.120 --> 00:22:43.840] The world is incredibly complex.
[00:22:43.840 --> 00:22:47.760] And so we have to simplify it in order to make decisions.
[00:22:47.760 --> 00:22:50.240] And we simplify it in many ways.
[00:22:50.240 --> 00:22:54.000] So, you know, we use a variety of heuristics.
[00:22:54.000 --> 00:22:57.600] Convinced Tversky made those heuristics very famous, right?
[00:22:58.480 --> 00:23:08.920] They talked about heuristics for probability judgment, and they talked about frames that we use to understand decisions that we're making.
[00:23:09.240 --> 00:23:19.480] Other people have talked about how we take the complexity of problems and simplify them either by ignoring things or by outsourcing things, or right?
[00:23:19.480 --> 00:23:24.920] Like making a decision is a process of trying to simplify.
[00:23:25.240 --> 00:23:27.640] And that's what sacred values are, right?
[00:23:27.640 --> 00:23:30.440] They're a way of simplifying.
[00:23:30.440 --> 00:23:41.800] So they ignore a bunch of stuff and they let us just use this very simple absolutist rule that means we hardly have to think at all.
[00:23:41.800 --> 00:23:49.960] In fact, often we can respond directly from emotion, which was kind of Josh Green's point about the trolley problem.
[00:23:51.480 --> 00:24:24.120] But I do think that the message to take home is that the consequences of decisions, even if they have long-term effects, don't necessarily guide what we're doing in the moment.
[00:24:24.440 --> 00:24:31.160] I was thinking about emotions there in the context of guiding decision-making and what people find attractive.
[00:24:31.160 --> 00:24:44.800] So my example from this is in the evolutionary psychologist, the research showing that people like symmetrical faces, clear complexion, men with broad shoulders and a narrow waist, women with a kind of hourglass.
[00:24:52.000 --> 00:25:01.680] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:25:01.680 --> 00:25:04.720] Plus, the more services you bundle, the more you save.
[00:25:04.720 --> 00:25:07.040] Don't let your network slow down your business.
[00:25:07.040 --> 00:25:13.040] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business.
[00:25:14.000 --> 00:25:18.720] That's a figure in a 0.67 hip-to-waist, waist-to-hip ratio.
[00:25:19.040 --> 00:25:19.760] Is that right?
[00:25:19.760 --> 00:25:20.160] Yeah.
[00:25:20.160 --> 00:25:23.840] So, but no one walks around with calipers, you know, making measurements, right?
[00:25:25.040 --> 00:25:28.000] In other words, natural selection has done all the calculations for us.
[00:25:28.000 --> 00:25:30.160] These are proxies for health and so on.
[00:25:30.160 --> 00:25:33.120] And you just look at something, you go, well, I like what I look at.
[00:25:33.120 --> 00:25:35.280] That feels good, you know, whatever.
[00:25:35.280 --> 00:25:41.520] So, the emotions are proxies for these ancient decisions that were already made.
[00:25:41.520 --> 00:25:43.840] Yeah, no, that's a perfect example.
[00:25:43.840 --> 00:25:45.040] That's a great example.
[00:25:45.040 --> 00:25:45.600] Yeah.
[00:25:45.600 --> 00:26:14.080] And the fact that, you know, ants are capable of making sophisticated decisions in the sense that the outcomes of their decisions can be, you know, incredibly sophisticated ant colonies, can be these, you know, architectural triumphs, suggests that it doesn't take a lot of sophistication to make a set of decisions that lead to a sophisticated outcome.
[00:26:14.080 --> 00:26:23.280] You can do things in a very simple way, as long as you've had a sufficiently long evolutionary history to give you the tools to do that.
[00:26:23.280 --> 00:26:28.240] So, the two biggest obstacles you read about the book are the knowledge illusion and the control illusion.
[00:26:28.240 --> 00:26:33.080] Maybe give some examples of that and how that clouds our reasoning.
[00:26:29.840 --> 00:26:36.920] The knowledge illusion and the illusion of control.
[00:26:37.240 --> 00:26:42.600] So, the knowledge illusion is the fact that we think we know more than we do, right?
[00:26:42.920 --> 00:26:46.680] And there's a bunch of evidence for that.
[00:26:46.920 --> 00:26:51.800] You know, one of my favorite examples is you ask people to draw a bicycle.
[00:26:51.800 --> 00:26:53.880] And you say, Can you draw a bicycle?
[00:26:53.880 --> 00:26:56.280] And most people think, well, of course I can draw a bicycle.
[00:26:56.920 --> 00:26:57.880] I own a bicycle.
[00:26:57.880 --> 00:27:00.200] I've seen by fixed bicycles.
[00:27:00.200 --> 00:27:07.880] And then when you sit down and actually try to draw it, with some exceptions, people make all sorts of errors.
[00:27:08.840 --> 00:27:16.280] They put the chain, you know, between the two wheels or up above at the top of the frame.
[00:27:16.280 --> 00:27:20.600] They just crazy things that would not allow the bike to work.
[00:27:20.840 --> 00:27:23.640] And there are a million examples of this type.
[00:27:23.640 --> 00:27:28.600] So that suggests that we don't understand as well as we think we do.
[00:27:28.600 --> 00:27:35.880] And what we argue in the book, The Knowledge Illusion, is that the reason for that is because we rely on other people for our thinking.
[00:27:35.880 --> 00:27:45.080] We live in a community of knowledge, and we don't have to know how to draw a bicycle because, you know, there are bike mechanics who can do it for us.
[00:27:45.560 --> 00:27:47.960] And in fact, there are bicycles themselves.
[00:27:47.960 --> 00:27:52.120] Like when our bike is broken, we don't actually have to fix it from scratch.
[00:27:52.120 --> 00:27:54.760] We can rather look at the bike and see what's broken.
[00:27:54.760 --> 00:28:03.640] And the physical thing provides a frame that makes it easy often to see what the problem is.
[00:28:03.640 --> 00:28:09.400] So we use the physical world as well as other people in order to help us think.
[00:28:09.720 --> 00:28:22.000] So that's part of the argument that the world is actually very complex and we only know a small part of it and therefore we have to simplify, right?
[00:28:24.240 --> 00:28:34.480] In fact, the origin of the current book, The Cost of Conviction, is that the previous book was discussed this phenomenon.
[00:28:34.480 --> 00:28:50.880] The phenomenon of the illusion of explanatory depth is called, is that you ask people how a simple thing works, like a ballpoint pen, or how well they understand how it works, and they'll give you a rating, you know, indicating that they think they understand pretty well.
[00:28:50.880 --> 00:28:52.960] And then you say, okay, how's it work?
[00:28:52.960 --> 00:28:54.240] Explain it to me.
[00:28:54.560 --> 00:29:00.720] And they try and they fail and they discover they don't understand how it works at all.
[00:29:00.720 --> 00:29:12.960] And so they rate their own understanding lower after trying to explain, implying that the attempt to explain punctured their illusion of understanding.
[00:29:12.960 --> 00:29:15.760] And that's what most people say, right?
[00:29:16.400 --> 00:29:20.080] And we did the same thing with policies, with political policies.
[00:29:20.080 --> 00:29:25.760] We asked people how well they understand policies, and then we asked them to explain how the policies work.
[00:29:25.760 --> 00:29:30.320] And we got exactly the same result that people's sense of understanding went down.
[00:29:30.880 --> 00:29:34.960] But the other thing that happened is we changed their attitudes, right?
[00:29:34.960 --> 00:29:37.840] They became less confident in their position.
[00:29:37.840 --> 00:29:47.920] So we not only punctured their sense of understanding, we punctured their confidence in their belief for or against the policy.
[00:29:47.920 --> 00:29:51.040] We depolarized the group in that sense.
[00:29:51.360 --> 00:30:04.920] And it always struck me that what was going on in those cases, at least a good chunk of the time, is that people came into the situation thinking about policies from a sacred values perspective.
[00:30:05.560 --> 00:30:15.240] Like if you ask, you know, Republicans what they think of tariffs, well, you know, tariffs are part of their identity now, right?
[00:30:16.520 --> 00:30:20.920] And so they're not thinking, well, this will be the consequence.
[00:30:20.920 --> 00:30:25.720] Other countries will respond and prices will go.
[00:30:26.440 --> 00:30:28.040] I mean, that's hard to think about.
[00:30:28.040 --> 00:30:29.080] I'm not an economist.
[00:30:29.160 --> 00:30:30.760] I don't know how to think about that.
[00:30:30.760 --> 00:30:39.400] But what I can do is say that my group, you know, thinks that tariffs are bad, or my group thinks that tariffs are good.
[00:30:39.400 --> 00:30:41.480] And therefore, I'm all in on tariffs.
[00:30:41.480 --> 00:30:44.040] They become a sacred value.
[00:30:44.040 --> 00:30:51.800] And then when we ask people to explain, what's happening is we're forcing them to think about the issue consequentially.
[00:30:51.800 --> 00:30:59.880] And that both reduces their sense of understanding, but also makes them realize there's another way to think about this.
[00:30:59.880 --> 00:31:02.920] And I actually don't have the goods anymore.
[00:31:03.800 --> 00:31:04.360] Yeah.
[00:31:04.360 --> 00:31:14.760] Oh, just another example of the knowledge illusion was Andrew Stollman's research at Occidental College on what people know about scientific concepts.
[00:31:14.760 --> 00:31:19.560] So one experiment I liked that he did was asking his students, you know, do you accept the theory of evolution?
[00:31:19.560 --> 00:31:20.920] Oh, yeah, of course.
[00:31:20.920 --> 00:31:23.000] Explain how it works.
[00:31:23.000 --> 00:31:27.000] Oh, well, the giraffe stretches its neck and then the babies have longer necks.
[00:31:27.000 --> 00:31:28.600] They give like a Lamarckian explanation.
[00:31:29.080 --> 00:31:30.840] No, that's not it at all.
[00:31:31.160 --> 00:31:36.600] In other words, having more knowledge doesn't, you know, nudge you in the direction of believing it.
[00:31:36.600 --> 00:31:39.400] It's that, you know, the scientists, they usually get it right.
[00:31:39.400 --> 00:31:43.240] So if they say evolution theory theory is true, it probably is.
[00:31:43.240 --> 00:31:44.680] I'll just go along with it.
[00:31:44.800 --> 00:31:52.400] And so his interpretation is it's kind of a signal, a public signal, like, yeah, I guess I trust science for the most part.
[00:31:52.720 --> 00:31:53.920] Oh, that's interesting.
[00:31:53.920 --> 00:31:54.320] Yeah.
[00:31:55.120 --> 00:32:02.560] We have some interesting data on that subject that I collected with Phil Fernbach and his student, Nick Light.
[00:32:02.560 --> 00:32:05.200] Nick Light was actually the leader of the project.
[00:32:05.680 --> 00:32:27.760] But what we found was that for a bunch of issues on which there's a scientific consensus, like whether evolution took place or whether there was a big bang, right, that started the universe or right, there are a variety of issues on which the vast majority of scientists agree, whether climate change is anthropogenic.
[00:32:29.840 --> 00:32:42.240] It turns out that the people who know the most, we gave people little general knowledge tests, and the people who know the most actually tend to be the ones who agree most with the scientific consensus.
[00:32:42.240 --> 00:32:44.080] So that's good, right?
[00:32:44.240 --> 00:32:46.560] That should make you feel a little more secure.
[00:32:46.880 --> 00:32:55.280] But the people who think they know the most oppose the scientific consensus the most, right?
[00:32:55.280 --> 00:33:04.400] So there's this divorce, this dissociation between whether people, what people actually know and how much they think they know.
[00:33:04.400 --> 00:33:12.160] And they relate in opposite ways to how much they agree with, you know, the scientific process and science.
[00:33:12.160 --> 00:33:25.840] Yeah, I think the whole climate change thing got hijacked when Al Gore's film became so hugely popular, Academy Award winner, and then he won, I don't know, an Emmy Award and I don't know, and he won a bunch of awards for this.
[00:33:25.840 --> 00:33:40.840] And then the whole issue became a democratic cause, you know, and so conservatives' brains auto-correct when they hear global warming, they hear anti-capitalism and anti-business and anti-America because it's Al Gore's cause.
[00:33:41.720 --> 00:33:42.920] Yeah, that's interesting.
[00:33:42.920 --> 00:33:52.920] You know, I heard a different analysis today that it all started when the Koch brothers got into business and did what was necessary to protect themselves.
[00:33:53.240 --> 00:33:54.920] That could also be part of it, yeah.
[00:33:54.920 --> 00:33:55.640] Yeah, exactly.
[00:33:55.640 --> 00:34:11.800] They're not mutually exclusive, but it's all evidence that when you have a belief that becomes part of your community's dynamic and discourse, then it becomes sacralized, right?
[00:34:11.800 --> 00:34:18.520] And once it becomes sacralized, then the opposition has to take the opposite view.
[00:34:18.520 --> 00:34:23.000] And that's the dynamic that leads to extreme polarization.
[00:34:23.000 --> 00:34:23.480] Yeah.
[00:34:24.360 --> 00:34:28.680] So to be a good consequentialist, you have to understand causality.
[00:34:28.680 --> 00:34:33.560] So you have a nice couple chapters in the middle of your book on what is causality?
[00:34:33.560 --> 00:34:34.680] How do we determine it?
[00:34:34.680 --> 00:34:40.280] How do people actually determine causality versus how scientists determine causality?
[00:34:40.280 --> 00:34:41.640] Because they're not the same thing.
[00:34:41.640 --> 00:34:43.720] So talk a little bit about that.
[00:34:44.680 --> 00:34:51.960] Well, you know, philosophers have talked about what the nature of causality is for a long time.
[00:34:51.960 --> 00:35:01.480] And I think these days there's some consensus that to say that A causes B is to imply a counterfactual, right?
[00:35:01.480 --> 00:35:08.760] The counterfactual is if that, so to say that A causes B implies that A occurred and B occurred.
[00:35:08.760 --> 00:35:14.760] And the counterfactual is if A had not occurred, then B would not have occurred either.
[00:35:16.240 --> 00:35:26.720] And the little twist on that that has become sort of de rigueur these days is what's called the interventional theory, right?
[00:35:26.720 --> 00:35:35.760] Judea Pearl is a computer scientist, statistician who's most famous for this, although he's probably not the origin of it.
[00:35:35.760 --> 00:35:38.400] There's a long history behind it.
[00:35:38.960 --> 00:35:54.880] But his suggestion is that to say that A causes B is to say that if I tweaked A, if I, as an agent external to the system, came in and intervened on A, then B would change, right?
[00:35:54.880 --> 00:36:06.960] And it's that intervention that would be the counterfactual change that describes this other possible world that would leave B in a different state.
[00:36:07.600 --> 00:36:31.280] And there is some evidence, and I think about Tanya Lombroso's work when I say this, that when we're thinking about physical causality, we think differently about it than when we think about intentional causality, like the causality between humans, right?
[00:36:31.280 --> 00:36:41.040] So if I say, yes, you're right, Michael, and you respond by feeling good, then I have caused you to feel good.
[00:36:41.040 --> 00:36:43.600] But that's not a physical effect.
[00:36:43.600 --> 00:36:46.240] That's a psychological effect.
[00:36:46.560 --> 00:36:57.040] And what the evidence seems to suggest is that this sort of counterfactual analysis describes psychological effects pretty well.
[00:36:57.040 --> 00:37:13.880] But when we're thinking about physical causality, then we tend to think more in terms of forces, in terms of processes by which A exerts some kind of change to B.
[00:37:14.200 --> 00:37:25.320] So it's sometimes referred to as a conserved quantity theory, because there's some quantity that A passes to B over space and time that changes B.
[00:37:25.640 --> 00:37:30.760] So it's really a different kind of mental model that we have in the two cases.
[00:37:31.080 --> 00:37:34.200] Is that what you were asking me about, or did I miss the boat completely?
[00:37:34.200 --> 00:37:35.640] No, no, no, that's a good start.
[00:37:35.960 --> 00:37:41.000] Thinking of Hume's definition of causality as constant conjunction.
[00:37:41.000 --> 00:37:43.400] A happens, B happens, A happens, B happens.
[00:37:43.480 --> 00:37:51.880] Human mind is designed to assume, especially after three, three steps, you know, you hear that, and you go, huh, I'm wondering what that is.
[00:37:52.200 --> 00:37:54.200] I wonder if somebody's there.
[00:37:54.520 --> 00:37:56.600] Somebody's definitely there, right?
[00:37:56.600 --> 00:38:07.480] So, but then as he points out, you know, the rooster could think, well, you know, every time I crow early in the morning, the sun rises, therefore, I am the cause of the sunrise.
[00:38:07.480 --> 00:38:10.840] So you just silence the rooster and then the sun rises anyway.
[00:38:10.840 --> 00:38:20.440] So the counterfactual there, the inter Pearl's intervention is you change the A and see if B happens anyway.
[00:38:20.440 --> 00:38:28.040] So, you know, people smoke or they don't smoke, and you can't run these double-blind experiments, but you can run natural experiments.
[00:38:28.040 --> 00:38:31.640] You see what's already been done, and then after the fact, you make comparisons.
[00:38:31.640 --> 00:38:34.360] And so we do determine causality that way.
[00:38:34.360 --> 00:38:40.200] You know, and I like Pearl's, you know, multiple levels of causality as a useful tool.
[00:38:40.200 --> 00:38:47.440] Like my example of this was: I read this book on the assassination of Admiral Yamamoto.
[00:38:47.440 --> 00:38:52.000] He was the Japanese architect of the Pearl Harbor sneak attack.
[00:38:52.000 --> 00:38:58.400] And he was, you know, a massive hero in Japan and basically running everything.
[00:38:58.720 --> 00:39:04.480] And so our intelligence services, I think it was 1943 or 44.
[00:39:04.480 --> 00:39:11.200] No, I think it was 44, found out where his plane was going from this place to that place in the Philippines somewhere.
[00:39:11.200 --> 00:39:17.200] And so we sent, I don't know, like a dozen different P-51 fighter jets and they found him.
[00:39:17.200 --> 00:39:24.800] And there he is, one plane with two protecting planes, and they shot him down and found him in the jungle in his seat, still dead, right?
[00:39:24.800 --> 00:39:32.400] So then there arose a big controversy about who gets the credit for shooting him down because it wasn't clear which of the half dozen P-51.
[00:39:32.400 --> 00:39:34.160] And they all said, oh, it was mine.
[00:39:34.160 --> 00:39:35.360] I'm the one that got the bullets.
[00:39:35.840 --> 00:39:37.440] And they all said that.
[00:39:37.440 --> 00:39:45.600] Anyway, but the point of this is that, but for the intelligence services, they wouldn't have had anywhere near where to go, right?
[00:39:45.600 --> 00:39:49.200] So really, it's the intelligence services that broke the Japanese code.
[00:39:49.200 --> 00:39:51.120] Those are the guys that caused his death.
[00:39:51.120 --> 00:39:54.000] But there are people above them that have to order the assassination.
[00:39:54.320 --> 00:39:57.200] Okay, now we're going to try to do this mission and take him out.
[00:39:57.200 --> 00:40:07.120] And there's somebody above that, the general or the admiral, and maybe even President Roosevelt at the top is the one who is the real cause of Admiral Yamamoto's death.
[00:40:07.120 --> 00:40:08.960] So different levels.
[00:40:09.280 --> 00:40:12.320] You're getting into the whole.
[00:40:12.320 --> 00:40:16.240] So cause is a really interesting English word, right?
[00:40:16.240 --> 00:40:18.480] That has all sorts of meanings.
[00:40:18.480 --> 00:40:24.480] So when I was talking about it, I was thinking more like, how do we think about mechanisms?
[00:40:24.960 --> 00:40:28.320] But what you're talking about is causal attribution.
[00:40:28.320 --> 00:40:28.640] Oh, right.
[00:40:28.800 --> 00:40:31.480] Like the thing that we have to do in a court of law.
[00:40:29.840 --> 00:40:35.720] We have to decide who to blame, who's guilty, who, what's the cause.
[00:40:36.040 --> 00:40:41.480] Or if you're troubleshooting a problem, then you have to engage in causal attribution.
[00:40:41.720 --> 00:40:43.720] But cause is really interesting.
[00:40:43.720 --> 00:40:52.600] Like, think about the difference between Mary killed Joe versus Mary caused Joe to die.
[00:40:54.040 --> 00:40:57.000] They're slightly different meanings, right?
[00:40:57.320 --> 00:41:01.720] And it's in the first one that you really want to throw Mary in jail.
[00:41:02.360 --> 00:41:02.760] Right?
[00:41:02.760 --> 00:41:06.200] The second one, it's sort of Mary's a little removed.
[00:41:06.840 --> 00:41:13.000] So that's another, you know, yet another property of the word cause.
[00:41:13.320 --> 00:41:15.480] It's a sophisticated word.
[00:41:15.480 --> 00:41:16.280] Yeah, indeed.
[00:41:16.280 --> 00:41:19.080] I guess in the law they call it the but for question.
[00:41:19.080 --> 00:41:23.320] You know, but for John with the gun, Mary would be alive or whatever.
[00:41:24.280 --> 00:41:34.040] I saw an interesting example of this I brought up a couple times because I watched that OJ, new OJ Doc series on Netflix, multi-part, most of the stuff we already know.
[00:41:34.040 --> 00:41:42.360] But they did have an interview with his agent, his ex-agent, who said, well, after the trial, years after the trial, they're just hanging out and having beers.
[00:41:42.360 --> 00:41:48.840] And, you know, he kind of, you know, got up the gumption to ask him, so OJ, come on, did you do it?
[00:41:49.480 --> 00:41:58.920] And OJ's, you know, like had a few drinks and paused and said, you know, if Nicole hadn't come to the door with a knife, he'd still be alive.
[00:41:59.240 --> 00:42:01.640] And this guy's just like, holy shit.
[00:42:01.960 --> 00:42:04.840] You know, because, you know, we know he was a stalker.
[00:42:04.840 --> 00:42:08.200] We know he went over there to spy on her and watch her and so on.
[00:42:08.520 --> 00:42:12.440] And it's possible he didn't go over there intending to kill her.
[00:42:12.440 --> 00:42:14.040] He was just another stalker.
[00:42:14.040 --> 00:42:16.160] And there she was with Ron Goldman.
[00:42:14.840 --> 00:42:21.200] And maybe she, you know, she had already been calling 911 multiple times on him.
[00:42:21.280 --> 00:42:24.240] And you could hear him screaming in the background like he was going to kill her.
[00:42:24.240 --> 00:42:27.200] And she had been beaten and bruised by him with those pictures.
[00:42:27.200 --> 00:42:28.640] You see her bruised up face.
[00:42:28.640 --> 00:42:32.080] So maybe she grabbed a knife just to protect herself in case.
[00:42:32.080 --> 00:42:33.600] And then she opens the door.
[00:42:33.600 --> 00:42:34.720] He opens the door.
[00:42:34.720 --> 00:42:37.680] She's got the knife and all hell breaks loose and then she's dead.
[00:42:37.680 --> 00:42:41.600] And then Ron Goldman comes in to fight for her and boom, he's dead.
[00:42:41.600 --> 00:42:43.280] You know, it's something like that.
[00:42:43.360 --> 00:42:50.800] You know, it's you think of, you know, how do you think of OJ as all these friends were, I had no idea he had this in them, in him to do.
[00:42:50.800 --> 00:42:53.760] He might not have known he had that in him to do, right?
[00:42:53.760 --> 00:42:55.120] I mean, it just snaps.
[00:42:56.240 --> 00:43:01.680] It's like the problem of predicting who's the next school shooter is going to be.
[00:43:01.680 --> 00:43:02.880] Impossible.
[00:43:02.880 --> 00:43:05.120] They may not even know, right?
[00:43:05.120 --> 00:43:07.440] Until that morning, somebody commits suicide.
[00:43:07.440 --> 00:43:08.560] They don't even know they're going to do it.
[00:43:08.720 --> 00:43:10.720] They wake up one morning and go, you know, that's it.
[00:43:10.720 --> 00:43:12.080] I'm done today.
[00:43:12.080 --> 00:43:12.560] Yeah.
[00:43:12.960 --> 00:43:13.440] Yeah.
[00:43:13.760 --> 00:43:31.280] Yeah, no, I mean, what I hear you saying is agreeing with my claim that the world is incredibly complex and every event is different and there's a whole different set of variables that are relevant in every individual case and we just can't have access to all of it.
[00:43:31.280 --> 00:43:33.120] So we have to simplify.
[00:43:33.120 --> 00:43:39.040] Well, this is your argument against consequentialism is that you can't possibly do all the calculations.
[00:43:39.040 --> 00:43:40.240] Nobody can.
[00:43:40.640 --> 00:43:43.120] And so therefore, we have to do shortcuts.
[00:43:43.120 --> 00:43:44.240] So what are the shortcuts?
[00:43:44.240 --> 00:43:46.000] Well, these are the sacred values.
[00:43:46.000 --> 00:43:46.240] All right.
[00:43:46.640 --> 00:43:48.000] Well, that's one kind of shortcut.
[00:43:48.160 --> 00:43:48.880] Okay, yeah, go ahead.
[00:43:49.280 --> 00:43:49.920] Yeah, go ahead.
[00:43:49.920 --> 00:43:50.720] Give us a few others.
[00:43:50.720 --> 00:43:51.360] Yeah.
[00:43:52.000 --> 00:43:56.960] Well, so there are the heuristics we use to make judgments, right?
[00:43:56.920 --> 00:44:02.360] There are famous representativeness heuristic and availability heuristic and anchoring and adjustment.
[00:44:02.680 --> 00:44:09.640] Those are all ways of simplifying in that probability judgment or confidence judgment, right?
[00:44:10.520 --> 00:44:15.240] And it's worth noting that they all are useful.
[00:44:15.240 --> 00:44:28.680] So I don't know how familiar you or your listeners are with these ideas, but representativeness is basically the idea that we think something is probable if it's similar to a model of that thing, right?
[00:44:28.680 --> 00:44:32.760] The famous example is Linda the bank teller.
[00:44:32.760 --> 00:44:41.480] So I said, there's this woman, she went to Berkeley and she majored in political philosophy and she was active in social movements.
[00:44:41.480 --> 00:44:47.720] Do you think it's more likely that she's a bank teller now or a feminist bank teller?
[00:44:47.720 --> 00:44:51.400] And most people say, well, a feminist bank teller, right?
[00:44:51.400 --> 00:44:52.840] Because she seems like a feminist.
[00:44:52.840 --> 00:44:55.080] She has all the properties of the feminist.
[00:44:55.080 --> 00:45:03.400] Well, she can't be more likely to be a feminist bank teller than a bank teller, because if she's a feminist bank teller, she's still a bank teller, right?
[00:45:03.720 --> 00:45:05.960] So that's called the conjunction fallacy.
[00:45:05.960 --> 00:45:17.240] And it's supposed to be evidence that we use representativeness, namely, she is representative of feminists, and so we judge the probability high that she's a feminist.
[00:45:17.240 --> 00:45:27.000] In the same sense that I judge the probability high that, you know, what you're holding there is a pen because it looks like a pen and sounds like a pen and quacks like a pen.
[00:45:27.000 --> 00:45:28.360] So it must be a pen, right?
[00:45:29.560 --> 00:45:39.080] So this is a very useful heuristic that gets us the right answer most of the time, but it does cause certain errors, systematic errors.
[00:45:39.080 --> 00:45:41.400] And it's a simplifying heuristic.
[00:45:41.400 --> 00:45:43.000] And, you know, there's a host of them.
[00:45:43.000 --> 00:45:48.960] You go online and look for heuristics of judgment, you'll find a couple of dozen.
[00:45:50.960 --> 00:45:53.360] So that's one way we simplify.
[00:45:53.600 --> 00:45:54.960] Another way we simplify.
[00:45:55.280 --> 00:46:11.680] Hang on, before you go there, let's talk about some of Tetlock's forbidden taboo heuristics, you know, that we generalize by blacks are more likely to be like this, or Jews are more likely to be like that, or women are more likely to be like this.
[00:46:11.680 --> 00:46:16.080] I mean, you could back them up with some statistics.
[00:46:16.080 --> 00:46:31.760] You know, the black crime rate is higher than the white crime rate, or I've actually heard this, when women score higher in neuroticism on the big five personality dimensions, therefore they have lower control over their emotional extremes.
[00:46:31.760 --> 00:46:39.520] Therefore, maybe they're not as likely to be successful as computer programmers or engineers or CEOs of Fortune 500 companies.
[00:46:39.840 --> 00:46:41.280] These are arguments I've heard made.
[00:46:41.280 --> 00:46:44.480] Of course, Tetlock says these are for the most part forbidden.
[00:46:44.480 --> 00:46:47.440] I mean, you really should not be asking.
[00:46:47.440 --> 00:46:51.120] And maybe that is a, I don't know if that's a sacred value or a consequence.
[00:46:51.120 --> 00:47:01.680] Maybe it's better we don't ask those questions because of the history of how those kinds of questions are answered, which is not good for the treatment of these groups.
[00:47:01.680 --> 00:47:02.560] Right.
[00:47:02.880 --> 00:47:08.240] Well, so I think what you're talking about are stereotypes.
[00:47:08.240 --> 00:47:08.480] Yes.
[00:47:08.800 --> 00:47:09.840] Yes, yeah, prejudices.
[00:47:10.160 --> 00:47:15.920] And you know, yes, we think in terms of stereotypes, there's no question about it.
[00:47:18.160 --> 00:47:20.000] Is that good or is that bad?
[00:47:20.000 --> 00:47:26.160] Well, so you could, you know, that's another simplifying heuristic, right?
[00:47:26.480 --> 00:47:34.520] So it gets us the right answer most of the time, I would argue, but it also leads to certain kinds of systematic errors.
[00:47:29.600 --> 00:47:36.760] So we have to be super careful about it.
[00:47:37.080 --> 00:47:49.640] I mean, if you say, here's the example I like to use in my class: I say, it turns out that most snowshoe thieves are Canadian.
[00:47:50.760 --> 00:47:53.240] I'm allowed to say this because I'm Canadian.
[00:47:53.240 --> 00:47:54.200] All right.
[00:47:56.120 --> 00:48:02.840] And like, so let's, so is it true that Canadians have the property of being snowshoe thieves?
[00:48:02.840 --> 00:48:04.760] No, that's not true.
[00:48:04.760 --> 00:48:06.600] What I said was the opposite, right?
[00:48:06.600 --> 00:48:16.840] So, but is it wrong to say that the prototypical snowshoe thief or our stereotype of snowshoe thieves shouldn't involve them being Canadian?
[00:48:16.840 --> 00:48:19.400] Well, if it's a fact, it's a fact, right?
[00:48:19.400 --> 00:48:22.840] Like, it doesn't mean it's always the case.
[00:48:22.840 --> 00:48:28.280] It doesn't mean it has to be the case, but it may be true, right, about snowshoe thieves.
[00:48:28.280 --> 00:48:29.160] I don't know.
[00:48:29.160 --> 00:48:34.040] Or, you know, most Canadians own warm winter jackets.
[00:48:34.040 --> 00:48:36.120] Well, yeah, that's true.
[00:48:36.440 --> 00:48:40.600] Like, and we use these stereotypes.
[00:48:40.600 --> 00:48:43.240] You know what can really slow down your business?
[00:48:43.240 --> 00:48:47.560] Anything from sophisticated phishing attacks to plain old spotty internet.
[00:48:47.560 --> 00:48:57.240] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[00:48:57.240 --> 00:49:00.280] Plus, the more services you bundle, the more you save.
[00:49:00.280 --> 00:49:02.600] Don't let your network slow down your business.
[00:49:02.600 --> 00:49:16.160] Get Optimum Business Secure Internet starting at just $60 a month at optimum.com/slash business constantly, constantly, like every five minutes, just to negotiate the world, right?
[00:49:16.960 --> 00:49:22.960] Because they capture statistical invariants in the world.
[00:49:22.960 --> 00:49:25.280] And in that sense, they're accurate.
[00:49:25.280 --> 00:49:26.640] Are they always right?
[00:49:26.640 --> 00:49:28.240] Well, no, of course not.
[00:49:28.240 --> 00:49:33.600] Like, you know, my brother, it turns out, lives in Canada and only wears a black leather jacket.
[00:49:33.600 --> 00:49:35.360] That's all he ever wears, right?
[00:49:35.360 --> 00:49:37.840] He doesn't own a winter coat.
[00:49:38.640 --> 00:49:40.560] But, you know, he's the exception.
[00:49:40.560 --> 00:49:44.480] And the stereotype captures most of the information.
[00:49:44.800 --> 00:49:49.520] So, and actually, it's interesting reading Walter Littman on this, right?
[00:49:49.520 --> 00:49:54.080] He wrote in 1922 and he used the word stereotype.
[00:49:54.080 --> 00:50:00.000] And he was talking about World War I propaganda, which is full of stereotypes, right?
[00:50:00.000 --> 00:50:02.080] Like the worst kind of stereotype.
[00:50:02.080 --> 00:50:07.280] So stereotypes can be deployed in nefarious ways, for sure.
[00:50:07.600 --> 00:50:15.200] You can build stereotypes that are systematically incorrect in order to degrade a people, right?
[00:50:15.200 --> 00:50:24.960] And there's all sorts of history of various countries doing that in order to scapegoat people.
[00:50:24.960 --> 00:50:30.000] Look, I think Trump is doing that now with Venezuelans, right?
[00:50:33.680 --> 00:50:36.320] So it's a way of simplifying.
[00:50:36.320 --> 00:50:39.440] It's a necessary way of simplifying.
[00:50:39.760 --> 00:50:46.800] And it's funny, you know, like when we talk about dog stereotypes, which I do all the time because I take my dogs to the dog park.
[00:50:46.800 --> 00:50:48.640] And what do you talk about at the dog park?
[00:50:50.800 --> 00:50:58.000] And so, you know, like, yes, German shepherds are like this, and labs are like that, and poodles are like this.
[00:50:58.000 --> 00:51:05.880] And like, it's just not true in every single case, but the stereotypes capture a lot, and they allow conversation.
[00:51:06.760 --> 00:51:13.160] So, we have to appeal to these invariants in order to simplify the world enough to try to make sense of it.
[00:51:13.160 --> 00:51:14.600] Yeah, I understand, of course.
[00:51:14.600 --> 00:51:24.040] But Tetlock's point about the forbidden taboo base rates, for example, what is the crime rate, the black-white crime rate differences in intersection, whatever.
[00:51:24.040 --> 00:51:30.920] Even asking the question is taboo because it suggests something that's deep that bothers people.
[00:51:30.920 --> 00:51:34.760] I think this comes from Tetlock, but I got it from Pinker's book, Rationality.
[00:51:34.760 --> 00:51:36.520] So, I'm not quite sure about that.
[00:51:36.520 --> 00:51:43.720] But you're at a dinner party with all your couple friends, and so float this one by.
[00:51:43.720 --> 00:51:50.360] So, of course, none of us would ever cheat on our spouses, but if you had to cheat with somebody here at the table, who would it be?
[00:51:51.320 --> 00:51:54.360] You cannot answer the question, just keep your mouth shut.
[00:51:54.360 --> 00:51:56.280] There is no correct answer, right?
[00:51:57.400 --> 00:52:00.600] Even if you say, I would never do this, but it would be Mary for sure.
[00:52:00.600 --> 00:52:05.880] No, you can't say that, you know, or worse, you know, back to the other example.
[00:52:05.880 --> 00:52:13.240] You know, of course, none of us here at the table are prejudiced, but you know, if you had to pick one group out that the stereotypes fit the statistic, who would it be?
[00:52:13.240 --> 00:52:17.240] You can't answer, well, of course, I'm not prejudiced, but the blacks are the Jews or whatever.
[00:52:17.240 --> 00:52:19.480] You can't keep your mouth shut.
[00:52:19.480 --> 00:52:23.640] And Tetlock's point is: maybe that's actually okay.
[00:52:23.640 --> 00:52:27.960] In a civil society, maybe some things are taboo enough we shouldn't do it.
[00:52:27.960 --> 00:52:35.960] So, another example of this is a debate in psychology circles of the scientific study of IQ differences between races and their causes.
[00:52:35.960 --> 00:52:37.400] Oh, boy.
[00:52:37.400 --> 00:52:42.120] I mean, I just, when I was in graduate school, this was already a hot topic in psychology.
[00:52:42.120 --> 00:52:43.960] And I thought, I'm never touching this one.
[00:52:43.960 --> 00:52:44.520] Oh, my God.
[00:52:44.520 --> 00:52:46.720] And it's only gotten worse.
[00:52:46.720 --> 00:52:47.200] Yeah.
[00:52:47.520 --> 00:52:48.080] Yeah.
[00:52:44.840 --> 00:52:49.360] No, for sure.
[00:52:50.320 --> 00:52:57.360] You know, and part of the problem with those things is that you're making judgments about ill-defined categories, right?
[00:52:57.360 --> 00:53:01.840] Like, what do you even mean by an African-American?
[00:53:01.840 --> 00:53:05.600] Like, what percentage African genes do they have to have?
[00:53:05.600 --> 00:53:07.760] And, you know, none of them are 100%.
[00:53:07.920 --> 00:53:09.840] Very few of them are 100%.
[00:53:09.840 --> 00:53:12.400] And look, my parents were born in South Africa.
[00:53:12.400 --> 00:53:13.600] Does that make me a Southern?
[00:53:14.560 --> 00:53:15.760] Oh, okay.
[00:53:16.960 --> 00:53:21.360] So, just defining the categories itself is tough.
[00:53:21.360 --> 00:53:25.440] But here's an interesting example of the kind of thing you're talking about.
[00:53:25.440 --> 00:53:29.120] Courts of law have to ignore base rates, right?
[00:53:29.120 --> 00:53:29.520] Yes.
[00:53:29.520 --> 00:53:38.560] So if someone is accused of being a snowshoe thief, you're not allowed to say, well, this person is likely to be guilty because they're Canadian.
[00:53:39.680 --> 00:53:46.720] Even if Canadians are the sole snowshoe thieves around, you're not allowed to peel to base rates.
[00:53:46.720 --> 00:53:53.600] And so that's, I think, another example of a favorable taboo subject.
[00:53:54.240 --> 00:54:01.120] All right, let me look at some specific examples: abortion, euthanasia, death penalty, capital punishment, so on.
[00:54:01.120 --> 00:54:02.320] I'll read from your book.
[00:54:02.320 --> 00:54:11.520] Cameron Todd Williams, William's house burned down in Texas in December 1991, killing his three young daughters.
[00:54:11.520 --> 00:54:15.600] A police investigation determined that the fire had been intentionally set.
[00:54:15.600 --> 00:54:19.680] William himself quickly became a suspect and eventually was accused.
[00:54:19.680 --> 00:54:22.800] He was tried and convicted of arson and murder in 1992.
[00:54:22.800 --> 00:54:27.920] He insisted that the fire was accidental and maintained his innocence throughout the trial and afterwards.
[00:54:27.920 --> 00:54:34.840] He even turned down a deal of a life in prison term in exchange for a guilty plea that would have taken execution off the table.
[00:54:34.840 --> 00:54:56.200] Partly on the strength of claims made by medical experts that William's tattoo of a skull and serpent fit the profile of a sociopath, oh my God, crazy, along with a statement by a psychologist that his poster of the heavy metal band Iron Maiden and another of a fallen angel from the rock band Led Zeppelin, oh my God, I love Led Zeppelin.
[00:54:56.200 --> 00:54:57.320] What does that make me?
[00:54:57.960 --> 00:55:01.720] Was an indicator of cultive type activities.
[00:55:01.720 --> 00:55:06.040] William Him was sentenced to death and executed by lethal injection in 2004.
[00:55:06.040 --> 00:55:07.480] Good reason to believe that William H.
[00:55:07.560 --> 00:55:09.000] was in fact innocent.
[00:55:09.000 --> 00:55:14.680] The evidence against him was effectively rebutted in 2004 by fire investigator Gerald Hearst.
[00:55:14.840 --> 00:55:21.800] In 2009, a report by Texas Forensic Science Commission found that the arson investigation in William's case was deeply flawed.
[00:55:21.880 --> 00:55:25.160] By the way, we did do the thing in Skeptic about fire investigations.
[00:55:25.160 --> 00:55:30.360] And, you know, if the paint is curled this way or if the glass is melted this way, or you know, it's this or that.
[00:55:30.360 --> 00:55:31.800] It's mostly bullshit.
[00:55:31.800 --> 00:55:35.480] You know, there's a really good under control.
[00:55:35.560 --> 00:55:41.560] You know, when you have blind conditions, they're not very good at predicting which ones were arson and which ones were accident.
[00:55:41.560 --> 00:55:43.960] There's a lot of after-the-fact reasoning in that.
[00:55:43.960 --> 00:55:45.320] But that's set aside.
[00:55:45.560 --> 00:55:48.760] You know, how do people think about the death penalty now?
[00:55:48.760 --> 00:55:50.360] You hear consequential arguments.
[00:55:50.360 --> 00:55:54.040] It costs more to keep somebody on death row than if they were in life in prison.
[00:55:54.040 --> 00:55:56.360] In the long run, you'd save money and so on.
[00:55:56.360 --> 00:55:58.280] But others make sacred value arguments.
[00:55:58.280 --> 00:55:59.800] We don't want to give the state.
[00:55:59.800 --> 00:56:01.320] This is a libertarian argument.
[00:56:01.320 --> 00:56:03.720] I don't want the state to have the power over life and death.
[00:56:03.720 --> 00:56:06.520] I don't care what the guy did.
[00:56:06.760 --> 00:56:08.760] So you hear those kind of arguments.
[00:56:08.760 --> 00:56:09.800] Yeah, yeah.
[00:56:10.680 --> 00:56:11.080] Yeah.
[00:56:11.080 --> 00:56:20.720] No, I think that capital punishment is definitely the kind of argument that elicits sacred values on all sides.
[00:56:21.040 --> 00:56:26.640] You know, I have spent my life actually making the financial argument, right?
[00:56:27.440 --> 00:56:31.840] Someone kills 30 people, I don't care about them anymore.
[00:56:31.840 --> 00:56:44.960] But if it's going to cost more to put them to death than to just keep them in prison for life, then why should the state be spending its money on this person that isn't worth anything?
[00:56:45.200 --> 00:56:48.880] But that argument has never gone down very well with anybody.
[00:56:49.360 --> 00:56:52.400] People want to think about this in terms of sacred values.
[00:56:52.400 --> 00:56:53.440] Absolutely.
[00:56:53.440 --> 00:57:00.080] I mean, the point of the story you just read, and by the way, I was attracted to it by the Led Zeppelin rapper.
[00:57:01.280 --> 00:57:02.160] That was funny.
[00:57:02.960 --> 00:57:05.760] That was too much for me.
[00:57:06.640 --> 00:57:19.200] But it's that he himself, the accused, refused to admit guilt and allowed the state to put him to death.
[00:57:19.200 --> 00:57:31.280] Like, that's how strongly he felt he held this sacred value that he did not want to admit guilt for something that he clearly didn't think he had done.
[00:57:32.800 --> 00:57:42.560] And so the point is the power of sacred values, that they're actually, you know, they have the power of life and death.
[00:57:42.560 --> 00:57:44.080] Well, I guess, you know, murder.
[00:57:44.080 --> 00:57:45.280] I mean, is murder wrong?
[00:57:45.280 --> 00:57:49.600] I did a debate with Dennis Breger once on that very subject.
[00:57:49.600 --> 00:57:51.720] But if you look it up, what's the, what is the word?
[00:57:51.720 --> 00:57:52.880] How do you define the word murder?
[00:57:52.880 --> 00:57:54.400] The wrongful killing of somebody.
[00:57:54.400 --> 00:57:58.800] Well, then you're asking, is wrongfully killing somebody wrong?
[00:57:58.800 --> 00:58:01.400] Yeah, it's right there in the definition, right?
[00:58:01.400 --> 00:58:03.720] But of course, if you say, is killing somebody wrong?
[00:57:59.760 --> 00:58:04.840] Well, it's like, well, it depends.
[00:57:59.920 --> 00:58:05.960] Self-defense.
[00:58:06.760 --> 00:58:10.600] You know, it's a just war and we're going to war to fight these evil people.
[00:58:10.600 --> 00:58:12.280] Or, you know, capital punishment.
[00:58:12.280 --> 00:58:14.600] The state, it's right there in the state constitution.
[00:58:14.600 --> 00:58:15.720] We have to execute them.
[00:58:15.720 --> 00:58:17.000] The courts ordered it.
[00:58:17.000 --> 00:58:18.200] End of story.
[00:58:18.440 --> 00:58:18.680] Right?
[00:58:18.680 --> 00:58:21.960] So, I mean, it depends on the context in that case.
[00:58:21.960 --> 00:58:22.520] Yeah.
[00:58:22.520 --> 00:58:27.240] But remember, I mean, we do all sorts of things that we know are wrong, right?
[00:58:27.240 --> 00:58:34.840] We drink too much or we speed on the highway or, you know, I mean, all kinds of things all the time.
[00:58:35.160 --> 00:58:40.840] And yet, very few of us are willing to commit murder.
[00:58:41.160 --> 00:58:52.360] So there's a sacred value there that distinguishes that act from other acts which also have that circularity in their definition, right?
[00:58:52.360 --> 00:58:56.520] Of being wrong by virtue of being defined as being wrong.
[00:58:56.520 --> 00:58:59.400] I mean, you know, what does it mean to go over the speed limit?
[00:58:59.400 --> 00:59:02.120] It means you're going faster than you're supposed to go.
[00:59:02.520 --> 00:59:05.320] But there's not a sacred value attached to that.
[00:59:05.320 --> 00:59:05.960] Yeah.
[00:59:06.280 --> 00:59:09.000] I mean, I think about, you know, moral bad luck.
[00:59:09.000 --> 00:59:16.280] If, you know, I mean, my wife and I go out to dinner, I have maybe two drinks, maybe three glasses of wine if we're there for a couple hours.
[00:59:16.280 --> 00:59:18.200] And then I get in the car and drive home.
[00:59:18.200 --> 00:59:19.480] You know, nothing's ever happened.
[00:59:19.480 --> 00:59:20.440] But what if it did?
[00:59:20.440 --> 00:59:25.160] You know, what if I got the wreck and I got a DUI and somebody was hurt or killed?
[00:59:25.160 --> 00:59:31.720] You know, but I mean, I've got a thousand, I don't know, a thousand, whatever, lots, dozens, whatever, in my 70 years.
[00:59:32.120 --> 00:59:35.720] But, you know, the other guy, he just, you know, got bad luck.
[00:59:36.040 --> 00:59:40.200] And there's not a lot of justice, cosmic justice behind all that.
[00:59:40.520 --> 00:59:41.160] Yeah.
[00:59:41.480 --> 00:59:41.880] Yeah.
[00:59:41.880 --> 00:59:47.520] No, those cases, I mean, I don't deal with them in the book, but they're interesting cases.
[00:59:47.760 --> 00:59:56.400] So you have two people, they're drunk the same amount, they've eaten the same amount, they're going just as far as each other.
[00:59:56.400 --> 00:59:59.440] One, you know, so they're both slight drunk.
[00:59:59.440 --> 01:00:04.640] One of them swerves and hits a tree, and the other swerves and hits a child.
[01:00:04.960 --> 01:00:12.320] And the one who hits a tree, you know, has to buy a new car, and the one who hits the child goes to jail for the rest of their lives.
[01:00:13.760 --> 01:00:22.320] And the world sort of is identical, except for what happened to be in their way when they swerved, right?
[01:00:22.320 --> 01:00:25.600] Which is not something they had any control over.
[01:00:25.920 --> 01:00:31.360] And I think it's an interesting question whether that's justified or not.
[01:00:31.360 --> 01:00:31.840] Yeah.
[01:00:32.480 --> 01:00:36.240] I think often these things come down to conflicting rights.
[01:00:36.240 --> 01:00:39.120] So they're, I guess, in a way, conflicting sacred values.
[01:00:39.120 --> 01:00:47.040] But on the other hand, you could have conflicting consequential calculations, like in your thought experiment in vignette number two.
[01:00:47.280 --> 01:00:53.120] You know, is it $50 for the kids' lunch program or $50 for the friends' medicine?
[01:00:54.080 --> 01:00:55.520] Is there a right answer?
[01:00:55.840 --> 01:00:57.360] Probably not, really.
[01:00:57.360 --> 01:00:58.080] No.
[01:00:58.400 --> 01:00:58.880] No.
[01:00:59.200 --> 01:00:59.600] Yeah.
[01:00:59.600 --> 01:01:02.240] I mean, that's why you have to deliberate.
[01:01:02.480 --> 01:01:10.240] If there were a clear right answer, then presumably the person would get no extra credit for spending the time thinking.
[01:01:10.240 --> 01:01:10.720] Yeah.
[01:01:11.280 --> 01:01:16.320] So on the abortion issue, I think I was thinking about that also reading that section in your book.
[01:01:16.640 --> 01:01:18.400] It really, it's kind of a conflicting rights.
[01:01:18.720 --> 01:01:24.240] I'm pro-choice, but I've tried to listen to the pro-life arguments and it's like, okay, those are pretty good arguments.
[01:01:24.240 --> 01:01:30.680] You know, their sacred value is, you know, the right to the fetus has a right to life.
[01:01:30.680 --> 01:01:32.920] Okay, yeah, I'll grant that.
[01:01:29.120 --> 01:01:35.400] But the woman has a right to life too, the mother.
[01:01:35.720 --> 01:01:39.480] You know, so we're going to value one or the other, and then you get, you know, get the calculations.
[01:01:39.880 --> 01:01:46.040] What's the likelihood she'll die in childbirth or whatever, much lower now than before, and so on and so forth.
[01:01:46.280 --> 01:01:51.160] Or what could the child have become, you know, that had Beethoven's mother aborted him?
[01:01:51.160 --> 01:01:54.280] You know, you've heard all these arguments.
[01:01:54.440 --> 01:02:00.520] How do you think about the abortion issue in terms of your consequentialism versus sacred values?
[01:02:00.520 --> 01:02:02.520] No, exactly the same way you do.
[01:02:02.760 --> 01:02:06.680] I mean, people tend to approach it from a sacred values perspective.
[01:02:07.000 --> 01:02:08.680] That's clear.
[01:02:08.680 --> 01:02:14.520] But in fact, it's an act that has all kinds of consequences.
[01:02:14.520 --> 01:02:20.920] And I think you have to weigh those consequences if you want to be wise when making a decision about abortion.
[01:02:20.920 --> 01:02:36.920] I mean, I'm also pro-choice, but I also think we should do what we can to minimize abortion, not only because of the fetus, and whether the fetus has rights or not is not something that I have any clear perspective on.
[01:02:37.240 --> 01:02:46.680] But there's no question that having an abortion can ruin someone's life, but it can also save someone's life.
[01:02:47.400 --> 01:02:49.800] There are all sorts of emotional consequences.
[01:02:49.800 --> 01:02:52.280] There are physical consequences, right?
[01:02:52.280 --> 01:02:56.280] There's the possibility of something terrible going wrong.
[01:02:56.600 --> 01:03:04.600] And then, yeah, I mean, the issue you brought up about who the child would have become is a whole other domain of uncertainty.
[01:03:04.840 --> 01:03:08.920] So there's a ton of uncertainty that really has to be considered.
[01:03:08.920 --> 01:03:24.160] So abortion is not a pleasant thing, and it's not something you want to happen, but I definitely think it's something that we would do better by thinking about consequentially rather than in terms of sacred values.
[01:03:24.160 --> 01:03:39.280] And I think all those medical professionals in red states that are now denying medical care to women because they have there's the possibility that they might end up killing their fetus in the process.
[01:03:39.280 --> 01:03:40.880] I think that's just pathetic.
[01:03:40.880 --> 01:03:43.040] Yeah, gone way too far.
[01:03:43.360 --> 01:03:47.040] Trans issue also, I've been thinking about that as a conflicting rights issues.
[01:03:47.040 --> 01:03:55.680] You don't want to support trans people if a man feels like a woman or vice versa and they want to compete in that other division in sports or whatever.
[01:03:55.680 --> 01:03:58.320] Yeah, I don't want to, I'm not opposed to that.
[01:03:58.320 --> 01:04:06.320] But if women are in the locker room going, I don't want a guy in here or a bathroom or whatever, it's like, well, yeah, I want to support women's rights also.
[01:04:06.320 --> 01:04:10.000] Well, you can't have, sometimes you can't have both, right?
[01:04:10.000 --> 01:04:14.160] I mean, I've looked for solutions like in sports, you could have an open division.
[01:04:14.400 --> 01:04:15.600] Frisbee Golf did this.
[01:04:16.160 --> 01:04:20.400] We have the men's division, women's division, and the open division where anybody would compete, right?
[01:04:20.400 --> 01:04:22.000] You know, or a gender-neutral bathroom.
[01:04:22.000 --> 01:04:24.000] It just says right on there, gender-neutral bathroom.
[01:04:24.080 --> 01:04:26.160] Okay, everybody knows, and that's fine.
[01:04:26.160 --> 01:04:28.880] You know, and you could build other locker rooms.
[01:04:28.880 --> 01:04:29.760] I don't know.
[01:04:30.000 --> 01:04:32.080] Sometimes the solutions get expensive, maybe.
[01:04:32.720 --> 01:04:36.720] But there, you know, I mean, life is mostly just trade-offs, complex trade-offs.
[01:04:36.720 --> 01:04:40.160] I mean, this was always Thomas Sowell's point that there are no solutions.
[01:04:40.160 --> 01:04:41.360] They're just trade-offs.
[01:04:41.360 --> 01:04:42.880] And most of life is trade-offs.
[01:04:42.880 --> 01:04:44.320] You can't have everything.
[01:04:45.600 --> 01:04:46.240] Yeah.
[01:04:47.440 --> 01:04:47.920] Yeah.
[01:04:47.920 --> 01:04:54.480] No, it sounds like in some sense in my book, I'm repeating the conclusions of Thomas Sowell.
[01:04:54.480 --> 01:04:55.760] In some ways, yeah.
[01:04:55.760 --> 01:04:56.560] I think so.
[01:04:57.120 --> 01:05:00.840] Yeah, no, look, lots of people have made the point, right?
[01:04:59.840 --> 01:05:03.320] I'm not going to pretend to be the first to say anything.
[01:05:04.440 --> 01:05:08.840] I'm bringing things together in a way that others haven't, I think.
[01:05:08.840 --> 01:05:12.920] But I think that that's exactly right.
[01:05:13.720 --> 01:05:27.080] On the other hand, so what's right is that when cases are difficult, you just have to take the case on its merits and you're just going to have to devote a lot of time to thinking really hard.
[01:05:27.080 --> 01:05:33.000] But, you know, you do have to appreciate that sometimes you don't have the resources to do that, right?
[01:05:33.000 --> 01:05:35.640] So it's like if you're at war, right?
[01:05:35.640 --> 01:05:43.000] And if you have to decide whether to kill the enemy, you just don't have time to deliberate about it.
[01:05:43.000 --> 01:05:45.480] You either do it or you don't, right?
[01:05:47.080 --> 01:05:53.560] You know, whether to which way to swerve, right?
[01:05:54.120 --> 01:06:03.320] If you're driving and suddenly something appears and you have to decide whether to risk your own life or to risk the life of the car beside you.
[01:06:03.320 --> 01:06:07.480] I mean, it's not something you have time to think about consequentially.
[01:06:07.480 --> 01:06:11.560] Moreover, we're making decisions constantly.
[01:06:12.120 --> 01:06:19.480] And we have to make decisions that other people understand often, if we're, say, working in an organization, right?
[01:06:20.120 --> 01:06:25.560] And decisions that other people will agree with, otherwise, we'll immediately become pariahs.
[01:06:25.880 --> 01:06:44.120] So while I agree with you that the best decision is one that, you know, maximizes benefits, minimizes costs, I do think we have no choice given the realities of life but to make use of sacred values a lot.
[01:06:45.200 --> 01:06:53.600] You talk about artificial intelligence in your book in the context of how complicated consequential calculations can be.
[01:06:53.600 --> 01:06:56.160] Maybe AI can do it for us.
[01:06:56.560 --> 01:07:06.160] Elon's full driving mode engineers will design into the car, you know, the trolley problem choice and make the right choice.
[01:07:06.160 --> 01:07:07.840] Something like that.
[01:07:09.120 --> 01:07:10.560] Yeah, do you believe that?
[01:07:10.560 --> 01:07:12.480] No, I don't.
[01:07:12.800 --> 01:07:13.760] I do not.
[01:07:14.160 --> 01:07:15.840] Never mind Elon Musk, right?
[01:07:15.840 --> 01:07:19.920] Even if somebody respectable did it, I don't think it would be possible.
[01:07:19.920 --> 01:07:29.360] Yeah, actually, the thought experiment that I thought was most clever in the book is the one where I suggest that there's a decision aid, right?
[01:07:29.360 --> 01:07:47.120] In which makes decisions that are just as good as us, because nowadays, AI, you can imagine, can have at least as much knowledge as we do and at least as much computing power and could make decisions that are at least as good as the ones that we make.
[01:07:47.120 --> 01:07:51.520] And so why not, you know, allow it to make all your decisions?
[01:07:51.840 --> 01:07:56.880] But then there are cases where that would just not be acceptable.
[01:07:56.880 --> 01:08:06.960] Like, for instance, if you decide to dump somebody or somebody decides to dump you and they say, well, you know, my AI told me to do it.
[01:08:07.760 --> 01:08:09.840] You're not going to forgive them.
[01:08:09.840 --> 01:08:25.200] Or if the Supreme Court was making a decision and they just said, oh, you know, each one of them allowed their little AI support or helpers make the call.
[01:08:25.200 --> 01:08:27.600] And so, you know, America was changed.
[01:08:27.600 --> 01:08:29.040] The immigration law was changed.
[01:08:29.120 --> 01:08:34.840] The president became king because the AIs decided that that would be the right thing to do.
[01:08:35.480 --> 01:08:38.040] I don't think anybody would find that acceptable.
[01:08:38.040 --> 01:08:39.560] No, that doesn't feel right.
[01:08:39.560 --> 01:08:49.000] But on the other hand, we're sort of saying evolution designed in us the capacity to reason to a certain extent that we make rational choices, more or less.
[01:08:49.240 --> 01:08:50.840] I did want to ask you about that.
[01:08:51.000 --> 01:08:55.320] Big debate I have here on the show all the time is to what extent humans are rational.
[01:08:55.320 --> 01:09:06.760] We assume it's skeptic that we can reason people to the right answer using critical thinking, making them aware of all the confirmation bias and hindsight bias, motivated reasoning, and all that stuff.
[01:09:06.760 --> 01:09:11.000] The research on this is pretty mixed, as you know, because you're right about that.
[01:09:11.080 --> 01:09:11.400] Right?
[01:09:11.400 --> 01:09:24.920] I mean, you can teach some of it, and if you teach them about other people's, or you teach them about the motivated reasoning, all the biases, they get better at seeing it in other people, but not themselves.
[01:09:25.400 --> 01:09:26.600] So you have a lot of people who are.
[01:09:26.680 --> 01:09:32.280] But you can teach people to behave in an unbiased way in specific contexts.
[01:09:32.280 --> 01:09:32.920] Yes, yes.
[01:09:32.920 --> 01:09:33.240] Right?
[01:09:33.240 --> 01:09:37.560] Like, like no one can really slow down your business.
[01:09:37.560 --> 01:09:41.880] Anything from sophisticated phishing attacks to plain old spotty internet.
[01:09:41.880 --> 01:09:51.560] Get business secure internet and protect your network with built-in cybersecurity starting at just $60 a month with no contracts and Optimum Business's 60-day money-back guarantee.
[01:09:51.560 --> 01:09:54.600] Plus, the more services you bundle, the more you save.
[01:09:54.600 --> 01:09:56.920] Don't let your network slow down your business.
[01:09:56.920 --> 01:10:03.000] Get Optimum Business Secure Internet starting at just $60 a months at optimum.com/slash business.
[01:10:03.960 --> 01:10:16.480] Doctors tend to, even though people neglect base rates in general, doctors these days are pretty good at observing base rates when patients come into their office looking for a diagnosis.
[01:10:16.800 --> 01:10:22.640] But that doesn't mean when the doctor goes home that they don't neglect base rates.
[01:10:22.640 --> 01:10:38.640] But I mean, so like if we teach you general critical thinking skills, and this is why you should be skeptical of spoon benders or astrology or cult leaders, will the reader then take that into the future and 10 years later they encounter something that I can't even think of.
[01:10:38.640 --> 01:10:41.200] And do they apply those skills or not?
[01:10:41.680 --> 01:10:42.400] No, no.
[01:10:42.400 --> 01:10:43.680] The answer is clear.
[01:10:43.680 --> 01:10:45.120] No, they don't.
[01:10:46.480 --> 01:11:00.400] I don't think there's ever been a study of critical reasoning skills which makes anything other than a tiny difference in a very contextually specific way to the quality of people's decisions.
[01:11:01.360 --> 01:11:02.480] That's the sad truth.
[01:11:02.480 --> 01:11:05.120] It's like I actually find the story.
[01:11:05.360 --> 01:11:12.320] It shocks me that there's still a debate about this because it strikes me that the evidence is very clear.
[01:11:12.320 --> 01:11:24.640] Like, first of all, so what I consider an irrational decision is one in which you have a normative theory which tells you you should be doing X and instead you do Y, right?
[01:11:24.640 --> 01:11:33.200] So there's a divergence between a theory that tells you what you should do and what you actually do, then that's irrational.
[01:11:33.520 --> 01:11:38.160] And first of all, we don't often have normative theories.
[01:11:38.160 --> 01:11:49.760] Like you have given me plenty of examples over the course of the last hour and a half in which there just is no normative theory of like, should you push the big man off the bridge?
[01:11:49.760 --> 01:11:55.040] Well, I, you know, that depends on your theory, on your ethical theory.
[01:11:55.000 --> 01:12:08.040] Um, so so you simply can't attribute irrationality in those cases because you're, you have no basis if you don't know what you should do.
[01:12:08.040 --> 01:12:22.280] But in those cases where you do know what you should do, then it turns out, yeah, people do the right thing a good chunk of the time, but we use all these simplifying strategies and that leads to systematic bias.
[01:12:22.280 --> 01:12:27.240] And it strikes me that that has been demonstrated time and time again.
[01:12:27.240 --> 01:12:45.640] So, you know, the conclusion I draw is some chunk of the time and sometimes when it really matters, like when we're making decisions about war or about tariffs or about immigration or at any big political issue, yeah, we have systematic biases.
[01:12:45.640 --> 01:12:49.400] We are irrational and we really should be taking that into account.
[01:12:49.400 --> 01:12:50.040] Yeah.
[01:12:50.040 --> 01:13:00.760] It's like that stretch of time with research on Alzheimer's and that as a preventative measure, you should do things like Sudoku puzzles, chess, and crossword puzzles and so on to keep your brain sharp.
[01:13:00.760 --> 01:13:06.600] And it turns out it doesn't do anything for Alzheimer's, but it does make you a better Sudoku puzzle solver.
[01:13:08.520 --> 01:13:09.000] Really?
[01:13:09.000 --> 01:13:11.000] It doesn't do anything for Alzheimer's.
[01:13:11.640 --> 01:13:12.360] Oh, no.
[01:13:12.360 --> 01:13:12.920] Oh, boy.
[01:13:12.920 --> 01:13:13.480] Yeah, sorry.
[01:13:13.480 --> 01:13:15.160] I'm going to stop playing Sudoku.
[01:13:15.160 --> 01:13:16.440] But you'll get better at Sudoku.
[01:13:16.920 --> 01:13:18.120] That's worth something.
[01:13:18.120 --> 01:13:28.840] But what I'm getting at is a slightly deeper question with like Hugo Mercier's book, Not Born Yesterday, kind of the follow-up to his enigma of reason.
[01:13:29.400 --> 01:13:36.760] That people are not as irrational as people like me think we are because we're always pointing out all the crazy cults people join and so on.
[01:13:36.760 --> 01:13:39.880] Like one of his examples is most people don't join cults.
[01:13:39.880 --> 01:13:44.280] We can spill off the examples of Scientology or Jonestown or Heaven's Gate.
[01:13:44.280 --> 01:13:47.920] There's a handful and you can point to the people and what were they thinking.
[01:13:48.240 --> 01:13:51.280] But, you know, how many millions of people have encountered Scientology?
[01:13:51.280 --> 01:13:58.320] They took the personality test or they see him on the sidewalk on Hollywood Boulevard there and have a good laugh about it and then walk on.
[01:13:58.320 --> 01:14:01.280] They don't take a second mortgage out on their home and give it to the church.
[01:14:01.280 --> 01:14:02.400] Most people don't follow.
[01:14:02.560 --> 01:14:07.120] This is his argument that like political persuasive ads are largely ineffective.
[01:14:07.120 --> 01:14:10.400] Most people, you know, are not persuaded by those things.
[01:14:10.400 --> 01:14:11.040] Yeah.
[01:14:11.360 --> 01:14:15.760] Look, I mean, this is a half glass full versus half glass empty debate.
[01:14:15.760 --> 01:14:22.400] So, you know, Hugo wants to see the world as with a glass that's half full and good for him.
[01:14:22.400 --> 01:14:24.320] And I hope he's happier for it.
[01:14:24.320 --> 01:14:31.520] But the fact is, you just got to turn on the news and see what the consequences of human irrationality are.
[01:14:31.840 --> 01:14:33.920] And, you know, they're horrific.
[01:14:33.920 --> 01:14:34.480] Yeah.
[01:14:34.880 --> 01:14:42.560] And, you know, when someone you know gets picked up and thrown in jail in El Salvador, you'll feel that way too.
[01:14:42.560 --> 01:14:43.040] Right.
[01:14:43.040 --> 01:14:43.520] Right.
[01:14:43.520 --> 01:14:44.320] Right.
[01:14:44.640 --> 01:14:45.040] All right.
[01:14:45.040 --> 01:14:46.160] Last line of thinking here.
[01:14:46.160 --> 01:14:50.640] I wanted to ask you about free will, determinism, compatibilism in the context of making choices.
[01:14:50.640 --> 01:14:53.920] I'll read my favorite passage from William James.
[01:14:53.920 --> 01:14:58.080] Romeo wants Juliet as the Philenes want the magnet.
[01:14:58.080 --> 01:15:03.120] And if no obstacles intervene, he moves toward her by as straight a line as they.
[01:15:03.120 --> 01:15:12.720] But Romeo and Juliet, if a wall built between them, do not remain idiotically pressing their faces against its opposite sides, like the magnet and the philenes with the card.
[01:15:12.720 --> 01:15:21.120] Romeo soon finds a circuitous path by scaling the wall or otherwise of touching Juliet's lips directly.
[01:15:21.120 --> 01:15:23.680] With the philenes, the path is fixed.
[01:15:23.680 --> 01:15:26.560] Whether it reaches the end depends on accidents.
[01:15:26.560 --> 01:15:29.280] With the lover, it is the end which is fixed.
[01:15:29.280 --> 01:15:32.280] The path may be modified indefinitely.
[01:15:29.920 --> 01:15:38.520] I like that as an argument for free will or some more volition or degrees of freedom compared to the iron filing.
[01:15:39.160 --> 01:15:46.600] So notice we've already discussed this when we discussed psychological versus physical causality.
[01:15:46.600 --> 01:15:50.040] This was exactly the distinction that I was pointing to.
[01:15:51.400 --> 01:15:52.040] Right?
[01:15:52.040 --> 01:15:57.400] So psychological causality depends on this counterfactual, right?
[01:15:57.800 --> 01:16:03.400] Would this have happened if the action had not been performed?
[01:16:03.400 --> 01:16:08.680] Whereas physical causality is just a matter of forces being enacted.
[01:16:09.560 --> 01:16:25.560] So the make it, the counterfactual view allows for Romeo to say, I would have kissed Juliet, I would have found a way to kiss her, right?
[01:16:27.240 --> 01:16:27.960] Right.
[01:16:27.960 --> 01:16:28.520] Yeah.
[01:16:28.840 --> 01:16:35.400] Precisely because it doesn't specify the process by which you got to her.
[01:16:36.680 --> 01:16:38.840] So are you a compatibilist?
[01:16:39.800 --> 01:16:40.760] Oh, I don't know.
[01:16:40.760 --> 01:16:41.480] You don't know?
[01:16:41.480 --> 01:16:42.360] You don't have to comment.
[01:16:42.760 --> 01:16:47.000] I mean, it's such a dead issue, but it's a live issue.
[01:16:47.000 --> 01:16:51.080] I can't tell you how many books I get every year on free will and determinism here.
[01:16:52.680 --> 01:16:58.040] That tells me something that it's not a soluble problem in any kind of scientific sense, right?
[01:16:58.040 --> 01:17:05.720] People just stake out their arguments and it's become something of a sacred value, like I have volition, or no, you don't, you're determined.
[01:17:05.720 --> 01:17:06.320] That's right.
[01:17:06.120 --> 01:17:06.800] Yeah.
[01:17:06.680 --> 01:17:06.920] Yeah.
[01:17:06.920 --> 01:17:07.160] Yeah.
[01:17:07.800 --> 01:17:22.480] No, look, I mean, the so studying Pearl's view of causality is the most informative, provided the most information I've managed to find about that issue, right?
[01:17:22.480 --> 01:17:30.400] Because basically what he says is if you want to make good inferences, you have to assume free will.
[01:17:30.400 --> 01:17:30.960] Yeah.
[01:17:30.960 --> 01:17:34.240] Because that's what intervention is, right?
[01:17:34.640 --> 01:17:56.880] So if you understand free will to mean that there's an action that's taking place from outside the system that isn't caused by anything, but determines the value of something inside the system, then that allows inferences that are different than if you're just observing the system directly.
[01:17:57.360 --> 01:17:59.280] You know, I didn't say that very well.
[01:17:59.680 --> 01:18:01.280] No, no, that makes sense.
[01:18:01.520 --> 01:18:03.200] It's some time to really paint the picture.
[01:18:03.600 --> 01:18:06.320] Today's book is, I think it's called The Book of Why.
[01:18:06.720 --> 01:18:07.520] Was that what it was?
[01:18:07.840 --> 01:18:10.240] Yeah, so that's the later popular book.
[01:18:10.640 --> 01:18:16.160] These ideas came out of a book called Causality, actually, that he published in 2000.
[01:18:16.560 --> 01:18:30.080] But the basic idea is that to make correct probabilistic inferences, you have to assume something like free will, which doesn't mean that free will actually exists.
[01:18:30.400 --> 01:18:34.880] It just means you have to assume it to make correct inferences.
[01:18:34.880 --> 01:18:41.280] Well, the point I always make is that I know all the determinists and have read their books, but none of them actually act like it.
[01:18:42.240 --> 01:18:47.200] They don't walk around their lives leading their lives as if they're determined because nobody does.
[01:18:47.520 --> 01:18:48.320] Interesting.
[01:18:48.360 --> 01:18:50.960] Yeah, it's really kind of funny that way.
[01:18:50.960 --> 01:18:51.440] All right.
[01:18:51.440 --> 01:18:57.120] Well, so you're in the end here, the cost of conviction, how our deepest values lead us astray.
[01:18:57.120 --> 01:18:57.600] There it is.
[01:18:57.600 --> 01:19:03.000] You make the case slightly in favor of consequentialism over sacred values.
[01:19:03.960 --> 01:19:06.840] You want to put a final exclamation point on that?
[01:19:07.800 --> 01:19:10.680] Yeah, but descriptively, right?
[01:19:10.680 --> 01:19:11.400] Descriptively, yeah.
[01:19:12.680 --> 01:19:16.120] We rely more on sacred values than we should.
[01:19:16.440 --> 01:19:21.160] I'm not making normative claims about what the right way to make a decision is.
[01:19:21.160 --> 01:19:21.640] Right.
[01:19:22.760 --> 01:19:30.200] By the way, since you talk about Kahneman so much, were you surprised to hear the story that he euthanized himself?
[01:19:30.280 --> 01:19:30.920] Went to Switzerland?
[01:19:31.160 --> 01:19:32.120] I was shocked.
[01:19:32.120 --> 01:19:33.160] Unbelievable.
[01:19:33.480 --> 01:19:33.960] Yeah.
[01:19:34.600 --> 01:19:35.160] Yeah.
[01:19:36.120 --> 01:19:36.760] Yeah.
[01:19:36.760 --> 01:19:47.560] I had just, you know, a couple of months beforehand had a long conversation with the woman he had been living with until that point, and she didn't say a word about it.
[01:19:47.560 --> 01:19:50.600] It was clearly a big secret.
[01:19:50.600 --> 01:19:51.240] Yeah.
[01:19:52.200 --> 01:19:58.600] But yeah, it worries me that he did it for its symbolic value.
[01:19:59.960 --> 01:20:00.520] Right.
[01:20:00.520 --> 01:20:04.120] Because he's the great decision maker in America.
[01:20:04.120 --> 01:20:07.800] And he's sort of teaching people how to make good decisions.
[01:20:08.120 --> 01:20:10.040] But hopefully that's not why.
[01:20:10.360 --> 01:20:16.920] And I, you know, I don't know enough of the details of his state of health to know what he had coming.
[01:20:16.920 --> 01:20:20.120] Well, yeah, apparently he wasn't terminal or anything like that.
[01:20:20.120 --> 01:20:22.200] But, you know, I'm 70.
[01:20:22.200 --> 01:20:27.160] If I was 91 and I could tell things were starting to go and I'd have a long, good life.
[01:20:27.160 --> 01:20:34.600] It's like, well, you know, what's the point of dragging it out for another 10 years, in which I can't even reason at all?
[01:20:34.600 --> 01:20:35.560] I can't even think.
[01:20:35.960 --> 01:20:37.640] Something like that.
[01:20:37.640 --> 01:20:41.160] That could be a calculation, you know, rational calculation.
[01:20:41.160 --> 01:20:42.040] Absolutely.
[01:20:42.040 --> 01:20:44.800] But I just don't know because I'm not in that position.
[01:20:43.320 --> 01:20:47.200] I don't want to be in that position.
[01:20:44.280 --> 01:20:49.280] All right, Stephen, great book.
[01:20:49.440 --> 01:20:50.480] What are you working on next?
[01:20:50.480 --> 01:20:52.800] What's your research programs these days?
[01:20:53.120 --> 01:20:54.640] Hey, that book hasn't even come out.
[01:20:54.800 --> 01:20:55.360] Oh, that's true.
[01:20:55.360 --> 01:20:55.760] That's right.
[01:20:55.920 --> 01:20:57.040] I shouldn't ask you that question.
[01:20:57.040 --> 01:20:58.320] I shouldn't ask you that question.
[01:20:58.320 --> 01:20:58.720] Okay.
[01:20:58.720 --> 01:20:59.520] You're not doing anything.
[01:20:59.520 --> 01:21:00.080] Oh, that's right.
[01:21:00.080 --> 01:21:00.880] May 20th.
[01:21:00.880 --> 01:21:06.400] Well, we will release this on the pub date of the book, and then you can worry about the future.
[01:21:06.720 --> 01:21:07.600] Thanks so much.
[01:21:07.600 --> 01:21:07.920] All right.
[01:21:08.080 --> 01:21:09.520] Been a real pleasure talking to you.
[01:21:09.520 --> 01:21:10.560] Likewise.
[01:21:17.280 --> 01:21:21.120] I actually chose FDU because of the small classroom sizes.
[01:21:21.120 --> 01:21:22.880] That's something that was extremely important to me.
[01:21:22.880 --> 01:21:25.600] I wanted to have a connection and a relationship with my professors.
[01:21:25.600 --> 01:21:27.760] I didn't want to just be a number in a classroom.
[01:21:27.760 --> 01:21:31.520] I seized the moment at FDU and found my purpose.
[01:21:32.480 --> 01:21:37.920] Groons just launched a limited edition Grooney Smith apple flavor, and it's only available through October.
[01:21:37.920 --> 01:21:42.640] Same full body benefits you love, but now it tastes like sweet tart green apple candy.
[01:21:42.640 --> 01:21:46.720] Like walking through an orchard in a cable-knit sweater, warm cider in hand.
[01:21:46.720 --> 01:21:52.080] Each packable pouch delivers six grams of prebiotic fiber and a powerful daily dose of vitamins and minerals.
[01:21:52.080 --> 01:21:54.640] Grab your limited edition Grooney Smith Apple Groons.
[01:21:54.640 --> 01:21:56.560] Stock up because they will sell out.
[01:21:56.560 --> 01:21:58.560] Use code apple at groons.co.
[01:21:58.560 --> 01:22:02.320] That's g-u-r-n-s.co-o to get up to 52% off.