Debug Information
Processing Details
- VTT File: mss528_Michael_Lynch_2025_07_01.vtt
- Processing Time: September 11, 2025 at 03:30 PM
- Total Chunks: 1
- Transcript Length: 74,479 characters
- Caption Count: 762 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 1 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.560 --> 00:00:04.400] Let's talk about a special end-of-life care called Hospice.
[00:00:04.400 --> 00:00:11.680] For 125 years, Calvary Hospital and Dindhome Hospice has been providing New York with world-renowned hospice care.
[00:00:11.680 --> 00:00:20.240] Calvary's special doctors, nurses, and counselors work as a team, relieving the patient's physical pain and the emotional stress of the family.
[00:00:20.240 --> 00:00:24.480] Visit us 24-7 at CalvaryHospital.org.
[00:00:24.480 --> 00:00:30.000] Calvary, for 125 years, where life continues.
[00:00:30.640 --> 00:00:36.880] This Labor Day at Value City Furniture, get up to 20% off your new living room and more throughout the store.
[00:00:36.880 --> 00:00:38.720] Think you deserve even more?
[00:00:38.720 --> 00:00:39.360] Same.
[00:00:39.360 --> 00:00:43.280] Like combining those savings with an extra 10% off door busters.
[00:00:43.280 --> 00:00:46.400] Plus, no interest financing for up to 40 months.
[00:00:46.400 --> 00:00:49.600] See if you pre-qualify without touching your credit score.
[00:00:49.600 --> 00:00:53.920] Yep, get more at Value City Furniture while paying less.
[00:00:53.920 --> 00:00:56.880] More style, more quality, more value.
[00:00:56.880 --> 00:00:59.440] We are Value City Furniture.
[00:01:03.920 --> 00:01:09.760] You're listening to The Michael Shermer Show.
[00:01:17.120 --> 00:01:20.800] All right, everybody, it's time for another episode of the Michael Shermer Show.
[00:01:20.800 --> 00:01:23.680] Brought to you as always by the Skeptic Society and Skeptic Magazine.
[00:01:23.680 --> 00:01:28.800] Now, my guest today has a book out about truth in politics.
[00:01:28.800 --> 00:01:30.240] I'm very interested in this topic.
[00:01:30.240 --> 00:01:36.080] As many of you know, my next book is called Truth: What It Is, How to Find It, Why It Still Matters.
[00:01:36.080 --> 00:01:39.200] The publisher added still in there because I forgot that.
[00:01:39.200 --> 00:01:40.080] Yeah, that's right.
[00:01:40.080 --> 00:01:42.320] It always mattered, but it still matters.
[00:01:42.560 --> 00:01:48.400] I don't have a chapter on political truths because I realized the book was already too long.
[00:01:48.400 --> 00:01:50.640] But also, that's not really my wheelhouse.
[00:01:50.640 --> 00:01:51.440] And guess what?
[00:01:51.440 --> 00:01:53.280] My guest today wrote a book about that.
[00:01:53.280 --> 00:01:56.560] It's called On Truth in Politics.
[00:01:56.560 --> 00:01:58.000] How about that for good timing?
[00:01:58.000 --> 00:01:59.840] Why democracy demands it?
[00:02:00.120 --> 00:02:02.440] He is Michael Patrick Lynch.
[00:02:02.440 --> 00:02:10.360] He's a provost professor of the humanities and board of trustees, distinguished professor of philosophy at the University of Connecticut.
[00:02:10.360 --> 00:02:10.760] Dr.
[00:02:10.760 --> 00:02:24.920] Lynch is the author or editor of 10 books, including Know-It-All Society, The Internet of Us, Truth as One and Many, and the New York Times Sunday Book Review Editor's Pick, True to Life.
[00:02:24.920 --> 00:02:26.600] Can you tell he's into truth?
[00:02:26.920 --> 00:02:34.280] Lynch has held grants from the National Endowment for the Humanities, the Andrew Mellon Foundation, and the John Templeton Foundation, among others.
[00:02:34.280 --> 00:02:43.880] He's spoken at TED, yay TED, the Nantucket Project, Chautauqua, and South by Southwest, usually with an SXX SW.
[00:02:43.880 --> 00:02:47.400] In 2019, he was awarded the George Orwell Award.
[00:02:47.400 --> 00:02:48.600] How cool is that?
[00:02:48.600 --> 00:02:53.960] Which recognizes writers who have made outstanding contributions to the critical analysis of public discourse.
[00:02:53.960 --> 00:02:59.320] And again, the new book is on truth and politics: why democracy demands it.
[00:02:59.320 --> 00:03:00.360] Michael, nice to see you.
[00:03:00.360 --> 00:03:07.960] I should warn our listeners: you've just recently had surgery on your throat, so your voice is a little off at the moment, but that's okay.
[00:03:07.960 --> 00:03:09.400] We'll work around that.
[00:03:10.360 --> 00:03:12.760] Well, thank you very much, Michael.
[00:03:12.760 --> 00:03:15.320] I'm pleased to be here talking to you.
[00:03:15.320 --> 00:03:19.960] Well, you know, the problem, I guess, let's start off with what you mean by truth.
[00:03:20.360 --> 00:03:22.200] Let me just give a fun example here.
[00:03:22.520 --> 00:03:29.640] I was just toying with the idea: like, what's the right percentage of upper income tax bracket?
[00:03:29.640 --> 00:03:30.680] Well, how should I know?
[00:03:30.680 --> 00:03:32.520] So I asked Grock.
[00:03:32.840 --> 00:03:38.440] And so here's Grok's answer: There is no single right percentage for the upper income tax bracket.
[00:03:38.440 --> 00:03:41.800] It depends on what you're trying to achieve and who you ask.
[00:03:41.800 --> 00:03:50.880] Economists, policymakers, and regular people have been arguing about this forever, and the answer hinges on trade-offs between revenue, fairness, and economic growth.
[00:03:51.200 --> 00:03:54.160] Historically, top marginal tax rates in the U.S.
[00:03:54.160 --> 00:03:55.280] have swung wildly.
[00:03:55.280 --> 00:04:02.800] In the 1950s, they hit 91% for the highest earners under Eisenhower, though loopholes meant few actually paid that.
[00:04:02.800 --> 00:04:13.760] Revenue maxed out at around 7% to 9% of GDP back then, similar to today's levels, despite the rates now topping out at 37% federally, plus state taxes.
[00:04:13.760 --> 00:04:16.560] Like in California, it's 10% here now.
[00:04:16.560 --> 00:04:18.720] Practically, it's a mess of priorities.
[00:04:18.720 --> 00:04:20.320] You want more social programs?
[00:04:20.320 --> 00:04:24.160] Crank it up to 50 to 60%, like some European countries.
[00:04:24.160 --> 00:04:28.160] France is at 45%, Denmark effectively higher with add-ons.
[00:04:28.160 --> 00:04:33.280] Worried about drain-brain or stifling entrepreneurs, keep it closer to 35% to 40%.
[00:04:33.280 --> 00:04:39.600] To me, that tells me I'm using the word like a scientist would, like, you know, is the theory of evolution true?
[00:04:39.600 --> 00:04:40.480] Well, yeah.
[00:04:40.480 --> 00:04:41.920] And here's all the reasons why.
[00:04:41.920 --> 00:04:47.440] I don't think of political truths in the same way, but how should I be thinking about it?
[00:04:48.080 --> 00:04:52.560] I think that most people would agree with you about that.
[00:04:52.880 --> 00:05:02.480] As Anna Arendt famously said, truth and politics are in constant war with one another.
[00:05:02.480 --> 00:05:12.640] And we don't typically think of statements in politics as really being accurate or inaccurate.
[00:05:12.640 --> 00:05:17.280] We think of them being persuasive or not persuasive.
[00:05:17.600 --> 00:05:37.960] But actually, I think democratic practice presupposes, demands, as I say, that we treat at least some judgments we make in politics as being true or false.
[00:05:38.600 --> 00:05:46.200] After all, it's a key part of democracy that we want to speak truth to power.
[00:05:46.520 --> 00:05:50.280] We want to give reasons for ideas.
[00:05:50.280 --> 00:05:56.040] We don't just want to force people to do this or that in a democracy.
[00:05:56.680 --> 00:06:00.280] We want to back up our policies with evidence.
[00:06:00.280 --> 00:06:03.560] These are all things that, Michael, you believe.
[00:06:03.880 --> 00:06:18.360] Well, none of that makes any sense to do that in politics, to have those principles, if you don't think that at least some judgments in politics can be true.
[00:06:18.680 --> 00:06:20.600] And what do I mean by that?
[00:06:20.920 --> 00:06:33.160] Well, the simplest answer is that true judgments are those that fit reality, that accord with reality.
[00:06:33.480 --> 00:06:37.560] The complicated answer is what that means.
[00:06:38.040 --> 00:06:42.440] Every philosopher, as you know, pretty much agrees.
[00:06:42.440 --> 00:06:52.760] Not every philosopher, but most philosophers throughout history have agreed that truth has something to do with reality.
[00:06:52.760 --> 00:06:58.440] True judgments are not just the ones we want to want to be true.
[00:06:58.760 --> 00:07:01.800] You know, wishing doesn't make it so.
[00:07:02.120 --> 00:07:04.360] But, and we all can agree on that.
[00:07:04.360 --> 00:07:06.360] That's not rocket science.
[00:07:06.680 --> 00:07:21.600] But the hard part is to say what it means for something like mathematics to accord reality with reality or politics to accord with reality.
[00:07:22.880 --> 00:07:27.920] In the one case, mathematics, it's very abstract.
[00:07:28.240 --> 00:07:31.120] Are there really numbers out there?
[00:07:31.440 --> 00:07:34.320] Some people say yes, some people say no.
[00:07:34.640 --> 00:07:39.760] But because you might think, how can numbers be physical objects?
[00:07:39.760 --> 00:07:41.120] Math is about numbers.
[00:07:41.760 --> 00:07:44.720] Politics is abstract too.
[00:07:45.360 --> 00:07:53.600] But nobody really seriously says that there aren't truths about mathematics.
[00:07:53.600 --> 00:07:59.840] Funnily enough, we talk ourselves into saying, oh, but politics is different.
[00:07:59.840 --> 00:08:01.840] It's too abstract.
[00:08:01.840 --> 00:08:03.200] Well, guess what?
[00:08:03.200 --> 00:08:05.360] Math is abstract.
[00:08:07.280 --> 00:08:18.080] So I think it's simple to say there are some judgments in politics like climate change soaks.
[00:08:18.080 --> 00:08:20.720] That's a political judgment that people make.
[00:08:21.360 --> 00:08:23.520] I think it's false.
[00:08:23.840 --> 00:08:26.800] But some people think it's true.
[00:08:26.800 --> 00:08:31.760] I, what does it mean for me to say that I think it's false?
[00:08:31.760 --> 00:08:36.720] It means I don't think climate change soaks.
[00:08:37.360 --> 00:08:41.120] So on the simple level, it's pretty straightforward.
[00:08:41.440 --> 00:08:45.440] Yeah, let's use that last example: climate change is a hoax.
[00:08:45.440 --> 00:08:50.880] Okay, that we can debunk pretty clearly, including also the question: is it, you know, real?
[00:08:50.880 --> 00:08:51.920] Is it human caused?
[00:08:51.920 --> 00:08:52.880] First, is it happening?
[00:08:52.880 --> 00:08:54.400] Second, what's happening?
[00:08:54.400 --> 00:08:55.440] Is it human-cause?
[00:08:55.440 --> 00:08:59.640] Third, you know, how much is it going to change over time?
[00:08:59.040 --> 00:09:01.880] And then what's your time horizon?
[00:08:59.280 --> 00:09:05.720] And then, you know, fourth, what will the consequences of the change be?
[00:08:59.520 --> 00:09:08.440] And then five, what should we do about it?
[00:09:08.440 --> 00:09:14.600] Now, it seems to me I'm shifting there from empirical objective truths to political truths, all right?
[00:09:14.600 --> 00:09:25.160] I mean, if we're going to spend, you know, a trillion dollars in the next century to reduce carbon emissions because of these effects, you know, politically or economically, we might have a legitimate debate.
[00:09:25.160 --> 00:09:33.720] Like, well, maybe we'd be better off spending the money on mosquito nets or, you know, vaccines for poor people in Africa or we'd save more lives.
[00:09:33.720 --> 00:09:36.440] And, you know, those are trade-off type questions.
[00:09:36.440 --> 00:09:40.520] You know, so there, I think, that's an example of what you're talking about.
[00:09:40.840 --> 00:09:41.640] Yeah.
[00:09:41.960 --> 00:09:58.840] So what I say in the book is that there's two senses of political judgment that people have in their head when they're talking about whether something's a political judgment or not.
[00:09:59.160 --> 00:10:08.040] In the narrow sense, in the technical sense, a political judgment is about what society ought to do.
[00:10:08.040 --> 00:10:13.240] So, should we enact climate change legislation or not?
[00:10:13.880 --> 00:10:17.720] That's what we might call a pure political judgment.
[00:10:17.720 --> 00:10:21.080] It's about what society ought to do.
[00:10:21.400 --> 00:10:35.720] But as you and I both know, and as you just illustrated, in real political life, that is, in real politics, everything gets politicized.
[00:10:36.040 --> 00:10:46.080] So, as we both know, and that's the reason I make this example, climate change science is heavily politicized.
[00:10:44.840 --> 00:10:48.160] Now, I don't like that.
[00:10:48.800 --> 00:10:52.880] My friends who are climate scientists don't like that.
[00:10:52.880 --> 00:10:55.040] I don't think you like that.
[00:10:55.040 --> 00:10:57.200] But it is what it is.
[00:10:57.840 --> 00:11:17.600] Some judgments in our life about the weather nowadays, about coffee, about inflation, about bridges, pretty much anything you name, become political in the wide sense.
[00:11:17.680 --> 00:11:21.920] By that I mean they come to have political meaning.
[00:11:21.920 --> 00:11:27.760] People put political associations next to those sorts of claims.
[00:11:28.160 --> 00:11:33.680] Strictly speaking, what makes those claims true is reality.
[00:11:34.000 --> 00:11:36.160] They're not.
[00:11:36.160 --> 00:11:37.440] They're content.
[00:11:37.440 --> 00:11:42.000] What they're about is, let's say, climate change science.
[00:11:43.600 --> 00:11:49.280] But that doesn't stop people from treating them as political.
[00:11:49.600 --> 00:11:57.360] And that's what complicates my job in this book and your job in thinking about truth in general.
[00:11:57.360 --> 00:11:58.000] Yeah.
[00:11:58.720 --> 00:12:05.680] Yeah, again, I think the climate change thing got politicized around the time of Al Gore's film, An Inconvenient Truth.
[00:12:05.680 --> 00:12:08.960] This isn't my observation, somebody else that I think made it.
[00:12:08.960 --> 00:12:10.160] I think it's right.
[00:12:10.560 --> 00:12:29.720] Therefore, because of the popularity of his film, and he was the vice president of the Democratic Party, you know, it got politicized and bundled such that, you know, when conservatives hear climate change or anthropogenic global warming, their brain auto-corrects to anti-capitalism, anti-business, you know, anti-America, whatever.
[00:12:29.720 --> 00:12:35.400] You know, and we're no longer talking about CO2 levels and shrinking glaciers or anything like that.
[00:12:35.400 --> 00:12:36.840] That's all irrelevant.
[00:12:36.840 --> 00:12:41.160] Well, yeah, the question is: is all in that solution number five?
[00:12:41.160 --> 00:12:43.720] You know, so the solution is step number five.
[00:12:43.720 --> 00:12:45.000] What should we do about it?
[00:12:45.000 --> 00:12:53.160] You know, and then they morph into, well, then it can't be real because if it's real and bad, then we probably should do something, you know, something like that.
[00:12:53.160 --> 00:12:54.840] So I think that's an example.
[00:12:55.880 --> 00:13:06.440] And, you know, since you're a philosopher, you know, the whole is-ought problem, we can describe the way things are, or is, the way something is, and what we ought to do about it.
[00:13:06.440 --> 00:13:12.280] But what I want to suggest is that actually we do cross that boundary quite often.
[00:13:12.600 --> 00:13:23.160] I mean, it's we can have a legitimate debate what's the proper upper income tax percentage because we want we it depends on the goals you want to accomplish.
[00:13:23.160 --> 00:13:38.120] Okay, but once you've decided we want a social safety net of this amount so that nobody falls through the cracks, universal health care, Medicare, Social Security, and so on, and so forth, and that ends up being at like 20% GDP or something like that.
[00:13:38.360 --> 00:13:41.160] Okay, now that we've set that goal, okay, how are we going to raise the money?
[00:13:41.160 --> 00:13:43.640] Then we can have a, you know, more of an empirical.
[00:13:43.640 --> 00:13:46.040] But how did we get to that first question?
[00:13:46.360 --> 00:13:54.360] Why is it we should take care of those who take care of themselves, the homeless or mentally ill or handicapped, whatever it is?
[00:13:54.360 --> 00:13:55.800] That's a harder one, right?
[00:13:55.800 --> 00:13:58.360] In terms of what's the right thing to do.
[00:13:59.000 --> 00:13:59.880] Yeah.
[00:13:59.880 --> 00:14:04.280] Well, what's just put your finger on very adroitly.
[00:14:04.280 --> 00:14:09.560] Is that when it comes to questions of pure politics?
[00:14:09.880 --> 00:14:13.960] We're talking about what philosophers call normative questions.
[00:14:14.640 --> 00:14:15.000] Questions.
[00:14:15.680 --> 00:14:18.080] What should society do?
[00:14:18.400 --> 00:14:21.200] What sorts of goals should we have?
[00:14:22.480 --> 00:14:36.960] So, when we get to questions about what sort of goals, political goals we ought to have, should we, for example, have free and fair elections?
[00:14:37.280 --> 00:14:39.760] That's actually a political question.
[00:14:40.080 --> 00:14:44.240] Should we have due process under the law?
[00:14:44.880 --> 00:14:46.880] That's a political question.
[00:14:47.200 --> 00:14:52.160] It's one that's, I think, now openly debated.
[00:14:52.480 --> 00:15:00.720] Are all citizens and are all people in the US guaranteed due process?
[00:15:01.360 --> 00:15:03.360] That's being debated.
[00:15:04.640 --> 00:15:08.080] How do we arrive at our decisions about those things?
[00:15:09.360 --> 00:15:17.280] Well, in part, it depends on what sort of prior commitments you have.
[00:15:18.240 --> 00:15:23.040] If, like me, you have democratic prior commitments.
[00:15:23.040 --> 00:15:24.160] Commitment.
[00:15:24.160 --> 00:15:31.760] Commitments, for example, to treating people as free and equal.
[00:15:31.760 --> 00:15:39.520] To giving reasons to them for your views because you treat them with respect.
[00:15:39.840 --> 00:15:57.760] If that's the sort of decision-making process you're interested in, then gathering evidence for your views, treating them as being able to be right or wrong, true or false, goes with the daredevil.
[00:15:58.080 --> 00:16:00.000] But there's other types of politics out there.
[00:16:01.800 --> 00:16:11.320] There's politics that tell us that the right way to solve these problems is to ask a set of experts.
[00:16:11.320 --> 00:16:13.720] The right way to solve these problems.
[00:16:13.720 --> 00:16:15.720] Just cite another example.
[00:16:15.720 --> 00:16:17.720] Is to ask the great leader.
[00:16:17.720 --> 00:16:19.480] North Korea.
[00:16:20.120 --> 00:16:24.680] The leader decides what the calls are.
[00:16:25.320 --> 00:16:28.280] And in autocracies, that's often the case.
[00:16:30.040 --> 00:16:34.760] So, push your question back.
[00:16:35.080 --> 00:16:53.480] We get to the point that what explains why it's true that we should be engaging in democratic politics rather than authoritarian politics.
[00:16:55.320 --> 00:17:00.520] That's a question I think all of us should be asking right now.
[00:17:00.520 --> 00:17:01.240] Yeah.
[00:17:01.800 --> 00:17:03.880] Michael, we're getting a ping.
[00:17:03.880 --> 00:17:06.840] Is that your email alerting you to emails?
[00:17:06.840 --> 00:17:08.040] Maybe turn that off.
[00:17:08.760 --> 00:17:09.480] Yeah, just shut off.
[00:17:10.200 --> 00:17:11.080] I had the same problem too.
[00:17:11.080 --> 00:17:12.440] Mine was pinging too.
[00:17:12.440 --> 00:17:22.360] Yeah, so I'm going to come back to that in a minute, but I'm going to give a couple more examples of this problem of determining truth in politics.
[00:17:22.360 --> 00:17:30.040] And so this is a debate I had in the pages of Skeptic with Lee McIntyre, the philosopher who wrote a book on post-truth.
[00:17:30.360 --> 00:17:37.800] He used the following exchange as an example of what he thinks is post-truth, but I think is maybe is a more of a political truth.
[00:17:37.800 --> 00:17:38.840] And you tell me what you think.
[00:17:38.840 --> 00:17:40.600] I'm not sure what the right answer is.
[00:17:40.600 --> 00:17:51.040] This is a 2016 exchange between CNN's Allison Camarota and Republican Speaker of the House, former Speaker of the House, Newt Gingrich, on crime rates.
[00:17:51.360 --> 00:17:54.640] I'm not sure when in 2016 this was at, you know, mid-year or so.
[00:17:54.640 --> 00:17:58.720] Anyway, Camerotta says from CNN, violent crime is down.
[00:17:58.720 --> 00:18:00.560] The economy is ticking up.
[00:18:00.560 --> 00:18:04.000] And Newt says, it's not down in the biggest cities.
[00:18:04.000 --> 00:18:06.880] And she says, violent crime, murder rate is down.
[00:18:06.880 --> 00:18:07.920] It's down.
[00:18:08.240 --> 00:18:13.760] And then Newt says, how come it's up in Chicago and up in Baltimore and up in Washington, D.C.?
[00:18:13.760 --> 00:18:18.960] Camera says, those are pockets where certainly we're not tackling murder.
[00:18:18.960 --> 00:18:22.320] And Gindrich says, your national capital, your third biggest city.
[00:18:22.320 --> 00:18:25.600] And Cameron says, but violent crime across the country is down.
[00:18:25.600 --> 00:18:31.920] And he says, the average American, I will bet you this morning, does not think crime is down, does not think they're safer.
[00:18:31.920 --> 00:18:33.360] And she says, but it is.
[00:18:33.360 --> 00:18:34.800] We're safer and it's down.
[00:18:34.800 --> 00:18:37.840] And Geindridge says, no, that's just your view.
[00:18:37.840 --> 00:18:39.600] And Camera says, it's a fact.
[00:18:39.600 --> 00:18:41.840] These are national FBI facts.
[00:18:41.840 --> 00:18:44.800] And Geindrus says, but what I said is also a fact.
[00:18:44.800 --> 00:18:52.320] The current view is that liberals have a whole set of statistics that theoretically may be right, but it's not where human beings are.
[00:18:52.320 --> 00:18:53.600] People are frightened.
[00:18:53.600 --> 00:18:57.200] And then she says, but what you're saying is, but hold on, Mr.
[00:18:57.200 --> 00:19:00.400] Speaker, because you're saying liberals use these numbers.
[00:19:00.400 --> 00:19:04.560] They use this sort of magic math, but these are FBI statistics.
[00:19:04.640 --> 00:19:06.720] FBI is not a liberal organization.
[00:19:06.720 --> 00:19:08.800] It's a crime-fighting organization.
[00:19:08.800 --> 00:19:12.080] And Geindrich says, no, but what I said is equally true.
[00:19:12.080 --> 00:19:14.160] People feel more threatened.
[00:19:14.160 --> 00:19:18.400] And she says, feel yes, but the facts don't support it.
[00:19:18.400 --> 00:19:24.400] And he says, as a political candidate, I'll go with how people feel and let you go with the theoreticians.
[00:19:25.040 --> 00:19:29.360] So, I mean, is he just making the case that, look, lady, I got to win elections.
[00:19:29.360 --> 00:19:32.680] And, you know, when the people are frightened, they're going to vote for our side.
[00:19:32.680 --> 00:19:34.440] And I didn't actually lie.
[00:19:29.760 --> 00:19:38.120] I mean, the crime rate did tick up in Chicago and Baltimore and so on.
[00:19:38.120 --> 00:19:42.760] I know the whole average is down, but the perception is what I'm after, not the facts.
[00:19:42.760 --> 00:19:44.360] Anyway, your thoughts?
[00:19:45.320 --> 00:20:00.760] I think that conversation is a beautiful example of why politics as a practice makes it very difficult to get any truth in politics.
[00:20:01.720 --> 00:20:11.480] The point of my book is to really try to grapple with a kind of paradox that this conversation illustrates.
[00:20:11.480 --> 00:20:14.520] I mean, the one you just read, not ours.
[00:20:16.280 --> 00:20:20.200] And that paradox is democratic politics.
[00:20:20.200 --> 00:20:33.160] Politics that favors deliberation between free and equal people, not democratic politics in terms of the party, but a certain type of practice of democracy.
[00:20:33.160 --> 00:20:37.320] In other words, is a kind of politics.
[00:20:37.640 --> 00:20:46.360] But every kind of politics is an engine that generates political meanings.
[00:20:46.360 --> 00:20:51.080] It generates political fog.
[00:20:51.720 --> 00:21:03.480] So just to practice politics means that you're paying attention to things like the associations people make with words like climate change.
[00:21:03.800 --> 00:21:14.200] You're a bad politician in terms of getting elected if you're not sensitive to how people interpret the words you use.
[00:21:14.200 --> 00:21:20.880] And you're a bad politician if you're not sensitive to people's emotions.
[00:21:20.880 --> 00:21:31.280] Because those emotions drive the associations they make with words, things, symbols, everything.
[00:21:32.880 --> 00:21:53.520] So, to practice politics paradoxically is to engage in an activity that, by its very nature, makes it difficult to get to a rational discussion.
[00:21:53.520 --> 00:22:02.000] And the paradox of democratic politics, the practice of democracy, is it demands.
[00:22:02.000 --> 00:22:08.640] Democracy requires us to use reasons and evidence.
[00:22:08.960 --> 00:22:09.680] Why?
[00:22:10.000 --> 00:22:19.440] Because we can't respect other people as free and equal if we don't give them reasons for the policies we're enacting.
[00:22:20.080 --> 00:22:30.240] You can't make people do things, pay taxes, you know, or any other thing, obey the laws, without giving them some reasons.
[00:22:30.560 --> 00:22:38.400] If, that is, you're treating them as free and equal citizens, like we do in democracy.
[00:22:39.840 --> 00:22:44.400] So, democratic politics is politics.
[00:22:44.400 --> 00:22:49.040] Politics gets in the way of reason and truth.
[00:22:49.040 --> 00:22:50.480] How do you combine them?
[00:22:50.720 --> 00:22:51.920] That's the trick.
[00:22:51.920 --> 00:22:52.720] Yeah.
[00:22:52.720 --> 00:22:53.360] Yeah.
[00:22:54.880 --> 00:22:56.160] Okay, another example.
[00:22:56.160 --> 00:23:08.440] So, in science, we aim to arrive at the truth through a peer review system, you know, a blind peer review.
[00:23:08.840 --> 00:23:17.400] You go to conferences, you have colleagues and peers you bounce your ideas off of, and you have expertise and degrees and so on.
[00:23:17.400 --> 00:23:20.760] And not that outsiders can't make contributions, they do.
[00:23:21.000 --> 00:23:25.800] But, you know, there's kind of a system set up over the last several centuries.
[00:23:25.800 --> 00:23:30.600] In the legal system, it's an adversarial system.
[00:23:30.600 --> 00:23:37.560] We have two lawyers, and the peer review is the jury or the judge that hears both cases.
[00:23:38.200 --> 00:23:40.680] And each lawyer is not trying to get to the truth.
[00:23:40.680 --> 00:23:42.360] They're just trying to win.
[00:23:42.360 --> 00:23:47.880] And hopefully, the truth will kind of fall there in the middle, and the jury or judge will make the right decision.
[00:23:47.880 --> 00:23:48.840] They don't always.
[00:23:48.840 --> 00:23:57.720] In journalism, we have fact-checking and editors and editors on top of editors, and an editor-in-chief, and then the owner of the newspaper with the final say or whatever.
[00:23:58.200 --> 00:24:08.440] In politics, I guess, is that the idea with a democracy, the public is the peer review system, and the elections are like the experiments.
[00:24:08.600 --> 00:24:10.120] Something like that?
[00:24:12.360 --> 00:24:14.200] That's one view.
[00:24:14.200 --> 00:24:21.160] I don't think of truth and democracy as being decided by vote.
[00:24:21.960 --> 00:24:22.520] Interesting.
[00:24:22.520 --> 00:24:23.240] Some people do.
[00:24:23.240 --> 00:24:23.880] Yeah.
[00:24:23.880 --> 00:24:28.200] Some people think that's what democratic truth means.
[00:24:28.840 --> 00:24:37.320] The majority determines not just what we ought to do, but what's what's true about what we ought to do.
[00:24:37.320 --> 00:24:37.720] Yeah.
[00:24:39.000 --> 00:24:43.720] I think that's conflating two things.
[00:24:43.720 --> 00:24:53.600] In a democracy, political authority is determined, roughly speaking, or generalizing, simplifying.
[00:24:53.600 --> 00:25:02.800] Roughly speaking, by Democrat votes, political authority is the authority for the president to enact.
[00:25:06.960 --> 00:25:15.440] Political authority and what we might call epistemic authority, authority of knowledge, are different things.
[00:25:15.760 --> 00:25:17.760] Plato thought they went together.
[00:25:18.160 --> 00:25:22.880] Plato thought, what gives you political authority is that you know things.
[00:25:23.200 --> 00:25:29.200] So just because you know something means you should be the person that ought to decide.
[00:25:29.200 --> 00:25:32.560] That's tempting, but I think that's a fallacy.
[00:25:33.600 --> 00:25:49.280] Because just because I'm a scientific expert doesn't mean that I have the ability to order you to do what I want you to do.
[00:25:51.840 --> 00:26:07.760] The authority that someone gets, let's say, as a ship captain, to order the crew to do what they do, is given by an outside principle.
[00:26:07.760 --> 00:26:13.040] A principle about on ships, captains are in charge.
[00:26:13.360 --> 00:26:19.520] Or, you know, the president gets to make decisions about the military.
[00:26:19.520 --> 00:26:21.520] He's the commander-in-chief.
[00:26:22.160 --> 00:26:28.400] Well, that principle is what's giving the commander-in-chief political authority.
[00:26:29.360 --> 00:26:34.760] But so we shouldn't confuse those two types of authority.
[00:26:34.760 --> 00:26:42.040] Because, look, I don't have to, I mean, this is pretty obvious, but sometimes it's important to state the obvious.
[00:26:42.040 --> 00:26:42.680] Yeah.
[00:26:43.320 --> 00:26:49.240] Just because people vote on something duh doesn't make us true.
[00:26:49.240 --> 00:27:02.360] People could vote and have voted on the issue of slavery, for example, and could vote for saying slavery is a good thing.
[00:27:02.520 --> 00:27:04.440] Let's have slavery.
[00:27:04.760 --> 00:27:10.440] Well, just because society votes on that doesn't make that true.
[00:27:11.080 --> 00:27:11.560] Okay?
[00:27:12.440 --> 00:27:12.920] Yeah.
[00:27:12.920 --> 00:27:13.640] Why?
[00:27:13.640 --> 00:27:18.760] Because it's not supported by coherent reasons.
[00:27:19.720 --> 00:27:33.400] It's not, it may be partly based on mistaken views about the human beings, about some beings, human beings being less intelligent than others.
[00:27:33.400 --> 00:27:36.760] Those are just factually incorrect.
[00:27:37.080 --> 00:27:41.560] But it also could be based on bad moral reasoning.
[00:27:42.520 --> 00:27:46.680] So just because people vote on something doesn't make us true.
[00:27:47.000 --> 00:27:57.880] And sadly, just because something's true doesn't mean people are going to vote on that basis.
[00:27:57.880 --> 00:27:58.360] Yeah.
[00:27:59.000 --> 00:28:02.920] As we know with things like climate change and vaccines.
[00:28:02.920 --> 00:28:06.120] So let's not equate the two.
[00:28:06.120 --> 00:28:11.640] Truth and democracy isn't about arriving at things at vote.
[00:28:11.640 --> 00:28:15.000] Political authority we arrive at by voting.
[00:28:15.040 --> 00:28:18.640] Is that what you mean by the phrase speak truth to power?
[00:28:18.640 --> 00:28:20.400] Or what does that mean?
[00:28:21.680 --> 00:28:29.440] I think when we speak truth to power, when we think of that as a value, I mean, that's an old civil rights phrase.
[00:28:30.080 --> 00:28:56.640] And what it meant then is what I mean by it now, which is in democracy, we think it's important for people to have the freedom to criticize those that are in power, but also to be able to speak those truths that are often obscured by, let's say, the majority.
[00:28:56.640 --> 00:29:14.160] So the minority, we think in a democracy, needs to be able to have, as it were, the room, the space to articulate truths about their experience that the majority may be overlooking.
[00:29:14.480 --> 00:29:19.360] So when we talk about truth to power, we're talking about both those things.
[00:29:19.680 --> 00:29:28.080] And in both cases, that value that we have, democracy, people have it on the right and the left.
[00:29:28.080 --> 00:29:34.160] It's not a progressive thing or a conservative thing.
[00:29:34.480 --> 00:29:36.080] But it is a democratic thing.
[00:29:37.040 --> 00:29:40.480] And it's a combination of those two ideas.
[00:29:40.480 --> 00:29:41.040] Yeah.
[00:29:42.160 --> 00:29:43.840] Your slavery example is good.
[00:29:43.840 --> 00:29:45.520] You know, Lincoln made this point.
[00:29:45.520 --> 00:29:59.280] You know, if you decide you're going to set up a system in which more or less, in which intelligence or smarts determines who gets to be a slave, then be careful because the first person you meet that's smarter than you can enslave you, right?
[00:29:59.400 --> 00:30:03.880] Or skin color or any other criteria like that.
[00:30:03.880 --> 00:30:22.120] In a way, I guess it's an argument between utilitarian ethics, consequentialism, and deontological rights, let's say, ethics in which it doesn't matter how many people think or what arguments are made why it would be better for society that slavery was reinstituted.
[00:30:22.120 --> 00:30:23.240] It doesn't matter.
[00:30:23.240 --> 00:30:24.280] It's just wrong.
[00:30:24.280 --> 00:30:25.720] Well, why is it wrong?
[00:30:25.720 --> 00:30:29.880] You know, so then we have to drill down into rights and where do they come from?
[00:30:29.880 --> 00:30:33.640] Is it just, like Bentham said, you know, nonsense on stilts?
[00:30:33.640 --> 00:30:35.960] Natural rights is nonsense on stilts.
[00:30:35.960 --> 00:30:47.000] Or is there something even deeper about it that the rights themselves have evolved because of the consequences for society?
[00:30:47.000 --> 00:30:49.000] This is a utilitarian argument.
[00:30:49.000 --> 00:30:57.320] Society is really better if everybody is treated equally, if everyone's equal under the law, if everyone has the same rights and so on.
[00:30:57.320 --> 00:31:05.880] So it's not like we're just plucking it out of thin air, like, oh, I don't know, just today randomly, let's just say everybody has equal rights and maybe we'll change our mind tomorrow.
[00:31:06.920 --> 00:31:09.640] You know, that it's really better for society.
[00:31:09.640 --> 00:31:24.920] And that can you make the argument that over centuries we have evolved in a certain direction that everyone is a member of the human race but born with natural rights.
[00:31:24.920 --> 00:31:29.240] We just call it natural rights, call it whatever you want, but here are the consequences of it.
[00:31:29.240 --> 00:31:31.400] We're going to have these laws and so on.
[00:31:31.400 --> 00:31:38.760] And so we have the Bill of Rights and amendments that can't be overturned willy-nilly, you know, takes two-thirds of the states and so on and so forth.
[00:31:39.080 --> 00:31:44.680] Those are all in place because of those kind of consequential type utilitarian arguments that somebody could make.
[00:31:44.800 --> 00:31:51.440] You know, I think Ford was built on the belief that the world doesn't get to decide what you're capable of.
[00:31:51.440 --> 00:31:52.480] You do.
[00:31:52.480 --> 00:31:56.000] So ask yourself: can you or can't you?
[00:31:56.000 --> 00:32:00.720] Can you load up a Ford F-150 and build your dream with sweat and steel?
[00:32:00.720 --> 00:32:04.480] Can you chase thrills and conquer curves in a Mustang?
[00:32:04.480 --> 00:32:08.960] Can you take a Bronco to where the map ends and adventure begins?
[00:32:08.960 --> 00:32:12.960] Whether you think you can or think you can't, you're right.
[00:32:12.960 --> 00:32:15.600] Ready, set, Ford.
[00:32:15.600 --> 00:32:17.280] Society's better if women don't have the vote.
[00:32:17.280 --> 00:32:18.080] Let's take it away.
[00:32:18.080 --> 00:32:19.920] No, that's not going to happen.
[00:32:21.200 --> 00:32:21.840] Yeah.
[00:32:22.160 --> 00:32:30.560] So, obviously, we're not going to solve the age-old debate between consequentialism or utilitarianism.
[00:32:30.640 --> 00:32:31.040] Oh, let's go.
[00:32:31.040 --> 00:32:31.520] Let's solve it.
[00:32:32.000 --> 00:32:33.280] Let's go for it.
[00:32:35.360 --> 00:32:41.360] What I think is important about what you're saying, there's a lot of important things.
[00:32:41.680 --> 00:32:51.760] And I'm sympathetic, at least on Monday, Wednesdays, and Fridays, with your sort of consequentialist justification of human rights.
[00:32:54.320 --> 00:33:03.520] But putting that aside for a minute, what I think you're getting at is a point that I'm freaking to make in the book.
[00:33:03.840 --> 00:33:15.680] Which is when it comes to these political principles, like slavery is undemocratic, or human rights are politically important.
[00:33:16.000 --> 00:33:18.640] Those are principles.
[00:33:18.640 --> 00:33:21.280] Just like human rights are principles.
[00:33:21.280 --> 00:33:28.560] What makes those principles, those political principles, true, if they are true?
[00:33:29.200 --> 00:33:53.240] Well, the founder of pragmatism, who had his own sympathies with utilitarianism, Charles Sanders Burrs, said, What matters when it comes to truth is whether the judgment in question, the proposition in question, let's say slavery is wrong.
[00:33:53.560 --> 00:34:02.040] Is that a proposition that will survive the fires of experience?
[00:34:02.360 --> 00:34:21.480] Will at last will stand up to the evidence and to reason and to scrutiny, to the consequences, as you were putting it, over time, into the future.
[00:34:21.800 --> 00:34:30.120] It's hard to know, of course, which of the propositions we make in politics are going to be able to do that.
[00:34:30.120 --> 00:34:33.320] I mean, some we can know, like the slavery one.
[00:34:33.320 --> 00:34:35.400] It seems pretty easy.
[00:34:35.400 --> 00:34:43.480] But a lot of the claims we make, we're hoping they're going to stand up to scrutiny.
[00:34:43.800 --> 00:34:44.760] But we don't know.
[00:34:44.760 --> 00:34:45.960] That's what makes it hard.
[00:34:45.960 --> 00:34:48.040] That's what makes politics hard.
[00:34:48.280 --> 00:34:53.240] That's what makes finding truth and morality hard.
[00:34:53.560 --> 00:35:03.000] But just because it's hard doesn't mean there aren't some propositions that are going to stand up to the consequences and some that are not.
[00:35:03.000 --> 00:35:03.400] Yeah.
[00:35:05.080 --> 00:35:11.400] When we come to truth and politics, what we mean is what Purse meant.
[00:35:11.720 --> 00:35:18.320] We want our ideas to stand up over scrutiny, over time.
[00:35:23.680 --> 00:35:24.880] Ages.
[00:35:24.880 --> 00:35:29.280] We want them to last and be rational to the end.
[00:35:29.920 --> 00:35:39.760] Would a test of that truth be something like a Rawlsian thought experiment in which you don't know which group you're going to be in when you pass this law?
[00:35:39.760 --> 00:35:46.240] Therefore, you pass it to make it sure that everybody's treated equally, regardless of which group they happen to be born in.
[00:35:46.240 --> 00:35:51.200] That might be kind of a thought experiment to run as a form of reason.
[00:35:52.160 --> 00:35:53.680] That's a great point, Michael.
[00:35:53.680 --> 00:35:56.560] I'm really glad you brought that up.
[00:35:56.560 --> 00:35:57.840] I think so.
[00:35:57.840 --> 00:36:15.840] Now, of course, that thought experiment is hotly debated, to put it mildly, because some people are going to say, well, reasonably enough, how can we make decisions if we don't know who we are or who we're going to be?
[00:36:16.480 --> 00:36:44.800] But in a previous book, book all in praise of reason, I took that thought experiment and said, look, one way of trying to decide what the sort of right methods of evidence we can use in politics and everything else to decide on the questions at hand.
[00:36:45.440 --> 00:36:54.720] How do we decide debates over whether science is the right way to make decisions about vaccines or not?
[00:36:55.680 --> 00:37:01.400] Well, ask yourself the Rawsing question that you've just asked.
[00:37:01.720 --> 00:37:06.680] If you didn't know who you were going to be, sorry about that.
[00:37:06.680 --> 00:37:07.400] That's all right.
[00:37:07.400 --> 00:37:22.840] Then, if you didn't know who you were going to be, or whether, for example, you were going to be in charge, in power, or just an elementary school kid.
[00:37:23.480 --> 00:37:32.840] What principles would you want people to be using to decide on what sorts of things should be done?
[00:37:33.160 --> 00:37:41.800] Would you like scientific principles that at least are, in principle, open to everyone?
[00:37:41.800 --> 00:37:49.080] They use sense perception and reasoning, things we all share in common.
[00:37:49.400 --> 00:37:52.280] Or do you want the priests to decide?
[00:37:52.600 --> 00:38:01.800] The only people to decide, where you might not know whether you're the only person in the future society.
[00:38:02.440 --> 00:38:22.360] So I think what you're right about is in politics, we, of course, are going to use tests like that to decide again on which principles we're going to use to assess the kinds of things we think we ought to do.
[00:38:25.160 --> 00:38:33.320] Let me use this analogy from Steven Pinker with moral values analogous to platonic truths.
[00:38:33.320 --> 00:38:46.560] On this analogy, we're born with a rudimentary concept of number, but as soon as we build on it with formal mathematical reasoning, the nature of mathematical reality forces us to discover some truths, but not others.
[00:38:46.560 --> 00:38:54.800] No one who understands the concept of two, the concept of four, and the concept of addition can come to any conclusion but that two plus two equals four.
[00:38:54.800 --> 00:39:07.680] Perhaps we're born with a rudimentary moral sense, and as soon as we build on it with moral reasoning, the nature of moral reality forces us to some conclusions, but not others, like, you know, would you rather live in North Korea or South Korea?
[00:39:07.680 --> 00:39:09.360] Do you want to be a slave or not?
[00:39:09.360 --> 00:39:10.480] And so on.
[00:39:10.480 --> 00:39:24.800] And so Pinker calls this the principle of interchangeable perspectives, goes by other names: Kant's categorical imperative, Rawls' veil of ignorance, the social contract of Hobbes, Rousseau, and Locke, Spinoza's viewpoint from eternity, and so on.
[00:39:24.800 --> 00:39:40.480] If I appeal to you to do anything that affects me, to get off my foot or tell me the time or not run me over with your car, then I can't do it in a way that privileges my interest over yours, say retaining my right to run you over with my car if I want you to take me seriously.
[00:39:40.480 --> 00:39:45.520] Unless I'm Galactic Overlord, I have to state my case in a way that would force me to treat you in kind.
[00:39:45.840 --> 00:39:57.360] I can't act as if my interests are special just because I'm me and you're not, any more than I can persuade you that the spot I'm standing on is a special place in the universe just because I happen to be standing on it.
[00:39:58.000 --> 00:39:58.400] Right?
[00:39:58.400 --> 00:40:10.000] So Pinker's point, you know, is that there's been moral progress because we've all kind of discovered this as a principle, but that can only happen in democracy where people can actually state their voices.
[00:40:10.000 --> 00:40:11.520] Exactly.
[00:40:11.840 --> 00:40:17.360] And that's much of what I've been trying to say over the last couple of years.
[00:40:18.320 --> 00:40:20.080] Am I writing on this?
[00:40:20.080 --> 00:40:21.600] Which is that?
[00:40:22.240 --> 00:40:36.360] Look, as I say in the book, right now, we're facing a sort of double crisis, a double crisis of faith around the world, not just in the United States.
[00:40:36.680 --> 00:40:40.520] A lot of people are losing faith in democracy.
[00:40:40.840 --> 00:40:43.080] It's not a coincidence.
[00:40:43.080 --> 00:40:49.960] They're also losing faith in concepts like truth, rationality, evidence.
[00:40:50.280 --> 00:40:52.760] Why is that not a coincidence?
[00:40:53.720 --> 00:41:06.360] Because democracy is the kind of politics that demands that we use evidence and truth without thinking about what we ought to do.
[00:41:06.360 --> 00:41:18.920] We're thinking about morality and morality insofar as we practice it based on the principle that you were just talking about.
[00:41:19.560 --> 00:41:20.840] Call it what you will.
[00:41:21.160 --> 00:41:28.920] Do you go so far as to say we're living in a post-truth world or a world of alternative facts or whatever?
[00:41:28.920 --> 00:41:30.840] Look, lots of people say those things.
[00:41:31.560 --> 00:41:37.640] I avoid saying those things because, well, I can make sense of them.
[00:41:37.640 --> 00:41:40.040] I mean, I know what people are getting at.
[00:41:40.360 --> 00:41:42.120] I mean, everybody does.
[00:41:42.440 --> 00:41:44.680] It's not particularly precise.
[00:41:44.680 --> 00:41:50.040] Because, look, you never really boast truth.
[00:41:50.360 --> 00:41:52.360] Truth is what it is.
[00:41:53.000 --> 00:41:59.160] It's propositions either stand up to rational scrutiny or they don't.
[00:41:59.480 --> 00:42:10.560] And while there might be cases where we'll agree that we'll never know what's going to be true.
[00:42:10.560 --> 00:42:17.360] Or a proposition is too vague to determine its truth.
[00:42:18.640 --> 00:42:26.240] To say that we're both truth is to sort of use words in an abstracting way.
[00:42:26.880 --> 00:42:33.360] That being said, it's clear that we're in a crisis of faith about truth.
[00:42:34.000 --> 00:42:48.800] Lots of people seem to think that truth is found by looking on when people say it on TikTok.
[00:42:49.440 --> 00:42:55.280] By truth is correlated with how many likes you get online.
[00:42:56.880 --> 00:43:06.720] That's a sort of crisis of faith in what I mean when I talk about truth, which is paying a sting to reasons.
[00:43:06.720 --> 00:43:26.880] Now look, the people that often fly truth's banner, who wave it the loudest, the people who call like our president, the social media platform they own truth.
[00:43:26.880 --> 00:43:33.920] Those people, just like the people that wave democracy's banner, the loudest.
[00:43:36.160 --> 00:43:45.840] Those people both wave a banner, use a word, but actually don't follow the idea.
[00:43:46.160 --> 00:43:48.960] So we've got to be careful.
[00:43:48.960 --> 00:43:50.960] Lots of people talk about truth.
[00:43:50.960 --> 00:43:54.120] Lots of people talk about democracy.
[00:43:54.120 --> 00:43:55.840] And don't believe in either one.
[00:43:55.840 --> 00:44:02.520] You mean countries that countries that put the word democratic in the title of their country means they're not actually democratic?
[00:44:03.480 --> 00:44:04.840] Yeah, shockingly enough.
[00:44:05.720 --> 00:44:06.360] Yeah.
[00:44:06.680 --> 00:44:07.160] Yeah.
[00:44:07.480 --> 00:44:13.000] Talk a little bit about your thought experiment at the beginning of the book, which I really enjoyed about the Twitbookeans.
[00:44:13.320 --> 00:44:13.960] Yeah.
[00:44:15.160 --> 00:44:17.000] And the broader principle.
[00:44:17.000 --> 00:44:20.840] I mean, you know, this kind of masses and bucks populi.
[00:44:23.720 --> 00:44:38.360] So imagine, for the listeners, imagine a society that followed one rule when it comes to the political discussions.
[00:44:38.680 --> 00:44:43.880] That rule is this: call it the rule of conformity.
[00:44:43.880 --> 00:44:59.720] Only say or boast those things online or off that are liked by your friends, thumbs up by your friends, and disliked by your enemies.
[00:45:00.360 --> 00:45:03.320] That's the only rule they follow.
[00:45:03.960 --> 00:45:07.960] Now, call these people the Twitbookians.
[00:45:08.280 --> 00:45:18.920] Now, here's the thing about the Twitbookians: here's the thing: the Twitbookians don't know this about themselves.
[00:45:18.920 --> 00:45:28.360] They think that when they say things about politics, they're using facts and evidence.
[00:45:28.360 --> 00:45:40.280] And they insist that they're appealing to reasons, even though they're never actually motivated by facts, reasons, or evidence.
[00:45:40.600 --> 00:45:50.560] They're actually motivated, although they don't know this, by what their friends like, what their enemies dislike.
[00:45:50.880 --> 00:45:54.960] They're only motivated by conformity.
[00:45:55.600 --> 00:46:11.520] Now, the point of opening up the book with this experiment is to get people to think about this question: Are we like Twitbookians?
[00:46:12.480 --> 00:46:19.040] Some philosophers, some political scientists think we are.
[00:46:19.360 --> 00:46:27.600] Some people think, in fact, that human nature makes us Twitbookians.
[00:46:27.600 --> 00:46:32.320] If that's true, democratic politics is doomed.
[00:46:33.280 --> 00:46:36.560] Because think about the Twitbookian society.
[00:46:36.560 --> 00:46:39.760] Is it really going to be democratic?
[00:46:40.400 --> 00:46:44.800] Are they even going to be having rational disagreements?
[00:46:45.120 --> 00:46:45.760] No.
[00:46:46.400 --> 00:46:54.800] Twitbookians with different political views are really disagreeing with one another.
[00:46:54.800 --> 00:47:01.280] They're just bleeding at each other like sheep across a political divide.
[00:47:01.920 --> 00:47:11.040] So the question I want people to think-think hard about is: is our society Twitbookian?
[00:47:11.360 --> 00:47:15.680] And if it is, how do we get it to change?
[00:47:16.960 --> 00:47:25.520] Right, so here are you speaking to the myth of the rational voter that people are not utilitarian maximizers and so on.
[00:47:25.520 --> 00:47:28.160] They're just voting their team.
[00:47:28.160 --> 00:47:45.640] Yes, to some extent, as I said, and as you just noted, in political science, there's an open debate about whether voting itself is a rational enterprise.
[00:47:47.400 --> 00:47:58.360] I want to say, in the book, I say there's a lot to be said for that side of the argument.
[00:47:58.360 --> 00:48:00.040] But it's not inevitable.
[00:48:00.360 --> 00:48:02.440] It's not inevitable.
[00:48:02.760 --> 00:48:03.960] Or not.
[00:48:04.600 --> 00:48:09.640] The argument that's inevitable is an argument about human nature.
[00:48:09.640 --> 00:48:19.880] It's the argument that humans by nature are just incapable of being rational about things they care about.
[00:48:19.880 --> 00:48:21.880] David Hume thought that roughly.
[00:48:22.840 --> 00:48:26.920] Reason is the slave of the passions, he said.
[00:48:27.240 --> 00:48:33.240] Okay, what I want to say about Hume is like, yeah, most of the time, that's often right.
[00:48:33.880 --> 00:48:35.240] But you know what?
[00:48:36.520 --> 00:48:47.400] Over the centuries, we've found out methods for counteracting our tendency to be irrational.
[00:48:48.040 --> 00:48:50.120] And you know what those methods are?
[00:48:50.440 --> 00:48:53.880] You already mentioned them at the top of the hour.
[00:48:54.520 --> 00:49:00.760] They are methods like using evidence in the law.
[00:49:02.040 --> 00:49:12.200] Requiring journalists fact-check their sources to have more than one source.
[00:49:12.200 --> 00:49:18.880] Requiring scientists to publish in peer-reviewed journals.
[00:49:20.400 --> 00:49:22.720] All those rules.
[00:49:23.040 --> 00:49:26.000] Everything that I just said is a rule.
[00:49:26.320 --> 00:49:40.240] Those rules were set up for the sole purpose of correcting for what Kant called the crooked timber of humanity.
[00:49:40.240 --> 00:49:54.880] That is our bent nature, our tendencies, Michael, to just go with the flow, to favor our friends, to ignore evidence that contradicts our opinions.
[00:49:56.880 --> 00:50:11.040] Society, democratic society, requires that we play not just by the political rules of fair play, but by the epistemic rules of fair play.
[00:50:11.040 --> 00:50:13.920] The rules of evidence.
[00:50:13.920 --> 00:50:20.080] And what the Deutbuchians have forgotten is to play by those rules.
[00:50:20.400 --> 00:50:21.280] Right.
[00:50:21.600 --> 00:50:24.080] Yeah, so that's deeply buried in our human nature.
[00:50:24.080 --> 00:50:25.840] It's hard to get around it.
[00:50:25.840 --> 00:50:26.960] Possible.
[00:50:27.520 --> 00:50:28.320] Possible.
[00:50:28.640 --> 00:50:35.920] You know, this, you know, starting off with the idea of if there is a God, he's omniscient and omnipotent.
[00:50:36.480 --> 00:50:37.920] I don't know if there is or not.
[00:50:37.920 --> 00:50:40.960] I'm an atheist, but I know for sure I'm not God.
[00:50:40.960 --> 00:50:42.960] And I also know that you're not God.
[00:50:42.960 --> 00:50:44.320] Okay, so we can start there.
[00:50:44.320 --> 00:50:50.160] Or if you like the secular version, there is an objective reality, only I don't know what it is, and you don't either.
[00:50:51.440 --> 00:50:54.640] And so we have to start with there, you know, just the principle of fallibilism.
[00:50:54.640 --> 00:50:56.960] Okay, so we all might be wrong.
[00:50:56.960 --> 00:50:58.320] Okay, so then how do you know?
[00:50:58.320 --> 00:51:04.360] Well, then we have to set up rules of evidence, and then we're off and rolling into setting up these different systems.
[00:51:04.760 --> 00:51:09.240] And so to overcome that, all right, so let's let me go back to Hume.
[00:51:09.240 --> 00:51:13.640] You know, reason is the slave, is and always will be the slave of the passions.
[00:51:13.640 --> 00:51:25.640] Of course, he didn't mean it's, you know, it's it's time to go party like it's 1999, but that reason can tell you how to get to your goals, but what are the goals in the first place?
[00:51:25.640 --> 00:51:27.400] How did you get those goals?
[00:51:27.400 --> 00:51:31.080] Right there, he's separating is and not, I think, right?
[00:51:31.080 --> 00:51:32.520] Am I interpreting that right?
[00:51:32.520 --> 00:51:33.880] No, he's right.
[00:51:33.880 --> 00:51:34.920] You're right.
[00:51:34.920 --> 00:51:47.880] I mean, Michael, I think that's correct for Hume and lots of people after him, particularly researchers in political psychology.
[00:51:48.200 --> 00:51:57.080] They want to say, reason, as you just said, can help you determine how to get where you're going.
[00:51:57.400 --> 00:52:04.600] But your direction, your goal, is set by the heart, by the passions.
[00:52:05.880 --> 00:52:17.160] And if you think that, then you'll think that, well, as Hume did, you can really only debate rationally about means, means to ends.
[00:52:17.240 --> 00:52:22.120] Can't debate about the ends, the principles themselves.
[00:52:22.120 --> 00:52:26.120] Because that's for the heart to decide.
[00:52:26.440 --> 00:52:30.760] And people have different feelings, different sentiments.
[00:52:31.080 --> 00:52:33.080] And that's not rational.
[00:52:34.360 --> 00:52:35.960] I disagree with that.
[00:52:36.600 --> 00:52:48.640] I think, as we were already talking about, you can reason about ends because you can put them in the hopper of evidence.
[00:52:49.600 --> 00:52:51.680] Go back to Peirce.
[00:52:52.000 --> 00:52:56.800] What's true is what he called concordant.
[00:52:56.800 --> 00:53:01.920] He said true ideas are coordinate, concordant ideas.
[00:53:02.240 --> 00:53:11.440] And what he meant was: even with principles, we can see which stand up to scrutiny.
[00:53:17.440 --> 00:53:21.360] Go back to my original analogy.
[00:53:21.360 --> 00:53:27.440] Mathematics is a place for truth and falsity, if anything is.
[00:53:29.680 --> 00:53:35.360] But what makes claims about numbers?
[00:53:35.680 --> 00:53:38.160] Well, numbers.
[00:53:38.800 --> 00:53:39.920] What are those?
[00:53:40.240 --> 00:53:41.360] Good question.
[00:53:42.640 --> 00:53:47.760] You and I aren't going to determine, thank God, the answer to that question right now.
[00:53:48.080 --> 00:54:13.520] But whatever the truth is of those, whatever makes mathematical judgments true, even if we don't know the answer to that, we can agree that mathematical judgments can be true because we have ways of determining which stand up to scrutiny.
[00:54:13.520 --> 00:54:19.440] It's called mathematical proof.
[00:54:19.440 --> 00:54:24.000] In politics, we don't have mathematical proofs, sadly.
[00:54:25.280 --> 00:54:29.120] But we do have appeals to reasons.
[00:54:30.360 --> 00:54:47.240] Now, sometimes the best we can do in politics is just say, look, this political picture hangs together, coheres with the evidence.
[00:54:47.240 --> 00:54:48.360] Best.
[00:54:49.640 --> 00:54:53.960] And this political picture doesn't really fit together.
[00:54:53.960 --> 00:54:56.520] It's filled with contradictions.
[00:54:56.520 --> 00:54:59.960] And it contradicts the empirical evidence we have.
[00:54:59.960 --> 00:55:04.120] So you're combining a correspondence theory of truth and a coherence theory of truth.
[00:55:05.080 --> 00:55:16.760] I think in politics, well, my official view is truth comes in more than one form.
[00:55:17.080 --> 00:55:20.760] All truth is a matter of fitting with reality.
[00:55:20.760 --> 00:55:23.640] But what that means, difference.
[00:55:24.280 --> 00:55:39.640] When we're talking about trees and battleships and mountains, empirical reality, I think the correspondence theory, in one form or another, is the best theory.
[00:55:39.640 --> 00:55:40.920] Myself.
[00:55:41.560 --> 00:55:51.240] When we're talking about morality, then I think something a little bit more like the coherence theory makes sense.
[00:55:52.520 --> 00:55:55.560] And that's what I was just saying about politics.
[00:55:55.560 --> 00:56:07.640] When it comes to what I earlier called judgments about politics in the narrow or pure sense, judgments about what we ought to do as a society.
[00:56:09.240 --> 00:56:22.080] Often the best we could do is just see which of these political stories makes sense, which holds together, and which doesn't contradict the empirical facts.
[00:56:23.680 --> 00:56:29.440] If we could do that, even we must be getting closer to the truth.
[00:56:29.760 --> 00:56:34.240] As you said, look, let's be realistic.
[00:56:38.720 --> 00:56:41.600] It's not easy to know what's true.
[00:56:42.080 --> 00:56:46.400] Skeptic like yourself is going to say, we'll never know what's true.
[00:56:48.000 --> 00:56:51.840] You know, I'm one of these people that thinks sort of in between.
[00:56:51.840 --> 00:56:56.240] I think some things I know are true, or a push short are true.
[00:56:56.880 --> 00:57:02.480] Things I'm willing to bat my life on, for example, which is pretty high standard.
[00:57:03.120 --> 00:57:10.240] And then there are things that I have no idea are true.
[00:57:10.240 --> 00:57:13.040] And I don't think anybody knows are true.
[00:57:13.040 --> 00:57:18.400] Well, this is why I like Bayesian reasoning so much because it allows you to put a probability on it.
[00:57:18.960 --> 00:57:28.480] And then I like Cromwell's rule: never assign a one or zero to anything, just in case, because I beseech you in the bowels of Christ, you might be mistaken.
[00:57:28.960 --> 00:57:30.560] All right, so, okay, yeah.
[00:57:31.120 --> 00:57:46.880] But then, wait, I was going to say, oh, yeah, so I want to also add pragmatism to the study of what, of history and psychology, of what people actually do and have done in the past, and then map that onto what you're talking about.
[00:57:46.880 --> 00:57:51.040] So, the low-hanging fruit, would you rather live in North Korea or South Korea?
[00:57:51.120 --> 00:57:52.240] Everybody knows the answer.
[00:57:52.240 --> 00:57:53.120] Why?
[00:57:53.120 --> 00:57:56.480] You know, well, look at historically, people vote with their feet.
[00:57:56.720 --> 00:57:59.360] They don't want to be enslaved and so on.
[00:57:59.360 --> 00:58:04.840] And you can then put reason behind that, as Lincoln said: as I would not be a slave, I would not be a slave owner.
[00:58:04.840 --> 00:58:10.680] Yeah, that's a no-brainer in hindsight, but that's not what people thought at the time.
[00:58:10.680 --> 00:58:11.160] Right?
[00:58:11.160 --> 00:58:19.960] So, you know, the you know, reason has to be tempered because, you know, if you and I lived 200 years ago, we might be making rational arguments for why slavery is great.
[00:58:20.280 --> 00:58:20.920] Yeah.
[00:58:21.240 --> 00:58:26.920] But that just shows you that some ideas don't stand up.
[00:58:26.920 --> 00:58:27.640] Right, right.
[00:58:27.640 --> 00:58:27.880] Okay.
[00:58:28.200 --> 00:58:28.840] That's the point.
[00:58:28.840 --> 00:58:29.080] Right.
[00:58:29.080 --> 00:58:30.840] So over time, we're getting closer.
[00:58:31.080 --> 00:58:36.600] I'm just trying to make over time new information comes in.
[00:58:36.600 --> 00:58:36.920] Yeah.
[00:58:36.920 --> 00:58:37.400] Yeah.
[00:58:37.400 --> 00:58:42.520] Now, do we really need new information when it comes to slavery?
[00:58:42.520 --> 00:58:43.160] No.
[00:58:43.800 --> 00:58:57.640] But for all sorts of other more complicated questions, let's say about the tax code, like you talked about at the beginning, or complicated questions of a very prudential kind in politics.
[00:58:57.640 --> 00:59:02.440] A lot of politics is much more ditty-gritty than what we're talking about here.
[00:59:02.440 --> 00:59:08.200] You know, my wife served on the local town council.
[00:59:08.520 --> 00:59:11.880] We live in a small New England village.
[00:59:12.200 --> 00:59:21.720] Political questions they had to decide were like, should we build a bike lane or should we build a hockey path?
[00:59:21.720 --> 00:59:22.200] Right.
[00:59:22.520 --> 00:59:23.080] Yeah.
[00:59:23.080 --> 00:59:27.960] Now, you know, that's a political question for the village.
[00:59:28.280 --> 00:59:31.000] People got heated about it.
[00:59:33.160 --> 00:59:37.240] But it's not a grand glorious question.
[00:59:37.560 --> 00:59:50.960] But the right answer to that question is going to be the one that sort of stands up over time, given the goals of the village, the money they have, the taxes, and so forth.
[00:59:51.280 --> 00:59:53.040] It's complicated.
[00:59:53.360 --> 01:00:04.320] And the best you can do, often, in real life, is to, as you said, play the probabilities and hope that you get it right.
[01:00:04.640 --> 01:00:08.000] What you shouldn't do, and a lot of people do.
[01:00:08.960 --> 01:00:10.640] And I think you know this.
[01:00:10.960 --> 01:00:14.960] A lot of people engage in black and white thinking.
[01:00:14.960 --> 01:00:23.440] They think, oh, because it's not clear what the answer is, that means there's no answer.
[01:00:23.440 --> 01:00:26.800] That's the mistake that people make.
[01:00:27.120 --> 01:00:37.600] That is a mistake that political psychologists make when they say politics has nothing to do with truth.
[01:00:37.600 --> 01:00:39.040] Of course it does.
[01:00:40.240 --> 01:00:46.000] Just it's a realm where truth is hard to come by.
[01:00:46.640 --> 01:00:48.160] But guess what?
[01:00:48.800 --> 01:00:50.960] Truth is hard to come by.
[01:00:50.960 --> 01:00:53.440] And higher mathematics too.
[01:00:53.440 --> 01:00:55.200] Number theory.
[01:00:55.520 --> 01:00:57.600] It's not easy.
[01:00:58.240 --> 01:01:12.160] And just because some truths and mathematics are easy, like two and two or four, doesn't mean that what the mathematicians do is easy.
[01:01:12.480 --> 01:01:16.640] Some political truths, as you say, are easy.
[01:01:16.640 --> 01:01:18.880] Nobody wants to be a slave.
[01:01:19.840 --> 01:01:21.320] Real politics is hard.
[01:01:21.200 --> 01:01:21.480] Yeah.
[01:01:22.480 --> 01:01:26.240] But grow up and get used to it, people.
[01:01:26.560 --> 01:01:28.480] Don't give up on truth.
[01:01:28.480 --> 01:01:32.200] Don't give up on justice just because it's hard.
[01:01:29.520 --> 01:01:34.040] Nice.
[01:01:34.040 --> 01:01:35.080] Yeah, I like that.
[01:01:29.680 --> 01:01:36.440] I like that a lot.
[01:01:37.800 --> 01:01:45.240] Yeah, but in terms of these overall trends, like back to where we started, pretty much all countries have a social safety net now.
[01:01:45.240 --> 01:01:55.880] And all the Western democracies spend, you know, as I said, roughly, I don't know, 18% to 25% of their GDP on these social services.
[01:01:55.880 --> 01:01:59.880] So, and, you know, the conservatives want it to be lower, liberals want it to be higher.
[01:01:59.880 --> 01:02:00.440] Okay.
[01:02:00.440 --> 01:02:05.720] But no one's saying, you know, we should just go back to the way it was in the 18th century when there was nothing, right?
[01:02:06.040 --> 01:02:15.240] So would you say that pragmatically, your pragmatism, you know, historically, that's kind of a direction towards something that really is true.
[01:02:15.240 --> 01:02:18.280] This is the way societies really are better when they're that way.
[01:02:18.600 --> 01:02:21.320] You can argue about the, you know, the percentage.
[01:02:21.640 --> 01:02:22.360] Yes.
[01:02:22.680 --> 01:02:41.400] I think that there are lots of different examples of these sorts of claims where you can see some coalescing, some agreement, some overlapping consensus, as Rawl sort of said.
[01:02:42.360 --> 01:02:47.560] Now, the problem is, as I just said, in the details.
[01:02:47.560 --> 01:02:51.640] I mean, what is the percentage we should be spending?
[01:02:51.640 --> 01:02:52.920] That's harder.
[01:02:52.920 --> 01:02:53.480] Yeah.
[01:02:55.320 --> 01:03:03.800] But it is actually important for this conversation that you point out we can make progress.
[01:03:03.800 --> 01:03:07.480] And we can look back and see we've made progress.
[01:03:09.080 --> 01:03:11.000] Look.
[01:03:11.320 --> 01:03:29.120] One thing we need to remember right now, especially, is that political progress and rational progress requires what I earlier called these sort of epistemic rules, rules of evidence.
[01:03:29.600 --> 01:03:44.240] More importantly, it requires institutions that act by those rules, that educate people in those rules, that act as sort of the infrastructure of knowledge for a society.
[01:03:45.600 --> 01:03:48.000] Epistemic infrastructure.
[01:03:48.320 --> 01:03:50.800] Universities are part of that.
[01:03:51.120 --> 01:03:53.840] Primary schools are part of that.
[01:03:54.160 --> 01:03:56.960] The legal system is part of that.
[01:03:57.280 --> 01:04:01.440] The National Science Foundation is part of that.
[01:04:01.440 --> 01:04:02.400] And so on.
[01:04:02.400 --> 01:04:05.600] Journalism is part of that.
[01:04:05.920 --> 01:04:14.800] It provides the infrastructure that helps us, just like a bridge helps us get where we want to go physically.
[01:04:15.120 --> 01:04:28.560] Epistemic infrastructure provides a way for us to get to what's rational, to get to consensus using reason and evidence.
[01:04:29.840 --> 01:04:38.880] So if we want to have progress of the sort you were talking about, we better protect that infrastructure.
[01:04:38.880 --> 01:04:39.680] Yeah.
[01:04:40.640 --> 01:04:41.360] Yeah, I agree.
[01:04:41.360 --> 01:04:43.040] And that's what your book is all about.
[01:04:43.040 --> 01:04:46.160] And you've argued for so well for your career.
[01:04:46.160 --> 01:04:56.400] You know, one of my favorite quotes in this space comes from John Stuart Mill: a party of order and stability and a party of progress and change are both necessary for a civil society.
[01:04:56.400 --> 01:05:04.440] Are we past that window of flexibility with, I don't know, the mega right and the crazy far-woke progressive left?
[01:05:04.600 --> 01:05:10.200] And I don't know what craziness is going on these days, and not just here, but in other countries.
[01:05:12.440 --> 01:05:15.240] Well, I don't know.
[01:05:16.520 --> 01:05:33.880] But I do know that right now, in my country, in the U.S., our country, we're witnessing what I would call an unprecedented attack on our epistemic infrastructure.
[01:05:33.880 --> 01:05:37.720] On the infrastructure of science and the law.
[01:05:39.640 --> 01:06:00.360] I do think that once that happens, once that infrastructure is damaged to the point of no repair, the democracy will fade or even blink out.
[01:06:02.120 --> 01:06:03.000] Wow.
[01:06:03.640 --> 01:06:06.920] So I think we are at a state of crisis.
[01:06:06.920 --> 01:06:07.560] Yeah.
[01:06:08.200 --> 01:06:11.400] More than when I even wrote the book.
[01:06:13.000 --> 01:06:17.960] And I think we can see that with the defunding of science.
[01:06:18.280 --> 01:06:19.160] And I'm not.
[01:06:19.480 --> 01:06:21.080] I'm not a scientist.
[01:06:21.080 --> 01:06:23.080] I'm a philosopher.
[01:06:23.720 --> 01:06:29.560] But I'm a philosopher that respects science and history and the law.
[01:06:31.160 --> 01:06:34.760] So, are we past democracy?
[01:06:34.760 --> 01:06:36.440] No, we're not.
[01:06:36.400 --> 01:06:38.280] Of course, there's soap.
[01:06:38.920 --> 01:06:41.880] But there are warning signs.
[01:06:42.200 --> 01:06:49.120] And, you know how when it comes to real infrastructure, like bridges to roads.
[01:06:50.080 --> 01:06:54.160] If the bridge gets dangerous, we put up a warning sign.
[01:06:54.880 --> 01:06:56.240] Bridges out.
[01:06:56.640 --> 01:06:57.840] Be careful.
[01:06:58.160 --> 01:06:59.600] Right, hopefully.
[01:07:00.560 --> 01:07:05.440] That is, if our infrastructure is working, we put up that.
[01:07:07.040 --> 01:07:15.200] Well, the warning signs are blanking right now about our epistemic infrastructure.
[01:07:15.520 --> 01:07:32.080] The warning signs about whether going forward we're going to have rational progress as opposed to just blind change in our society.
[01:07:32.400 --> 01:07:35.040] Those signs are blanky red.
[01:07:35.040 --> 01:07:35.520] Yeah.
[01:07:36.080 --> 01:07:41.520] I think, if I can riff for just a moment, that those assaults come from both the right and the left.
[01:07:41.520 --> 01:07:54.800] I'm worried about the academy and that kind of that's where the whole post-truth thing comes from in the first place, post-modernism, and that all cultures are equal and all people's truths are equal and so on.
[01:07:55.200 --> 01:08:05.920] And then the whole crisis during the pandemic, you know, about masks and social distancing and school closures, all the, you know, there was a lot of misinformation.
[01:08:05.920 --> 01:08:09.520] I would have preferred our policymakers be more Bayesian.
[01:08:09.520 --> 01:08:11.920] Like, we think you should wear masks or social distance.
[01:08:11.920 --> 01:08:13.200] We're not sure, really.
[01:08:13.440 --> 01:08:14.800] But this is what we think this week.
[01:08:14.800 --> 01:08:16.000] This could all change next week.
[01:08:16.000 --> 01:08:18.960] Instead, they made these, you know, dramatic pronouncements.
[01:08:18.960 --> 01:08:20.560] Now we know this is true.
[01:08:20.560 --> 01:08:21.680] But they didn't know that.
[01:08:21.680 --> 01:08:23.280] And we now know they didn't know that.
[01:08:23.280 --> 01:08:31.320] So the public trust in the CDC and people like Anthony Fauci and Francis Collins, who were always trustworthy, I always had great respect for them.
[01:08:31.640 --> 01:08:33.240] You know, that's plummeted.
[01:08:33.240 --> 01:08:38.040] You know, so a lot of this backlash from the right is because of what happened on the left.
[01:08:38.120 --> 01:08:42.680] So it goes both ways, is all I'm trying to say and be politically neutral here, I guess.
[01:08:43.240 --> 01:08:47.080] Because I'm worried too, in both directions about that.
[01:08:47.080 --> 01:08:49.080] I guess it depends on the time horizon.
[01:08:49.080 --> 01:08:52.920] I tend to be an optimist, and I think this too will pass.
[01:08:52.920 --> 01:09:00.520] You know, I don't know, the trans movement, the MAGA movement, the this, that, these culture wars that I'm, you know, I waste way too much time on.
[01:09:00.520 --> 01:09:02.120] I'm one of your twit bookieans.
[01:09:02.120 --> 01:09:04.360] I don't do Facebook, but I do Twitter.
[01:09:04.680 --> 01:09:09.960] And last weekend, I was off social media for three days.
[01:09:09.960 --> 01:09:12.920] And it was just, I realized I don't really miss any of that.
[01:09:12.920 --> 01:09:15.000] It's just, you know, maybe we don't need all that.
[01:09:15.000 --> 01:09:16.040] But that could all change.
[01:09:16.040 --> 01:09:18.600] You know, I mean, the smartphone is only 2007.
[01:09:18.600 --> 01:09:20.840] Facebook, 2007, 2008.
[01:09:20.840 --> 01:09:25.320] Instagram is what, 2012, 13, something like that this is not that long ago, right?
[01:09:25.320 --> 01:09:28.600] This, maybe in 10, 20 years, none of this technology.
[01:09:28.600 --> 01:09:31.800] It'll be something completely different that you and I can't think of.
[01:09:31.800 --> 01:09:37.800] And then I'm hoping we're worried about much ado about nothing, but you know, we should pay attention.
[01:09:41.000 --> 01:09:41.880] Yes, sir.
[01:09:42.200 --> 01:09:43.800] I hope you're right.
[01:09:44.440 --> 01:09:45.400] Yes.
[01:09:45.720 --> 01:09:46.440] All right, Michael.
[01:09:46.440 --> 01:09:51.640] I know you got a hard out coming out in five minutes, and I want to save your voice in any case for your next interviews for your book.
[01:09:51.720 --> 01:09:54.440] So I'll just thank you and again, plug the book.
[01:09:54.600 --> 01:09:58.840] Really a great read on truth and politics, why democracy demands it.
[01:09:58.840 --> 01:09:59.560] Indeed, it does.
[01:09:59.560 --> 01:10:01.080] I listened to it on audio.
[01:10:01.080 --> 01:10:02.280] Nice read.
[01:10:02.280 --> 01:10:04.280] And read the book as well.
[01:10:04.280 --> 01:10:07.960] So, congratulations, and thanks for your work.
[01:10:08.280 --> 01:10:10.280] Thanks for having me, Michael.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.560 --> 00:00:04.400] Let's talk about a special end-of-life care called Hospice.
[00:00:04.400 --> 00:00:11.680] For 125 years, Calvary Hospital and Dindhome Hospice has been providing New York with world-renowned hospice care.
[00:00:11.680 --> 00:00:20.240] Calvary's special doctors, nurses, and counselors work as a team, relieving the patient's physical pain and the emotional stress of the family.
[00:00:20.240 --> 00:00:24.480] Visit us 24-7 at CalvaryHospital.org.
[00:00:24.480 --> 00:00:30.000] Calvary, for 125 years, where life continues.
[00:00:30.640 --> 00:00:36.880] This Labor Day at Value City Furniture, get up to 20% off your new living room and more throughout the store.
[00:00:36.880 --> 00:00:38.720] Think you deserve even more?
[00:00:38.720 --> 00:00:39.360] Same.
[00:00:39.360 --> 00:00:43.280] Like combining those savings with an extra 10% off door busters.
[00:00:43.280 --> 00:00:46.400] Plus, no interest financing for up to 40 months.
[00:00:46.400 --> 00:00:49.600] See if you pre-qualify without touching your credit score.
[00:00:49.600 --> 00:00:53.920] Yep, get more at Value City Furniture while paying less.
[00:00:53.920 --> 00:00:56.880] More style, more quality, more value.
[00:00:56.880 --> 00:00:59.440] We are Value City Furniture.
[00:01:03.920 --> 00:01:09.760] You're listening to The Michael Shermer Show.
[00:01:17.120 --> 00:01:20.800] All right, everybody, it's time for another episode of the Michael Shermer Show.
[00:01:20.800 --> 00:01:23.680] Brought to you as always by the Skeptic Society and Skeptic Magazine.
[00:01:23.680 --> 00:01:28.800] Now, my guest today has a book out about truth in politics.
[00:01:28.800 --> 00:01:30.240] I'm very interested in this topic.
[00:01:30.240 --> 00:01:36.080] As many of you know, my next book is called Truth: What It Is, How to Find It, Why It Still Matters.
[00:01:36.080 --> 00:01:39.200] The publisher added still in there because I forgot that.
[00:01:39.200 --> 00:01:40.080] Yeah, that's right.
[00:01:40.080 --> 00:01:42.320] It always mattered, but it still matters.
[00:01:42.560 --> 00:01:48.400] I don't have a chapter on political truths because I realized the book was already too long.
[00:01:48.400 --> 00:01:50.640] But also, that's not really my wheelhouse.
[00:01:50.640 --> 00:01:51.440] And guess what?
[00:01:51.440 --> 00:01:53.280] My guest today wrote a book about that.
[00:01:53.280 --> 00:01:56.560] It's called On Truth in Politics.
[00:01:56.560 --> 00:01:58.000] How about that for good timing?
[00:01:58.000 --> 00:01:59.840] Why democracy demands it?
[00:02:00.120 --> 00:02:02.440] He is Michael Patrick Lynch.
[00:02:02.440 --> 00:02:10.360] He's a provost professor of the humanities and board of trustees, distinguished professor of philosophy at the University of Connecticut.
[00:02:10.360 --> 00:02:10.760] Dr.
[00:02:10.760 --> 00:02:24.920] Lynch is the author or editor of 10 books, including Know-It-All Society, The Internet of Us, Truth as One and Many, and the New York Times Sunday Book Review Editor's Pick, True to Life.
[00:02:24.920 --> 00:02:26.600] Can you tell he's into truth?
[00:02:26.920 --> 00:02:34.280] Lynch has held grants from the National Endowment for the Humanities, the Andrew Mellon Foundation, and the John Templeton Foundation, among others.
[00:02:34.280 --> 00:02:43.880] He's spoken at TED, yay TED, the Nantucket Project, Chautauqua, and South by Southwest, usually with an SXX SW.
[00:02:43.880 --> 00:02:47.400] In 2019, he was awarded the George Orwell Award.
[00:02:47.400 --> 00:02:48.600] How cool is that?
[00:02:48.600 --> 00:02:53.960] Which recognizes writers who have made outstanding contributions to the critical analysis of public discourse.
[00:02:53.960 --> 00:02:59.320] And again, the new book is on truth and politics: why democracy demands it.
[00:02:59.320 --> 00:03:00.360] Michael, nice to see you.
[00:03:00.360 --> 00:03:07.960] I should warn our listeners: you've just recently had surgery on your throat, so your voice is a little off at the moment, but that's okay.
[00:03:07.960 --> 00:03:09.400] We'll work around that.
[00:03:10.360 --> 00:03:12.760] Well, thank you very much, Michael.
[00:03:12.760 --> 00:03:15.320] I'm pleased to be here talking to you.
[00:03:15.320 --> 00:03:19.960] Well, you know, the problem, I guess, let's start off with what you mean by truth.
[00:03:20.360 --> 00:03:22.200] Let me just give a fun example here.
[00:03:22.520 --> 00:03:29.640] I was just toying with the idea: like, what's the right percentage of upper income tax bracket?
[00:03:29.640 --> 00:03:30.680] Well, how should I know?
[00:03:30.680 --> 00:03:32.520] So I asked Grock.
[00:03:32.840 --> 00:03:38.440] And so here's Grok's answer: There is no single right percentage for the upper income tax bracket.
[00:03:38.440 --> 00:03:41.800] It depends on what you're trying to achieve and who you ask.
[00:03:41.800 --> 00:03:50.880] Economists, policymakers, and regular people have been arguing about this forever, and the answer hinges on trade-offs between revenue, fairness, and economic growth.
[00:03:51.200 --> 00:03:54.160] Historically, top marginal tax rates in the U.S.
[00:03:54.160 --> 00:03:55.280] have swung wildly.
[00:03:55.280 --> 00:04:02.800] In the 1950s, they hit 91% for the highest earners under Eisenhower, though loopholes meant few actually paid that.
[00:04:02.800 --> 00:04:13.760] Revenue maxed out at around 7% to 9% of GDP back then, similar to today's levels, despite the rates now topping out at 37% federally, plus state taxes.
[00:04:13.760 --> 00:04:16.560] Like in California, it's 10% here now.
[00:04:16.560 --> 00:04:18.720] Practically, it's a mess of priorities.
[00:04:18.720 --> 00:04:20.320] You want more social programs?
[00:04:20.320 --> 00:04:24.160] Crank it up to 50 to 60%, like some European countries.
[00:04:24.160 --> 00:04:28.160] France is at 45%, Denmark effectively higher with add-ons.
[00:04:28.160 --> 00:04:33.280] Worried about drain-brain or stifling entrepreneurs, keep it closer to 35% to 40%.
[00:04:33.280 --> 00:04:39.600] To me, that tells me I'm using the word like a scientist would, like, you know, is the theory of evolution true?
[00:04:39.600 --> 00:04:40.480] Well, yeah.
[00:04:40.480 --> 00:04:41.920] And here's all the reasons why.
[00:04:41.920 --> 00:04:47.440] I don't think of political truths in the same way, but how should I be thinking about it?
[00:04:48.080 --> 00:04:52.560] I think that most people would agree with you about that.
[00:04:52.880 --> 00:05:02.480] As Anna Arendt famously said, truth and politics are in constant war with one another.
[00:05:02.480 --> 00:05:12.640] And we don't typically think of statements in politics as really being accurate or inaccurate.
[00:05:12.640 --> 00:05:17.280] We think of them being persuasive or not persuasive.
[00:05:17.600 --> 00:05:37.960] But actually, I think democratic practice presupposes, demands, as I say, that we treat at least some judgments we make in politics as being true or false.
[00:05:38.600 --> 00:05:46.200] After all, it's a key part of democracy that we want to speak truth to power.
[00:05:46.520 --> 00:05:50.280] We want to give reasons for ideas.
[00:05:50.280 --> 00:05:56.040] We don't just want to force people to do this or that in a democracy.
[00:05:56.680 --> 00:06:00.280] We want to back up our policies with evidence.
[00:06:00.280 --> 00:06:03.560] These are all things that, Michael, you believe.
[00:06:03.880 --> 00:06:18.360] Well, none of that makes any sense to do that in politics, to have those principles, if you don't think that at least some judgments in politics can be true.
[00:06:18.680 --> 00:06:20.600] And what do I mean by that?
[00:06:20.920 --> 00:06:33.160] Well, the simplest answer is that true judgments are those that fit reality, that accord with reality.
[00:06:33.480 --> 00:06:37.560] The complicated answer is what that means.
[00:06:38.040 --> 00:06:42.440] Every philosopher, as you know, pretty much agrees.
[00:06:42.440 --> 00:06:52.760] Not every philosopher, but most philosophers throughout history have agreed that truth has something to do with reality.
[00:06:52.760 --> 00:06:58.440] True judgments are not just the ones we want to want to be true.
[00:06:58.760 --> 00:07:01.800] You know, wishing doesn't make it so.
[00:07:02.120 --> 00:07:04.360] But, and we all can agree on that.
[00:07:04.360 --> 00:07:06.360] That's not rocket science.
[00:07:06.680 --> 00:07:21.600] But the hard part is to say what it means for something like mathematics to accord reality with reality or politics to accord with reality.
[00:07:22.880 --> 00:07:27.920] In the one case, mathematics, it's very abstract.
[00:07:28.240 --> 00:07:31.120] Are there really numbers out there?
[00:07:31.440 --> 00:07:34.320] Some people say yes, some people say no.
[00:07:34.640 --> 00:07:39.760] But because you might think, how can numbers be physical objects?
[00:07:39.760 --> 00:07:41.120] Math is about numbers.
[00:07:41.760 --> 00:07:44.720] Politics is abstract too.
[00:07:45.360 --> 00:07:53.600] But nobody really seriously says that there aren't truths about mathematics.
[00:07:53.600 --> 00:07:59.840] Funnily enough, we talk ourselves into saying, oh, but politics is different.
[00:07:59.840 --> 00:08:01.840] It's too abstract.
[00:08:01.840 --> 00:08:03.200] Well, guess what?
[00:08:03.200 --> 00:08:05.360] Math is abstract.
[00:08:07.280 --> 00:08:18.080] So I think it's simple to say there are some judgments in politics like climate change soaks.
[00:08:18.080 --> 00:08:20.720] That's a political judgment that people make.
[00:08:21.360 --> 00:08:23.520] I think it's false.
[00:08:23.840 --> 00:08:26.800] But some people think it's true.
[00:08:26.800 --> 00:08:31.760] I, what does it mean for me to say that I think it's false?
[00:08:31.760 --> 00:08:36.720] It means I don't think climate change soaks.
[00:08:37.360 --> 00:08:41.120] So on the simple level, it's pretty straightforward.
[00:08:41.440 --> 00:08:45.440] Yeah, let's use that last example: climate change is a hoax.
[00:08:45.440 --> 00:08:50.880] Okay, that we can debunk pretty clearly, including also the question: is it, you know, real?
[00:08:50.880 --> 00:08:51.920] Is it human caused?
[00:08:51.920 --> 00:08:52.880] First, is it happening?
[00:08:52.880 --> 00:08:54.400] Second, what's happening?
[00:08:54.400 --> 00:08:55.440] Is it human-cause?
[00:08:55.440 --> 00:08:59.640] Third, you know, how much is it going to change over time?
[00:08:59.040 --> 00:09:01.880] And then what's your time horizon?
[00:08:59.280 --> 00:09:05.720] And then, you know, fourth, what will the consequences of the change be?
[00:08:59.520 --> 00:09:08.440] And then five, what should we do about it?
[00:09:08.440 --> 00:09:14.600] Now, it seems to me I'm shifting there from empirical objective truths to political truths, all right?
[00:09:14.600 --> 00:09:25.160] I mean, if we're going to spend, you know, a trillion dollars in the next century to reduce carbon emissions because of these effects, you know, politically or economically, we might have a legitimate debate.
[00:09:25.160 --> 00:09:33.720] Like, well, maybe we'd be better off spending the money on mosquito nets or, you know, vaccines for poor people in Africa or we'd save more lives.
[00:09:33.720 --> 00:09:36.440] And, you know, those are trade-off type questions.
[00:09:36.440 --> 00:09:40.520] You know, so there, I think, that's an example of what you're talking about.
[00:09:40.840 --> 00:09:41.640] Yeah.
[00:09:41.960 --> 00:09:58.840] So what I say in the book is that there's two senses of political judgment that people have in their head when they're talking about whether something's a political judgment or not.
[00:09:59.160 --> 00:10:08.040] In the narrow sense, in the technical sense, a political judgment is about what society ought to do.
[00:10:08.040 --> 00:10:13.240] So, should we enact climate change legislation or not?
[00:10:13.880 --> 00:10:17.720] That's what we might call a pure political judgment.
[00:10:17.720 --> 00:10:21.080] It's about what society ought to do.
[00:10:21.400 --> 00:10:35.720] But as you and I both know, and as you just illustrated, in real political life, that is, in real politics, everything gets politicized.
[00:10:36.040 --> 00:10:46.080] So, as we both know, and that's the reason I make this example, climate change science is heavily politicized.
[00:10:44.840 --> 00:10:48.160] Now, I don't like that.
[00:10:48.800 --> 00:10:52.880] My friends who are climate scientists don't like that.
[00:10:52.880 --> 00:10:55.040] I don't think you like that.
[00:10:55.040 --> 00:10:57.200] But it is what it is.
[00:10:57.840 --> 00:11:17.600] Some judgments in our life about the weather nowadays, about coffee, about inflation, about bridges, pretty much anything you name, become political in the wide sense.
[00:11:17.680 --> 00:11:21.920] By that I mean they come to have political meaning.
[00:11:21.920 --> 00:11:27.760] People put political associations next to those sorts of claims.
[00:11:28.160 --> 00:11:33.680] Strictly speaking, what makes those claims true is reality.
[00:11:34.000 --> 00:11:36.160] They're not.
[00:11:36.160 --> 00:11:37.440] They're content.
[00:11:37.440 --> 00:11:42.000] What they're about is, let's say, climate change science.
[00:11:43.600 --> 00:11:49.280] But that doesn't stop people from treating them as political.
[00:11:49.600 --> 00:11:57.360] And that's what complicates my job in this book and your job in thinking about truth in general.
[00:11:57.360 --> 00:11:58.000] Yeah.
[00:11:58.720 --> 00:12:05.680] Yeah, again, I think the climate change thing got politicized around the time of Al Gore's film, An Inconvenient Truth.
[00:12:05.680 --> 00:12:08.960] This isn't my observation, somebody else that I think made it.
[00:12:08.960 --> 00:12:10.160] I think it's right.
[00:12:10.560 --> 00:12:29.720] Therefore, because of the popularity of his film, and he was the vice president of the Democratic Party, you know, it got politicized and bundled such that, you know, when conservatives hear climate change or anthropogenic global warming, their brain auto-corrects to anti-capitalism, anti-business, you know, anti-America, whatever.
[00:12:29.720 --> 00:12:35.400] You know, and we're no longer talking about CO2 levels and shrinking glaciers or anything like that.
[00:12:35.400 --> 00:12:36.840] That's all irrelevant.
[00:12:36.840 --> 00:12:41.160] Well, yeah, the question is: is all in that solution number five?
[00:12:41.160 --> 00:12:43.720] You know, so the solution is step number five.
[00:12:43.720 --> 00:12:45.000] What should we do about it?
[00:12:45.000 --> 00:12:53.160] You know, and then they morph into, well, then it can't be real because if it's real and bad, then we probably should do something, you know, something like that.
[00:12:53.160 --> 00:12:54.840] So I think that's an example.
[00:12:55.880 --> 00:13:06.440] And, you know, since you're a philosopher, you know, the whole is-ought problem, we can describe the way things are, or is, the way something is, and what we ought to do about it.
[00:13:06.440 --> 00:13:12.280] But what I want to suggest is that actually we do cross that boundary quite often.
[00:13:12.600 --> 00:13:23.160] I mean, it's we can have a legitimate debate what's the proper upper income tax percentage because we want we it depends on the goals you want to accomplish.
[00:13:23.160 --> 00:13:38.120] Okay, but once you've decided we want a social safety net of this amount so that nobody falls through the cracks, universal health care, Medicare, Social Security, and so on, and so forth, and that ends up being at like 20% GDP or something like that.
[00:13:38.360 --> 00:13:41.160] Okay, now that we've set that goal, okay, how are we going to raise the money?
[00:13:41.160 --> 00:13:43.640] Then we can have a, you know, more of an empirical.
[00:13:43.640 --> 00:13:46.040] But how did we get to that first question?
[00:13:46.360 --> 00:13:54.360] Why is it we should take care of those who take care of themselves, the homeless or mentally ill or handicapped, whatever it is?
[00:13:54.360 --> 00:13:55.800] That's a harder one, right?
[00:13:55.800 --> 00:13:58.360] In terms of what's the right thing to do.
[00:13:59.000 --> 00:13:59.880] Yeah.
[00:13:59.880 --> 00:14:04.280] Well, what's just put your finger on very adroitly.
[00:14:04.280 --> 00:14:09.560] Is that when it comes to questions of pure politics?
[00:14:09.880 --> 00:14:13.960] We're talking about what philosophers call normative questions.
[00:14:14.640 --> 00:14:15.000] Questions.
[00:14:15.680 --> 00:14:18.080] What should society do?
[00:14:18.400 --> 00:14:21.200] What sorts of goals should we have?
[00:14:22.480 --> 00:14:36.960] So, when we get to questions about what sort of goals, political goals we ought to have, should we, for example, have free and fair elections?
[00:14:37.280 --> 00:14:39.760] That's actually a political question.
[00:14:40.080 --> 00:14:44.240] Should we have due process under the law?
[00:14:44.880 --> 00:14:46.880] That's a political question.
[00:14:47.200 --> 00:14:52.160] It's one that's, I think, now openly debated.
[00:14:52.480 --> 00:15:00.720] Are all citizens and are all people in the US guaranteed due process?
[00:15:01.360 --> 00:15:03.360] That's being debated.
[00:15:04.640 --> 00:15:08.080] How do we arrive at our decisions about those things?
[00:15:09.360 --> 00:15:17.280] Well, in part, it depends on what sort of prior commitments you have.
[00:15:18.240 --> 00:15:23.040] If, like me, you have democratic prior commitments.
[00:15:23.040 --> 00:15:24.160] Commitment.
[00:15:24.160 --> 00:15:31.760] Commitments, for example, to treating people as free and equal.
[00:15:31.760 --> 00:15:39.520] To giving reasons to them for your views because you treat them with respect.
[00:15:39.840 --> 00:15:57.760] If that's the sort of decision-making process you're interested in, then gathering evidence for your views, treating them as being able to be right or wrong, true or false, goes with the daredevil.
[00:15:58.080 --> 00:16:00.000] But there's other types of politics out there.
[00:16:01.800 --> 00:16:11.320] There's politics that tell us that the right way to solve these problems is to ask a set of experts.
[00:16:11.320 --> 00:16:13.720] The right way to solve these problems.
[00:16:13.720 --> 00:16:15.720] Just cite another example.
[00:16:15.720 --> 00:16:17.720] Is to ask the great leader.
[00:16:17.720 --> 00:16:19.480] North Korea.
[00:16:20.120 --> 00:16:24.680] The leader decides what the calls are.
[00:16:25.320 --> 00:16:28.280] And in autocracies, that's often the case.
[00:16:30.040 --> 00:16:34.760] So, push your question back.
[00:16:35.080 --> 00:16:53.480] We get to the point that what explains why it's true that we should be engaging in democratic politics rather than authoritarian politics.
[00:16:55.320 --> 00:17:00.520] That's a question I think all of us should be asking right now.
[00:17:00.520 --> 00:17:01.240] Yeah.
[00:17:01.800 --> 00:17:03.880] Michael, we're getting a ping.
[00:17:03.880 --> 00:17:06.840] Is that your email alerting you to emails?
[00:17:06.840 --> 00:17:08.040] Maybe turn that off.
[00:17:08.760 --> 00:17:09.480] Yeah, just shut off.
[00:17:10.200 --> 00:17:11.080] I had the same problem too.
[00:17:11.080 --> 00:17:12.440] Mine was pinging too.
[00:17:12.440 --> 00:17:22.360] Yeah, so I'm going to come back to that in a minute, but I'm going to give a couple more examples of this problem of determining truth in politics.
[00:17:22.360 --> 00:17:30.040] And so this is a debate I had in the pages of Skeptic with Lee McIntyre, the philosopher who wrote a book on post-truth.
[00:17:30.360 --> 00:17:37.800] He used the following exchange as an example of what he thinks is post-truth, but I think is maybe is a more of a political truth.
[00:17:37.800 --> 00:17:38.840] And you tell me what you think.
[00:17:38.840 --> 00:17:40.600] I'm not sure what the right answer is.
[00:17:40.600 --> 00:17:51.040] This is a 2016 exchange between CNN's Allison Camarota and Republican Speaker of the House, former Speaker of the House, Newt Gingrich, on crime rates.
[00:17:51.360 --> 00:17:54.640] I'm not sure when in 2016 this was at, you know, mid-year or so.
[00:17:54.640 --> 00:17:58.720] Anyway, Camerotta says from CNN, violent crime is down.
[00:17:58.720 --> 00:18:00.560] The economy is ticking up.
[00:18:00.560 --> 00:18:04.000] And Newt says, it's not down in the biggest cities.
[00:18:04.000 --> 00:18:06.880] And she says, violent crime, murder rate is down.
[00:18:06.880 --> 00:18:07.920] It's down.
[00:18:08.240 --> 00:18:13.760] And then Newt says, how come it's up in Chicago and up in Baltimore and up in Washington, D.C.?
[00:18:13.760 --> 00:18:18.960] Camera says, those are pockets where certainly we're not tackling murder.
[00:18:18.960 --> 00:18:22.320] And Gindrich says, your national capital, your third biggest city.
[00:18:22.320 --> 00:18:25.600] And Cameron says, but violent crime across the country is down.
[00:18:25.600 --> 00:18:31.920] And he says, the average American, I will bet you this morning, does not think crime is down, does not think they're safer.
[00:18:31.920 --> 00:18:33.360] And she says, but it is.
[00:18:33.360 --> 00:18:34.800] We're safer and it's down.
[00:18:34.800 --> 00:18:37.840] And Geindridge says, no, that's just your view.
[00:18:37.840 --> 00:18:39.600] And Camera says, it's a fact.
[00:18:39.600 --> 00:18:41.840] These are national FBI facts.
[00:18:41.840 --> 00:18:44.800] And Geindrus says, but what I said is also a fact.
[00:18:44.800 --> 00:18:52.320] The current view is that liberals have a whole set of statistics that theoretically may be right, but it's not where human beings are.
[00:18:52.320 --> 00:18:53.600] People are frightened.
[00:18:53.600 --> 00:18:57.200] And then she says, but what you're saying is, but hold on, Mr.
[00:18:57.200 --> 00:19:00.400] Speaker, because you're saying liberals use these numbers.
[00:19:00.400 --> 00:19:04.560] They use this sort of magic math, but these are FBI statistics.
[00:19:04.640 --> 00:19:06.720] FBI is not a liberal organization.
[00:19:06.720 --> 00:19:08.800] It's a crime-fighting organization.
[00:19:08.800 --> 00:19:12.080] And Geindrich says, no, but what I said is equally true.
[00:19:12.080 --> 00:19:14.160] People feel more threatened.
[00:19:14.160 --> 00:19:18.400] And she says, feel yes, but the facts don't support it.
[00:19:18.400 --> 00:19:24.400] And he says, as a political candidate, I'll go with how people feel and let you go with the theoreticians.
[00:19:25.040 --> 00:19:29.360] So, I mean, is he just making the case that, look, lady, I got to win elections.
[00:19:29.360 --> 00:19:32.680] And, you know, when the people are frightened, they're going to vote for our side.
[00:19:32.680 --> 00:19:34.440] And I didn't actually lie.
[00:19:29.760 --> 00:19:38.120] I mean, the crime rate did tick up in Chicago and Baltimore and so on.
[00:19:38.120 --> 00:19:42.760] I know the whole average is down, but the perception is what I'm after, not the facts.
[00:19:42.760 --> 00:19:44.360] Anyway, your thoughts?
[00:19:45.320 --> 00:20:00.760] I think that conversation is a beautiful example of why politics as a practice makes it very difficult to get any truth in politics.
[00:20:01.720 --> 00:20:11.480] The point of my book is to really try to grapple with a kind of paradox that this conversation illustrates.
[00:20:11.480 --> 00:20:14.520] I mean, the one you just read, not ours.
[00:20:16.280 --> 00:20:20.200] And that paradox is democratic politics.
[00:20:20.200 --> 00:20:33.160] Politics that favors deliberation between free and equal people, not democratic politics in terms of the party, but a certain type of practice of democracy.
[00:20:33.160 --> 00:20:37.320] In other words, is a kind of politics.
[00:20:37.640 --> 00:20:46.360] But every kind of politics is an engine that generates political meanings.
[00:20:46.360 --> 00:20:51.080] It generates political fog.
[00:20:51.720 --> 00:21:03.480] So just to practice politics means that you're paying attention to things like the associations people make with words like climate change.
[00:21:03.800 --> 00:21:14.200] You're a bad politician in terms of getting elected if you're not sensitive to how people interpret the words you use.
[00:21:14.200 --> 00:21:20.880] And you're a bad politician if you're not sensitive to people's emotions.
[00:21:20.880 --> 00:21:31.280] Because those emotions drive the associations they make with words, things, symbols, everything.
[00:21:32.880 --> 00:21:53.520] So, to practice politics paradoxically is to engage in an activity that, by its very nature, makes it difficult to get to a rational discussion.
[00:21:53.520 --> 00:22:02.000] And the paradox of democratic politics, the practice of democracy, is it demands.
[00:22:02.000 --> 00:22:08.640] Democracy requires us to use reasons and evidence.
[00:22:08.960 --> 00:22:09.680] Why?
[00:22:10.000 --> 00:22:19.440] Because we can't respect other people as free and equal if we don't give them reasons for the policies we're enacting.
[00:22:20.080 --> 00:22:30.240] You can't make people do things, pay taxes, you know, or any other thing, obey the laws, without giving them some reasons.
[00:22:30.560 --> 00:22:38.400] If, that is, you're treating them as free and equal citizens, like we do in democracy.
[00:22:39.840 --> 00:22:44.400] So, democratic politics is politics.
[00:22:44.400 --> 00:22:49.040] Politics gets in the way of reason and truth.
[00:22:49.040 --> 00:22:50.480] How do you combine them?
[00:22:50.720 --> 00:22:51.920] That's the trick.
[00:22:51.920 --> 00:22:52.720] Yeah.
[00:22:52.720 --> 00:22:53.360] Yeah.
[00:22:54.880 --> 00:22:56.160] Okay, another example.
[00:22:56.160 --> 00:23:08.440] So, in science, we aim to arrive at the truth through a peer review system, you know, a blind peer review.
[00:23:08.840 --> 00:23:17.400] You go to conferences, you have colleagues and peers you bounce your ideas off of, and you have expertise and degrees and so on.
[00:23:17.400 --> 00:23:20.760] And not that outsiders can't make contributions, they do.
[00:23:21.000 --> 00:23:25.800] But, you know, there's kind of a system set up over the last several centuries.
[00:23:25.800 --> 00:23:30.600] In the legal system, it's an adversarial system.
[00:23:30.600 --> 00:23:37.560] We have two lawyers, and the peer review is the jury or the judge that hears both cases.
[00:23:38.200 --> 00:23:40.680] And each lawyer is not trying to get to the truth.
[00:23:40.680 --> 00:23:42.360] They're just trying to win.
[00:23:42.360 --> 00:23:47.880] And hopefully, the truth will kind of fall there in the middle, and the jury or judge will make the right decision.
[00:23:47.880 --> 00:23:48.840] They don't always.
[00:23:48.840 --> 00:23:57.720] In journalism, we have fact-checking and editors and editors on top of editors, and an editor-in-chief, and then the owner of the newspaper with the final say or whatever.
[00:23:58.200 --> 00:24:08.440] In politics, I guess, is that the idea with a democracy, the public is the peer review system, and the elections are like the experiments.
[00:24:08.600 --> 00:24:10.120] Something like that?
[00:24:12.360 --> 00:24:14.200] That's one view.
[00:24:14.200 --> 00:24:21.160] I don't think of truth and democracy as being decided by vote.
[00:24:21.960 --> 00:24:22.520] Interesting.
[00:24:22.520 --> 00:24:23.240] Some people do.
[00:24:23.240 --> 00:24:23.880] Yeah.
[00:24:23.880 --> 00:24:28.200] Some people think that's what democratic truth means.
[00:24:28.840 --> 00:24:37.320] The majority determines not just what we ought to do, but what's what's true about what we ought to do.
[00:24:37.320 --> 00:24:37.720] Yeah.
[00:24:39.000 --> 00:24:43.720] I think that's conflating two things.
[00:24:43.720 --> 00:24:53.600] In a democracy, political authority is determined, roughly speaking, or generalizing, simplifying.
[00:24:53.600 --> 00:25:02.800] Roughly speaking, by Democrat votes, political authority is the authority for the president to enact.
[00:25:06.960 --> 00:25:15.440] Political authority and what we might call epistemic authority, authority of knowledge, are different things.
[00:25:15.760 --> 00:25:17.760] Plato thought they went together.
[00:25:18.160 --> 00:25:22.880] Plato thought, what gives you political authority is that you know things.
[00:25:23.200 --> 00:25:29.200] So just because you know something means you should be the person that ought to decide.
[00:25:29.200 --> 00:25:32.560] That's tempting, but I think that's a fallacy.
[00:25:33.600 --> 00:25:49.280] Because just because I'm a scientific expert doesn't mean that I have the ability to order you to do what I want you to do.
[00:25:51.840 --> 00:26:07.760] The authority that someone gets, let's say, as a ship captain, to order the crew to do what they do, is given by an outside principle.
[00:26:07.760 --> 00:26:13.040] A principle about on ships, captains are in charge.
[00:26:13.360 --> 00:26:19.520] Or, you know, the president gets to make decisions about the military.
[00:26:19.520 --> 00:26:21.520] He's the commander-in-chief.
[00:26:22.160 --> 00:26:28.400] Well, that principle is what's giving the commander-in-chief political authority.
[00:26:29.360 --> 00:26:34.760] But so we shouldn't confuse those two types of authority.
[00:26:34.760 --> 00:26:42.040] Because, look, I don't have to, I mean, this is pretty obvious, but sometimes it's important to state the obvious.
[00:26:42.040 --> 00:26:42.680] Yeah.
[00:26:43.320 --> 00:26:49.240] Just because people vote on something duh doesn't make us true.
[00:26:49.240 --> 00:27:02.360] People could vote and have voted on the issue of slavery, for example, and could vote for saying slavery is a good thing.
[00:27:02.520 --> 00:27:04.440] Let's have slavery.
[00:27:04.760 --> 00:27:10.440] Well, just because society votes on that doesn't make that true.
[00:27:11.080 --> 00:27:11.560] Okay?
[00:27:12.440 --> 00:27:12.920] Yeah.
[00:27:12.920 --> 00:27:13.640] Why?
[00:27:13.640 --> 00:27:18.760] Because it's not supported by coherent reasons.
[00:27:19.720 --> 00:27:33.400] It's not, it may be partly based on mistaken views about the human beings, about some beings, human beings being less intelligent than others.
[00:27:33.400 --> 00:27:36.760] Those are just factually incorrect.
[00:27:37.080 --> 00:27:41.560] But it also could be based on bad moral reasoning.
[00:27:42.520 --> 00:27:46.680] So just because people vote on something doesn't make us true.
[00:27:47.000 --> 00:27:57.880] And sadly, just because something's true doesn't mean people are going to vote on that basis.
[00:27:57.880 --> 00:27:58.360] Yeah.
[00:27:59.000 --> 00:28:02.920] As we know with things like climate change and vaccines.
[00:28:02.920 --> 00:28:06.120] So let's not equate the two.
[00:28:06.120 --> 00:28:11.640] Truth and democracy isn't about arriving at things at vote.
[00:28:11.640 --> 00:28:15.000] Political authority we arrive at by voting.
[00:28:15.040 --> 00:28:18.640] Is that what you mean by the phrase speak truth to power?
[00:28:18.640 --> 00:28:20.400] Or what does that mean?
[00:28:21.680 --> 00:28:29.440] I think when we speak truth to power, when we think of that as a value, I mean, that's an old civil rights phrase.
[00:28:30.080 --> 00:28:56.640] And what it meant then is what I mean by it now, which is in democracy, we think it's important for people to have the freedom to criticize those that are in power, but also to be able to speak those truths that are often obscured by, let's say, the majority.
[00:28:56.640 --> 00:29:14.160] So the minority, we think in a democracy, needs to be able to have, as it were, the room, the space to articulate truths about their experience that the majority may be overlooking.
[00:29:14.480 --> 00:29:19.360] So when we talk about truth to power, we're talking about both those things.
[00:29:19.680 --> 00:29:28.080] And in both cases, that value that we have, democracy, people have it on the right and the left.
[00:29:28.080 --> 00:29:34.160] It's not a progressive thing or a conservative thing.
[00:29:34.480 --> 00:29:36.080] But it is a democratic thing.
[00:29:37.040 --> 00:29:40.480] And it's a combination of those two ideas.
[00:29:40.480 --> 00:29:41.040] Yeah.
[00:29:42.160 --> 00:29:43.840] Your slavery example is good.
[00:29:43.840 --> 00:29:45.520] You know, Lincoln made this point.
[00:29:45.520 --> 00:29:59.280] You know, if you decide you're going to set up a system in which more or less, in which intelligence or smarts determines who gets to be a slave, then be careful because the first person you meet that's smarter than you can enslave you, right?
[00:29:59.400 --> 00:30:03.880] Or skin color or any other criteria like that.
[00:30:03.880 --> 00:30:22.120] In a way, I guess it's an argument between utilitarian ethics, consequentialism, and deontological rights, let's say, ethics in which it doesn't matter how many people think or what arguments are made why it would be better for society that slavery was reinstituted.
[00:30:22.120 --> 00:30:23.240] It doesn't matter.
[00:30:23.240 --> 00:30:24.280] It's just wrong.
[00:30:24.280 --> 00:30:25.720] Well, why is it wrong?
[00:30:25.720 --> 00:30:29.880] You know, so then we have to drill down into rights and where do they come from?
[00:30:29.880 --> 00:30:33.640] Is it just, like Bentham said, you know, nonsense on stilts?
[00:30:33.640 --> 00:30:35.960] Natural rights is nonsense on stilts.
[00:30:35.960 --> 00:30:47.000] Or is there something even deeper about it that the rights themselves have evolved because of the consequences for society?
[00:30:47.000 --> 00:30:49.000] This is a utilitarian argument.
[00:30:49.000 --> 00:30:57.320] Society is really better if everybody is treated equally, if everyone's equal under the law, if everyone has the same rights and so on.
[00:30:57.320 --> 00:31:05.880] So it's not like we're just plucking it out of thin air, like, oh, I don't know, just today randomly, let's just say everybody has equal rights and maybe we'll change our mind tomorrow.
[00:31:06.920 --> 00:31:09.640] You know, that it's really better for society.
[00:31:09.640 --> 00:31:24.920] And that can you make the argument that over centuries we have evolved in a certain direction that everyone is a member of the human race but born with natural rights.
[00:31:24.920 --> 00:31:29.240] We just call it natural rights, call it whatever you want, but here are the consequences of it.
[00:31:29.240 --> 00:31:31.400] We're going to have these laws and so on.
[00:31:31.400 --> 00:31:38.760] And so we have the Bill of Rights and amendments that can't be overturned willy-nilly, you know, takes two-thirds of the states and so on and so forth.
[00:31:39.080 --> 00:31:44.680] Those are all in place because of those kind of consequential type utilitarian arguments that somebody could make.
[00:31:44.800 --> 00:31:51.440] You know, I think Ford was built on the belief that the world doesn't get to decide what you're capable of.
[00:31:51.440 --> 00:31:52.480] You do.
[00:31:52.480 --> 00:31:56.000] So ask yourself: can you or can't you?
[00:31:56.000 --> 00:32:00.720] Can you load up a Ford F-150 and build your dream with sweat and steel?
[00:32:00.720 --> 00:32:04.480] Can you chase thrills and conquer curves in a Mustang?
[00:32:04.480 --> 00:32:08.960] Can you take a Bronco to where the map ends and adventure begins?
[00:32:08.960 --> 00:32:12.960] Whether you think you can or think you can't, you're right.
[00:32:12.960 --> 00:32:15.600] Ready, set, Ford.
[00:32:15.600 --> 00:32:17.280] Society's better if women don't have the vote.
[00:32:17.280 --> 00:32:18.080] Let's take it away.
[00:32:18.080 --> 00:32:19.920] No, that's not going to happen.
[00:32:21.200 --> 00:32:21.840] Yeah.
[00:32:22.160 --> 00:32:30.560] So, obviously, we're not going to solve the age-old debate between consequentialism or utilitarianism.
[00:32:30.640 --> 00:32:31.040] Oh, let's go.
[00:32:31.040 --> 00:32:31.520] Let's solve it.
[00:32:32.000 --> 00:32:33.280] Let's go for it.
[00:32:35.360 --> 00:32:41.360] What I think is important about what you're saying, there's a lot of important things.
[00:32:41.680 --> 00:32:51.760] And I'm sympathetic, at least on Monday, Wednesdays, and Fridays, with your sort of consequentialist justification of human rights.
[00:32:54.320 --> 00:33:03.520] But putting that aside for a minute, what I think you're getting at is a point that I'm freaking to make in the book.
[00:33:03.840 --> 00:33:15.680] Which is when it comes to these political principles, like slavery is undemocratic, or human rights are politically important.
[00:33:16.000 --> 00:33:18.640] Those are principles.
[00:33:18.640 --> 00:33:21.280] Just like human rights are principles.
[00:33:21.280 --> 00:33:28.560] What makes those principles, those political principles, true, if they are true?
[00:33:29.200 --> 00:33:53.240] Well, the founder of pragmatism, who had his own sympathies with utilitarianism, Charles Sanders Burrs, said, What matters when it comes to truth is whether the judgment in question, the proposition in question, let's say slavery is wrong.
[00:33:53.560 --> 00:34:02.040] Is that a proposition that will survive the fires of experience?
[00:34:02.360 --> 00:34:21.480] Will at last will stand up to the evidence and to reason and to scrutiny, to the consequences, as you were putting it, over time, into the future.
[00:34:21.800 --> 00:34:30.120] It's hard to know, of course, which of the propositions we make in politics are going to be able to do that.
[00:34:30.120 --> 00:34:33.320] I mean, some we can know, like the slavery one.
[00:34:33.320 --> 00:34:35.400] It seems pretty easy.
[00:34:35.400 --> 00:34:43.480] But a lot of the claims we make, we're hoping they're going to stand up to scrutiny.
[00:34:43.800 --> 00:34:44.760] But we don't know.
[00:34:44.760 --> 00:34:45.960] That's what makes it hard.
[00:34:45.960 --> 00:34:48.040] That's what makes politics hard.
[00:34:48.280 --> 00:34:53.240] That's what makes finding truth and morality hard.
[00:34:53.560 --> 00:35:03.000] But just because it's hard doesn't mean there aren't some propositions that are going to stand up to the consequences and some that are not.
[00:35:03.000 --> 00:35:03.400] Yeah.
[00:35:05.080 --> 00:35:11.400] When we come to truth and politics, what we mean is what Purse meant.
[00:35:11.720 --> 00:35:18.320] We want our ideas to stand up over scrutiny, over time.
[00:35:23.680 --> 00:35:24.880] Ages.
[00:35:24.880 --> 00:35:29.280] We want them to last and be rational to the end.
[00:35:29.920 --> 00:35:39.760] Would a test of that truth be something like a Rawlsian thought experiment in which you don't know which group you're going to be in when you pass this law?
[00:35:39.760 --> 00:35:46.240] Therefore, you pass it to make it sure that everybody's treated equally, regardless of which group they happen to be born in.
[00:35:46.240 --> 00:35:51.200] That might be kind of a thought experiment to run as a form of reason.
[00:35:52.160 --> 00:35:53.680] That's a great point, Michael.
[00:35:53.680 --> 00:35:56.560] I'm really glad you brought that up.
[00:35:56.560 --> 00:35:57.840] I think so.
[00:35:57.840 --> 00:36:15.840] Now, of course, that thought experiment is hotly debated, to put it mildly, because some people are going to say, well, reasonably enough, how can we make decisions if we don't know who we are or who we're going to be?
[00:36:16.480 --> 00:36:44.800] But in a previous book, book all in praise of reason, I took that thought experiment and said, look, one way of trying to decide what the sort of right methods of evidence we can use in politics and everything else to decide on the questions at hand.
[00:36:45.440 --> 00:36:54.720] How do we decide debates over whether science is the right way to make decisions about vaccines or not?
[00:36:55.680 --> 00:37:01.400] Well, ask yourself the Rawsing question that you've just asked.
[00:37:01.720 --> 00:37:06.680] If you didn't know who you were going to be, sorry about that.
[00:37:06.680 --> 00:37:07.400] That's all right.
[00:37:07.400 --> 00:37:22.840] Then, if you didn't know who you were going to be, or whether, for example, you were going to be in charge, in power, or just an elementary school kid.
[00:37:23.480 --> 00:37:32.840] What principles would you want people to be using to decide on what sorts of things should be done?
[00:37:33.160 --> 00:37:41.800] Would you like scientific principles that at least are, in principle, open to everyone?
[00:37:41.800 --> 00:37:49.080] They use sense perception and reasoning, things we all share in common.
[00:37:49.400 --> 00:37:52.280] Or do you want the priests to decide?
[00:37:52.600 --> 00:38:01.800] The only people to decide, where you might not know whether you're the only person in the future society.
[00:38:02.440 --> 00:38:22.360] So I think what you're right about is in politics, we, of course, are going to use tests like that to decide again on which principles we're going to use to assess the kinds of things we think we ought to do.
[00:38:25.160 --> 00:38:33.320] Let me use this analogy from Steven Pinker with moral values analogous to platonic truths.
[00:38:33.320 --> 00:38:46.560] On this analogy, we're born with a rudimentary concept of number, but as soon as we build on it with formal mathematical reasoning, the nature of mathematical reality forces us to discover some truths, but not others.
[00:38:46.560 --> 00:38:54.800] No one who understands the concept of two, the concept of four, and the concept of addition can come to any conclusion but that two plus two equals four.
[00:38:54.800 --> 00:39:07.680] Perhaps we're born with a rudimentary moral sense, and as soon as we build on it with moral reasoning, the nature of moral reality forces us to some conclusions, but not others, like, you know, would you rather live in North Korea or South Korea?
[00:39:07.680 --> 00:39:09.360] Do you want to be a slave or not?
[00:39:09.360 --> 00:39:10.480] And so on.
[00:39:10.480 --> 00:39:24.800] And so Pinker calls this the principle of interchangeable perspectives, goes by other names: Kant's categorical imperative, Rawls' veil of ignorance, the social contract of Hobbes, Rousseau, and Locke, Spinoza's viewpoint from eternity, and so on.
[00:39:24.800 --> 00:39:40.480] If I appeal to you to do anything that affects me, to get off my foot or tell me the time or not run me over with your car, then I can't do it in a way that privileges my interest over yours, say retaining my right to run you over with my car if I want you to take me seriously.
[00:39:40.480 --> 00:39:45.520] Unless I'm Galactic Overlord, I have to state my case in a way that would force me to treat you in kind.
[00:39:45.840 --> 00:39:57.360] I can't act as if my interests are special just because I'm me and you're not, any more than I can persuade you that the spot I'm standing on is a special place in the universe just because I happen to be standing on it.
[00:39:58.000 --> 00:39:58.400] Right?
[00:39:58.400 --> 00:40:10.000] So Pinker's point, you know, is that there's been moral progress because we've all kind of discovered this as a principle, but that can only happen in democracy where people can actually state their voices.
[00:40:10.000 --> 00:40:11.520] Exactly.
[00:40:11.840 --> 00:40:17.360] And that's much of what I've been trying to say over the last couple of years.
[00:40:18.320 --> 00:40:20.080] Am I writing on this?
[00:40:20.080 --> 00:40:21.600] Which is that?
[00:40:22.240 --> 00:40:36.360] Look, as I say in the book, right now, we're facing a sort of double crisis, a double crisis of faith around the world, not just in the United States.
[00:40:36.680 --> 00:40:40.520] A lot of people are losing faith in democracy.
[00:40:40.840 --> 00:40:43.080] It's not a coincidence.
[00:40:43.080 --> 00:40:49.960] They're also losing faith in concepts like truth, rationality, evidence.
[00:40:50.280 --> 00:40:52.760] Why is that not a coincidence?
[00:40:53.720 --> 00:41:06.360] Because democracy is the kind of politics that demands that we use evidence and truth without thinking about what we ought to do.
[00:41:06.360 --> 00:41:18.920] We're thinking about morality and morality insofar as we practice it based on the principle that you were just talking about.
[00:41:19.560 --> 00:41:20.840] Call it what you will.
[00:41:21.160 --> 00:41:28.920] Do you go so far as to say we're living in a post-truth world or a world of alternative facts or whatever?
[00:41:28.920 --> 00:41:30.840] Look, lots of people say those things.
[00:41:31.560 --> 00:41:37.640] I avoid saying those things because, well, I can make sense of them.
[00:41:37.640 --> 00:41:40.040] I mean, I know what people are getting at.
[00:41:40.360 --> 00:41:42.120] I mean, everybody does.
[00:41:42.440 --> 00:41:44.680] It's not particularly precise.
[00:41:44.680 --> 00:41:50.040] Because, look, you never really boast truth.
[00:41:50.360 --> 00:41:52.360] Truth is what it is.
[00:41:53.000 --> 00:41:59.160] It's propositions either stand up to rational scrutiny or they don't.
[00:41:59.480 --> 00:42:10.560] And while there might be cases where we'll agree that we'll never know what's going to be true.
[00:42:10.560 --> 00:42:17.360] Or a proposition is too vague to determine its truth.
[00:42:18.640 --> 00:42:26.240] To say that we're both truth is to sort of use words in an abstracting way.
[00:42:26.880 --> 00:42:33.360] That being said, it's clear that we're in a crisis of faith about truth.
[00:42:34.000 --> 00:42:48.800] Lots of people seem to think that truth is found by looking on when people say it on TikTok.
[00:42:49.440 --> 00:42:55.280] By truth is correlated with how many likes you get online.
[00:42:56.880 --> 00:43:06.720] That's a sort of crisis of faith in what I mean when I talk about truth, which is paying a sting to reasons.
[00:43:06.720 --> 00:43:26.880] Now look, the people that often fly truth's banner, who wave it the loudest, the people who call like our president, the social media platform they own truth.
[00:43:26.880 --> 00:43:33.920] Those people, just like the people that wave democracy's banner, the loudest.
[00:43:36.160 --> 00:43:45.840] Those people both wave a banner, use a word, but actually don't follow the idea.
[00:43:46.160 --> 00:43:48.960] So we've got to be careful.
[00:43:48.960 --> 00:43:50.960] Lots of people talk about truth.
[00:43:50.960 --> 00:43:54.120] Lots of people talk about democracy.
[00:43:54.120 --> 00:43:55.840] And don't believe in either one.
[00:43:55.840 --> 00:44:02.520] You mean countries that countries that put the word democratic in the title of their country means they're not actually democratic?
[00:44:03.480 --> 00:44:04.840] Yeah, shockingly enough.
[00:44:05.720 --> 00:44:06.360] Yeah.
[00:44:06.680 --> 00:44:07.160] Yeah.
[00:44:07.480 --> 00:44:13.000] Talk a little bit about your thought experiment at the beginning of the book, which I really enjoyed about the Twitbookeans.
[00:44:13.320 --> 00:44:13.960] Yeah.
[00:44:15.160 --> 00:44:17.000] And the broader principle.
[00:44:17.000 --> 00:44:20.840] I mean, you know, this kind of masses and bucks populi.
[00:44:23.720 --> 00:44:38.360] So imagine, for the listeners, imagine a society that followed one rule when it comes to the political discussions.
[00:44:38.680 --> 00:44:43.880] That rule is this: call it the rule of conformity.
[00:44:43.880 --> 00:44:59.720] Only say or boast those things online or off that are liked by your friends, thumbs up by your friends, and disliked by your enemies.
[00:45:00.360 --> 00:45:03.320] That's the only rule they follow.
[00:45:03.960 --> 00:45:07.960] Now, call these people the Twitbookians.
[00:45:08.280 --> 00:45:18.920] Now, here's the thing about the Twitbookians: here's the thing: the Twitbookians don't know this about themselves.
[00:45:18.920 --> 00:45:28.360] They think that when they say things about politics, they're using facts and evidence.
[00:45:28.360 --> 00:45:40.280] And they insist that they're appealing to reasons, even though they're never actually motivated by facts, reasons, or evidence.
[00:45:40.600 --> 00:45:50.560] They're actually motivated, although they don't know this, by what their friends like, what their enemies dislike.
[00:45:50.880 --> 00:45:54.960] They're only motivated by conformity.
[00:45:55.600 --> 00:46:11.520] Now, the point of opening up the book with this experiment is to get people to think about this question: Are we like Twitbookians?
[00:46:12.480 --> 00:46:19.040] Some philosophers, some political scientists think we are.
[00:46:19.360 --> 00:46:27.600] Some people think, in fact, that human nature makes us Twitbookians.
[00:46:27.600 --> 00:46:32.320] If that's true, democratic politics is doomed.
[00:46:33.280 --> 00:46:36.560] Because think about the Twitbookian society.
[00:46:36.560 --> 00:46:39.760] Is it really going to be democratic?
[00:46:40.400 --> 00:46:44.800] Are they even going to be having rational disagreements?
[00:46:45.120 --> 00:46:45.760] No.
[00:46:46.400 --> 00:46:54.800] Twitbookians with different political views are really disagreeing with one another.
[00:46:54.800 --> 00:47:01.280] They're just bleeding at each other like sheep across a political divide.
[00:47:01.920 --> 00:47:11.040] So the question I want people to think-think hard about is: is our society Twitbookian?
[00:47:11.360 --> 00:47:15.680] And if it is, how do we get it to change?
[00:47:16.960 --> 00:47:25.520] Right, so here are you speaking to the myth of the rational voter that people are not utilitarian maximizers and so on.
[00:47:25.520 --> 00:47:28.160] They're just voting their team.
[00:47:28.160 --> 00:47:45.640] Yes, to some extent, as I said, and as you just noted, in political science, there's an open debate about whether voting itself is a rational enterprise.
[00:47:47.400 --> 00:47:58.360] I want to say, in the book, I say there's a lot to be said for that side of the argument.
[00:47:58.360 --> 00:48:00.040] But it's not inevitable.
[00:48:00.360 --> 00:48:02.440] It's not inevitable.
[00:48:02.760 --> 00:48:03.960] Or not.
[00:48:04.600 --> 00:48:09.640] The argument that's inevitable is an argument about human nature.
[00:48:09.640 --> 00:48:19.880] It's the argument that humans by nature are just incapable of being rational about things they care about.
[00:48:19.880 --> 00:48:21.880] David Hume thought that roughly.
[00:48:22.840 --> 00:48:26.920] Reason is the slave of the passions, he said.
[00:48:27.240 --> 00:48:33.240] Okay, what I want to say about Hume is like, yeah, most of the time, that's often right.
[00:48:33.880 --> 00:48:35.240] But you know what?
[00:48:36.520 --> 00:48:47.400] Over the centuries, we've found out methods for counteracting our tendency to be irrational.
[00:48:48.040 --> 00:48:50.120] And you know what those methods are?
[00:48:50.440 --> 00:48:53.880] You already mentioned them at the top of the hour.
[00:48:54.520 --> 00:49:00.760] They are methods like using evidence in the law.
[00:49:02.040 --> 00:49:12.200] Requiring journalists fact-check their sources to have more than one source.
[00:49:12.200 --> 00:49:18.880] Requiring scientists to publish in peer-reviewed journals.
[00:49:20.400 --> 00:49:22.720] All those rules.
[00:49:23.040 --> 00:49:26.000] Everything that I just said is a rule.
[00:49:26.320 --> 00:49:40.240] Those rules were set up for the sole purpose of correcting for what Kant called the crooked timber of humanity.
[00:49:40.240 --> 00:49:54.880] That is our bent nature, our tendencies, Michael, to just go with the flow, to favor our friends, to ignore evidence that contradicts our opinions.
[00:49:56.880 --> 00:50:11.040] Society, democratic society, requires that we play not just by the political rules of fair play, but by the epistemic rules of fair play.
[00:50:11.040 --> 00:50:13.920] The rules of evidence.
[00:50:13.920 --> 00:50:20.080] And what the Deutbuchians have forgotten is to play by those rules.
[00:50:20.400 --> 00:50:21.280] Right.
[00:50:21.600 --> 00:50:24.080] Yeah, so that's deeply buried in our human nature.
[00:50:24.080 --> 00:50:25.840] It's hard to get around it.
[00:50:25.840 --> 00:50:26.960] Possible.
[00:50:27.520 --> 00:50:28.320] Possible.
[00:50:28.640 --> 00:50:35.920] You know, this, you know, starting off with the idea of if there is a God, he's omniscient and omnipotent.
[00:50:36.480 --> 00:50:37.920] I don't know if there is or not.
[00:50:37.920 --> 00:50:40.960] I'm an atheist, but I know for sure I'm not God.
[00:50:40.960 --> 00:50:42.960] And I also know that you're not God.
[00:50:42.960 --> 00:50:44.320] Okay, so we can start there.
[00:50:44.320 --> 00:50:50.160] Or if you like the secular version, there is an objective reality, only I don't know what it is, and you don't either.
[00:50:51.440 --> 00:50:54.640] And so we have to start with there, you know, just the principle of fallibilism.
[00:50:54.640 --> 00:50:56.960] Okay, so we all might be wrong.
[00:50:56.960 --> 00:50:58.320] Okay, so then how do you know?
[00:50:58.320 --> 00:51:04.360] Well, then we have to set up rules of evidence, and then we're off and rolling into setting up these different systems.
[00:51:04.760 --> 00:51:09.240] And so to overcome that, all right, so let's let me go back to Hume.
[00:51:09.240 --> 00:51:13.640] You know, reason is the slave, is and always will be the slave of the passions.
[00:51:13.640 --> 00:51:25.640] Of course, he didn't mean it's, you know, it's it's time to go party like it's 1999, but that reason can tell you how to get to your goals, but what are the goals in the first place?
[00:51:25.640 --> 00:51:27.400] How did you get those goals?
[00:51:27.400 --> 00:51:31.080] Right there, he's separating is and not, I think, right?
[00:51:31.080 --> 00:51:32.520] Am I interpreting that right?
[00:51:32.520 --> 00:51:33.880] No, he's right.
[00:51:33.880 --> 00:51:34.920] You're right.
[00:51:34.920 --> 00:51:47.880] I mean, Michael, I think that's correct for Hume and lots of people after him, particularly researchers in political psychology.
[00:51:48.200 --> 00:51:57.080] They want to say, reason, as you just said, can help you determine how to get where you're going.
[00:51:57.400 --> 00:52:04.600] But your direction, your goal, is set by the heart, by the passions.
[00:52:05.880 --> 00:52:17.160] And if you think that, then you'll think that, well, as Hume did, you can really only debate rationally about means, means to ends.
[00:52:17.240 --> 00:52:22.120] Can't debate about the ends, the principles themselves.
[00:52:22.120 --> 00:52:26.120] Because that's for the heart to decide.
[00:52:26.440 --> 00:52:30.760] And people have different feelings, different sentiments.
[00:52:31.080 --> 00:52:33.080] And that's not rational.
[00:52:34.360 --> 00:52:35.960] I disagree with that.
[00:52:36.600 --> 00:52:48.640] I think, as we were already talking about, you can reason about ends because you can put them in the hopper of evidence.
[00:52:49.600 --> 00:52:51.680] Go back to Peirce.
[00:52:52.000 --> 00:52:56.800] What's true is what he called concordant.
[00:52:56.800 --> 00:53:01.920] He said true ideas are coordinate, concordant ideas.
[00:53:02.240 --> 00:53:11.440] And what he meant was: even with principles, we can see which stand up to scrutiny.
[00:53:17.440 --> 00:53:21.360] Go back to my original analogy.
[00:53:21.360 --> 00:53:27.440] Mathematics is a place for truth and falsity, if anything is.
[00:53:29.680 --> 00:53:35.360] But what makes claims about numbers?
[00:53:35.680 --> 00:53:38.160] Well, numbers.
[00:53:38.800 --> 00:53:39.920] What are those?
[00:53:40.240 --> 00:53:41.360] Good question.
[00:53:42.640 --> 00:53:47.760] You and I aren't going to determine, thank God, the answer to that question right now.
[00:53:48.080 --> 00:54:13.520] But whatever the truth is of those, whatever makes mathematical judgments true, even if we don't know the answer to that, we can agree that mathematical judgments can be true because we have ways of determining which stand up to scrutiny.
[00:54:13.520 --> 00:54:19.440] It's called mathematical proof.
[00:54:19.440 --> 00:54:24.000] In politics, we don't have mathematical proofs, sadly.
[00:54:25.280 --> 00:54:29.120] But we do have appeals to reasons.
[00:54:30.360 --> 00:54:47.240] Now, sometimes the best we can do in politics is just say, look, this political picture hangs together, coheres with the evidence.
[00:54:47.240 --> 00:54:48.360] Best.
[00:54:49.640 --> 00:54:53.960] And this political picture doesn't really fit together.
[00:54:53.960 --> 00:54:56.520] It's filled with contradictions.
[00:54:56.520 --> 00:54:59.960] And it contradicts the empirical evidence we have.
[00:54:59.960 --> 00:55:04.120] So you're combining a correspondence theory of truth and a coherence theory of truth.
[00:55:05.080 --> 00:55:16.760] I think in politics, well, my official view is truth comes in more than one form.
[00:55:17.080 --> 00:55:20.760] All truth is a matter of fitting with reality.
[00:55:20.760 --> 00:55:23.640] But what that means, difference.
[00:55:24.280 --> 00:55:39.640] When we're talking about trees and battleships and mountains, empirical reality, I think the correspondence theory, in one form or another, is the best theory.
[00:55:39.640 --> 00:55:40.920] Myself.
[00:55:41.560 --> 00:55:51.240] When we're talking about morality, then I think something a little bit more like the coherence theory makes sense.
[00:55:52.520 --> 00:55:55.560] And that's what I was just saying about politics.
[00:55:55.560 --> 00:56:07.640] When it comes to what I earlier called judgments about politics in the narrow or pure sense, judgments about what we ought to do as a society.
[00:56:09.240 --> 00:56:22.080] Often the best we could do is just see which of these political stories makes sense, which holds together, and which doesn't contradict the empirical facts.
[00:56:23.680 --> 00:56:29.440] If we could do that, even we must be getting closer to the truth.
[00:56:29.760 --> 00:56:34.240] As you said, look, let's be realistic.
[00:56:38.720 --> 00:56:41.600] It's not easy to know what's true.
[00:56:42.080 --> 00:56:46.400] Skeptic like yourself is going to say, we'll never know what's true.
[00:56:48.000 --> 00:56:51.840] You know, I'm one of these people that thinks sort of in between.
[00:56:51.840 --> 00:56:56.240] I think some things I know are true, or a push short are true.
[00:56:56.880 --> 00:57:02.480] Things I'm willing to bat my life on, for example, which is pretty high standard.
[00:57:03.120 --> 00:57:10.240] And then there are things that I have no idea are true.
[00:57:10.240 --> 00:57:13.040] And I don't think anybody knows are true.
[00:57:13.040 --> 00:57:18.400] Well, this is why I like Bayesian reasoning so much because it allows you to put a probability on it.
[00:57:18.960 --> 00:57:28.480] And then I like Cromwell's rule: never assign a one or zero to anything, just in case, because I beseech you in the bowels of Christ, you might be mistaken.
[00:57:28.960 --> 00:57:30.560] All right, so, okay, yeah.
[00:57:31.120 --> 00:57:46.880] But then, wait, I was going to say, oh, yeah, so I want to also add pragmatism to the study of what, of history and psychology, of what people actually do and have done in the past, and then map that onto what you're talking about.
[00:57:46.880 --> 00:57:51.040] So, the low-hanging fruit, would you rather live in North Korea or South Korea?
[00:57:51.120 --> 00:57:52.240] Everybody knows the answer.
[00:57:52.240 --> 00:57:53.120] Why?
[00:57:53.120 --> 00:57:56.480] You know, well, look at historically, people vote with their feet.
[00:57:56.720 --> 00:57:59.360] They don't want to be enslaved and so on.
[00:57:59.360 --> 00:58:04.840] And you can then put reason behind that, as Lincoln said: as I would not be a slave, I would not be a slave owner.
[00:58:04.840 --> 00:58:10.680] Yeah, that's a no-brainer in hindsight, but that's not what people thought at the time.
[00:58:10.680 --> 00:58:11.160] Right?
[00:58:11.160 --> 00:58:19.960] So, you know, the you know, reason has to be tempered because, you know, if you and I lived 200 years ago, we might be making rational arguments for why slavery is great.
[00:58:20.280 --> 00:58:20.920] Yeah.
[00:58:21.240 --> 00:58:26.920] But that just shows you that some ideas don't stand up.
[00:58:26.920 --> 00:58:27.640] Right, right.
[00:58:27.640 --> 00:58:27.880] Okay.
[00:58:28.200 --> 00:58:28.840] That's the point.
[00:58:28.840 --> 00:58:29.080] Right.
[00:58:29.080 --> 00:58:30.840] So over time, we're getting closer.
[00:58:31.080 --> 00:58:36.600] I'm just trying to make over time new information comes in.
[00:58:36.600 --> 00:58:36.920] Yeah.
[00:58:36.920 --> 00:58:37.400] Yeah.
[00:58:37.400 --> 00:58:42.520] Now, do we really need new information when it comes to slavery?
[00:58:42.520 --> 00:58:43.160] No.
[00:58:43.800 --> 00:58:57.640] But for all sorts of other more complicated questions, let's say about the tax code, like you talked about at the beginning, or complicated questions of a very prudential kind in politics.
[00:58:57.640 --> 00:59:02.440] A lot of politics is much more ditty-gritty than what we're talking about here.
[00:59:02.440 --> 00:59:08.200] You know, my wife served on the local town council.
[00:59:08.520 --> 00:59:11.880] We live in a small New England village.
[00:59:12.200 --> 00:59:21.720] Political questions they had to decide were like, should we build a bike lane or should we build a hockey path?
[00:59:21.720 --> 00:59:22.200] Right.
[00:59:22.520 --> 00:59:23.080] Yeah.
[00:59:23.080 --> 00:59:27.960] Now, you know, that's a political question for the village.
[00:59:28.280 --> 00:59:31.000] People got heated about it.
[00:59:33.160 --> 00:59:37.240] But it's not a grand glorious question.
[00:59:37.560 --> 00:59:50.960] But the right answer to that question is going to be the one that sort of stands up over time, given the goals of the village, the money they have, the taxes, and so forth.
[00:59:51.280 --> 00:59:53.040] It's complicated.
[00:59:53.360 --> 01:00:04.320] And the best you can do, often, in real life, is to, as you said, play the probabilities and hope that you get it right.
[01:00:04.640 --> 01:00:08.000] What you shouldn't do, and a lot of people do.
[01:00:08.960 --> 01:00:10.640] And I think you know this.
[01:00:10.960 --> 01:00:14.960] A lot of people engage in black and white thinking.
[01:00:14.960 --> 01:00:23.440] They think, oh, because it's not clear what the answer is, that means there's no answer.
[01:00:23.440 --> 01:00:26.800] That's the mistake that people make.
[01:00:27.120 --> 01:00:37.600] That is a mistake that political psychologists make when they say politics has nothing to do with truth.
[01:00:37.600 --> 01:00:39.040] Of course it does.
[01:00:40.240 --> 01:00:46.000] Just it's a realm where truth is hard to come by.
[01:00:46.640 --> 01:00:48.160] But guess what?
[01:00:48.800 --> 01:00:50.960] Truth is hard to come by.
[01:00:50.960 --> 01:00:53.440] And higher mathematics too.
[01:00:53.440 --> 01:00:55.200] Number theory.
[01:00:55.520 --> 01:00:57.600] It's not easy.
[01:00:58.240 --> 01:01:12.160] And just because some truths and mathematics are easy, like two and two or four, doesn't mean that what the mathematicians do is easy.
[01:01:12.480 --> 01:01:16.640] Some political truths, as you say, are easy.
[01:01:16.640 --> 01:01:18.880] Nobody wants to be a slave.
[01:01:19.840 --> 01:01:21.320] Real politics is hard.
[01:01:21.200 --> 01:01:21.480] Yeah.
[01:01:22.480 --> 01:01:26.240] But grow up and get used to it, people.
[01:01:26.560 --> 01:01:28.480] Don't give up on truth.
[01:01:28.480 --> 01:01:32.200] Don't give up on justice just because it's hard.
[01:01:29.520 --> 01:01:34.040] Nice.
[01:01:34.040 --> 01:01:35.080] Yeah, I like that.
[01:01:29.680 --> 01:01:36.440] I like that a lot.
[01:01:37.800 --> 01:01:45.240] Yeah, but in terms of these overall trends, like back to where we started, pretty much all countries have a social safety net now.
[01:01:45.240 --> 01:01:55.880] And all the Western democracies spend, you know, as I said, roughly, I don't know, 18% to 25% of their GDP on these social services.
[01:01:55.880 --> 01:01:59.880] So, and, you know, the conservatives want it to be lower, liberals want it to be higher.
[01:01:59.880 --> 01:02:00.440] Okay.
[01:02:00.440 --> 01:02:05.720] But no one's saying, you know, we should just go back to the way it was in the 18th century when there was nothing, right?
[01:02:06.040 --> 01:02:15.240] So would you say that pragmatically, your pragmatism, you know, historically, that's kind of a direction towards something that really is true.
[01:02:15.240 --> 01:02:18.280] This is the way societies really are better when they're that way.
[01:02:18.600 --> 01:02:21.320] You can argue about the, you know, the percentage.
[01:02:21.640 --> 01:02:22.360] Yes.
[01:02:22.680 --> 01:02:41.400] I think that there are lots of different examples of these sorts of claims where you can see some coalescing, some agreement, some overlapping consensus, as Rawl sort of said.
[01:02:42.360 --> 01:02:47.560] Now, the problem is, as I just said, in the details.
[01:02:47.560 --> 01:02:51.640] I mean, what is the percentage we should be spending?
[01:02:51.640 --> 01:02:52.920] That's harder.
[01:02:52.920 --> 01:02:53.480] Yeah.
[01:02:55.320 --> 01:03:03.800] But it is actually important for this conversation that you point out we can make progress.
[01:03:03.800 --> 01:03:07.480] And we can look back and see we've made progress.
[01:03:09.080 --> 01:03:11.000] Look.
[01:03:11.320 --> 01:03:29.120] One thing we need to remember right now, especially, is that political progress and rational progress requires what I earlier called these sort of epistemic rules, rules of evidence.
[01:03:29.600 --> 01:03:44.240] More importantly, it requires institutions that act by those rules, that educate people in those rules, that act as sort of the infrastructure of knowledge for a society.
[01:03:45.600 --> 01:03:48.000] Epistemic infrastructure.
[01:03:48.320 --> 01:03:50.800] Universities are part of that.
[01:03:51.120 --> 01:03:53.840] Primary schools are part of that.
[01:03:54.160 --> 01:03:56.960] The legal system is part of that.
[01:03:57.280 --> 01:04:01.440] The National Science Foundation is part of that.
[01:04:01.440 --> 01:04:02.400] And so on.
[01:04:02.400 --> 01:04:05.600] Journalism is part of that.
[01:04:05.920 --> 01:04:14.800] It provides the infrastructure that helps us, just like a bridge helps us get where we want to go physically.
[01:04:15.120 --> 01:04:28.560] Epistemic infrastructure provides a way for us to get to what's rational, to get to consensus using reason and evidence.
[01:04:29.840 --> 01:04:38.880] So if we want to have progress of the sort you were talking about, we better protect that infrastructure.
[01:04:38.880 --> 01:04:39.680] Yeah.
[01:04:40.640 --> 01:04:41.360] Yeah, I agree.
[01:04:41.360 --> 01:04:43.040] And that's what your book is all about.
[01:04:43.040 --> 01:04:46.160] And you've argued for so well for your career.
[01:04:46.160 --> 01:04:56.400] You know, one of my favorite quotes in this space comes from John Stuart Mill: a party of order and stability and a party of progress and change are both necessary for a civil society.
[01:04:56.400 --> 01:05:04.440] Are we past that window of flexibility with, I don't know, the mega right and the crazy far-woke progressive left?
[01:05:04.600 --> 01:05:10.200] And I don't know what craziness is going on these days, and not just here, but in other countries.
[01:05:12.440 --> 01:05:15.240] Well, I don't know.
[01:05:16.520 --> 01:05:33.880] But I do know that right now, in my country, in the U.S., our country, we're witnessing what I would call an unprecedented attack on our epistemic infrastructure.
[01:05:33.880 --> 01:05:37.720] On the infrastructure of science and the law.
[01:05:39.640 --> 01:06:00.360] I do think that once that happens, once that infrastructure is damaged to the point of no repair, the democracy will fade or even blink out.
[01:06:02.120 --> 01:06:03.000] Wow.
[01:06:03.640 --> 01:06:06.920] So I think we are at a state of crisis.
[01:06:06.920 --> 01:06:07.560] Yeah.
[01:06:08.200 --> 01:06:11.400] More than when I even wrote the book.
[01:06:13.000 --> 01:06:17.960] And I think we can see that with the defunding of science.
[01:06:18.280 --> 01:06:19.160] And I'm not.
[01:06:19.480 --> 01:06:21.080] I'm not a scientist.
[01:06:21.080 --> 01:06:23.080] I'm a philosopher.
[01:06:23.720 --> 01:06:29.560] But I'm a philosopher that respects science and history and the law.
[01:06:31.160 --> 01:06:34.760] So, are we past democracy?
[01:06:34.760 --> 01:06:36.440] No, we're not.
[01:06:36.400 --> 01:06:38.280] Of course, there's soap.
[01:06:38.920 --> 01:06:41.880] But there are warning signs.
[01:06:42.200 --> 01:06:49.120] And, you know how when it comes to real infrastructure, like bridges to roads.
[01:06:50.080 --> 01:06:54.160] If the bridge gets dangerous, we put up a warning sign.
[01:06:54.880 --> 01:06:56.240] Bridges out.
[01:06:56.640 --> 01:06:57.840] Be careful.
[01:06:58.160 --> 01:06:59.600] Right, hopefully.
[01:07:00.560 --> 01:07:05.440] That is, if our infrastructure is working, we put up that.
[01:07:07.040 --> 01:07:15.200] Well, the warning signs are blanking right now about our epistemic infrastructure.
[01:07:15.520 --> 01:07:32.080] The warning signs about whether going forward we're going to have rational progress as opposed to just blind change in our society.
[01:07:32.400 --> 01:07:35.040] Those signs are blanky red.
[01:07:35.040 --> 01:07:35.520] Yeah.
[01:07:36.080 --> 01:07:41.520] I think, if I can riff for just a moment, that those assaults come from both the right and the left.
[01:07:41.520 --> 01:07:54.800] I'm worried about the academy and that kind of that's where the whole post-truth thing comes from in the first place, post-modernism, and that all cultures are equal and all people's truths are equal and so on.
[01:07:55.200 --> 01:08:05.920] And then the whole crisis during the pandemic, you know, about masks and social distancing and school closures, all the, you know, there was a lot of misinformation.
[01:08:05.920 --> 01:08:09.520] I would have preferred our policymakers be more Bayesian.
[01:08:09.520 --> 01:08:11.920] Like, we think you should wear masks or social distance.
[01:08:11.920 --> 01:08:13.200] We're not sure, really.
[01:08:13.440 --> 01:08:14.800] But this is what we think this week.
[01:08:14.800 --> 01:08:16.000] This could all change next week.
[01:08:16.000 --> 01:08:18.960] Instead, they made these, you know, dramatic pronouncements.
[01:08:18.960 --> 01:08:20.560] Now we know this is true.
[01:08:20.560 --> 01:08:21.680] But they didn't know that.
[01:08:21.680 --> 01:08:23.280] And we now know they didn't know that.
[01:08:23.280 --> 01:08:31.320] So the public trust in the CDC and people like Anthony Fauci and Francis Collins, who were always trustworthy, I always had great respect for them.
[01:08:31.640 --> 01:08:33.240] You know, that's plummeted.
[01:08:33.240 --> 01:08:38.040] You know, so a lot of this backlash from the right is because of what happened on the left.
[01:08:38.120 --> 01:08:42.680] So it goes both ways, is all I'm trying to say and be politically neutral here, I guess.
[01:08:43.240 --> 01:08:47.080] Because I'm worried too, in both directions about that.
[01:08:47.080 --> 01:08:49.080] I guess it depends on the time horizon.
[01:08:49.080 --> 01:08:52.920] I tend to be an optimist, and I think this too will pass.
[01:08:52.920 --> 01:09:00.520] You know, I don't know, the trans movement, the MAGA movement, the this, that, these culture wars that I'm, you know, I waste way too much time on.
[01:09:00.520 --> 01:09:02.120] I'm one of your twit bookieans.
[01:09:02.120 --> 01:09:04.360] I don't do Facebook, but I do Twitter.
[01:09:04.680 --> 01:09:09.960] And last weekend, I was off social media for three days.
[01:09:09.960 --> 01:09:12.920] And it was just, I realized I don't really miss any of that.
[01:09:12.920 --> 01:09:15.000] It's just, you know, maybe we don't need all that.
[01:09:15.000 --> 01:09:16.040] But that could all change.
[01:09:16.040 --> 01:09:18.600] You know, I mean, the smartphone is only 2007.
[01:09:18.600 --> 01:09:20.840] Facebook, 2007, 2008.
[01:09:20.840 --> 01:09:25.320] Instagram is what, 2012, 13, something like that this is not that long ago, right?
[01:09:25.320 --> 01:09:28.600] This, maybe in 10, 20 years, none of this technology.
[01:09:28.600 --> 01:09:31.800] It'll be something completely different that you and I can't think of.
[01:09:31.800 --> 01:09:37.800] And then I'm hoping we're worried about much ado about nothing, but you know, we should pay attention.
[01:09:41.000 --> 01:09:41.880] Yes, sir.
[01:09:42.200 --> 01:09:43.800] I hope you're right.
[01:09:44.440 --> 01:09:45.400] Yes.
[01:09:45.720 --> 01:09:46.440] All right, Michael.
[01:09:46.440 --> 01:09:51.640] I know you got a hard out coming out in five minutes, and I want to save your voice in any case for your next interviews for your book.
[01:09:51.720 --> 01:09:54.440] So I'll just thank you and again, plug the book.
[01:09:54.600 --> 01:09:58.840] Really a great read on truth and politics, why democracy demands it.
[01:09:58.840 --> 01:09:59.560] Indeed, it does.
[01:09:59.560 --> 01:10:01.080] I listened to it on audio.
[01:10:01.080 --> 01:10:02.280] Nice read.
[01:10:02.280 --> 01:10:04.280] And read the book as well.
[01:10:04.280 --> 01:10:07.960] So, congratulations, and thanks for your work.
[01:10:08.280 --> 01:10:10.280] Thanks for having me, Michael.