Debug Information
Processing Details
- VTT File: skepticast2024-11-16.vtt
- Processing Time: September 09, 2025 at 08:39 AM
- Total Chunks: 3
- Transcript Length: 173,565 characters
- Caption Count: 1,722 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:01.040 --> 00:00:06.640] 16 years from today, Greg Gerstner will finally land the perfect cannonball.
[00:00:06.960 --> 00:00:21.920] Epic Splash, Unsuspecting Friends, a work of art only possible because Greg is already meeting all these same people at AARP volunteer and community events that keep him active and involved and help make sure his happiness lives as long as he does.
[00:00:21.920 --> 00:00:25.840] That's why the younger you are, the more you need AARP.
[00:00:25.840 --> 00:00:29.680] Learn more at AARP.org/slash local.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.800] Hello and welcome to the Skeptic's Guide to the Universe.
[00:00:42.800 --> 00:00:47.680] Today is Wednesday, November 13th, 2024, and this is your host, Stephen Novella.
[00:00:47.680 --> 00:00:49.600] Joining me this week are Bob Novella.
[00:00:49.600 --> 00:00:50.240] Hey, everybody.
[00:00:50.240 --> 00:00:51.520] Kara Santa Maria.
[00:00:51.520 --> 00:00:51.920] Howdy.
[00:00:52.000 --> 00:00:52.960] Jay Novella.
[00:00:52.960 --> 00:00:53.760] Hey, guys.
[00:00:53.760 --> 00:00:55.760] And Evan Bernstein.
[00:00:55.760 --> 00:00:57.040] Good evening, everyone.
[00:00:57.040 --> 00:00:58.640] The year is going fast.
[00:00:58.640 --> 00:01:01.120] It's almost the end of 2024.
[00:01:01.600 --> 00:01:05.120] My gosh, we will be recording our year-end episode before you know it.
[00:01:05.120 --> 00:01:05.760] That's right.
[00:01:06.160 --> 00:01:15.680] This is always a good time of the year to remind people that we have published two books: Skeptic's Guide to the Universe and The Skeptic's Guide to the Future.
[00:01:16.000 --> 00:01:20.480] So if you have joined us recently and they're not aware of this fact, check us out.
[00:01:20.960 --> 00:01:24.640] You could get to those books through our website or through Amazon.
[00:01:24.640 --> 00:01:26.880] They make fantastic holiday gifts.
[00:01:26.880 --> 00:01:31.120] You can spread the gift of skepticism to everyone in your life.
[00:01:31.120 --> 00:01:33.440] So check them out if you haven't read them already.
[00:01:33.440 --> 00:01:37.280] And some people have asked us, are they in digital form?
[00:01:37.280 --> 00:01:37.680] Yes.
[00:01:37.680 --> 00:01:39.040] Are they in audio form?
[00:01:39.040 --> 00:01:39.440] Yes.
[00:01:39.440 --> 00:01:41.120] You can get they are read.
[00:01:41.120 --> 00:01:42.800] Both are read by me.
[00:01:43.120 --> 00:01:47.280] But you can also, if you'd like to read on Kindle, you can get them in digital form as well.
[00:01:47.280 --> 00:01:48.240] So, check that out.
[00:01:48.240 --> 00:01:52.080] We're working on a version of Darth Vader reading it, so I'll let you know when that comes out.
[00:01:52.640 --> 00:01:58.960] Oh, have you guys seen any of the latest UAP congressional testimony today?
[00:01:58.960 --> 00:01:59.360] I have not.
[00:01:59.600 --> 00:02:00.120] I heard about it.
[00:01:59.760 --> 00:02:02.200] I didn't get a chance to watch it.
[00:02:02.200 --> 00:02:03.320] Yeah, Jay and I watched it.
[00:01:59.920 --> 00:02:05.640] I think we talked about it on the live stream today.
[00:02:05.960 --> 00:02:08.600] It's as worthless as it's been.
[00:02:08.840 --> 00:02:16.360] Essentially, the UAP is the new name for UFOs, unidentified anomalous phenomena.
[00:02:16.360 --> 00:02:16.920] Okay.
[00:02:16.920 --> 00:02:21.800] I guess it's technically more accurate, more inclusive.
[00:02:21.800 --> 00:02:22.920] But it's still the same thing.
[00:02:22.920 --> 00:02:27.800] It's the same people talking about the same crappy evidence for the last 50 years.
[00:02:27.800 --> 00:02:29.320] They had nothing new.
[00:02:29.320 --> 00:02:31.400] There's just simply nothing going on.
[00:02:31.880 --> 00:02:39.080] One of the bits that we saw today, the UFO proponent was saying, there was this, you know, it's the residue argument, what I call the residue arguments.
[00:02:39.160 --> 00:02:47.400] Like, yes, you could explain 98% of whatever of these sightings of UAP reports, but there's that subset that you can't explain.
[00:02:47.400 --> 00:02:51.880] As if, like, that means there must be something interesting going on.
[00:02:51.880 --> 00:02:54.520] But that's going to be true of anything.
[00:02:54.520 --> 00:03:03.240] Anytime you have hundreds or thousands of whatever, you're not going to have enough information to necessarily definitively explain everything.
[00:03:03.240 --> 00:03:05.960] There's always going to be some unusual cases.
[00:03:05.960 --> 00:03:06.760] And then for people.
[00:03:07.000 --> 00:03:08.440] Without good evidence about that.
[00:03:08.520 --> 00:03:09.320] Yeah, exactly.
[00:03:09.320 --> 00:03:17.640] So the people who have reviewed this said that the reason you can't explain that one or two percent is not because they're fantastical.
[00:03:17.640 --> 00:03:19.640] It's because the evidence is crappy.
[00:03:19.640 --> 00:03:33.560] It says we don't have enough evidence to know what they were, which fits something else that we say all the time, which is like with UFOs, just like with Bigfoot and with everything else, the ambiguity, right, the fuzziness is the phenomenon.
[00:03:33.560 --> 00:03:35.000] That is the phenomenon.
[00:03:35.320 --> 00:03:43.960] It is just the residue of low-quality evidence that or cases that are going to be there for any phenomenon.
[00:03:43.960 --> 00:03:46.400] Yeah, this is going to apply to like every field of every.
[00:03:46.400 --> 00:03:47.600] I think about medicine, right?
[00:03:44.840 --> 00:03:51.920] Like there's a drug, and you take the drug and you get better, but that percentage of people didn't get better.
[00:03:52.240 --> 00:03:54.720] And you look at the autopsy and you can't figure out why.
[00:03:54.720 --> 00:03:57.760] You're not going to assume they didn't get better because magic.
[00:03:57.760 --> 00:03:58.320] Right, exactly.
[00:03:58.720 --> 00:04:03.360] You're just going to assume you couldn't figure it out, but something biological was happening.
[00:04:04.000 --> 00:04:04.480] That's so good.
[00:04:04.640 --> 00:04:05.920] I believe we call this logical fallacy.
[00:04:06.160 --> 00:04:06.960] Good analogy.
[00:04:07.040 --> 00:04:08.720] God of the gaps fallacy.
[00:04:08.720 --> 00:04:09.680] Yes, exactly.
[00:04:09.680 --> 00:04:10.160] Yeah.
[00:04:10.160 --> 00:04:12.160] I can't explain it, therefore, magic.
[00:04:12.160 --> 00:04:14.800] Right, therefore, something fantastic.
[00:04:14.800 --> 00:04:15.280] Yeah.
[00:04:15.280 --> 00:04:16.320] Fill in the blank.
[00:04:16.320 --> 00:04:20.320] And that is where, but that is where all paranormal and all pseudoscience lives.
[00:04:20.320 --> 00:04:21.840] It lives in that gap.
[00:04:22.160 --> 00:04:23.360] Yeah, so nothing new.
[00:04:23.360 --> 00:04:24.560] Same old crap.
[00:04:24.560 --> 00:04:25.280] More of the same.
[00:04:25.280 --> 00:04:26.080] Thank you, Congress.
[00:04:26.080 --> 00:04:27.120] Waste our time, waste our money.
[00:04:27.680 --> 00:04:28.080] Once again.
[00:04:28.080 --> 00:04:35.120] All right, Evan, you're going to get us started with a quickie about advertising magic in Kyrgyzstan.
[00:04:35.840 --> 00:04:36.640] Kyrgyzstan.
[00:04:36.640 --> 00:04:37.440] Kyrgyzstan.
[00:04:37.760 --> 00:04:39.440] Good news, everyone.
[00:04:39.440 --> 00:04:41.520] We're all moving to Kyrgyzstan.
[00:04:42.640 --> 00:04:55.040] And I think I'm going to call these segments going forward good news, everyone, because, you know, occasionally there is some good news in the world of skepticism and rational thought to talk about, and this could be one of them.
[00:04:55.040 --> 00:05:12.080] 2024, yes, Kyrgyzstan enacted earlier this year legislation prohibiting the advertising of services related to the occult and mysticism, which includes things like fortune-telling, shamanism, divination, spiritualism, clairvoyance, and magic.
[00:05:12.080 --> 00:05:13.360] What were we just talking about?
[00:05:13.360 --> 00:05:14.000] Yep.
[00:05:14.000 --> 00:05:22.640] And this band extends across various media platforms, including the internet, outdoor advertising, radio, and television.
[00:05:22.640 --> 00:05:25.680] Whoa, does that mean they can't advertise church?
[00:05:26.000 --> 00:05:27.840] Well, that's interesting.
[00:05:28.160 --> 00:05:30.760] I was going to bring up that more towards the end of the day.
[00:05:31.080 --> 00:05:31.480] Okay, sorry.
[00:05:29.520 --> 00:05:32.760] But no, that's okay.
[00:05:29.840 --> 00:05:34.120] Since you brought it up, I will mention it.
[00:05:34.360 --> 00:05:47.240] So, this is a primarily Muslim country, and the banning of these kinds of practices is very much in accord with Muslim practices.
[00:05:47.240 --> 00:05:53.720] You're not supposed to be doing this thing because it is in violation of the tenets of the religion.
[00:05:53.720 --> 00:06:07.000] So, it's as much a protective measure, probably, for the people of the country, which it is, as it is a, you know, further sort of solidification of the religious culture that permeates.
[00:06:07.000 --> 00:06:09.160] So, it's kind of a combination of both.
[00:06:09.160 --> 00:06:09.880] But you know what?
[00:06:10.040 --> 00:06:14.280] I'll take it because I think other countries should also be doing this stuff.
[00:06:14.680 --> 00:06:16.600] Do we know where Kyrgyzstan is?
[00:06:16.760 --> 00:06:17.880] It's one of the stands.
[00:06:17.880 --> 00:06:18.440] It's one of the stands.
[00:06:18.520 --> 00:06:20.360] Yes, it is one of the stands.
[00:06:20.360 --> 00:06:22.840] And you've been to which one?
[00:06:22.840 --> 00:06:23.720] Kajakstan.
[00:06:23.720 --> 00:06:24.120] Kazakhstan.
[00:06:24.760 --> 00:06:25.480] Yeah, which I borderled.
[00:06:25.640 --> 00:06:26.600] Which borders China.
[00:06:26.920 --> 00:06:27.560] It does.
[00:06:27.560 --> 00:06:27.960] Yep.
[00:06:27.960 --> 00:06:28.520] Yep.
[00:06:28.520 --> 00:06:34.920] So, yeah, and this Kyrgyzstan does also border Kazakhstan, Ubekistan, Tajikistan, and China.
[00:06:34.920 --> 00:06:38.120] And it's 90% covered by mountains, this country.
[00:06:38.120 --> 00:06:40.680] So about 6 million people live there.
[00:06:40.680 --> 00:06:44.280] And they have a nomadic lifestyle traditionally in rural areas.
[00:06:44.280 --> 00:06:44.760] Yurts.
[00:06:44.760 --> 00:06:45.720] Have you ever heard of yurts?
[00:06:45.720 --> 00:06:46.920] Those are those tents.
[00:06:47.240 --> 00:06:53.720] I actually saw a, I don't know, one of these decoration shows, house decor shows, and they talked about yurts.
[00:06:53.720 --> 00:06:55.320] So it was the whole show was all about yurts.
[00:06:55.480 --> 00:06:56.360] I love a good yurt.
[00:06:56.360 --> 00:06:57.800] It was pretty fascinating.
[00:06:58.120 --> 00:06:58.600] I do.
[00:06:58.600 --> 00:06:59.960] I've stayed in a yurt before.
[00:06:59.960 --> 00:07:00.360] I like it.
[00:07:00.680 --> 00:07:01.320] Wow.
[00:07:01.320 --> 00:07:07.800] And, oh, they are the first country in the world to commit to protecting snow leopards, which is an endangered species found in their mountains.
[00:07:07.800 --> 00:07:09.880] So good on them for that as well.
[00:07:09.880 --> 00:07:15.000] Now, the ban extends to, yes, all the medias, broadcast media, social media.
[00:07:16.000 --> 00:07:28.240] The measure is designed to prevent unscrupulous citizens from taking advantage of vulnerable segments of society, says Marlon Mamotailayev, who's one of the sponsors of the amendments.
[00:07:28.240 --> 00:07:31.760] It aligns with similar actions in neighboring countries.
[00:07:32.080 --> 00:07:37.200] Tajikistan, as I mentioned before, they have criminalized sorcery and fortune-telling as well.
[00:07:37.200 --> 00:07:44.800] They characterize these practices as being perpetrated by fraudsters, and they implement strict measures against these activities.
[00:07:44.800 --> 00:07:54.160] In fact, the fines, if you're fined for this activity, it's usually about like a half a year's worth of your money that you would earn in a year.
[00:07:54.800 --> 00:07:59.120] So that's pretty steep, relatively speaking.
[00:07:59.120 --> 00:08:03.840] Here's one example of the kind of stuff that they've been doing with the fortune tellers, right?
[00:08:03.840 --> 00:08:12.880] So the fortune tellers will purport to offer anything from predicting the future to helping find suitable spouses and also to make your businesses flourish.
[00:08:12.880 --> 00:08:19.360] So there was this woman, her name is Mavzuna, a housewife from the northern city of Khunjand.
[00:08:19.600 --> 00:08:21.440] That's her first name, didn't offer her last name.
[00:08:21.440 --> 00:08:27.760] She paid $30 to help find appropriate husbands for her three daughters, who are aged 24 to 30.
[00:08:27.760 --> 00:08:33.840] The fortune teller conducted separate sessions with these women in order to trigger their luck.
[00:08:33.840 --> 00:08:35.840] Then here's what she said.
[00:08:35.840 --> 00:08:44.480] She gave me an egg to throw into the river, and now she's expecting eligible men to turn up to ask for her daughter's hands in marriage.
[00:08:44.480 --> 00:08:48.000] All kinds of things, apparently, this is this is seems to be a common practice.
[00:08:48.000 --> 00:09:01.240] And they'll take these eggs or whatever trinkets, mostly a lot of like talismans and metal jewelry and stuff, and they'll throw them into water bodies, into rivers, and into ponds and other things as part of this.
[00:09:01.560 --> 00:09:15.480] But it's getting the problem is getting so bad that authorities there are concerned that these metal pieces are lying at the bottom of these ponds, riverbeds, and other places where people go or animals go, and it's injuring them.
[00:09:15.480 --> 00:09:15.880] Right?
[00:09:15.880 --> 00:09:28.040] They said, authority, here's one example: authorities recovered half a cart of locks from the bed of one of the rivers that could have split the heads open of unsuspecting swimmers or people diving into the water.
[00:09:28.040 --> 00:09:34.120] And the locks were actually discovered accidentally by police who were looking for evidence of some other crime that was going on.
[00:09:34.120 --> 00:09:41.320] So, so many of these things that it's clogging, you know, that it's polluting their waterways and their water systems.
[00:09:41.320 --> 00:09:49.480] That's how many of these charms and these fortune tellers are selling these people for total, total, you know, garbage and things that just won't come true.
[00:09:49.480 --> 00:09:52.920] So, that is the good news from Kyrgyzstan.
[00:09:52.920 --> 00:10:02.040] And as I said, other countries, I think, should perhaps look to them as an example and try to help their citizens in any way they can from these fraudsters.
[00:10:02.600 --> 00:10:04.120] So, they're just banning the advertising.
[00:10:04.120 --> 00:10:05.560] They're not banning the practices of the case.
[00:10:05.800 --> 00:10:06.520] Correct.
[00:10:06.520 --> 00:10:07.560] Yes, yes.
[00:10:07.560 --> 00:10:09.080] Although they do, right?
[00:10:09.080 --> 00:10:19.080] Although, if you are caught, although if you are caught and determined to be fraudulent in whatever capacity that these courts decide, you can be in big time trouble.
[00:10:19.080 --> 00:10:22.280] So, it is considered kind of mostly an underground practice.
[00:10:23.000 --> 00:10:28.040] It's, you know, the authorities have their eyes on you if you are doing these things to begin with.
[00:10:28.040 --> 00:10:29.960] So, it's definitely frowned upon.
[00:10:29.960 --> 00:10:38.440] Oh, and by the way, fun fact: there's a Kurzig epic poem called The Epic of Mana of Manas, M-A-N-A-S, which is one of the longest poems in the world.
[00:10:38.440 --> 00:10:43.240] It has over 500,000 lines to this poem.
[00:10:43.800 --> 00:10:45.200] That is an epic poem.
[00:10:45.680 --> 00:10:48.880] It would take up three large-volume books, basically.
[00:10:44.760 --> 00:10:51.840] So imagine, it's kind of like, I envision it like memorize.
[00:10:52.000 --> 00:10:58.000] If you know this poem, it's like memorizing the Lord of the Rings trilogy, basically, as a poem, as one poem.
[00:10:58.000 --> 00:10:59.040] Oh, boy.
[00:10:59.040 --> 00:10:59.360] All right.
[00:10:59.360 --> 00:11:00.240] Thanks, Evan.
[00:11:00.240 --> 00:11:00.720] Yep.
[00:11:00.720 --> 00:11:05.840] Kara, does having an armed police officer in school make kids safer?
[00:11:05.840 --> 00:11:07.040] What do you guys think?
[00:11:07.040 --> 00:11:07.520] No.
[00:11:09.120 --> 00:11:11.920] I think we probably wouldn't be talking about it if the answer were in there.
[00:11:12.160 --> 00:11:13.200] Right, you're right.
[00:11:13.520 --> 00:11:14.880] Bob says no.
[00:11:14.880 --> 00:11:16.160] Evan says, um.
[00:11:17.280 --> 00:11:18.960] I have no information to base that.
[00:11:19.280 --> 00:11:21.680] I have no statistics to base that off.
[00:11:21.680 --> 00:11:23.520] I mean, to base the decision.
[00:11:23.520 --> 00:11:26.400] I would say that schools benefit.
[00:11:26.400 --> 00:11:28.000] I don't know to what degree.
[00:11:28.400 --> 00:11:29.040] Interesting.
[00:11:29.040 --> 00:11:29.440] Okay.
[00:11:29.760 --> 00:11:47.840] So, according to Education Week's school shooting tracker, as of the last update, which was November 11th, 2024, so far there have been 35 school shootings in the United States this year that have resulted in injuries or deaths.
[00:11:47.840 --> 00:11:50.960] 217 total since 2018.
[00:11:50.960 --> 00:11:53.280] There were 38 last year.
[00:11:53.280 --> 00:12:01.360] So this year, again, up to two days ago as of this recording, 35 school shootings with injuries or deaths, 65 people killed or injured.
[00:12:01.520 --> 00:12:06.960] So that's 16 people killed, seven students, nine employees, and 49 people injured.
[00:12:06.960 --> 00:12:09.600] And they keep a map of all the school shootings.
[00:12:09.920 --> 00:12:11.600] Very sad work.
[00:12:11.600 --> 00:12:16.880] So this is, as we know here in the States, a big problem, a scary problem.
[00:12:16.880 --> 00:12:19.040] And a lot of parents are afraid.
[00:12:19.040 --> 00:12:23.520] And a lot of people are clamoring to solve this problem.
[00:12:23.520 --> 00:12:38.200] And there have been policy decisions, some of which are misguided, some of which are based in the evidence, but there up to date really hasn't been a lot of evidence to answer that question.
[00:12:38.200 --> 00:12:43.800] You know, are schools with armed police officers actually safer?
[00:12:44.120 --> 00:12:48.680] Does deploying officers in public schools deter criminal activity?
[00:12:48.680 --> 00:12:58.360] And so I'm going to talk about a few studies today, which were compiled by Rod McMullum in Undark magazine.
[00:12:58.360 --> 00:13:01.480] He wrote kind of a long feature article about this.
[00:13:01.480 --> 00:13:14.200] First, a little bit of statisticales: more than 41,000 schools employ at least one law enforcement officer, or sometimes they're called school resource officers.
[00:13:14.520 --> 00:13:17.320] And there have been big changes.
[00:13:17.320 --> 00:13:29.240] During COVID, things got kind of interesting because, of course, we saw the Black Lives Matter movement kind of take a big upswing after the murder of George Floyd.
[00:13:29.240 --> 00:13:35.320] And we also saw a lot of schools, you know, closing or modifying their practices.
[00:13:35.320 --> 00:13:45.640] And so during this time, many school districts decided to decrease funding or end school resource officer programs altogether.
[00:13:45.640 --> 00:13:54.680] And others have gone in the opposite direction, increasing the presence of armed officers within their schools.
[00:13:54.680 --> 00:14:02.520] But this article talks specifically about Chicago, which was one of the largest public school systems in the U.S.
[00:14:02.840 --> 00:14:11.960] There are 325,000 students in Chicago schools that return to classes this year without armed officers present.
[00:14:11.960 --> 00:14:17.680] And this was the first time in Chicago since the year 1966.
[00:14:18.000 --> 00:14:25.600] And so they have already started to collect some data on the outcomes of not having armed resource officers present.
[00:14:25.840 --> 00:14:38.640] And in this model, they actually took the funding that they would have put towards paying these armed police officers and they put it towards having social workers, trauma-informed mental health interventions, and social and emotional learning.
[00:14:38.640 --> 00:14:46.960] So we're going to get to some of these Chicago studies because it's really early, the information is really early.
[00:14:46.960 --> 00:15:04.720] And we're going to turn to a couple of large studies, one of which is a systematic review that was published in November of 2023 called School-Based Law Enforcement Strategies to Reduce Crime, Increase Perceptions of Safety, and Improve Learning Outcomes in Primary and Secondary Schools: A Systematic Review.
[00:15:04.720 --> 00:15:13.840] This study looked at 32 different reports yielding 1,002 effect sizes to ask a number of different questions.
[00:15:13.840 --> 00:15:31.680] But one of the big outcomes that they found was that there was no evidence that there is a safety, and I'm quoting them directly, no evidence that there is a safety-promoting component of what they call SBLE, school-based law enforcement.
[00:15:31.680 --> 00:15:44.320] And the authors, based on this large study, come out in support of the criticism that school-based law enforcement actually criminalizes students and schools.
[00:15:44.320 --> 00:16:02.520] So, not only did they find that gun violence was not reduced, and actually, other forms of violence, specifically in this study, were not reduced, there was an increase in what is often referred to as the school-to-prison pipeline.
[00:16:02.840 --> 00:16:20.840] So, the types of disciplinary actions that were taken against students, which were especially high among black and Hispanic boys, led to either suspension or, in some cases, incarceration.
[00:16:21.080 --> 00:16:35.480] And there is a ton of evidence, and we won't get into like that whole body of literature, but there's a ton of evidence that sort of like punitive action at a young age leads to higher rates of incarceration later.
[00:16:35.480 --> 00:16:45.560] And this is, there have been randomized controlled trials where students are assigned to middle schools where there's stricter punishment versus middle schools without that strict punishment.
[00:16:45.560 --> 00:16:57.800] And they find that the kids who are assigned to those stricter middle schools on average are more likely to drop out of high school, less likely to attend a four-year college, and are more likely to become incarcerated as adults.
[00:16:57.800 --> 00:17:05.160] So, we know that these punitive practices have a detrimental effect on children's development and well-being.
[00:17:05.160 --> 00:17:16.200] And studies are showing that having sworn law enforcement officers in the schools not only do not reduce gun violence, and in some cases, they don't even reduce violent crime.
[00:17:16.200 --> 00:17:36.840] Another study showed that there was a small but significant decrease in fighting, probably because of just the physicality of the police presence, but that they do actually increase the number of infractions that are being reported against these children, which can have detrimental effects later in life.
[00:17:36.840 --> 00:17:38.920] There's a couple of other things I want to point out.
[00:17:38.920 --> 00:17:48.000] So, one study showed that over 90% of schools that have a law enforcement officer present have an officer who's routinely armed.
[00:17:48.000 --> 00:17:58.480] Although some of those officers have special training in violence prevention and youth mental health, requirements vary by state, and in many cases, there is no required special training in these areas.
[00:17:58.480 --> 00:18:01.520] This article does a really interesting job of going back to the history.
[00:18:01.520 --> 00:18:14.080] I didn't realize that resource officers started in 1953 in Flint, Michigan, and then efforts increased in the 60s in large cities like LA, Chicago, and Cincinnati.
[00:18:14.080 --> 00:18:19.920] And then there's a dramatic increase after Columbine in April of 1999.
[00:18:19.920 --> 00:18:28.240] And we saw that after this huge increase by 2022, about 48% of public schools have at least one school resource officer.
[00:18:28.240 --> 00:18:32.400] And again, like I said, more than 90% of those schools have an armed officer.
[00:18:32.400 --> 00:18:39.280] So that led to the 2019 to 2020 year, about 23,400 school resource officers being employed.
[00:18:39.280 --> 00:18:41.760] Now, let's cut back to Chicago.
[00:18:41.760 --> 00:18:46.720] In Chicago, they actually reduced that presence.
[00:18:46.720 --> 00:18:53.520] They, you know, this year was the first year that students went back to school without school resource officers present.
[00:18:53.520 --> 00:19:01.840] Removing, okay, the U of Chicago Consortium on School Research put out a brief in June of this year removing police officers from Chicago schools.
[00:19:01.840 --> 00:19:06.800] And they call it SRO in this case, school resource officers.
[00:19:06.800 --> 00:19:13.440] School resource officer removal was significantly related to having fewer high-level discipline infractions.
[00:19:13.440 --> 00:19:22.880] And it was not related to changes over time in perception of physical safety or student-teacher trust, either by students or teachers.
[00:19:22.880 --> 00:19:30.000] So, when school resource officers were removed, students did not say that they felt less safe, and neither did teachers.
[00:19:30.520 --> 00:19:41.320] They also found that schools that had resource officers, the ones that retained them, were more likely to serve predominantly black students.
[00:19:41.320 --> 00:19:45.960] Black students became more than twice as likely as other students to have a resource officer.
[00:19:45.960 --> 00:19:50.760] They tended to be smaller schools, but they tended to have higher suspension rates.
[00:19:50.760 --> 00:20:00.120] And they tended to be schools where students who were eligible for free or reduced lunch, non-English learners or kids who were in special education.
[00:20:00.440 --> 00:20:07.000] So, these are schools where obviously populations are more vulnerable, are more likely to have these resource officers there.
[00:20:07.000 --> 00:20:25.480] And those resource officers were not over time reducing or effective in reducing gun violence or gun crime, but they were effective in criminalization of students who were there to go to school because infractions were identified and magnified.
[00:20:25.480 --> 00:20:32.600] So, across the board, what we're starting to see, and there's not much literature on this subject, but it's gaining.
[00:20:32.600 --> 00:20:44.840] Across the board, we are starting to see that while some of the studies have cited kind of small benefits, most of them are showing very large detriments.
[00:20:44.840 --> 00:20:52.760] And there is no like detectable, there's no significant difference in outcomes when it comes to school safety.
[00:20:52.760 --> 00:21:02.760] So, the children are no safer, the children and the adults are no safer, and there is no detectable evidence saying that having police in schools is reducing gun violence at all.
[00:21:02.760 --> 00:21:03.960] Yeah, that's interesting.
[00:21:04.920 --> 00:21:06.280] There's a lot of good stuff here.
[00:21:06.280 --> 00:21:08.040] It's not surprising to me.
[00:21:08.040 --> 00:21:29.920] I think this is an area where I have like dug a little bit deep into the research in the past, but I think for some people it is surprising and it is pretty counterintuitive that putting police officers in school leads to more arrests and more expulsions of students, very often disproportionately black and brown students, and very often student disproportionately students with disabilities.
[00:21:29.920 --> 00:21:39.040] But what it doesn't do is it doesn't make them safer from physical harm and violence, which was very often the whole argument, right?
[00:21:39.040 --> 00:21:42.080] It's the whole reasoning behind having these police officers present.
[00:21:42.080 --> 00:21:50.320] Yeah, I mean, it's easy to imagine that for most schools, there's probably not going to be events that require an armed officer present.
[00:21:50.320 --> 00:21:55.280] And even if they are present, as we know, like from that, what was it in that Texas case?
[00:21:55.280 --> 00:21:57.040] It doesn't always help, right?
[00:21:57.360 --> 00:22:08.720] No, not only does it not always help, sometimes there is some evidence to show that when teachers are armed, it can actually lead to more detriment, not less.
[00:22:08.720 --> 00:22:12.400] But we are seeing that there are laws that are being passed.
[00:22:12.400 --> 00:22:17.600] Like there's a law in Texas right now that requires schools to have armed guards.
[00:22:17.600 --> 00:22:21.360] And the Florida law, I think, is even more intense.
[00:22:21.360 --> 00:22:24.240] It requires that somebody in the school be armed.
[00:22:24.240 --> 00:22:27.120] It doesn't even have to be a guard, it can be a teacher.
[00:22:27.120 --> 00:22:31.200] Well, I mean, I think it's also a good area to continue to do research.
[00:22:31.200 --> 00:22:35.600] I doubt this one study is going to change, or there are a few studies that are going to change policy.
[00:22:37.200 --> 00:22:39.200] And they are systematic reviews, by the way.
[00:22:39.200 --> 00:22:43.040] So these are reviews of multiple studies.
[00:22:43.040 --> 00:22:43.440] Right.
[00:22:43.440 --> 00:22:44.480] Okay, thanks, Kara.
[00:22:44.480 --> 00:22:44.640] Yep.
[00:22:44.800 --> 00:22:47.920] Jay, tell us about training medical robots.
[00:22:47.920 --> 00:22:49.520] Yeah, I really like this one, guys.
[00:22:49.520 --> 00:22:58.160] Johns Hopkins University researchers collaborated with Stanford University researchers, and they achieved something that's pretty significant.
[00:22:58.160 --> 00:23:01.240] They're calling it a breakthrough in medical robots.
[00:23:01.240 --> 00:23:10.520] So, you know, for the first time, they were able to train a robot to perform surgical procedures by using video recordings of human surgeons.
[00:23:10.520 --> 00:23:14.760] And this is definitely different than any other way this has been done in the past.
[00:23:14.760 --> 00:23:23.480] This method is called imitation learning, and it essentially eliminates the need for manual programming of the robotic movements, right?
[00:23:23.480 --> 00:23:29.560] This is like when a surgeon will actually, you know, a surgeon's movements will be recorded as they go, right?
[00:23:30.120 --> 00:23:31.720] Which is very difficult.
[00:23:31.720 --> 00:23:36.760] This moves robotic surgery a lot closer to full autonomy, like in the movies and stuff.
[00:23:36.760 --> 00:23:45.640] Like, you know, we've seen this in different science fiction movies where you're seeing like, you know, some type of extensive type of surgery being done, you know, in outer space or whatever.
[00:23:45.640 --> 00:23:52.440] What they now know can be done is robots can perform complex procedures without any human involvement.
[00:23:52.440 --> 00:23:56.680] I know that sounds kind of outlandish, but they were able to do it.
[00:23:56.680 --> 00:24:01.640] The researchers focused on an existing system called the Da Vinci Surgical System.
[00:24:01.640 --> 00:24:04.120] This is a widely used robotic platform.
[00:24:04.120 --> 00:24:07.240] It's, you know, it's considered to be cutting-edge modern medicine.
[00:24:07.240 --> 00:24:21.640] And to overcome that system's current limitations in precision, which is obviously a huge problem, they turned to video recordings captured by wrist cameras attached to the Da Vinci robot's arm during surgeries, right?
[00:24:21.640 --> 00:24:31.800] So, what this means is these videos were originally created for post-operative analysis, where they document techniques and movements of experienced surgeons, right?
[00:24:31.800 --> 00:24:43.320] With nearly 7,000 Da Vinci robots today in use globally and more than 50,000 trained surgeons, the researchers had access to a massive archive of surgical footage.
[00:24:43.320 --> 00:24:46.960] This was really the base of this whole achievement.
[00:24:44.840 --> 00:24:49.680] That footage was absolutely significant.
[00:24:50.000 --> 00:24:56.240] It proved to be an amazing amount of data for them to incorporate into computer learning.
[00:24:56.240 --> 00:24:58.480] So they used hundreds of these recordings.
[00:24:58.480 --> 00:25:00.720] I guess they picked the absolute best ones.
[00:25:00.720 --> 00:25:11.760] And the research team trained their robotic model to perform the basic surgical tasks, such as things like needle manipulation, something called tissue lifting, and suturing.
[00:25:11.760 --> 00:25:19.440] So unlike traditional programming methods which we have today, these require manually coding every step of a procedure.
[00:25:19.440 --> 00:25:20.800] You know, think about that.
[00:25:21.120 --> 00:25:31.200] In order for a robot to perform surgery the old way, they literally have to hand code in all those procedures, and it takes an incredibly long time to do it.
[00:25:31.200 --> 00:25:35.840] The new approach relies on machine learning, which is incredible at this.
[00:25:36.240 --> 00:25:39.920] Machine learning is so powerful when it comes to things like this.
[00:25:39.920 --> 00:25:47.600] So the model combines imitation learning with advanced AI architecture that's similar to those powering Chat GPT, of course.
[00:25:47.600 --> 00:25:54.400] However, instead of processing language, the model speaks in something called kinematics.
[00:25:54.400 --> 00:25:55.600] Have you guys ever heard of that?
[00:25:55.600 --> 00:25:56.640] Yeah, it's just like movement.
[00:25:57.200 --> 00:26:00.000] It's the mathematical representation of robotic movement.
[00:26:00.000 --> 00:26:01.280] So think about that.
[00:26:01.280 --> 00:26:09.040] A chat GPT-like AI system was programmed to speak in mathematical robotic movements.
[00:26:09.040 --> 00:26:10.480] That is so brilliant.
[00:26:10.480 --> 00:26:11.760] I just love that.
[00:26:11.760 --> 00:26:18.480] So historically, training the DaVinci robot to perform any single task was, like I said, incredibly labor-intensive.
[00:26:18.480 --> 00:26:20.480] And I want to give you more details about that.
[00:26:20.480 --> 00:26:24.720] It required programs to code every single movement meticulously.
[00:26:24.720 --> 00:26:36.680] And this took, you know, if you want to pick one specific movement that would be involved in one procedure, modeling the steps involved in suturing for particular types of surgery could take up to guess how many years, guys?
[00:26:36.680 --> 00:26:37.880] I would guess a decade.
[00:26:37.880 --> 00:26:39.160] Right, a decade.
[00:26:39.160 --> 00:26:39.640] Wow.
[00:26:39.640 --> 00:26:40.600] Yeah, I read the article.
[00:26:40.600 --> 00:26:40.760] Yes.
[00:26:41.080 --> 00:26:46.040] Yeah, that's to program every discrete step and all the exceptions and stuff.
[00:26:46.040 --> 00:26:54.440] Yeah, a decade still seems like a little long to me, but yeah, it's a hugely onerous process, especially compared to this little breakthrough they got here.
[00:26:54.440 --> 00:26:55.720] This seems pretty sweet.
[00:26:55.720 --> 00:27:03.800] Yeah, so that decade-long process was incredibly restrictive in the flexibility and scalability of robotic surgery.
[00:27:03.800 --> 00:27:08.200] It was one of the reasons why it didn't proliferate as much as we'd liked it to.
[00:27:08.200 --> 00:27:16.520] So by contrast, the new imitation learning methods allow robots to learn these complex tasks in just a few days, guys.
[00:27:16.520 --> 00:27:17.880] A few days.
[00:27:17.880 --> 00:27:19.880] 10 years to days.
[00:27:19.880 --> 00:27:21.320] Just incredibly unreal.
[00:27:21.320 --> 00:27:29.320] But don't forget, though, I think it said something that 100 videos is a good sample for it to get a good handle on what's required.
[00:27:29.320 --> 00:27:33.240] So 100 videos, that's 100 surgeries, essentially.
[00:27:33.640 --> 00:27:34.040] That's a lot.
[00:27:34.040 --> 00:27:34.520] That's a lot of surprise.
[00:27:34.600 --> 00:27:36.440] But those surgeries are being done anyway.
[00:27:36.760 --> 00:27:37.000] Right.
[00:27:37.640 --> 00:27:40.680] The fact is, we have those surgeries on video.
[00:27:40.680 --> 00:27:41.320] We have them.
[00:27:42.280 --> 00:27:43.080] It's amazing.
[00:27:43.320 --> 00:27:53.560] And as new surgery methods come out, of course, I'm sure that they would videotape them in the correct way to continue to let this happen, right?
[00:27:53.560 --> 00:27:59.400] So one of the key advancements reported lies in the robot's ability to adapt its movements, right?
[00:27:59.400 --> 00:28:05.080] Which is really important because every human body isn't the same and every single situation isn't going to be exactly the same.
[00:28:05.080 --> 00:28:13.640] So, traditional robotic systems had to rely on this rigid, pre-programmed actions, and that's very error-prone in the real world.
[00:28:13.640 --> 00:28:14.800] Oh, it sounds totally brittle.
[00:28:14.800 --> 00:28:15.120] Yeah.
[00:28:14.680 --> 00:28:16.640] Yeah, it's really scary, actually.
[00:28:17.280 --> 00:28:24.560] Training the models to use relative movements instead of fixed motions, the researchers enhanced the robot's precision and flexibility, right?
[00:28:24.560 --> 00:28:36.640] This allows the robot to perform tasks with skill comparable to experienced human surgeons and even recover from mistakes like retrieving a dropped needle and then continuing on with the procedure.
[00:28:36.640 --> 00:28:37.360] So, the break.
[00:28:37.520 --> 00:28:46.080] That one, Jay, that one caught my attention because, yeah, that's cool that it's kind of like some emergent behavior where it knew just to pick it up and continue.
[00:28:46.080 --> 00:28:49.680] But then I thought, well, wait a second, should that needle be sterilized now that it's fallen on the floor?
[00:28:50.160 --> 00:28:53.760] It fell in the person being operated on.
[00:28:53.760 --> 00:28:58.320] And I'm sure it would be an easy thing to fix if they said, you know, don't pick up anything that had hit the floor.
[00:28:58.960 --> 00:29:00.080] Unless it's a heart or something.
[00:29:00.400 --> 00:29:02.000] And you got to pick it up.
[00:29:02.000 --> 00:29:04.560] Yeah, well, those are the kinds of errors that I'd be afraid of.
[00:29:04.560 --> 00:29:14.400] Something that's, you know, that's unusual and rare, something you wouldn't necessarily find even within 100 surgeries, but still kind of like, oh boy, don't do that.
[00:29:14.400 --> 00:29:16.240] A human, any human would know not to do it.
[00:29:16.320 --> 00:29:18.320] Did organs really fall on the floor before?
[00:29:18.560 --> 00:29:20.480] But would you actually be afraid of that, Bob?
[00:29:20.480 --> 00:29:26.640] Because here's the thing: I think it's a little silly to have this standard be that there is no human involved at all.
[00:29:26.640 --> 00:29:28.880] There will always be a human involved.
[00:29:29.280 --> 00:29:33.360] But maybe they're just observing as opposed to actually operating.
[00:29:33.600 --> 00:29:40.800] Or even someone partially knowledgeable enough to know if something's really going off the rails would be nice too, but not necessarily a surgeon.
[00:29:40.800 --> 00:29:41.360] Because that's the thing.
[00:29:41.360 --> 00:29:41.600] Oh, yeah.
[00:29:41.840 --> 00:29:43.440] You need to have a surgeon involved.
[00:29:43.440 --> 00:29:44.080] But you're still right.
[00:29:44.320 --> 00:29:50.320] If there is a bleed, if somebody codes, like, and anesthesia is never going to be done this way.
[00:29:50.320 --> 00:29:52.240] There's going to be an anesthesiologist there.
[00:29:52.240 --> 00:29:53.280] Well, I wouldn't say never.
[00:29:53.280 --> 00:29:57.520] I mean, look, of course, we'd be able to achieve all these things with enough time.
[00:29:58.160 --> 00:29:58.960] Holy crap.
[00:29:58.960 --> 00:30:01.160] But I don't want that to happen.
[00:29:59.680 --> 00:30:01.800] I want there to be a lot of people.
[00:30:02.280 --> 00:30:04.760] You've got to think about it in a completely different way, right?
[00:30:04.760 --> 00:30:10.520] Because, yeah, ideally, you feel better with a human doing it for all these implied reasons.
[00:30:10.520 --> 00:30:23.480] But the point is, this will make surgeries probably cost less money and makes them much more readily available to people everywhere, including people in outer space or on ships that are crossing the ocean or whatever.
[00:30:23.480 --> 00:30:28.040] I mean, I think the benefits here dramatically outweigh any of the negatives.
[00:30:28.040 --> 00:30:32.040] That was the one good thing about Prometheus, the movie, was that medical pod thing.
[00:30:32.040 --> 00:30:33.480] Oh, yeah, but dude, dude, dude.
[00:30:34.280 --> 00:30:39.240] But still, they kind of blew it because this is like centuries in the future.
[00:30:39.240 --> 00:30:42.120] There's an auto-doc, which is so cool.
[00:30:42.120 --> 00:30:46.760] And then she has some abdominal surgery, and then it staples her gut glue.
[00:30:46.840 --> 00:30:47.240] It's terrible.
[00:30:47.480 --> 00:30:49.320] Like, really, you're using staples in two seconds.
[00:30:49.640 --> 00:30:52.280] How about a foam that heals the flesh?
[00:30:52.760 --> 00:30:54.280] Some bio-glue or something.
[00:30:54.280 --> 00:30:55.320] It was just like, oh, come on.
[00:30:55.720 --> 00:31:00.360] But more importantly than that, Bob, Bob, which Star Wars robot would you want surgery?
[00:31:00.360 --> 00:31:02.680] Would you want to be your robotic surgeon?
[00:31:03.000 --> 00:31:05.000] I'd rather pick a Neil Asher Poly universe.
[00:31:05.080 --> 00:31:07.000] I didn't give you that option.
[00:31:07.480 --> 00:31:09.720] It's got to be Star Wars, and you've got to answer in five seconds.
[00:31:09.800 --> 00:31:14.840] Well, whoever handled Luke's amputated hand, I'm sure, would probably be pretty damn good.
[00:31:14.920 --> 00:31:15.640] Oh, I thought you wanted.
[00:31:16.680 --> 00:31:18.600] You wanted IG, you know?
[00:31:18.600 --> 00:31:19.320] Oh, yeah.
[00:31:20.120 --> 00:31:22.200] I don't know if he's signed off on surgery.
[00:31:22.680 --> 00:31:25.880] If I go into combat, he'd probably tear you apart.
[00:31:26.280 --> 00:31:28.600] Yeah, if I go into combat, I want egg.
[00:31:28.600 --> 00:31:30.680] But for surgery, I'm not sure.
[00:31:30.680 --> 00:31:32.440] So, one last point here, guys.
[00:31:32.440 --> 00:31:41.240] So, what they were able to do was have the robot perform discrete pieces of surgery, and now they're, ha ha, they're stitching it all together.
[00:31:41.240 --> 00:31:45.920] So, they're working on making it be able to do the whole process from beginning to end.
[00:31:45.920 --> 00:31:53.680] And again, you know, when you're doing any kind of computer programming, you just break it down into smaller and smaller bits to make it easier.
[00:31:53.680 --> 00:31:56.160] And apparently, that's what they were doing here.
[00:31:56.160 --> 00:31:58.000] But I'm excited about it.
[00:31:58.000 --> 00:32:05.600] I mean, again, if people that can't afford expensive surgeries can have that be affordable to them, I think this would be fantastic.
[00:32:05.600 --> 00:32:15.600] Oh, yeah, I think this clearly has got a pretty bright future, especially after these robots seem to be doing better than what would have been anticipated.
[00:32:15.600 --> 00:32:21.440] Just after like 100 videos, it's got a solid handle on these complex surgical procedures.
[00:32:21.680 --> 00:32:22.720] That's fantastic.
[00:32:22.800 --> 00:32:27.280] Imagine when AI improves even more and you've got even more videos.
[00:32:27.280 --> 00:32:44.720] Imagine if they got a thousand or five thousand videos from like really high-end surgeons and then maybe throw in some like biological context of a body and what a body is and how it should behave like during surgeries and things like that so that there's some more context.
[00:32:44.720 --> 00:32:46.720] I think it would be even better.
[00:32:46.720 --> 00:32:47.840] All right, thank you, Jay.
[00:32:47.840 --> 00:32:50.240] Bob, tell us about these new imaging techniques.
[00:32:50.240 --> 00:32:52.960] This is another sort of medical breakthrough kind of topic.
[00:32:52.960 --> 00:32:54.640] Yeah, oh, don't get me started.
[00:32:54.640 --> 00:32:56.320] All right, I'll say it anyway.
[00:32:56.320 --> 00:33:02.000] So, yeah, this is a new imaging technology called HIP CT, which caught my attention this week.
[00:33:02.000 --> 00:33:12.240] This technology uses fourth-generation synchrotron radiation for medical and materials imaging, far better in many ways than others, like MRIs or CAT scans.
[00:33:12.240 --> 00:33:17.760] This started when I was reading a news item recently about imaging coelacanth fossils.
[00:33:17.720 --> 00:33:30.280] Now, the article kept going on and on about the coelacanths, but I kept thinking that they were kind of burying the lead with this revolutionary imaging technology that they just kind of mentioned almost as an aside.
[00:33:29.840 --> 00:33:32.040] And then, so I was looking at another article.
[00:33:32.600 --> 00:33:40.440] This one was from a couple of months ago, but it was an interesting news item about imaging human hearts with unprecedented detail.
[00:33:40.440 --> 00:33:43.720] And they mentioned the same imaging technology and they did the same thing.
[00:33:43.720 --> 00:33:45.320] It was just kind of like an aside.
[00:33:45.320 --> 00:33:49.160] So I'm like, I want to learn about this thing.
[00:33:49.400 --> 00:33:52.440] So I did some research where they weren't burying the lead.
[00:33:52.440 --> 00:33:58.360] And I learned about HIP CT or hierarchical phase contrast tomography.
[00:33:58.360 --> 00:34:00.040] And it was fascinating.
[00:34:00.040 --> 00:34:03.720] So HIPCT was funded by Chan Zuckerberg Initiative.
[00:34:03.720 --> 00:34:09.000] It's a multinational collaboration between scientists, mathematicians, clinicians, and more.
[00:34:09.000 --> 00:34:18.680] So these images, these imaging advances that they made are based on an upgrade of the European Synchrotron Radiation Facility to create brighter X-rays.
[00:34:18.680 --> 00:34:21.080] And that's kind of the crux of this whole news item.
[00:34:21.080 --> 00:34:27.400] And about these X-rays, it's all about, it turns out, synchrotron radiation, which was fascinating to study.
[00:34:27.560 --> 00:34:31.000] I knew a little bit about it, but I really deepened my understanding.
[00:34:31.160 --> 00:34:33.640] So synchrotron radiation is simply light.
[00:34:33.640 --> 00:34:39.880] It's electromagnetic radiation that's been emitted by charged particles like electrons that are accelerating.
[00:34:39.880 --> 00:34:46.760] But I'm not talking about accelerating like Steve's wife does in her high torque Tesla, going faster and faster on the highway.
[00:34:47.000 --> 00:34:49.800] In physics, that's called linear acceleration, right?
[00:34:49.800 --> 00:34:55.560] You're going faster or you're going slower, you're slowing down, which is it's still called acceleration.
[00:34:55.560 --> 00:34:57.160] It's just negative acceleration.
[00:34:57.400 --> 00:35:00.840] We, of course, call it deceleration, but it's still acceleration.
[00:35:00.840 --> 00:35:05.480] But there's another type of acceleration in physics called centripetal acceleration.
[00:35:05.480 --> 00:35:08.840] That means that essentially you're not going straight.
[00:35:08.840 --> 00:35:10.280] You're on a curved path.
[00:35:10.280 --> 00:35:20.800] So even if your velocity is the same and you're not accelerating classically, colloquially, but you're on a curving path, technically that's acceleration as well.
[00:35:20.800 --> 00:35:22.800] And I remember first learning that years ago.
[00:35:22.800 --> 00:35:30.960] And it's still hard for me to imagine that acceleration is moving on a curved path and having nothing to do with your velocity.
[00:35:30.960 --> 00:35:32.480] But that's what it is in physics.
[00:35:32.480 --> 00:35:40.960] Now, in electrons in the magnetic field going around and around a circular facility, they're experiencing this centripetal acceleration, right?
[00:35:40.960 --> 00:35:41.920] Because they're going around.
[00:35:42.560 --> 00:35:43.600] They're not going straight.
[00:35:43.600 --> 00:35:45.760] So this is a centripetal acceleration.
[00:35:45.760 --> 00:35:57.200] And if they're also moving very, very fast at relativistic speeds, then they will emit significant amounts of synchrotron radiation, which can be anything from infrared to hard x-rays.
[00:35:57.200 --> 00:35:58.560] So why does that happen?
[00:35:58.880 --> 00:35:59.840] Why is that even happening?
[00:35:59.840 --> 00:36:02.960] I was trying to figure out why does it release energy like that?
[00:36:03.200 --> 00:36:06.800] One reason is because that's what Maxwell's equations say will happen, and it does.
[00:36:06.800 --> 00:36:07.840] So he was right.
[00:36:07.840 --> 00:36:11.040] If you're not familiar with the Maxwell's equations, definitely check them out.
[00:36:11.040 --> 00:36:23.280] But another common explanation that you'll find is that if you imagine this electron going around and around, a tiny bit of its kinetic energy is being converted into that radiation as its path changes.
[00:36:23.280 --> 00:36:26.240] Okay, so that's one way to think of what's happening.
[00:36:26.240 --> 00:36:29.280] And that's why these electrons will slow down over time.
[00:36:29.280 --> 00:36:35.520] Because if you're losing some energy, some kinetic energy that's being converted into radiation, you are going to slow down.
[00:36:35.520 --> 00:36:46.880] And that's why these facilities that create synchrotron radiation, they have to continually pump more energy into the collider so that it will stay at the same linear velocity.
[00:36:46.880 --> 00:36:51.760] So, okay, so it's these synchrotron x-rays that are changing many fields of science.
[00:36:51.760 --> 00:36:59.280] They're millions to billions of times brighter than the x-rays that you got, Steve, when you broke both your arms playing Tarzan in that tree.
[00:36:59.280 --> 00:37:00.680] I'm sure you got an X-ray.
[00:37:00.680 --> 00:37:01.320] Oh, no.
[00:37:01.320 --> 00:37:03.080] My wrists, not my arms.
[00:37:03.080 --> 00:37:05.720] Well, wrists, arms, lower arms, whatever.
[00:37:06.920 --> 00:37:13.160] So the X-rays that you got were millions to billions of times dimmer than what they're creating here.
[00:37:13.400 --> 00:37:16.520] These are the brightest X-rays on Earth that I could find.
[00:37:16.520 --> 00:37:26.200] And the relatively new fourth-generation synchrotrons are even better than the Generation 3 that were around for years, kind of like going from Chat GPT-3 to 4.
[00:37:26.200 --> 00:37:32.200] These electrons are traveling at something like 99.99999% the speed of light.
[00:37:32.200 --> 00:37:35.560] So these are like ultra-relativistic, hyper-relativistic.
[00:37:35.560 --> 00:37:41.320] Now, this intense light can be focused to reveal super high-resolution images.
[00:37:41.320 --> 00:37:51.320] So it would be like using x-rays to not only see Steve's entire mangled wrist bone, but also the individual bone cells anywhere in that bone.
[00:37:51.320 --> 00:37:53.720] So that's kind of, we can't do that now.
[00:37:53.720 --> 00:38:07.720] We can't use this radiation to look at living people yet, but that's kind of like what we're dealing with here: from seeing this one image of this one discrete object to being able to also zoom down into the cellular level.
[00:38:08.040 --> 00:38:10.200] But it's not just the light intensity that's changing.
[00:38:10.440 --> 00:38:18.920] The x-ray beam itself can be tuned to specific wavelengths to analyze specific elements in the sample that you want to learn about.
[00:38:19.160 --> 00:38:26.520] The common film x-rays that you use to image your broken bones would use broad spectrum x-rays.
[00:38:26.520 --> 00:38:32.360] So these are using, it's kind of like almost kind of like a laser where you're tuning to very specific wavelengths.
[00:38:32.440 --> 00:38:43.240] So, the main benefit of this imaging technology is, if I had to distill it down to a few words, is that it decouples the field of view from resolution.
[00:38:43.240 --> 00:38:50.960] So, that means that a wide field of view, which historically would mean low resolution, no longer needs to be that way.
[00:38:44.760 --> 00:39:00.480] You could have a wide field of view that gets the whole image, though, say an entire human body can also be extremely high resolution.
[00:39:00.480 --> 00:39:02.960] And that's one of the major advances here.
[00:39:02.960 --> 00:39:05.040] So, how is this being used today?
[00:39:05.040 --> 00:39:11.120] So, I'll go back to the coelacanth story that they're using this technology to image coelacanth fossils.
[00:39:11.120 --> 00:39:13.600] Now, do you remember those coelacanth fish?
[00:39:13.920 --> 00:39:19.600] They were extinct, apparently, for millions of years until a fisherman caught one.
[00:39:19.600 --> 00:39:21.680] Like, what was that in the 70s, Steve?
[00:39:21.680 --> 00:39:23.760] Something like 60s or 70s.
[00:39:23.760 --> 00:39:24.640] Somebody caught one.
[00:39:24.960 --> 00:39:30.960] It's like pulling up a fossil that you thought that was long dead, that should not exist.
[00:39:30.960 --> 00:39:32.960] The guy found one from the depths.
[00:39:32.960 --> 00:39:44.720] So, they're using these synchrotron x-rays to image these new coelacanth fossils that were like 250 million years old that were still in the rock.
[00:39:44.960 --> 00:39:48.240] They didn't excavate this fossil, it was still in the rock.
[00:39:48.240 --> 00:40:00.480] So, they imaged every bone in the fossil that was in the rock, and they were able to create a full 3D model of the fossil bones with a level of detail that they've never seen with this type of fossil before.
[00:40:00.480 --> 00:40:05.680] They created a complete virtual skeleton with amazing detail.
[00:40:05.680 --> 00:40:19.600] Basically, they discovered that this was a third species of coelacanth that they become aware of: the two that they have found extant that are still living, and now this other fossil is a third fossil that they're aware of.
[00:40:19.600 --> 00:40:25.600] In this heart news item that I read about, the researchers were able to x-ray a heart with amazing detail.
[00:40:25.840 --> 00:40:31.640] Senior author Peter Lee DeFil, he's a professor of material science, University of College London.
[00:40:31.640 --> 00:40:37.400] He said, The atlas that we created in this study is like having Google Earth for the human heart.
[00:40:37.400 --> 00:40:45.240] It allows us to view the whole organ at global scale, then zoom into street level to look at cardiovascular features in unprecedented detail.
[00:40:45.240 --> 00:40:46.920] So that's a really cool analogy.
[00:40:46.920 --> 00:40:56.520] He continues: One of the major advantages of this technique is that it achieves a full 3D view of the organ that's around 25 times better than a clinical CT scanner.
[00:40:56.520 --> 00:41:07.240] In addition, it could zoom into cellular level in selected areas, which is 250 times better to achieve the same detail that we would get through a microscope, but without cutting the sample.
[00:41:07.240 --> 00:41:13.160] Being able to image whole organs like this reveals details and connections that were previously unknown.
[00:41:13.160 --> 00:41:14.120] Cool stuff.
[00:41:14.120 --> 00:41:25.000] So they say that the primary limiting factor was not even the image itself, but processing the huge amount of data that's created by this HIP CT technology.
[00:41:25.000 --> 00:41:30.280] Now, so the future of HIP CT looks even brighter than the x-rays it produces.
[00:41:30.600 --> 00:41:37.080] So, I mean, medical imaging, material science, biological research, they are already benefiting from this tool.
[00:41:37.080 --> 00:41:53.400] But going further in the future, as I love to do, of course, perhaps using a fifth-generation radiation source, which probably will eventually arrive at some point, the resolution will likely go from the already awesome micron scale, which is a millionth of a meter.
[00:41:53.400 --> 00:42:03.400] It could get even more awesome to the nanoscale, of course, a billionth of a meter, looking at molecular interactions and what's going on within cells.
[00:42:03.400 --> 00:42:13.560] So, as this becomes more radiation-efficient and safer, this could eventually allow for cellular-level detail on living humans without doing any nasty tissue sampling.
[00:42:13.560 --> 00:42:21.680] Because right now, it's X vivo, which means that you could take a living sample, but you've got to remove it from whatever you're, you know, whatever living creature.
[00:42:21.760 --> 00:42:24.800] So, if it were a human, you'd have to take a sample.
[00:42:24.800 --> 00:42:27.120] So, that's you can't do it on people yet.
[00:42:27.120 --> 00:42:31.440] So, of course, typical hospitals don't have access to synchrotron facilities.
[00:42:31.440 --> 00:42:36.720] So, in the future, accessibility for routine imaging is probably going to be one of the biggest hurdles.
[00:42:36.720 --> 00:42:44.320] You know, they will make this smaller, more efficient, and cheaper, but I think there's probably going to be a limit to how small it can be.
[00:42:44.320 --> 00:42:48.880] So, it's not going to be something that's going to be at a local small hospital.
[00:42:49.200 --> 00:42:49.840] But who knows?
[00:42:49.840 --> 00:42:51.200] Who knows what's going to happen?
[00:42:51.440 --> 00:42:53.120] But it's not just, this isn't just medical.
[00:42:53.120 --> 00:42:56.240] We're also talking material science as well.
[00:42:56.240 --> 00:43:05.280] We'll be able to use this to 3D image micro and nanostructures of composites, and that could lead to more advanced alloys, nanotubes, and polymers, and more.
[00:43:05.280 --> 00:43:09.440] And, like we've said on the show many times, material science is the shit.
[00:43:09.440 --> 00:43:18.000] That's the stuff that's the stuff if you make fundamental advances in material science that can impact society at such a huge level.
[00:43:18.000 --> 00:43:20.320] So many industries, so many levels of society.
[00:43:20.800 --> 00:43:23.440] Those are the real big changes that can really make it make a difference.
[00:43:23.440 --> 00:43:25.360] So, I'll definitely be tracking this tech.
[00:43:25.360 --> 00:43:25.760] All right.
[00:43:25.760 --> 00:43:26.880] Thanks, Bob.
[00:43:27.200 --> 00:43:28.080] All right, everyone.
[00:43:28.080 --> 00:43:32.240] We're going to take a quick break from our show to talk about our sponsor this week, Rocket Money.
[00:43:32.240 --> 00:43:43.360] Yeah, Steve, most Americans think that they spend about $62 per month on subscriptions, but the real number is closer to $300, and that adds up really fast over the course of a year.
[00:43:43.360 --> 00:43:49.280] If you just forget about a couple of subscriptions that you have going right now, that can cost you a lot of money.
[00:43:49.280 --> 00:43:52.960] Managing your finances can be complicated, and it's time-consuming.
[00:43:52.960 --> 00:44:00.760] Rocket Money simplifies your finances, they make it really easy to see exactly what's happening with all your finances and track your spending.
[00:43:59.760 --> 00:44:03.960] They give you full control over all of it right on your phone.
[00:44:04.280 --> 00:44:09.960] Rocket Money is essentially a personal finance app that helps find and cancel your unwanted subscriptions.
[00:44:09.960 --> 00:44:13.080] It also monitors your spending and helps lower your bills.
[00:44:13.080 --> 00:44:17.160] See all of your subscriptions in one place and know exactly where your money is going.
[00:44:17.160 --> 00:44:21.240] For anything you don't want anymore, Rocket Money can help you cancel them with a few taps.
[00:44:21.240 --> 00:44:26.120] Rocket Money will even try to negotiate lower bills for you, sometimes by up to 20%.
[00:44:26.120 --> 00:44:36.600] Rocket Money has over 5 million users and has saved a total of half a billion dollars in canceled subscriptions, saving members up to $740 a year when using all the app's features.
[00:44:36.600 --> 00:44:38.840] So stop wasting money on things you don't use.
[00:44:38.840 --> 00:44:44.920] Cancel your unwanted subscriptions by going to rocketmoney.com/slash SGU.
[00:44:44.920 --> 00:44:49.080] That's rocketmoney.com/slash SGU.
[00:44:49.080 --> 00:44:51.720] RocketMoney.com/slash SGU.
[00:44:51.720 --> 00:44:53.880] All right, guys, let's get back to the show.
[00:44:54.360 --> 00:44:55.480] Let me ask you guys a question.
[00:44:55.480 --> 00:44:57.640] You know, I like to start off with questions.
[00:44:57.960 --> 00:44:58.760] Yes, you do.
[00:44:58.760 --> 00:44:59.080] What?
[00:45:00.520 --> 00:45:06.600] Should we police what physicians say in public on healthcare topics?
[00:45:06.600 --> 00:45:14.040] Specifically, how should the regulatory agencies respond to physicians spreading medical misinformation?
[00:45:14.200 --> 00:45:15.960] I think they should be drawn and quartered.
[00:45:17.320 --> 00:45:18.600] I think they should follow it.
[00:45:18.600 --> 00:45:24.200] They should track it, and there should be ramifications for spreading misinformation or disinformation.
[00:45:24.200 --> 00:45:26.360] So who gets to decide if it's misinformation?
[00:45:26.360 --> 00:45:31.560] Well, how is that not already part of the ethical code of the medical board?
[00:45:31.560 --> 00:45:31.960] It is.
[00:45:32.280 --> 00:45:32.640] Absolutely.
[00:45:32.520 --> 00:45:32.800] Okay.
[00:45:33.320 --> 00:45:37.080] Then, yeah, the medical board should do something about it.
[00:45:37.080 --> 00:45:37.880] So, who is policing?
[00:45:38.040 --> 00:45:41.640] So, in the U.S., in the U.S., there are state medical boards, right?
[00:45:41.640 --> 00:45:45.760] They license physicians, they license all professionals at the state level.
[00:45:44.840 --> 00:45:50.320] And the medical board also regulates those licenses.
[00:45:50.480 --> 00:46:03.600] So, if you're a physician in Connecticut, the Connecticut State Medical Board can hold you to the standard of care and also to a code of ethics, and they can take actions against your license.
[00:46:03.600 --> 00:46:05.840] Yeah, they can fully revoke it if they do something back in that.
[00:46:05.840 --> 00:46:09.360] They can censor you, they could suspend your license, they could revoke your license.
[00:46:09.360 --> 00:46:21.600] There's a range of things that they could do if there is a complaint that you've violated the standard of care or the code of ethics, and an investigation finds that to be true.
[00:46:21.600 --> 00:46:28.960] So, the question is: should a physician making anti-vaccine statements, for example, should that count?
[00:46:28.960 --> 00:46:32.000] Should that be something that violates absolutely, absolutely.
[00:46:32.320 --> 00:46:35.680] What you're saying to the public is no different than what you say to a patient.
[00:46:36.320 --> 00:46:38.320] Yeah, so there's two contexts here.
[00:46:38.320 --> 00:46:46.320] One is a physician giving misinformation directly to their patient, and then another context is giving it to the public, right?
[00:46:46.320 --> 00:46:51.040] Not to somebody that is actually their patient, but just like making statements in public.
[00:46:51.280 --> 00:46:53.360] But this question was specifically about the public.
[00:46:53.360 --> 00:46:54.640] Well, it covers both.
[00:46:55.040 --> 00:46:57.840] No, I'm saying both of those things should be violations.
[00:46:57.840 --> 00:47:03.840] Yeah, I mean, yeah, I think there'd be maybe a little bit more leeway for public, but not that much.
[00:47:04.160 --> 00:47:08.000] And anything egregious absolutely should be dealt with.
[00:47:08.000 --> 00:47:10.480] Okay, so obviously I agree with you.
[00:47:10.480 --> 00:47:19.520] So, there's basically a discussion going on within the medical profession as well as the legal profession about this issue.
[00:47:19.520 --> 00:47:28.640] And there's also the question of: are state medical boards enforcing any kind of standard in terms of public information?
[00:47:29.120 --> 00:47:31.560] Let me quote from you a couple of sources here.
[00:47:31.560 --> 00:47:37.240] The Federation of State Medical Boards, this is the organization of state medical boards in the U.S.
[00:47:37.560 --> 00:47:39.240] Their position is the following.
[00:47:39.240 --> 00:47:50.120] Physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their medical license.
[00:47:50.120 --> 00:47:59.000] Due to their specialized knowledge and training, licensed physicians possess a high degree of public trust and therefore have a powerful platform in society, whether they recognize it or not.
[00:47:59.000 --> 00:48:11.400] They also have an ethical and professional responsibility to practice medicine in the best interests of their patients and must share information that is factual, scientifically grounded, and consensus-driven for the betterment of public health.
[00:48:11.400 --> 00:48:19.400] Spreading inaccurate COVID-19 vaccine information contradicts that responsibility, threatens to further erode public trust in the medical profession, and puts all patients at risk.
[00:48:19.400 --> 00:48:21.720] The AMA basically agrees with them, right?
[00:48:21.720 --> 00:48:25.240] That's they give basically the same different words, same opinion.
[00:48:25.480 --> 00:48:26.360] What's the problem?
[00:48:26.520 --> 00:48:28.840] The problem is they're not doing it.
[00:48:28.840 --> 00:48:29.960] Oh, geez.
[00:48:29.960 --> 00:48:39.800] So there was a recent study, which is what triggered this whole review, that asked the question: are state medical boards disciplining physicians for spreading misinformation?
[00:48:39.800 --> 00:48:43.000] Now, this is a difficult question to get at, right?
[00:48:43.000 --> 00:48:46.040] When you think about how would you research, how would you answer this question?
[00:48:46.040 --> 00:48:50.440] They basically looked where the light was available, right?
[00:48:50.440 --> 00:48:56.840] So they got the information that they were able to get, and then we have to use that as a way to kind of infer how they're doing.
[00:48:56.840 --> 00:48:58.360] So this is a JAMA article.
[00:48:58.360 --> 00:49:12.120] They looked at 3,128 medical board disciplinary proceedings involving physicians, and they found that only 0.1% of them were for spreading misinformation to the community.
[00:49:12.120 --> 00:49:14.280] That was the least common reason.
[00:49:14.280 --> 00:49:14.920] 0.1%.
[00:49:15.760 --> 00:49:26.160] Direct patient misinformation and inappropriate advertising were tied for the next two least common reasons at 0.3%.
[00:49:26.480 --> 00:49:32.000] So these are way below all the other reasons that physicians get disciplined.
[00:49:32.000 --> 00:49:32.960] Now, of course, we don't know.
[00:49:33.280 --> 00:49:34.080] Is that representative?
[00:49:34.400 --> 00:49:39.440] We don't know how many times physicians are violating this or how many complaints there were, what the percent.
[00:49:39.680 --> 00:49:52.320] The thing is that complaints against physicians, if they are found to not be worth acting upon, right, if the physicians are essentially cleared, that information is not made public.
[00:49:52.320 --> 00:49:52.640] That is.
[00:49:53.120 --> 00:49:55.840] Yeah, and the complaints have to be filed in the first place.
[00:49:56.080 --> 00:49:59.040] Nobody is monitoring or policing physicians.
[00:49:59.200 --> 00:50:00.000] Yeah, so we don't.
[00:50:00.160 --> 00:50:10.240] There's no way to know how often physicians are spreading misinformation and how often the complaints are being made and how they're being decided.
[00:50:10.240 --> 00:50:20.800] All we know is of those complaints that are made public, it's a very teeny tiny slice of 0.1% basically for spreading public misinformation.
[00:50:20.800 --> 00:50:26.240] So the authors of the study were saying, yeah, you know, this suggests that they're just not doing their job.
[00:50:26.880 --> 00:50:29.920] They also say that we have some indirect information.
[00:50:29.920 --> 00:50:38.400] So there is an increase in the number of complaints about physicians spreading misinformation.
[00:50:38.400 --> 00:50:39.360] Okay, that's good.
[00:50:39.680 --> 00:50:44.400] So we know that while that's happening, the number of disciplinary actions is not increasing.
[00:50:44.400 --> 00:50:44.960] That's not good.
[00:50:44.960 --> 00:50:47.680] Yeah, so it doesn't appear, there does appear to be a disconnect.
[00:50:47.680 --> 00:50:50.800] But yeah, so but you know, we don't have really perfect information here.
[00:50:51.120 --> 00:51:00.000] Also, how many I'm I'm curious, how many physicians, and I know that that's a special division of doctors that you hear talking online, are actual medical doctors.
[00:51:00.440 --> 00:51:07.960] How many of the COVID misinformation is being touted by people who never, who like didn't maintain their license?
[00:51:08.520 --> 00:51:13.000] I don't know what the percentage is, but I know a lot of the ones that I am aware of are MDs.
[00:51:13.720 --> 00:51:14.360] And they still have to do that.
[00:51:14.600 --> 00:51:16.360] They have their MD and they're practicing.
[00:51:16.360 --> 00:51:16.760] Yeah.
[00:51:17.160 --> 00:51:17.880] Yeah, that's scary.
[00:51:17.880 --> 00:51:18.680] That's really scary.
[00:51:18.680 --> 00:51:19.480] Yeah, it is.
[00:51:19.480 --> 00:51:32.120] The other side of the coin, though, is the question: if we think it's reasonable for medical boards to hold physicians accountable for spreading misinformation, are they the right people to do it?
[00:51:32.120 --> 00:51:33.880] And probably not.
[00:51:33.880 --> 00:51:40.440] Yeah, the consensus seems to be that whether or not you think they should be doing it, they don't have the resources to do it.
[00:51:40.440 --> 00:51:42.280] They're basically not in a position to do it.
[00:51:42.280 --> 00:52:03.800] They don't have the infrastructure of people who can, first of all, determine if something crosses over that line to, all right, this is demonstrable, egregious misinformation, not something in the gray zone, not a matter of opinion, not something about which an ethical physician can have a minority but reasonable opinion.
[00:52:03.800 --> 00:52:10.760] This is demonstrably wrong, harming the public health, and is something that we could say is over the line.
[00:52:11.080 --> 00:52:14.440] Well, and there's enough examples.
[00:52:14.440 --> 00:52:16.600] I mean, there are famous examples like Dr.
[00:52:16.600 --> 00:52:26.200] Death in Texas and like these different, but there are enough examples of individuals who it took far too long for them to be stripped of their license.
[00:52:26.200 --> 00:52:33.320] Like they caused so much harm before enough evidence was amassed and a license was actually revoked.
[00:52:33.320 --> 00:52:38.040] I don't think people are losing their licenses left and right for small things.
[00:52:38.040 --> 00:52:41.720] Like, it has to be really egregious, and a lot of people have to be harmed for it.
[00:52:41.880 --> 00:52:46.400] Generally speaking, state boards are very, very forgiving.
[00:52:47.280 --> 00:52:56.480] Yeah, and so, and so that's already worrisome that they would then be tasked with policing this public health issue because it really is a public health issue.
[00:52:57.040 --> 00:53:02.400] So, I think the answer is that they need to reconfigure themselves to be able to do this.
[00:53:02.400 --> 00:53:09.120] And the other way to think about this is that there are other resources out there that they could be availing themselves of.
[00:53:09.600 --> 00:53:10.160] Skeptic groups.
[00:53:10.320 --> 00:53:11.600] For example, no, actually, no.
[00:53:11.600 --> 00:53:14.480] What I would say is that this is not something that we should be doing.
[00:53:15.040 --> 00:53:24.560] We should be educating the public about this and maybe pointing the finger at individuals, but we can't be the ones to be deciding whose license gets acted against.
[00:53:24.560 --> 00:53:30.560] But there are other layers of quality control within medicine, right?
[00:53:30.560 --> 00:53:32.640] It's not just state medical boards.
[00:53:32.640 --> 00:53:34.960] There's also specialty boards, right?
[00:53:34.960 --> 00:53:46.560] So, if you're an internal medicine doctor, you can get board certified by the Board of Internal Medicine, which means they can withdraw your certification if you violate their standards.
[00:53:47.040 --> 00:53:48.800] You also have to be credentialed at your hospital.
[00:53:48.960 --> 00:53:51.040] You also have to be credentialed at your hospital.
[00:53:51.120 --> 00:53:54.640] Of course, you could have your own private practice somewhere affiliated with no other institution.
[00:53:54.640 --> 00:53:56.960] You don't have to be board certified.
[00:53:56.960 --> 00:53:59.760] You can just have a license and do whatever you want.
[00:53:59.760 --> 00:54:05.120] So, then there's always two layers: there's the state medical board, and then there's the law.
[00:54:05.120 --> 00:54:06.720] You could always be sued, right?
[00:54:06.720 --> 00:54:08.480] There's the malpractice standard.
[00:54:08.480 --> 00:54:14.160] Of course, that's not the one we want to rely upon because that only kicks in after harm has already been done, right?
[00:54:14.480 --> 00:54:20.480] But is there, should there be like some sort of CDC NIH, like some sort of like public?
[00:54:20.720 --> 00:54:23.920] There's no federal-level regulation of licenses.
[00:54:23.920 --> 00:54:35.800] This would be a no, I don't mean of licenses, I mean of taking some form of action against an individual who is causing a viable risk to public health.
[00:54:36.040 --> 00:54:39.640] Yeah, so again, there's no legal infrastructure right now for this to happen at the federal level.
[00:54:39.640 --> 00:54:49.080] So if we want to existing infrastructure, what we could have is, say, for example, the specialty boards are in a great position.
[00:54:49.080 --> 00:54:55.160] They're in the best position to determine what is misinformation because this is what they do all the time.
[00:54:55.160 --> 00:54:56.200] They are the experts.
[00:54:56.200 --> 00:55:01.880] They put together panels of experts to review the scientific evidence and come up with practice guidelines.
[00:55:01.880 --> 00:55:05.480] And that could include communication guidelines as well.
[00:55:05.480 --> 00:55:09.560] And then all the state has to do is say, yeah, whatever these guys say, right?
[00:55:09.560 --> 00:55:17.400] All they have to do is we're going to utilize the recognized American board certification organizations and their practice guidelines.
[00:55:17.400 --> 00:55:18.760] And they do this all the time.
[00:55:18.760 --> 00:55:22.280] The state boards don't reinvent the wheel 50 times for every state.
[00:55:22.440 --> 00:55:25.240] They use, you know, like the AMA ethical standards.
[00:55:25.240 --> 00:55:27.400] They say, yep, we'll adopt those, right?
[00:55:27.400 --> 00:55:40.520] They will use, when you're sued for, or there's an action, a complaint that you violated the standard of care, for example, they don't determine themselves what the standard of care is.
[00:55:40.520 --> 00:55:44.760] They refer to published guidelines or experts or whatever, other people.
[00:55:44.760 --> 00:56:10.040] So it's not a stretch to say that the specialty boards need to get way more involved with this and can be a resource for the state medical boards who then also need to be acting on these much more aggressively and availing themselves of the specialty boards to determine who is violating, who's spreading misinformation and disinformation.
[00:56:10.360 --> 00:56:18.480] But then there's another layer here, a wrinkle, and I don't know if you want to get into it, Steve, but there is a wrinkle that comes down to straight up political pressure.
[00:56:18.480 --> 00:56:36.160] And when the zeitgeist of the state, whether we're talking about the directives of the state attorney general or the directives of a surgeon general, are explicitly saying, no, we promote these practices, we promote this misinformation.
[00:56:36.160 --> 00:56:42.640] Now things get really murky because there may be political pressure being leveraged onto that state medical board.
[00:56:42.640 --> 00:56:49.920] Well, there shouldn't be, and almost by definition, there really can't be any pressure by any federal person or agency.
[00:56:50.080 --> 00:56:51.360] No, I'm talking about at the state level.
[00:56:51.360 --> 00:56:52.160] Oh, yeah, at the state level.
[00:56:52.160 --> 00:56:53.600] Yeah, it's all at the state level.
[00:56:53.600 --> 00:56:54.560] But that's what I'm saying.
[00:56:54.560 --> 00:56:58.400] There are states in which this is pro-misinformation.
[00:56:59.120 --> 00:57:04.640] Is the governor going to put pressure on the professional board who licenses the professionals that they should be independent?
[00:57:04.880 --> 00:57:08.720] You see this if it happens all the time.
[00:57:08.720 --> 00:57:10.880] Yeah, it should not happen.
[00:57:11.200 --> 00:57:17.440] The state attorney or the state surgeon general or whoever it is, do they have a surgeon general of each state?
[00:57:17.440 --> 00:57:17.680] No.
[00:57:17.680 --> 00:57:18.880] Usually I don't either.
[00:57:19.520 --> 00:57:21.040] Attorney General handles it.
[00:57:21.200 --> 00:57:22.080] Yeah, the AG.
[00:57:22.080 --> 00:57:30.080] So, but I mean, I saw this when I was living in Florida, these directives coming from state officials saying don't get vaccinated.
[00:57:30.080 --> 00:57:31.280] Oh, yeah, absolutely.
[00:57:31.840 --> 00:57:37.520] But that does not tie the hands of any professional organization.
[00:57:37.520 --> 00:57:43.600] And I have been involved with professional organizations essentially fighting with the state over this.
[00:57:43.600 --> 00:57:56.800] And also, like a state medical board, you know, again, sort of relying upon professional organizations in order to act against somebody who was being protected politically, for example.
[00:57:56.800 --> 00:58:00.360] So I think they are at cross purposes, which is the way it should be.
[00:58:00.360 --> 00:58:02.360] Yeah, which is good
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
leveraged onto that state medical board.
[00:56:42.640 --> 00:56:49.920] Well, there shouldn't be, and almost by definition, there really can't be any pressure by any federal person or agency.
[00:56:50.080 --> 00:56:51.360] No, I'm talking about at the state level.
[00:56:51.360 --> 00:56:52.160] Oh, yeah, at the state level.
[00:56:52.160 --> 00:56:53.600] Yeah, it's all at the state level.
[00:56:53.600 --> 00:56:54.560] But that's what I'm saying.
[00:56:54.560 --> 00:56:58.400] There are states in which this is pro-misinformation.
[00:56:59.120 --> 00:57:04.640] Is the governor going to put pressure on the professional board who licenses the professionals that they should be independent?
[00:57:04.880 --> 00:57:08.720] You see this if it happens all the time.
[00:57:08.720 --> 00:57:10.880] Yeah, it should not happen.
[00:57:11.200 --> 00:57:17.440] The state attorney or the state surgeon general or whoever it is, do they have a surgeon general of each state?
[00:57:17.440 --> 00:57:17.680] No.
[00:57:17.680 --> 00:57:18.880] Usually I don't either.
[00:57:19.520 --> 00:57:21.040] Attorney General handles it.
[00:57:21.200 --> 00:57:22.080] Yeah, the AG.
[00:57:22.080 --> 00:57:30.080] So, but I mean, I saw this when I was living in Florida, these directives coming from state officials saying don't get vaccinated.
[00:57:30.080 --> 00:57:31.280] Oh, yeah, absolutely.
[00:57:31.840 --> 00:57:37.520] But that does not tie the hands of any professional organization.
[00:57:37.520 --> 00:57:43.600] And I have been involved with professional organizations essentially fighting with the state over this.
[00:57:43.600 --> 00:57:56.800] And also, like a state medical board, you know, again, sort of relying upon professional organizations in order to act against somebody who was being protected politically, for example.
[00:57:56.800 --> 00:58:00.360] So I think they are at cross purposes, which is the way it should be.
[00:58:00.360 --> 00:58:02.360] Yeah, which is good because it avoids that conflict of the fact that it's not.
[00:57:59.840 --> 00:58:02.920] Yeah, right.
[00:58:03.000 --> 00:58:10.920] And then the national board certification boards, they are completely independent, right?
[00:58:10.920 --> 00:58:12.280] They are not even part of the government.
[00:58:12.280 --> 00:58:21.880] They are just academic kind of professional organizations whose purpose is really to just promote and certify high standards within their specialty.
[00:58:21.880 --> 00:58:26.200] Again, I think they're optimally positioned to deal with these issues.
[00:58:26.200 --> 00:58:31.080] And I think they just need to functionally connect with the state medical boards.
[00:58:31.800 --> 00:58:43.080] So, and unfortunately, they've been gun-shy, though, because they've been burned so often by both states and by individual quacks, right?
[00:58:43.080 --> 00:58:54.760] So, for example, and competing professional organizations, like the famously, like in the 80s, the chiropractors sued the AMA for practice infringement and won.
[00:58:54.760 --> 00:59:00.920] And so, the AMA has like been completely gun-shy about dealing with pseudoscience in medicine ever since then.
[00:59:00.920 --> 00:59:13.000] In Connecticut, for example, the state government went against the American Infectious Disease Society, right, on the question of chronic Lyme disease.
[00:59:13.000 --> 00:59:17.160] They sided with the quacks against the professional organization.
[00:59:17.160 --> 00:59:22.040] So, the states should absolutely not do that, but they absolutely do do that, right?
[00:59:22.040 --> 00:59:34.840] The legislature, the legislature takes on the regulatory roles or supermans the regulatory role for purely political purposes, always causing mischief, never in a good way, in my experience.
[00:59:34.840 --> 00:59:38.840] They should just basically let the professionals handle it and stay the freak out of it.
[00:59:38.840 --> 00:59:51.360] Well, and that's what I'm so afraid of: is deregulating and the increased distrust of these professional organizations of institutional knowledge as a whole.
[00:59:51.680 --> 01:00:03.840] What happens when you get to a scary place where in certain states, these types of licenses are no longer even regulated, and anybody can call themselves a doctor.
[01:00:03.840 --> 01:00:09.680] Yeah, well, if you have a complete erosion of expertise and professionalism and any kind of quality control, the game is over.
[01:00:09.680 --> 01:00:10.560] But we're not there.
[01:00:10.560 --> 01:00:11.440] We are not there yet.
[01:00:11.440 --> 01:00:13.520] We're not there, but we have holes.
[01:00:13.520 --> 01:00:14.640] Yeah, absolutely.
[01:00:14.640 --> 01:00:16.400] Yeah, and they're mainly legislative.
[01:00:16.400 --> 01:00:17.840] They are mainly legislative.
[01:00:17.840 --> 01:00:18.480] Yeah.
[01:00:18.480 --> 01:00:24.000] But that's where a lot of power lies, and that's the part that scares me because the deregulation is happening at the legislative level.
[01:00:24.000 --> 01:00:24.640] Yeah.
[01:00:24.640 --> 01:00:29.760] Yeah, and there's the healthcare freedom laws, which are all happening at the state legislative level.
[01:00:29.760 --> 01:00:32.000] That's where all the mischief is happening.
[01:00:32.000 --> 01:00:40.480] But while they still have the power to, you know, to protect the public with some level of ethical and quality control, they should do their job.
[01:00:40.480 --> 01:00:45.760] And right now they're really not doing it because I just think they're not set up for it.
[01:00:45.760 --> 01:00:50.160] You know, again, they have not adapted to this new world of misinformation, and they need to.
[01:00:50.160 --> 01:00:51.360] That's the bottom line.
[01:00:51.360 --> 01:01:02.320] 100%, because otherwise, who is going to protect vulnerable individuals who literally do not know any better and are trying to listen to smart people giving them good advice.
[01:01:02.640 --> 01:01:08.080] Yeah, you end up with the dueling experts, you know, and everyone just thinks, ah, whatever.
[01:01:08.080 --> 01:01:09.040] I could believe whatever I want.
[01:01:09.200 --> 01:01:13.040] I'll just go with my tribe, you know, because everyone has their own experts.
[01:01:13.040 --> 01:01:13.520] Exactly.
[01:01:13.520 --> 01:01:13.840] Yeah.
[01:01:13.840 --> 01:01:14.720] And they're expert.
[01:01:14.720 --> 01:01:15.760] It seems legitimate.
[01:01:15.760 --> 01:01:16.560] They're licensed.
[01:01:16.560 --> 01:01:17.280] They're practicing.
[01:01:17.280 --> 01:01:18.000] They're credentialed.
[01:01:18.000 --> 01:01:19.360] And they've never taken that away yet.
[01:01:19.600 --> 01:01:27.680] If you want the benefits of being licensed and credentialed and certified, you have to actually adhere to the standard.
[01:01:27.680 --> 01:01:28.960] You can't have it both ways.
[01:01:28.960 --> 01:01:39.640] You can't say, I'm a board-certified physician, so you can trust me, and not be liable to those very certifying entities saying, Nope, you can't say that.
[01:01:39.640 --> 01:01:40.680] That's not true.
[01:01:40.680 --> 01:01:42.360] That's misinformation.
[01:01:42.360 --> 01:01:45.800] So the whole, there is no First Amendment defense here.
[01:01:45.800 --> 01:01:48.840] There isn't, because this is not personal speech or private speech.
[01:01:48.840 --> 01:01:50.520] This is professional speech.
[01:01:50.520 --> 01:01:52.280] Nobody deserves to be licensed.
[01:01:52.280 --> 01:01:54.440] Nobody deserves to be certified.
[01:01:54.440 --> 01:01:56.040] And you're board certified.
[01:01:56.040 --> 01:02:03.000] And so, yes, you can absolutely lose your board certification if you violate whatever standard the board decides to have.
[01:02:03.000 --> 01:02:05.240] It's completely up to them, right?
[01:02:05.240 --> 01:02:09.560] As long as it's being applied fairly and not selectively, you know.
[01:02:09.560 --> 01:02:14.600] But if they say this is the standard, you have to abide by our standards in order to be board certified.
[01:02:14.600 --> 01:02:21.880] And that includes passing an exam and maintaining your continuing medical education and practicing within the standard of care, blah, blah, blah.
[01:02:21.880 --> 01:02:29.240] And for any reason they could say, nope, you no longer meet whatever we consider to be the requirements for certification, it can be removed.
[01:02:29.240 --> 01:02:46.200] So anyway, yeah, I think it's pretty clear, but the bottom line is there needs to be pressure to take a more active role in policing misinformation and disinformation coming from licensed and certified physicians.
[01:02:46.200 --> 01:02:48.920] Jay, it's who's that noisy time?
[01:02:48.920 --> 01:02:52.040] All right, guys, last week I played this noisy.
[01:03:05.640 --> 01:03:06.920] You guys have any guesses?
[01:03:06.920 --> 01:03:11.480] Yeah, shots of Ready Whip being passed around.
[01:03:11.480 --> 01:03:12.960] Everyone, open your mouth.
[01:03:15.920 --> 01:03:18.800] So, a listener named Matthew Morrison wrote in and said, Hi, Jay.
[01:03:18.800 --> 01:03:24.880] My wife, Nicole, daughter Nev, and I all agree this sounds like a fire extinguisher being used.
[01:03:24.880 --> 01:03:27.200] I think it would be louder than that if it were a fire extinguisher.
[01:03:27.280 --> 01:03:28.560] Those things can be loud.
[01:03:28.560 --> 01:03:31.280] Yeah, when you're right on there, sure, they're crazy loud.
[01:03:31.280 --> 01:03:34.640] It's not a fire extinguisher, but that's a good guess.
[01:03:34.640 --> 01:03:37.600] A listener named Joel Harding said, Hi, Skeptics Guide.
[01:03:37.600 --> 01:03:43.680] My six-year-old son Reed guesses that this who's that noisy for this week is an axolotl.
[01:03:44.000 --> 01:03:47.040] And I, you know, I have, I didn't know that they made noise.
[01:03:47.040 --> 01:03:47.840] Do they make noise?
[01:03:47.840 --> 01:03:48.480] I mean, do they?
[01:03:48.560 --> 01:03:50.160] I don't think they make noise like that.
[01:03:50.160 --> 01:03:51.360] They're like teeny tiny.
[01:03:51.360 --> 01:03:57.040] I mean, if they did make a noise, I suppose it would be something like that with them like screaming, my lungs are on the outside.
[01:03:57.920 --> 01:04:00.880] You know, it's a pretty scary thing that they got going on there.
[01:04:00.880 --> 01:04:06.080] Another listener named Adam Russell wrote in and said, This week's noisy is the mating call of a Minock.
[01:04:08.240 --> 01:04:09.120] Good job, right?
[01:04:09.120 --> 01:04:10.560] I don't know if everybody got that.
[01:04:10.560 --> 01:04:12.160] Anyone know what a Minock is?
[01:04:12.160 --> 01:04:12.480] I do.
[01:04:12.480 --> 01:04:13.600] Yeah, from Star Wars.
[01:04:13.920 --> 01:04:14.640] Star Wars.
[01:04:14.640 --> 01:04:16.560] Specifically, like the true on wirelines.
[01:04:16.960 --> 01:04:17.760] Yeah, the cables, you know.
[01:04:18.000 --> 01:04:20.080] Specifically, the Empire Strikes Back.
[01:04:20.080 --> 01:04:21.680] Another listener named Nick wrote in.
[01:04:21.680 --> 01:04:23.360] He said, Hello, second time guesser.
[01:04:23.360 --> 01:04:28.160] My guest this week is a brush-tail possum letting you know you're in its territory.
[01:04:28.160 --> 01:04:33.680] I've recently had two of these cute-looking monstrous sound creatures move in under my house.
[01:04:33.680 --> 01:04:39.120] They are not the brush-tail possum, and I wouldn't be happy if I met one of those.
[01:04:39.120 --> 01:04:40.560] We have a winner from last week, guys.
[01:04:40.560 --> 01:04:41.520] Are you prepared?
[01:04:41.760 --> 01:04:43.200] You're all brimming with excitement.
[01:04:43.200 --> 01:04:43.920] I could tell.
[01:04:43.920 --> 01:04:44.560] I'm not ready.
[01:04:44.960 --> 01:04:45.360] Now I'm ready.
[01:04:45.440 --> 01:04:50.560] Jesse Babonis wrote, I think this noisy is the call of a kiwi bird.
[01:04:51.200 --> 01:04:53.200] That is the sound of a kiwi bird.
[01:04:53.200 --> 01:04:54.160] Listen again.
[01:05:01.080 --> 01:05:02.440] Not a happy creature.
[01:05:02.440 --> 01:05:07.560] Steve, I have to ask you: did you have any idea that it was a bird?
[01:05:08.120 --> 01:05:09.960] That was my guess.
[01:05:09.960 --> 01:05:10.520] You did.
[01:05:11.080 --> 01:05:13.320] But you had no idea what the actual bird was.
[01:05:13.320 --> 01:05:15.160] Yeah, I haven't heard that one before.
[01:05:15.160 --> 01:05:16.120] Kiwi, huh?
[01:05:16.120 --> 01:05:16.840] Kiwi, huh?
[01:05:17.000 --> 01:05:21.000] That little thing that we saw at night in New Zealand?
[01:05:21.000 --> 01:05:23.400] Well, yeah, we actually saw it during the day, didn't we?
[01:05:23.400 --> 01:05:23.720] Did we?
[01:05:23.720 --> 01:05:24.600] I thought it was.
[01:05:24.680 --> 01:05:28.600] Well, it was day to us, but it was in a night every enclosure.
[01:05:28.600 --> 01:05:32.440] And the enclosure was flipped so that its night was our day so that we could see it, right?
[01:05:32.440 --> 01:05:32.920] Remember that?
[01:05:32.920 --> 01:05:33.800] Oh, it was like a trick.
[01:05:33.960 --> 01:05:35.320] Yeah, oh, yeah, yeah, yeah.
[01:05:35.320 --> 01:05:37.160] Yeah, well, apparently they make that noise.
[01:05:37.160 --> 01:05:39.960] So not the most inviting noise, huh?
[01:05:40.280 --> 01:05:41.080] No.
[01:05:41.720 --> 01:05:43.400] Sounds terrifying.
[01:05:43.400 --> 01:05:45.640] All right, guys, I have a new noisy for you this week.
[01:05:45.640 --> 01:05:48.040] I'm going to play it for you right now.
[01:06:06.840 --> 01:06:07.480] Yes.
[01:06:07.480 --> 01:06:08.600] It's not a bird.
[01:06:08.600 --> 01:06:10.280] That is most definitely not our bird.
[01:06:10.280 --> 01:06:13.160] I will give everybody that unbelievable hint.
[01:06:13.960 --> 01:06:20.200] So, guys, if you think you know what this week's noisy is, or if you have a cool noisy that you heard, give me an email.
[01:06:20.200 --> 01:06:21.480] Send me an email.
[01:06:21.480 --> 01:06:27.000] The only place I want you to send me the email is wtn at the skepticsguide.org.
[01:06:27.000 --> 01:06:28.200] A few quick things, Steve.
[01:06:28.200 --> 01:06:35.640] So you could always become a patron of the SGU in these times where the world needs skepticism.
[01:06:35.640 --> 01:06:49.600] I believe I wouldn't say now more than ever, but now is definitely an inflection point where we really feel the need to help educate people on how to think and how to be reasonable and how to use reason to come to your conclusions.
[01:06:44.600 --> 01:06:50.800] That's why we're here, guys.
[01:06:51.040 --> 01:06:57.680] So if you would like to support the work that we do, go to patreon.com forward slash skepticsguide, all one word.
[01:06:57.680 --> 01:06:59.200] You can also join our mailing list.
[01:06:59.200 --> 01:07:05.680] Every week we send out an email, it gives you a summary of all the things that the SGU has done in the last week.
[01:07:05.680 --> 01:07:11.360] And you could sign up going to theskepticsguide.org and on our homepage, you'll see a button to join there.
[01:07:11.360 --> 01:07:15.360] And we have three shows coming up: the DC, Washington, D.C.
[01:07:15.520 --> 01:07:21.280] private show, which is a live recording of the SGU, and also we do an extra hour of fun and audience interaction.
[01:07:21.280 --> 01:07:22.720] It's different every time.
[01:07:22.720 --> 01:07:25.040] If you haven't been to one, go to one.
[01:07:25.040 --> 01:07:28.800] There's only a certain number of these that we will ever do left, correct, guys?
[01:07:29.440 --> 01:07:32.160] It's an unknown number, but it is limited by the same.
[01:07:32.400 --> 01:07:33.920] That is a correct.
[01:07:34.000 --> 01:07:35.920] That is a correct statement.
[01:07:36.240 --> 01:07:43.760] We also, on the same day, this is happening all on December 7th of this year, we have a skeptical extravaganza.
[01:07:43.760 --> 01:07:45.440] This is our stage show.
[01:07:45.600 --> 01:07:50.320] This is a live performance that includes all of the SGU and George Robb.
[01:07:50.320 --> 01:07:52.560] Ian is manning the tech table.
[01:07:52.560 --> 01:07:57.200] And this is a show that we have been perfecting over the last 10 years.
[01:07:57.200 --> 01:07:58.560] It's a ton of fun.
[01:07:58.560 --> 01:08:02.240] I really do hope that you decide to come join us in DC.
[01:08:02.720 --> 01:08:06.880] We also will be scheduling more of these around the country in 2025.
[01:08:06.960 --> 01:08:09.760] We are happening, you know, the talks are happening right now.
[01:08:09.760 --> 01:08:11.520] So this is a great show to grab.
[01:08:11.520 --> 01:08:15.520] You can also go to the skepticsguide.org to buy tickets for that.
[01:08:15.520 --> 01:08:20.080] And then, and the thing that I'm most excited for is Notacon 2025.
[01:08:20.080 --> 01:08:21.640] This is our conference.
[01:08:21.640 --> 01:08:24.640] We started last year in 23.
[01:08:24.640 --> 01:08:27.520] And it is an SGU celebration.
[01:08:27.520 --> 01:08:29.360] That's the best way I could describe this.
[01:08:29.520 --> 01:08:37.880] The conference is largely around socializing, but we do, of course, provide an incredible amount of content during the two days that the conference is going on.
[01:08:37.880 --> 01:08:42.040] And there is going to be a lot of fun new things that we do this year.
[01:08:42.040 --> 01:08:46.440] If you're interested, you can go to nataconcon.com.
[01:08:46.440 --> 01:08:47.800] Evan, what was that URL?
[01:08:48.280 --> 01:08:51.880] NataconCon.com.
[01:08:51.880 --> 01:08:52.600] Correct.
[01:08:52.600 --> 01:08:53.240] Yes.
[01:08:53.240 --> 01:08:54.280] Please do join us.
[01:08:54.280 --> 01:08:55.400] It's an awesome time.
[01:08:55.400 --> 01:08:57.560] It's a great place to meet new people.
[01:08:57.560 --> 01:09:00.680] We had such a blast last year that people are still talking about it.
[01:09:00.680 --> 01:09:02.040] I know I am.
[01:09:02.040 --> 01:09:05.800] And Sharon made us a ton of Biscotti, right, guys?
[01:09:05.800 --> 01:09:06.600] Oh, boy.
[01:09:06.600 --> 01:09:07.080] Oh, boy.
[01:09:07.080 --> 01:09:08.120] That was just so good.
[01:09:08.120 --> 01:09:08.600] Yeah.
[01:09:08.600 --> 01:09:10.120] Yeah, but we ate it all.
[01:09:11.080 --> 01:09:11.960] That's why we're doing it again.
[01:09:12.040 --> 01:09:13.000] It's the only way to get it.
[01:09:13.720 --> 01:09:15.080] To do Natakon.
[01:09:15.080 --> 01:09:15.960] BiscottiCon.
[01:09:16.920 --> 01:09:18.200] That's a wrap, Steve.
[01:09:18.200 --> 01:09:19.320] All right, thanks, Jay.
[01:09:19.320 --> 01:09:20.280] One quick email.
[01:09:20.280 --> 01:09:26.840] We had a number of people write us to follow up with our mid-Atlantic accent discussion.
[01:09:26.840 --> 01:09:27.320] Cool.
[01:09:27.800 --> 01:09:29.000] I file this under.
[01:09:29.000 --> 01:09:32.120] It's always more complicated than you think.
[01:09:32.120 --> 01:09:33.400] It's a big file.
[01:09:33.400 --> 01:09:40.600] So mostly they were referring to a video by a phonetics expert, Dr.
[01:09:40.600 --> 01:09:48.840] Jeff Lindsay, who argues that the notion that the mid-Atlantic accent is fake is itself a myth.
[01:09:48.840 --> 01:09:57.160] And then he sets about to debunk that myth, citing a lot of evidence about the origins of the accent.
[01:09:57.160 --> 01:10:00.760] I got to say, I'm not convinced by his video.
[01:10:00.760 --> 01:10:16.480] I mean, clearly he understands a lot more about this than I do, but I looked up, just did, tried to research it as best as I could, and I could certainly find other experts that seemed to be as legitimate as this guy saying something extremely different.
[01:10:14.840 --> 01:10:21.680] So, at the very least, I think we need to say that this is a controversy or it's contested.
[01:10:21.840 --> 01:10:27.760] I don't know if this guy, this guy has a YouTube channel, right, where he has a lot of videos about language.
[01:10:27.760 --> 01:10:36.320] That doesn't necessarily mean that he's the definitive authority on everything just because he has a YouTube channel, but he certainly is an expert and he has a reasoned argument to make.
[01:10:36.320 --> 01:10:43.840] He's basically saying that the quote-unquote mid-Atlantic accent is a Northeastern regional accent.
[01:10:43.840 --> 01:10:52.560] And it's kind of what, you know, by the early 20th century, what was settled on as sort of a generic American accent.
[01:10:52.560 --> 01:11:02.800] And he cites evidence from like the aspects of the mid-Atlantic accent that exist in the regional accents of the Northeast.
[01:11:02.960 --> 01:11:04.640] Those features all pre-existed.
[01:11:04.640 --> 01:11:06.720] So he said, how could that be fake?
[01:11:07.120 --> 01:11:16.160] However, I also find references that say that not only was the mid-Atlantic accent manufactured, they named the guy who did it.
[01:11:16.160 --> 01:11:20.080] They said this Australian phonetician, William Henry Till.
[01:11:20.320 --> 01:11:21.040] Phonetician.
[01:11:21.040 --> 01:11:36.080] Yeah, phonetician, created the mid-Atlantic accent basically for the purpose of saying there should be an English, like a one accent that any educated person who speaks English anywhere in the world should speak, right?
[01:11:36.080 --> 01:11:39.040] And that's the quote-unquote mid-Atlantic accent.
[01:11:39.040 --> 01:11:56.320] So I'm not sure how to square the apparent history here that is being cited, you know, authoritatively by some experts with this guy's analysis who's saying that, no, it was a Northeastern American regional accent.
[01:11:56.320 --> 01:11:57.040] So I don't know.
[01:11:57.400 --> 01:11:58.960] It seems, it's complicated.
[01:11:58.960 --> 01:12:10.600] I do, there are some points which I think, some of which we made ourselves, and I've actually, interestingly, I first learned that the Mid-Atlantic accent was just a generic American accent.
[01:12:10.680 --> 01:12:13.320] It was picked because it was a generic American accent.
[01:12:13.320 --> 01:12:16.280] It was sort of the most neutral accent.
[01:12:16.280 --> 01:12:22.600] I only later read that it was sort of an invented or a crafted accent.
[01:12:22.600 --> 01:12:30.840] And now this guy's saying, no, it actually is just sort of the regional accent that is the most neutral, and that's why it was kind of preferred by actors.
[01:12:30.840 --> 01:12:39.480] Also, a lot of Hollywood actors were British, so that's why they might sound like half American, half British, et cetera.
[01:12:39.480 --> 01:12:51.080] I also have, you know, researching this, and I've read before that a lot of the quote-unquote regional accents laid on top of that is a socioeconomic accent.
[01:12:51.080 --> 01:13:05.960] So if you, what we might be hearing is just a northeastern upper society accent, which sounds more similar across regions than do the lower socioeconomic class accents, right?
[01:13:05.960 --> 01:13:19.720] So in other words, an upper class person living in Boston, New York, and London would sound a lot more similar to each other than a lower socioeconomic class person from those three cities, right?
[01:13:19.960 --> 01:13:33.400] Yeah, it's also interesting to think too, Steve, about like when did, like, when we think of a neutral broadcast accent today, kind of a generic American accent today, when did that actually start?
[01:13:33.720 --> 01:13:37.280] Yeah, I mean, because in the 1600s and 1700s, we were all British.
[01:13:37.160 --> 01:13:37.600] That was right.
[01:13:37.880 --> 01:13:38.520] You know what I mean?
[01:13:38.520 --> 01:13:41.800] Like that's how people talk about, or French, yeah, it depends on where they were.
[01:13:41.800 --> 01:13:46.720] But the English accent was probably a British accent.
[01:13:44.760 --> 01:13:49.280] And that would have been regional as well.
[01:13:49.600 --> 01:13:51.280] What do you mean by English?
[01:13:51.280 --> 01:13:52.640] I mean from England.
[01:13:52.640 --> 01:13:52.960] Yeah.
[01:13:53.680 --> 01:13:54.960] Yeah, of course it was a British accent.
[01:13:55.360 --> 01:13:56.160] That's what you're saying.
[01:13:56.480 --> 01:14:00.240] I'm saying that early Americans, I'm not talking about indigenous Americans.
[01:14:00.240 --> 01:14:00.640] Yeah.
[01:14:00.640 --> 01:14:01.760] They were settlers.
[01:14:01.760 --> 01:14:02.160] Oh, yeah.
[01:14:02.160 --> 01:14:02.320] So.
[01:14:02.560 --> 01:14:03.040] Colonial.
[01:14:03.040 --> 01:14:07.360] Well, we know that, for example, the New York accent is a Dutch accent.
[01:14:07.360 --> 01:14:07.600] Right.
[01:14:07.600 --> 01:14:07.920] So I'm saying that.
[01:14:08.000 --> 01:14:13.040] The Southern accent is actually closer to a British accent than the Northern accent.
[01:14:13.040 --> 01:14:15.520] And the Appalachian accent is very Scottish.
[01:14:15.520 --> 01:14:19.520] Like, at what point, though, did it become definitively American?
[01:14:19.760 --> 01:14:20.960] Well, that's what this guy is saying.
[01:14:21.840 --> 01:14:29.600] Around this time is when it sort of coalesced into a distinctive American accent, and this was just the Northeastern version of that.
[01:14:29.920 --> 01:14:31.520] That feels very late.
[01:14:31.520 --> 01:14:32.480] Yeah, I don't know.
[01:14:32.480 --> 01:14:35.200] Because it feels like we skipped the whole 1800s.
[01:14:35.200 --> 01:14:35.760] I don't know.
[01:14:36.320 --> 01:14:36.720] Yeah.
[01:14:36.880 --> 01:14:41.440] So I have not come to a comfortable place where I think I've wrapped my head around this.
[01:14:41.440 --> 01:14:42.640] I think it's complicated.
[01:14:42.640 --> 01:14:44.640] I think there are a lot of moving parts here.
[01:14:45.040 --> 01:14:52.880] It may not be so simple as this was a fake accent that was taught in finishing schools and affected by actors and actresses.
[01:14:52.880 --> 01:14:59.200] I think there's multiple things going on, but I'm not convinced that it's entirely a natural regional accent either.
[01:14:59.200 --> 01:15:05.360] I think there is an element of education and socioeconomic status in here as well.
[01:15:05.680 --> 01:15:06.720] And it may be both.
[01:15:06.720 --> 01:15:12.000] It may be that some people naturally spoke that way and other people go, I like the way that person is talking.
[01:15:12.000 --> 01:15:13.920] I'm going to emulate that.
[01:15:13.920 --> 01:15:14.480] Totally.
[01:15:14.480 --> 01:15:23.120] And I know, you know, that if you go to like radio broadcast school right now, they will teach you how to talk in a very specific way.
[01:15:23.640 --> 01:15:27.440] There is there is a broadcast, quote-unquote, accent today.
[01:15:27.440 --> 01:15:30.600] We just take it for granted because it's normal for us.
[01:15:31.000 --> 01:15:32.440] Yeah, it's the neutral American.
[01:15:32.760 --> 01:15:36.040] And it's what like I have friends who are actors, right?
[01:15:29.840 --> 01:15:37.240] But are from other countries.
[01:15:37.480 --> 01:15:43.080] So they go to dialect coaches and they learn how to speak neutral American, like broadcast.
[01:15:43.080 --> 01:15:45.320] They also learn how to speak in different accents.
[01:15:45.320 --> 01:15:45.480] Right.
[01:15:45.640 --> 01:15:46.680] Whale pilgrim.
[01:15:46.680 --> 01:15:53.400] So that neutral American accent is not fake, but it's not natural either.
[01:15:53.400 --> 01:15:58.520] It is affected, even though it is based upon regional accents.
[01:15:58.520 --> 01:15:59.720] So I think it's a complicated.
[01:15:59.800 --> 01:16:02.040] Yeah, that's an interesting way to even look at it, Steve.
[01:16:02.040 --> 01:16:06.280] Like I speak with more of a neutral broadcast accent.
[01:16:06.280 --> 01:16:11.080] I have never had a southern accent, but my entire family does.
[01:16:11.080 --> 01:16:11.560] Right.
[01:16:11.880 --> 01:16:13.800] So like you were adopted.
[01:16:13.800 --> 01:16:15.000] But did I fake that?
[01:16:15.000 --> 01:16:17.400] Did I choose that at a certain point in my life?
[01:16:17.400 --> 01:16:19.320] Did I unconsciously choose that?
[01:16:19.320 --> 01:16:21.160] Like it's very interesting.
[01:16:21.480 --> 01:16:24.040] Some people have thick accents and some people don't.
[01:16:24.120 --> 01:16:24.520] Could be.
[01:16:24.520 --> 01:16:26.440] Your peers may have influenced it.
[01:16:26.440 --> 01:16:28.120] I'm sure a lot of things influenced it.
[01:16:28.280 --> 01:16:29.880] I watched television growing up.
[01:16:29.880 --> 01:16:30.360] I don't know.
[01:16:30.680 --> 01:16:31.160] Well, it's funny.
[01:16:31.160 --> 01:16:37.960] I remember I had a conversation with somebody from the South, and we were talking about the accents on television.
[01:16:37.960 --> 01:16:43.880] And it was their impression that people on TV spoke in a southern accent.
[01:16:43.880 --> 01:16:56.120] While it was our impression, coming from Connecticut, that they spoke in a northern accent that the southern accent is the exception, and they thought the northern accent was the exception.
[01:16:56.920 --> 01:17:00.520] And I think that both of those are really obvious.
[01:17:00.520 --> 01:17:01.240] Yeah, right.
[01:17:02.200 --> 01:17:05.400] I hear your Kinnedicushian accents all the time on the show.
[01:17:05.400 --> 01:17:07.080] But my parents, I'm from Texas.
[01:17:07.080 --> 01:17:08.680] My parents sound southern.
[01:17:09.000 --> 01:17:10.680] I definitely hear their accents as well.
[01:17:10.680 --> 01:17:15.760] But it's funny, everyone thought their own accent was the rule, and the other ones were the exception.
[01:17:15.760 --> 01:17:16.240] Yeah.
[01:17:16.240 --> 01:17:20.000] You know, it's like, oh, yeah, that show, yeah, that's because that like Barney Miller, they're in New York.
[01:17:14.920 --> 01:17:21.520] So of course they're speaking with a New York accent.
[01:17:21.600 --> 01:17:23.200] That's not the TV accent, though.
[01:17:23.200 --> 01:17:25.200] The TV accent is my accent.
[01:17:25.760 --> 01:17:27.040] That's what everyone perceived.
[01:17:27.040 --> 01:17:27.840] That's funny.
[01:17:27.840 --> 01:17:28.640] Hey, Steve.
[01:17:28.640 --> 01:17:29.200] Yeah.
[01:17:29.520 --> 01:17:32.080] I forgot to mention something, and this one's pretty cool.
[01:17:32.080 --> 01:17:38.560] So we just turned on on our Patreon the ability to gift somebody a membership.
[01:17:38.560 --> 01:17:39.280] Whoa, neat.
[01:17:39.280 --> 01:17:39.840] Oh, man.
[01:17:40.000 --> 01:17:43.920] I thought this would be a cool thing to do for the holiday season.
[01:17:43.920 --> 01:17:45.440] So you just go to patreon.com.
[01:17:45.520 --> 01:17:47.680] Just buy them our books and give them a membership.
[01:17:48.160 --> 01:17:49.680] No accent required.
[01:17:49.680 --> 01:17:56.400] So all you got to do is go to patreon.com forward slash skepticsguide forward slash gift.
[01:17:56.400 --> 01:17:59.360] And that link will be on the SGU homepage as well.
[01:17:59.360 --> 01:18:01.280] And what you could do is you could do two things.
[01:18:01.280 --> 01:18:05.600] One, you could give it to anybody you want as a gift in any parameters that you want.
[01:18:05.600 --> 01:18:07.920] And it could be a cool Christmas present or whatever.
[01:18:07.920 --> 01:18:15.600] The other thing you could do is you could donate a gift membership to someone in the skeptical community that we will find.
[01:18:15.600 --> 01:18:17.920] You know, Ian and I are working on this right now.
[01:18:18.400 --> 01:18:21.440] It will be very easy to locate people that want a free membership.
[01:18:21.440 --> 01:18:26.560] And then they'll know who you are if you want them to as the person giving the gift.
[01:18:26.560 --> 01:18:27.680] So check it out.
[01:18:27.680 --> 01:18:31.440] I think it's a really cool thing to do and it also can support the SGU.
[01:18:31.440 --> 01:18:31.680] All right.
[01:18:31.680 --> 01:18:32.640] Thanks, Jay.
[01:18:32.640 --> 01:18:36.080] Guys, we have a great interview coming up with Michael Mann.
[01:18:36.080 --> 01:18:37.840] So let's go to that interview now.
[01:18:48.000 --> 01:18:50.080] We are joined now by Michael Mann.
[01:18:50.080 --> 01:18:52.160] Michael, welcome back to The Skeptics Guide.
[01:18:52.320 --> 01:18:52.800] Thanks, Steve.
[01:18:52.800 --> 01:18:53.440] It's great to be with you.
[01:18:53.440 --> 01:18:54.960] Yeah, always a pleasure to have you on the show.
[01:18:54.960 --> 01:19:08.120] So, you are obviously a famous climate scientist, and that's had both good and negative aspects in that you've been targeted because you have spoken publicly what the science says about climate change.
[01:19:08.120 --> 01:19:12.200] So, since we've talked to you last time, it's been a few years, actually, I think, since we spoke last time.
[01:19:12.520 --> 01:19:17.080] How do you think things are going in terms of like hearts and minds, climate change, et cetera?
[01:19:17.480 --> 01:19:18.600] It's a great question.
[01:19:18.920 --> 01:19:28.120] This was sort of the topic of my penultimate book, not the most recent one, our fragile moment, but the new climate war back in 2020.
[01:19:28.760 --> 01:19:37.960] Sort of, there's this evolution away from denialism because it's difficult to deny something that people can see with their own two eyes.
[01:19:38.360 --> 01:19:43.160] Literally, being told to not believe what you're seeing with your eyes, what you're hearing with your ears.
[01:19:43.160 --> 01:19:54.520] And so, I think polluters, other bad actors sort of promoting the fossil fuel agenda, if you will, recognize that denialism doesn't really cut it anymore.
[01:19:54.520 --> 01:19:57.160] So, they've turned to other tactics.
[01:19:57.160 --> 01:20:00.440] And among them, ironically, is doomism.
[01:20:01.080 --> 01:20:04.520] So, I mean, and I mentioned this in my talk today.
[01:20:05.160 --> 01:20:17.240] That's in some ways a greater threat today than denialism because you have people who would otherwise sort of be mobilizing, would be on the front lines demanding action, becoming convinced it's too late to do anything.
[01:20:17.240 --> 01:20:19.880] And it potentially leads them down this path of disengagement.
[01:20:19.880 --> 01:20:24.040] And you've got some bad actors who are sort of feeding the flames of doomism.
[01:20:24.200 --> 01:20:26.120] It's a really diabolical.
[01:20:26.440 --> 01:20:30.280] It's absolutely diabolical and Machiavellian.
[01:20:32.120 --> 01:20:33.800] And that's what we face today.
[01:20:33.800 --> 01:20:36.360] Yeah, and it's incredibly effective.
[01:20:36.360 --> 01:20:42.960] And in fact, even like among ourselves, we've had to be vigilant about not falling for the doomism.
[01:20:42.760 --> 01:20:49.760] Because just to put it, my perspective on this is that, and this is the same thing with anti-vaxxers or whatever.
[01:20:50.560 --> 01:20:56.720] The big tell is that they're a denialist, is that the answer is always do nothing.
[01:20:57.200 --> 01:21:03.520] Or it's always the negative, or it's always the vaccines, or it's like with climate change, the answer is always, well, we just shouldn't do anything.
[01:21:03.520 --> 01:21:06.160] And then reverse engineer what the argument is.
[01:21:06.320 --> 01:21:07.600] Yes, reverse engineer.
[01:21:07.600 --> 01:21:08.240] And it's different.
[01:21:08.240 --> 01:21:10.400] The argument changes, but the conclusion is always the same.
[01:21:10.400 --> 01:21:12.160] So at first it was like, it's not happening.
[01:21:12.720 --> 01:21:14.720] Then it's like, well, it's natural cycles.
[01:21:15.440 --> 01:21:18.640] And now it's like, well, there's nothing we can do about it anyway, so why bother?
[01:21:18.880 --> 01:21:22.960] And unfortunately, we've kind of been playing into that.
[01:21:23.520 --> 01:21:24.480] The people were like the...
[01:21:24.560 --> 01:21:24.960] That's right.
[01:21:24.960 --> 01:21:26.240] Yeah, we're saying, hey, we've got to do something.
[01:21:26.240 --> 01:21:27.120] It's going to get a bit too late.
[01:21:27.120 --> 01:21:27.680] It's going to be bad.
[01:21:27.680 --> 01:21:28.160] Blah, blah, blah.
[01:21:28.160 --> 01:21:29.040] Now they're saying, yeah, you're right.
[01:21:29.040 --> 01:21:30.160] It's like, in fact, it's too late.
[01:21:30.160 --> 01:21:31.600] So why bother doing anything?
[01:21:31.600 --> 01:21:35.760] It's like, shit, we've inadvertently been playing into that narrative.
[01:21:35.760 --> 01:21:39.360] And as you say, it undercuts the very people who are probably the most activists.
[01:21:39.840 --> 01:21:43.040] Yeah, it is truly pernicious.
[01:21:43.040 --> 01:22:06.960] And the good news, I suppose, is in a sense, it's an evolution towards the stages, evolving in terms of the stages of denialism or delay and towards something that we can hopefully, the point I make in terms of our messaging on climate, it's good that there's a sense of urgency.
[01:22:07.200 --> 01:22:09.200] We can make use of that.
[01:22:09.200 --> 01:22:11.440] People clearly understand the urgency.
[01:22:11.440 --> 01:22:13.920] It's the agency that we have to convince them of.
[01:22:13.920 --> 01:22:15.520] But we're sort of halfway there.
[01:22:16.240 --> 01:22:21.120] And we need to sort of deprogram them from the misinformation that they've been fed.
[01:22:21.280 --> 01:22:33.960] And that's what, again, is so sort of pernicious here is the way that misinformation has been used and weaponized to feed this sort of agenda of despair and doomism.
[01:22:34.280 --> 01:22:42.520] And, you know, so it's a matter of telling people: look, the science action, look what the models predicted and look what the observations show.
[01:22:42.520 --> 01:22:51.160] The models done a really good job, and those same models tell us that we can limit warming below catastrophic levels.
[01:22:51.160 --> 01:22:53.880] The only obstacle is politics, right?
[01:22:54.040 --> 01:22:54.520] Action.
[01:22:54.520 --> 01:22:55.880] It's not physics.
[01:22:55.880 --> 01:22:56.200] Right.
[01:22:56.200 --> 01:22:57.800] The physics doesn't tell us we can't do it.
[01:22:57.800 --> 01:22:58.440] The technology doesn't.
[01:22:58.600 --> 01:22:59.720] Right, we have the technology too.
[01:22:59.720 --> 01:23:00.600] I make that point too.
[01:23:00.600 --> 01:23:03.560] We actually don't need any technology that we don't already have.
[01:23:03.560 --> 01:23:03.960] That's right.
[01:23:03.960 --> 01:23:05.000] And it's only going to get better.
[01:23:05.800 --> 01:23:09.960] But even if we flatlined our technology, we could still solve it, and it's going to get better.
[01:23:10.200 --> 01:23:10.840] Yeah, exactly.
[01:23:10.840 --> 01:23:17.560] The economies of scale, there's learning by doing the new efficiencies that arise when you start deploying that new technology.
[01:23:17.560 --> 01:23:22.440] So yeah, we will do better than sort of the baseline scenario.
[01:23:22.440 --> 01:23:29.880] But even the baseline scenario of a shift towards these new energy options will get us there.
[01:23:29.880 --> 01:23:41.320] But how do you, maybe convince is the wrong word, how do you have a conversation with individuals who are despairing and are despairing for legitimate reasons, right?
[01:23:41.320 --> 01:23:49.800] Because they look at the political will and they look at every global summit and we're constantly falling short of these arbitrary goals.
[01:23:49.800 --> 01:23:53.080] And they go, like, yeah, we might have the technology, we might have whatever.
[01:23:53.080 --> 01:23:57.000] We're not going to get there because they don't want to enough, and I'm just a person and I can't control this.
[01:23:57.000 --> 01:23:57.880] Yeah, no, absolutely.
[01:23:57.880 --> 01:24:00.760] And this is often the first point I make.
[01:24:00.760 --> 01:24:06.600] We have to sort of parse out the different sort of sources of despair and doomism.
[01:24:06.600 --> 01:24:12.520] Because the doomism, based on the idea that we've triggered runaway warming, and I mean, that's all just nonsense.
[01:24:12.520 --> 01:24:14.680] And we've got to debunk that nonsense.
[01:24:15.200 --> 01:24:26.400] The argument that, well, look at how fraught our politics are, look at the election that you know we're in right now, that's much more difficult to combat because it's understandable.
[01:24:26.400 --> 01:24:40.960] Young folks, especially I teach at the University of Pennsylvania, I interact with Gen Z folks every day, and I understand that, and I understand where it comes from, the cynicism that we can ever rise to the challenge.
[01:24:41.280 --> 01:24:59.120] What I would say is that I remain sort of a stubborn optimist, is the expression I would use, in that I see in those same students a level of fortitude and just sort of Gen Zers.
[01:24:59.120 --> 01:25:03.600] To me, they get it in a way that previous generations didn't.
[01:25:03.600 --> 01:25:07.600] They understand that it's their future that's on the line.
[01:25:07.600 --> 01:25:09.120] They are very savvy.
[01:25:09.120 --> 01:25:20.080] They're very, when it comes to communication and me, because they grew up in this sort of social media world and they have tools and abilities that my generation didn't have, and they're using those tools to make a real difference.
[01:25:20.080 --> 01:25:28.320] So when I see the students that I teach, every day they sort of inspire me and they convince me that we can do it.
[01:25:29.520 --> 01:25:31.520] I've described myself as a pragmatic optimist.
[01:25:32.080 --> 01:25:32.480] Same way.
[01:25:32.480 --> 01:25:35.040] I'm an optimist because that's the only thing that's going to work.
[01:25:35.920 --> 01:25:39.280] Because being a doomist is self-defeating.
[01:25:39.920 --> 01:25:42.800] And the same thing with political, it's, oh, there's nothing we could do about it.
[01:25:42.800 --> 01:25:44.000] That's a self-fulfilling prophecy.
[01:25:44.240 --> 01:25:45.200] That's exactly right.
[01:25:45.200 --> 01:25:47.200] And that's the struggle here, right?
[01:25:47.200 --> 01:25:55.040] Is that as scientists and as critical thinkers, we understand that we've got to be truthful to what the science says.
[01:25:55.520 --> 01:26:08.840] We cannot violate that oath that we have to be truthful to the science, which means that if the science really did tell us it was too late to prevent runaway warming, we'd have to come forward and say so.
[01:26:09.160 --> 01:26:18.840] I mean, and so Steve Schneider referred to this as sort of an ethical double bind, that we have to be truthful to the science and we have to be effective at the same time.
[01:26:18.840 --> 01:26:21.240] It's looking forward as we ever had to do.
[01:26:21.800 --> 01:26:23.560] I often see a parallel.
[01:26:23.560 --> 01:26:35.320] It's interesting in my work working with cancer patients, I see a big parallel, and I learn a lot from my patients that there's often a hope for the best, but prepare for the worst mentality.
[01:26:35.320 --> 01:26:46.600] And understanding that there are some things that are written in stone, and there are other things where, when it comes to hope, I think this is the biggest thing that has kept me from diving into cynicism.
[01:26:46.600 --> 01:26:48.200] Hope is a moving target.
[01:26:48.200 --> 01:26:52.360] You can adjust and adapt your hope based on the available evidence.
[01:26:52.600 --> 01:26:54.920] And it doesn't have to be all the way over here.
[01:26:55.160 --> 01:26:57.240] You can have hope for something over here.
[01:26:57.240 --> 01:26:58.040] That's achievable.
[01:26:58.280 --> 01:27:04.040] Yeah, when you're constantly sort of allowing that, it can keep you looking at the next step and the next way.
[01:27:04.520 --> 01:27:06.680] We're sort of wired that way, aren't we?
[01:27:06.680 --> 01:27:15.960] Yeah, and along those lines, because I was going to ask you this question too, because one of the tactics that we also take is: yes, there are tipping points, but climate change also is a spectrum.
[01:27:15.960 --> 01:27:20.120] It's like, and yeah, even if you think it's going to get bad, it could get worse.
[01:27:20.440 --> 01:27:23.800] So at this point, our goal is always to make it less bad.
[01:27:24.760 --> 01:27:25.560] No, absolutely.
[01:27:25.720 --> 01:27:37.400] And this is, I always try to disabuse folks from the notion that there's like this cliff, like this tipping point, that we warm the planet to 1.5 Celsius and we go off the climate cliff.
[01:27:37.400 --> 01:27:38.840] That's not how it works.
[01:27:39.160 --> 01:27:42.120] I liken it to a highway.
[01:27:42.120 --> 01:27:47.440] We're going down this dangerous highway and we want to get off at the earliest exit that we can.
[01:27:44.840 --> 01:27:52.320] If we miss the 1.5 exit, we don't drive all the way to Topeka.
[01:27:52.960 --> 01:27:55.280] We get off at the 1.6 exit.
[01:27:55.840 --> 01:28:06.400] But isn't there, though, isn't there something, though, to the idea, though, that a tipping point could be achieved where the climate basically resets itself into a new stable configuration?
[01:28:06.400 --> 01:28:07.840] I mean, isn't that a thing, though?
[01:28:08.160 --> 01:28:09.520] Yeah, so let me clarify.
[01:28:09.520 --> 01:28:11.600] So there is no one tipping point.
[01:28:12.240 --> 01:28:17.360] There are many possible tipping points with various subsystems of the climate.
[01:28:17.360 --> 01:28:19.360] There are certain tipping elements.
[01:28:19.360 --> 01:28:24.400] There are aspects of the climate system that have tipping point dynamics.
[01:28:24.640 --> 01:28:26.720] But no one tipping point is catastrophic.
[01:28:26.720 --> 01:28:34.640] Yeah, so if you look at global temperature, for example, that behaves very linearly, at least as far as we can see with the observations and the models.
[01:28:35.440 --> 01:28:40.080] The amount of warming is linearly proportional to the carbon emissions.
[01:28:40.080 --> 01:28:41.840] No obvious tipping point.
[01:28:41.840 --> 01:28:47.360] There's no runaway methane feedback of the sort that some climate doomers argue.
[01:28:48.720 --> 01:28:52.880] The warming is a very linear function, and the models have predicted it very well.
[01:28:52.880 --> 01:28:59.200] Where there is the potential for tipping points is in the non-linear components of the system.
[01:28:59.200 --> 01:29:01.120] And the ice sheets are a good example.
[01:29:01.280 --> 01:29:08.960] There are feedback processes that once you start the collapse of the ice shelves and the ice starts surging into the ocean, it sort of feeds back on itself.
[01:29:08.960 --> 01:29:11.520] There's something called the ice cliff instability.
[01:29:11.520 --> 01:29:17.440] There are these various positive feedback mechanisms, and of course, positive feedback is not a good thing here.
[01:29:18.560 --> 01:29:21.040] And so, there are elements of the climate system.
[01:29:21.040 --> 01:29:23.040] Ice sheet collapse is one of them.
[01:29:23.040 --> 01:29:28.160] There's been a lot of press coverage over the last week or so about the ocean conveyor.
[01:29:28.720 --> 01:29:29.360] Yes, AMOC?
[01:29:29.440 --> 01:29:30.680] The AMOC, exactly.
[01:29:30.840 --> 01:29:34.760] Atlantic meridional overturning circulation.
[01:29:30.000 --> 01:29:36.440] You can see why we don't spell it out.
[01:29:37.240 --> 01:29:39.000] You were a researcher on that, weren't you?
[01:29:39.800 --> 01:29:45.400] I signed my name to a letter from a number of scientists.
[01:29:45.400 --> 01:29:54.920] Stefan Ramsdorf, a friend of mine from Germany, leading climate scientist from the Potsdam Institute, was sort of spearheaded this.
[01:29:54.920 --> 01:30:02.280] But I signed my name to it, and for some reason, in the States anyway, I got mentioned quite a bit in the coverage.
[01:30:02.280 --> 01:30:07.160] I was sort of like a minor signatory of it, and I agree with the overall tenor of what it said.
[01:30:07.160 --> 01:31:11.200] And there's some fine print that didn't get much coverage in the that you know it's speculative that we don't know where the the tipping point is there are uncertainties in the models the observations you know are incomplete and so it's more speculative than definitive and this is about like the Atlantic conveyor belt turning off is that yeah they said you saw the movie the day after tomorrow yeah I mean and what happens in that movie is unrealistic right you're not gonna have tornadoes destroy Los Angeles you're not gonna have super cooled plumes of air killing people or an ice sheet forming you know within three days over the northern half of North America and you're certainly not gonna have Jake Gyllenha winning an academic decatal law there's just no way that's gonna be at least believable I mean you I could not suspend my disproportionate debelief in that they did say, though, like it could be it could be within a few decades, or it could be, you know, by the end of the century.
[01:31:11.200 --> 01:31:13.920] You know, I mean, the time span is is big, but.
[01:31:14.200 --> 01:31:17.040] The models, there's a huge spread in the models that we use.
[01:31:17.040 --> 01:31:20.640] Some of them, it collapses within a couple decades.
[01:31:20.640 --> 01:31:24.080] Some of them, it goes along happily to the end of the century.
[01:31:24.080 --> 01:31:29.920] The other thing is, if it happens, the consequences aren't quite as catastrophic as you might think.
[01:31:30.160 --> 01:31:31.520] Okay, that's encouraging.
[01:31:31.520 --> 01:31:35.920] Yeah, I mean, so what would happen is you'd get cooling in the North Atlantic.
[01:31:35.920 --> 01:31:51.840] And we wrote an article in Nature back in 2015 showing that there's signs, like if you look at the global temperature pattern of the last century, there's warming just about everywhere, but there's this little hole south of Greenland in the North Atlantic where temperatures have actually been cooling.
[01:31:51.840 --> 01:31:52.800] Is that what they call it?
[01:31:52.800 --> 01:31:53.440] The cold blob?
[01:31:53.440 --> 01:31:54.000] The cold blob.
[01:31:54.000 --> 01:31:54.320] There you are.
[01:31:54.320 --> 01:31:54.560] Absolutely.
[01:31:54.720 --> 01:31:55.840] Is Ireland in the cold blob?
[01:31:56.000 --> 01:31:57.280] Because they complain a lot about this.
[01:31:57.760 --> 01:31:59.360] It's getting colder in Ireland.
[01:31:59.360 --> 01:32:03.840] Well, they're in the periphery of where the area that could see.
[01:32:03.840 --> 01:32:10.160] So, I mean, in reality, there's only a small area in the North Atlantic that would likely cool.
[01:32:10.640 --> 01:32:21.920] You would, yeah, and maybe reaching in, you know, to like England, Ireland, slight cooling that could happen, but it's competing against the overall warming.
[01:32:21.920 --> 01:32:23.440] And so it's a battle against.
[01:32:23.680 --> 01:32:28.800] And the further you get away from the bullseye, the more the warming is likely to win out over the cooling.
[01:32:28.960 --> 01:32:33.200] It's like the heat miser and cold misery.
[01:32:33.280 --> 01:32:34.800] Cold misery is a good idea.
[01:32:34.960 --> 01:32:35.280] Absolutely.
[01:32:35.280 --> 01:32:40.240] In fact, I talk about cold miser and heat miser in Our Fragile Moment.
[01:32:40.640 --> 01:32:41.360] As you should.
[01:32:41.360 --> 01:32:41.760] Yeah.
[01:32:42.800 --> 01:32:44.560] And you name it.
[01:32:44.560 --> 01:32:45.840] We talk about it in the book.
[01:32:46.720 --> 01:32:57.680] You mentioned this idea that you're already starting to see some evidence of this, that you use the analogy of this really dangerous freeway and wanting to get off of it as soon as possible.
[01:32:57.680 --> 01:33:06.760] And yes, there may be a couple of different identified cliffs to be relative points where there are cataclysmic or maybe less cataclysmic events happening.
[01:33:07.080 --> 01:33:12.920] But I think an important point to reinforce there is that we are on the freeway now.
[01:33:12.920 --> 01:33:21.960] And just because I am in a home with good air conditioning and I can wear a coat in the winter and I can wear a tank top in the summer and I have plenty of water at my disposal.
[01:33:22.120 --> 01:33:25.480] There are people whose lives are already destroyed by this.
[01:33:25.880 --> 01:33:27.800] And I think we often don't bring that.
[01:33:27.800 --> 01:33:30.920] We keep talking about later on instead of right now.
[01:33:31.160 --> 01:33:33.400] So the climate refugees you're talking about, right?
[01:33:33.400 --> 01:33:43.640] Like, yeah, we if you don't live, if you don't live near the equator, then not much has really changed for me for if you're, you know, there are all of these areas.
[01:33:43.880 --> 01:33:46.760] Yeah, there's people that are already just massively displaced.
[01:33:47.960 --> 01:33:56.680] I just covered this famine that's happening across multiple countries where there's no that high nutrition yield food available to these children.
[01:33:56.680 --> 01:33:57.800] This is climate change.
[01:33:58.360 --> 01:33:58.600] Absolutely.
[01:33:58.840 --> 01:33:59.880] It's directly related.
[01:34:00.200 --> 01:34:00.760] Absolutely.
[01:34:00.760 --> 01:34:08.520] I mean, this is something that I try to convey when I talk about where we are, that dangerous climate change.
[01:34:08.520 --> 01:34:14.280] There's this false notion that, again, that cliff, like dangerous climate change is 1.5 Celsius.
[01:34:14.280 --> 01:34:16.440] That's what dangerous climate change.
[01:34:16.920 --> 01:34:18.600] Dangerous climate change is here.
[01:34:19.560 --> 01:34:25.240] If you're in Puerto Rico, if you're California, if you're, I mean, North Carolina.
[01:34:25.240 --> 01:34:26.040] North Carolina.
[01:34:26.760 --> 01:34:32.840] Yeah, as you said, we can't ever say that hurricane was due to climate change, but we know that there's way too many things happening.
[01:34:33.080 --> 01:34:44.360] Well, we know that rainfall, there's a study already, an attribution study, that shows that there was a 50% increase in the rainfall associated with Hellene that can be attributed to the warming of the planet.
[01:34:44.360 --> 01:34:47.600] And that's that can be linked directly to human lives.
[01:34:48.400 --> 01:34:48.640] Absolutely.
[01:34:44.840 --> 01:34:52.080] Lives loss and huge amounts of property damage.
[01:34:52.720 --> 01:34:54.240] A friend of mine lives in Texas, right?
[01:34:54.240 --> 01:34:56.160] And he's been telling me about the summers recently.
[01:34:56.400 --> 01:34:58.240] And, you know, and he's like, it's getting bad.
[01:34:58.240 --> 01:35:00.640] You know, the grid down there sucks.
[01:35:00.640 --> 01:35:03.440] It can't handle anything extreme that's going on.
[01:35:03.440 --> 01:35:04.800] You know, it's going to be a long time.
[01:35:05.120 --> 01:35:05.520] Yeah.
[01:35:05.520 --> 01:35:07.680] Oh, no, he had they were way above 110.
[01:35:07.680 --> 01:35:10.400] They're way above 110 last year, right?
[01:35:10.400 --> 01:35:10.800] That's okay.
[01:35:10.800 --> 01:35:13.200] You can just fly off to Cancun if you're the center of the country.
[01:35:13.840 --> 01:35:14.160] Exactly.
[01:35:14.880 --> 01:35:21.520] So what's funny is I'm talking to him, and in the conversation, we're like, what's happening with the people who live in Mexico?
[01:35:21.520 --> 01:35:23.600] Or you keep going down to the equator.
[01:35:23.600 --> 01:35:26.160] I mean, it's like, yeah, it's bad in Texas, but we're north.
[01:35:26.480 --> 01:35:27.920] Well, and that's very real.
[01:35:27.920 --> 01:35:37.360] I mean, and so, like, you know, this leads us to, so all of these calamities mean that there are areas of the planet that are becoming increasingly unlivable.
[01:35:37.360 --> 01:35:46.080] And that means more people, 8.2 billion people, and growing, competing for less land, less food, less water.
[01:35:46.400 --> 01:35:50.880] And that is a prescription for a dystopian huge.
[01:35:51.120 --> 01:35:53.040] You think we're having an immigrant problem now?
[01:35:53.440 --> 01:35:54.160] Imagine what that's going to happen.
[01:35:54.240 --> 01:35:57.280] I think it's just going to be a lot of that even.
[01:35:57.600 --> 01:36:00.720] Like, there are people who need a place to live.
[01:36:00.720 --> 01:36:01.360] Well, that's the thing.
[01:36:01.600 --> 01:36:04.400] The thing that scares me the most is that, right?
[01:36:04.400 --> 01:36:10.080] When we get to a point, let's say, you know, and I don't want to catastrophize, but this basically is a catastrophe.
[01:36:10.080 --> 01:36:18.160] But the idea is at some point, there's something that there's going to be some point in time when a lot of people are going to be displaced.
[01:36:18.160 --> 01:36:22.880] And it can't be like, it's not going to be a regional fix, like, oh, they just moved a little north here, whatever.
[01:36:22.880 --> 01:36:25.280] It's going to be like, we have to move people out.
[01:36:25.280 --> 01:36:26.480] This is a global effort.
[01:36:26.960 --> 01:36:30.600] It could be a thing where we have to save 100 million people from dying, you know?
[01:36:32.440 --> 01:36:33.320] We are in that.
[01:36:33.320 --> 01:36:34.280] Yeah, exactly.
[01:36:34.600 --> 01:36:35.400] It's already happening.
[01:36:35.640 --> 01:36:35.880] Exactly.
[01:36:35.960 --> 01:36:37.800] And it's causing massive geopolitical.
[01:36:37.880 --> 01:36:39.400] But you know what they say, Kara?
[01:36:39.400 --> 01:36:46.120] They say, well, we're not, the United States isn't the biggest greenhouse gas emitter, and if China's not doing it, why should we do it?
[01:36:46.120 --> 01:36:46.920] So we are.
[01:36:47.160 --> 01:36:48.120] We put more.
[01:36:48.120 --> 01:36:52.760] So the climate only cares about the cumulative carbon pollution that's been added.
[01:36:52.760 --> 01:36:54.280] It doesn't care about the country at all.
[01:36:54.840 --> 01:36:55.720] It doesn't care about the country.
[01:36:55.800 --> 01:36:58.200] It doesn't care about what year it was added.
[01:36:58.200 --> 01:37:02.120] So the United States has put more carbon pollution into the atmosphere than any other country.
[01:37:02.600 --> 01:37:03.800] Cumulatively, we are the big.
[01:37:03.960 --> 01:37:04.120] Yeah.
[01:37:04.360 --> 01:37:04.680] Currently.
[01:37:05.880 --> 01:37:07.880] Per capita, we're also the biggest.
[01:37:08.280 --> 01:37:08.760] Exactly.
[01:37:08.760 --> 01:37:10.280] But China just has a big population.
[01:37:10.600 --> 01:37:11.720] They're currently at the moment.
[01:37:11.880 --> 01:37:12.760] They're currently, exactly.
[01:37:12.760 --> 01:37:13.000] Exactly.
[01:37:13.400 --> 01:37:16.120] So they're also kicking our ass on solar power and other things.
[01:37:16.360 --> 01:37:19.720] Yeah, so they may mitigate that long before they even reach us cumulatively.
[01:37:19.960 --> 01:37:20.840] That's exactly right.
[01:37:21.080 --> 01:37:24.120] They're ahead of us on the renewable energy front.
[01:37:24.120 --> 01:37:29.480] They are, yes, they may be emitting carbon, more carbon than us, but not on a per capita basis.
[01:37:29.800 --> 01:37:30.360] Not even close.
[01:37:30.520 --> 01:37:32.680] So it's a diversionary tactic, right?
[01:37:33.080 --> 01:37:37.160] When you hear that, oh, what about, it's what about is it what we call it?
[01:37:37.560 --> 01:37:37.800] Yeah, yeah.
[01:37:38.600 --> 01:37:41.960] I know, it's frustrating because you're like, well, it's got to start somewhere.
[01:37:41.960 --> 01:37:44.600] You know, it's like, it's not like, okay, on three, we're all going to start doing this.
[01:37:45.080 --> 01:37:47.480] Well, it's the exact same thing as the doomism.
[01:37:47.480 --> 01:37:50.280] It's like, well, unless everyone's doing it, what's the point?
[01:37:50.280 --> 01:37:52.120] It's like, yeah, but every bit counts.
[01:37:52.120 --> 01:37:53.400] It could always be worse.
[01:37:53.400 --> 01:37:54.440] Making it less bad.
[01:37:54.680 --> 01:37:55.560] We're doing our best.
[01:37:55.560 --> 01:37:56.040] All that stuff.
[01:37:56.040 --> 01:37:56.680] It's like it's.
[01:37:56.680 --> 01:37:59.480] And I think it is part of the same counteracting the doomism thing.
[01:37:59.560 --> 01:37:59.640] Yeah.
[01:37:59.640 --> 01:38:06.680] You hear the analogy of like being in a rowboat and there's holes in the floor of the rowboat and you're not going to plug your hole because that guy over there hasn't plugged his hole.
[01:38:06.680 --> 01:38:07.400] Like, what are you doing?
[01:38:07.880 --> 01:38:08.760] Of course, I'm going to plug my hole.
[01:38:08.760 --> 01:38:10.760] Then I'll go help you plug your hole once my hole is plugged.
[01:38:10.920 --> 01:38:17.280] And the governmental b bureaucracy, it's, you know, it's maddening because we see issues being discussed.
[01:38:17.280 --> 01:38:18.000] Like, yeah, I get it.
[01:38:18.000 --> 01:38:19.280] Yeah, that's not a bad issue.
[01:38:14.760 --> 01:38:20.000] That's not a bad issue.
[01:38:20.160 --> 01:38:25.920] But can global warming please take a front seat at some point in some recent administration, right?
[01:38:25.920 --> 01:38:27.520] Well, what do you ask you next?
[01:38:28.160 --> 01:38:33.520] I actually wanted to piggyback on what you just said because I want to extend the analogy further.
[01:38:33.680 --> 01:38:39.520] And then there are those who are telling us that, no, we don't need to plug the holes in the bottom of the boat.
[01:38:39.520 --> 01:38:42.800] We just need spoons to spoon out the water.
[01:38:43.440 --> 01:38:43.840] Right.
[01:38:44.160 --> 01:38:44.880] Yeah, yeah.
[01:38:45.120 --> 01:38:45.520] All these things.
[01:38:45.680 --> 01:38:46.400] Geoengineering.
[01:38:46.720 --> 01:38:48.720] It's so obvious what we can do, right?
[01:38:49.360 --> 01:38:55.200] I had somebody email me going, What if everything stays the same and we just get really good at carbon capture?
[01:38:55.200 --> 01:38:58.880] And I'm like, We will be very good at carbon capture in about 100 years.
[01:38:58.880 --> 01:39:01.360] We'll have massive, awesome technology.
[01:39:01.520 --> 01:39:02.640] We can't wait till then.
[01:39:02.640 --> 01:39:03.360] And that's exactly what we're doing.
[01:39:03.840 --> 01:39:16.800] Speaking about that, speaking about that, some people I've read have a point of view where they say that at some point, some of these, maybe some of these non-linear tipping points will be kind of imminent, apparently.
[01:39:16.800 --> 01:39:20.640] And merely decreasing the warming isn't going to be enough.
[01:39:20.640 --> 01:39:25.680] And we need to actively cool the planet with some form of geoengineering.
[01:39:25.680 --> 01:39:27.440] So, what are your thoughts on geoengineering?
[01:39:27.600 --> 01:39:30.880] Yes, I have a whole chapter in the new climate war.
[01:39:30.880 --> 01:39:33.120] I'll talk a little bit about it in our fragile moment.
[01:39:33.120 --> 01:39:36.320] Geoengineering or what could possibly go wrong.
[01:39:37.040 --> 01:39:37.200] Right.
[01:39:37.440 --> 01:39:38.800] Like Snow Skier Serviversia.
[01:39:39.760 --> 01:39:40.000] Right.
[01:39:40.160 --> 01:39:40.480] Yes.
[01:39:40.880 --> 01:39:41.200] Yeah.
[01:39:42.400 --> 01:39:56.400] And there was also an episode of this climate series that ran on one of the streaming services where they had a geoengineering sort of mishap.
[01:39:56.280 --> 01:39:57.600] Mishap, exactly.
[01:39:57.920 --> 01:39:58.240] Oops.
[01:39:58.400 --> 01:40:08.280] I think it involved the problem is that it's being used mostly as a crutch right now, as an excuse for business as usual.
[01:40:08.520 --> 01:40:12.040] I mean, who's really, you know, who is really pushing geoengineering?
[01:40:12.040 --> 01:40:15.640] It's Rex Tillerson, you know, former CEO of ExxonMobil.
[01:40:15.640 --> 01:40:18.040] Climate change is just an engineering problem.
[01:40:18.040 --> 01:40:25.320] This idea that we can engineer our way out of it is sort of a license to continue to pollute.
[01:40:25.320 --> 01:40:31.720] And as we've already sort of alluded to here, this is not proven technology at scale.
[01:40:31.720 --> 01:40:34.120] Like, yeah, it's theoretical at this point.
[01:40:34.360 --> 01:40:41.400] There's no demonstration that it can be done at scale on the timeframe necessary, like carbon capture.
[01:40:41.400 --> 01:40:46.520] Now, geoengineering other things like sulfate aerosol in the stratosphere.
[01:40:46.520 --> 01:40:50.680] I mean, that's something that we have lots of, we've seen it happen with volcanic eruptions.
[01:40:50.680 --> 01:40:54.840] We know what it can, the impact it can have to the Earth's albedo and stuff.
[01:40:54.840 --> 01:41:01.800] So those are, that seems more interesting to me because you put those up there, and some studies show that diamond dust apparently would be really efficient.
[01:41:02.040 --> 01:41:09.320] Put them up there, they last for a while, and they slowly have a lot of $100 trillion.
[01:41:09.640 --> 01:41:12.600] $100 trillion or $200 trillion, over 40, over 40 years.
[01:41:12.840 --> 01:41:13.640] 200 trillion.
[01:41:13.640 --> 01:41:14.440] Over 40 years.
[01:41:14.440 --> 01:41:18.040] But, I mean, it could be, they said 1.6 degrees cooling in that time.
[01:41:18.040 --> 01:41:19.400] But here's the part that's so infuriating.
[01:41:19.400 --> 01:41:22.360] It's like I think about these like almost like a medical analogy, right?
[01:41:22.360 --> 01:41:25.560] Like, if you don't want cavities, brush your teeth.
[01:41:25.560 --> 01:41:28.440] Don't just extract the tooth every time you get a cavity.
[01:41:28.760 --> 01:41:33.560] Like, if you're allergic to shellfish, don't just eat a bunch of shellfish and keep giving yourself a bunch of stuff.
[01:41:34.440 --> 01:41:45.360] I used that framing in a piece I wrote for BBC once that, you know, as sort of planetary doctors, if you will, we have an oath to first do no harm.
[01:41:44.840 --> 01:41:52.000] And this violates that oath, potentially, because what you're describing, like, yeah, it's like a volcano.
[01:41:52.080 --> 01:41:53.680] We know how during a volcanic eruption.
[01:41:53.840 --> 01:41:54.720] Acid rain, acid rain, isn't it?
[01:41:54.880 --> 01:41:58.480] Whether there's acid rain, and it's also the climate that you get.
[01:41:58.960 --> 01:42:05.840] So if all we cared about was the global average temperature, you can calculate how much sulfate aerosol you would need to pump in to cool off.
[01:42:06.960 --> 01:42:16.000] But the problem is, if you look at the pattern of the climate response to a volcanic eruption, it's not the inverse of the global warming pattern.
[01:42:16.480 --> 01:42:19.680] So some areas will warm even faster.
[01:42:19.680 --> 01:42:20.880] Some will cool.
[01:42:20.880 --> 01:42:25.760] The hydrological cycle globally over the continents is likely to slow down.
[01:42:25.760 --> 01:42:34.160] So you get drying over the continents and then ozone depletion and other potential unintended consequences.
[01:42:34.160 --> 01:42:36.720] So sort of the principle of unintended.
[01:42:36.880 --> 01:42:39.680] And there's uncertainty, and uncertainty isn't our friend here.
[01:42:39.920 --> 01:42:41.280] This is the largest system.
[01:42:41.280 --> 01:42:42.320] We are in this system.
[01:42:42.320 --> 01:42:43.840] It's a stochastic system.
[01:42:44.080 --> 01:42:45.760] Everything we do changes it.
[01:42:45.760 --> 01:42:49.200] And we got into this mess because we geoengineer.
[01:42:49.200 --> 01:42:50.800] I think that's the thing we have to remember.
[01:42:51.120 --> 01:42:54.000] The extractive practices, the utilization, the burning.
[01:42:54.000 --> 01:42:55.440] This is geoengineering.
[01:42:55.440 --> 01:43:01.360] And we think we're going to geoengineer our way out when what we could do is stop geoengineering so much.
[01:43:01.600 --> 01:43:04.480] Well, this actually came up at the VIP luncheon.
[01:43:04.800 --> 01:43:05.840] Were any of you at that?
[01:43:06.400 --> 01:43:06.560] Okay.
[01:43:07.280 --> 01:43:08.080] We're not VIP.
[01:43:08.800 --> 01:43:10.800] I had an extra ticket I would have given somebody.
[01:43:11.440 --> 01:43:12.240] We were interviewing people.
[01:43:12.320 --> 01:43:12.720] We were interviewing.
[01:43:12.880 --> 01:43:13.360] Yeah, no, exactly.
[01:43:13.440 --> 01:43:13.840] Got it.
[01:43:13.840 --> 01:43:14.240] Yeah.
[01:43:15.040 --> 01:43:17.360] We had a good sort of panel discussion.
[01:43:17.840 --> 01:43:24.000] And one of the things that came up was like we were asked a question about time travel.
[01:43:24.000 --> 01:43:24.720] I was going to say that.
[01:43:24.880 --> 01:43:25.920] I was just going to say that.
[01:43:25.920 --> 01:43:28.480] And Neil deGrasse Tyson had an answer.
[01:43:28.480 --> 01:43:31.640] Brian Cox had an answer.
[01:43:32.200 --> 01:43:35.000] And then the question was posed to the physicists.
[01:43:28.960 --> 01:43:35.880] I said, oh, wait a second.
[01:43:35.960 --> 01:43:37.400] That was all but dissertation physics.
[01:43:37.400 --> 01:43:40.520] So let me give an answer, like a story about time travel.
[01:43:41.000 --> 01:43:45.640] And the story that I used is like when people asked me, is there any techno fix to climate that would work?
[01:43:45.640 --> 01:43:46.840] I said, yeah, time machine.
[01:43:47.400 --> 01:43:47.960] Right.
[01:43:48.440 --> 01:43:51.880] That's the only techno fix that would be going to work.
[01:43:52.040 --> 01:43:52.760] It'll fully work.
[01:43:53.160 --> 01:44:06.840] I'm afraid, though, that 30, say 30 years when it's bad, really bad, because I'm not very, as you know, I'm not very optimistic that we're going to get our shit together enough to make this not even more disastrous.
[01:44:06.920 --> 01:44:11.240] 30 or 40 years, I could see some countries getting together and say, we've got to do something dramatic now.
[01:44:11.240 --> 01:44:12.680] Let's try this geoengineering.
[01:44:12.680 --> 01:44:13.320] Hell yeah.
[01:44:13.640 --> 01:44:14.520] You raise a good point.
[01:44:14.920 --> 01:44:15.560] That's scary.
[01:44:16.040 --> 01:44:17.400] We don't have a global government.
[01:44:17.800 --> 01:44:17.960] Yeah.
[01:44:18.200 --> 01:44:18.520] Exactly.
[01:44:18.760 --> 01:44:19.560] That's a problem.
[01:44:19.880 --> 01:44:24.040] And so you have people now looking at global governance issues.
[01:44:25.240 --> 01:44:26.440] And I support that.
[01:44:26.440 --> 01:44:37.080] Like, even if you don't support geoengineering, we should be thinking about what we do in the event that a rogue country, like China, decides it's favorable for its interests to.
[01:44:37.560 --> 01:44:57.640] And part of the problem here is one place cools at the expense of another place warming more, and so it becomes this very fraught sort of game theory problem if you have individual actors acting in a way to try to optimize their own climate in a way that may be destructive to the global climate.
[01:44:57.960 --> 01:45:01.800] Yeah, I don't want to be the sunny optimist Pollyannish about it.
[01:45:01.960 --> 01:45:06.280] I don't think it's too late to solve the climate crisis for the reasons that we've talked about.
[01:45:06.280 --> 01:45:09.480] But I think we have a monumental battle on our hands.
[01:45:09.480 --> 01:45:20.160] And the larger battle here is the battle to preserve democracy, to preserve fact-based discourse, and the notion of objective truth.
[01:45:22.640 --> 01:45:24.160] Which you guys are already doing.
[01:45:24.400 --> 01:45:27.200] You're saying the solution is to become a patron of the SGU.
[01:45:27.760 --> 01:45:28.640] That's what I'm hearing.
[01:45:29.440 --> 01:45:30.480] That's what I heard.
[01:45:30.480 --> 01:45:30.880] It is.
[01:45:32.640 --> 01:45:34.480] And I'll be looking for the check in the mail next week.
[01:45:34.720 --> 01:45:35.200] Thank you very much.
[01:45:37.120 --> 01:45:38.400] We pay in bubblegum now.
[01:45:40.240 --> 01:45:42.320] Well, Michael, thank you so much for sitting down with us.
[01:45:42.320 --> 01:45:43.040] This was great.
[01:45:43.040 --> 01:45:43.840] It was a pleasure, guys.
[01:45:43.840 --> 01:45:44.080] Thanks.
[01:45:44.080 --> 01:45:44.320] Thank you.
[01:45:44.480 --> 01:45:44.800] Thank you.
[01:45:44.800 --> 01:45:45.840] Thank you.
[01:45:48.400 --> 01:45:53.680] It's time for science or fiction.
[01:45:57.840 --> 01:46:03.040] Each week, I come up with three science news items or facts: two real, one fake.
[01:46:03.040 --> 01:46:07.040] And then I challenge my panel of skeptics to tell me which one is the fake.
[01:46:07.040 --> 01:46:08.560] Three regular news items.
[01:46:08.560 --> 01:46:09.760] You guys ready this week?
[01:46:09.760 --> 01:46:10.400] Let's do it.
[01:46:10.400 --> 01:46:11.520] All right, here we go.
[01:46:11.520 --> 01:46:23.280] Item number one: analysis of a Martian meteorite indicates the presence of bodies of liquid water on the surface of Mars as late as 742 million years ago.
[01:46:23.280 --> 01:46:35.280] Item number two: a recent study finds that doctors using ChatGPT Plus as an additional aid did no better at making diagnoses than those using only traditional methods.
[01:46:35.280 --> 01:46:44.080] And item number three: in an animal study, researchers have demonstrated the ability to deliver functional mRNA through inhalation.
[01:46:44.400 --> 01:46:45.920] Kara, go first.
[01:46:45.920 --> 01:46:59.920] Okay, so scientists analyzed a Martian meteorite and found the presence or found something indicative of the presence of bodies of liquid water on the surface of Mars as late, does that say 700?
[01:47:00.920 --> 01:47:01.480] Yep.
[01:47:01.640 --> 01:47:02.440] Million years ago.
[01:47:02.440 --> 01:47:06.600] Yeah, I have no idea when the liquid dried up, but I do think there was liquid.
[01:47:06.760 --> 01:47:13.000] A recent study finds that doctors using Chat GPT plus, I don't really know, is that just like the most recent one?
[01:47:13.000 --> 01:47:13.560] What's the difference?
[01:47:14.280 --> 01:47:14.760] Okay.
[01:47:14.760 --> 01:47:20.360] As an additional aid, did know better at making diagnoses than those using only traditional methods.
[01:47:20.360 --> 01:47:21.640] Well, there's so much here.
[01:47:21.640 --> 01:47:23.080] Like, what were they diagnosing?
[01:47:23.080 --> 01:47:24.440] Like a strep throat?
[01:47:24.440 --> 01:47:27.800] Or were these complex, complicated things?
[01:47:27.800 --> 01:47:35.800] And then item number three, in an animal study, researchers have demonstrated the ability to deliver functional mRNA through inhalation.
[01:47:35.800 --> 01:47:38.600] Meaning that it gets into the system and it functions.
[01:47:38.600 --> 01:47:39.800] It does what it's supposed to do.
[01:47:39.800 --> 01:47:41.880] It takes up shop, makes proteins.
[01:47:42.280 --> 01:47:45.240] So I could see them doing that with some sort of viral vector.
[01:47:45.240 --> 01:47:48.520] You inhale the virus and then it inserts itself.
[01:47:48.520 --> 01:47:48.920] I don't know.
[01:47:48.920 --> 01:47:50.600] That one doesn't bother me that much.
[01:47:50.600 --> 01:47:55.160] I have no idea the age of the water, though, so that's problematic.
[01:47:55.160 --> 01:47:58.840] And then the ChatGPT Plus, no better at making diet.
[01:47:59.000 --> 01:48:00.920] There's just too many unknowns in this one.
[01:48:00.920 --> 01:48:05.560] I believe that would be the case if they were diagnosing strep throat.
[01:48:05.560 --> 01:48:17.080] I do not believe that would be the case if they were diagnosing some sort of kind of infectious disease that required understanding of tropical medicine or differentials that weren't.
[01:48:17.320 --> 01:48:21.480] I mean, I'll say they were challenging diagnoses, otherwise it would have been a pointless study.
[01:48:22.040 --> 01:48:22.840] Yeah, I agree.
[01:48:22.840 --> 01:48:27.960] So I think in that case, then the Chat GPT plus one is the fiction.
[01:48:27.960 --> 01:48:31.960] I think that probably it did help to be able to ask.
[01:48:32.280 --> 01:48:38.040] Wait, real quick, though, when you say traditional methods, does that mean they were still able to look stuff up in books?
[01:48:38.040 --> 01:48:38.680] Yeah.
[01:48:38.680 --> 01:48:47.120] Oh, yeah, so they were still following a standard differential model and they still had access to all the same information.
[01:48:48.160 --> 01:48:49.520] Okay, then I'll go with Mars.
[01:48:49.520 --> 01:48:50.800] Maybe the date's wrong.
[01:48:44.760 --> 01:48:51.200] I don't know.
[01:48:52.960 --> 01:48:54.240] Yeah, I don't know.
[01:48:54.240 --> 01:48:55.200] Okay, Bob.
[01:48:55.520 --> 01:48:56.960] Yeah, the Martian meteorite.
[01:48:56.960 --> 01:48:58.080] Yeah, that kind of makes sense.
[01:48:58.080 --> 01:49:00.640] I don't ever see any problems with that.
[01:49:02.320 --> 01:49:08.000] Let's see, go to three, the inhaling mRNA.
[01:49:08.000 --> 01:49:09.520] Yeah, I agree with Kara.
[01:49:09.920 --> 01:49:12.720] If there's a viral vector, that makes a lot of sense.
[01:49:12.720 --> 01:49:14.800] But, I mean, viral vectors are scary.
[01:49:14.800 --> 01:49:16.240] Would they use that route?
[01:49:16.240 --> 01:49:21.760] And then if they had a viral vector, we wouldn't necessarily even need to use it for inhalation.
[01:49:21.760 --> 01:49:23.360] But it kind of makes sense.
[01:49:23.360 --> 01:49:34.160] But ChatGPT, yeah, I mean, I think there might be, I don't know, maybe, unless you're using ChatGPT with a specific medical database.
[01:49:34.160 --> 01:49:45.040] If you're just using, I'm not aware of the ins and outs of ChatGPT Plus, but I think there might be a lot of medical misinformation mixed in there.
[01:49:45.040 --> 01:49:48.480] So generic ChatGPT might not be a great source for that kind of stuff.
[01:49:48.480 --> 01:49:52.400] But it wouldn't surprise me either way, but I'd say ChatGPT is fiction.
[01:49:52.400 --> 01:49:53.520] Okay, Evan?
[01:49:53.520 --> 01:50:02.320] Yeah, so the Mars meteorite, as late as 742 million years ago, with indications of liquid water.
[01:50:02.640 --> 01:50:07.760] We know there has been liquid water on the surface of Mars, but when?
[01:50:07.760 --> 01:50:14.320] And a meteorite revealing that seems to, I don't really see anything wrong with this one.
[01:50:14.560 --> 01:50:18.960] Nothing is saying, hey, Red Alert, this doesn't jive.
[01:50:19.120 --> 01:50:22.640] I think that's in sync with what's understood about Mars.
[01:50:22.640 --> 01:50:24.400] But that we found it is kind of cool.
[01:50:24.400 --> 01:50:35.080] The ChatGPT one, this is the one I think lends itself the most to being the fiction, only because not for any technical reasons, just for a guess reason.
[01:50:35.720 --> 01:50:41.160] It reads here, as an additional aid, did no better at making diagnoses.
[01:50:41.160 --> 01:50:46.200] So that means it would be fiction if it did better or worse.
[01:50:46.200 --> 01:50:49.080] So you kind of have two directions you can go with this one.
[01:50:49.080 --> 01:50:52.760] So I think that opens up the possibility that that one's fiction a little bit more.
[01:50:52.760 --> 01:50:59.320] And the last one, I don't, I know nothing about mRNA through inhalation.
[01:50:59.320 --> 01:51:04.680] And just based on what I heard Kara and Bob say, yea
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 9: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 3 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
t kind of makes sense.
[01:49:23.360 --> 01:49:34.160] But ChatGPT, yeah, I mean, I think there might be, I don't know, maybe, unless you're using ChatGPT with a specific medical database.
[01:49:34.160 --> 01:49:45.040] If you're just using, I'm not aware of the ins and outs of ChatGPT Plus, but I think there might be a lot of medical misinformation mixed in there.
[01:49:45.040 --> 01:49:48.480] So generic ChatGPT might not be a great source for that kind of stuff.
[01:49:48.480 --> 01:49:52.400] But it wouldn't surprise me either way, but I'd say ChatGPT is fiction.
[01:49:52.400 --> 01:49:53.520] Okay, Evan?
[01:49:53.520 --> 01:50:02.320] Yeah, so the Mars meteorite, as late as 742 million years ago, with indications of liquid water.
[01:50:02.640 --> 01:50:07.760] We know there has been liquid water on the surface of Mars, but when?
[01:50:07.760 --> 01:50:14.320] And a meteorite revealing that seems to, I don't really see anything wrong with this one.
[01:50:14.560 --> 01:50:18.960] Nothing is saying, hey, Red Alert, this doesn't jive.
[01:50:19.120 --> 01:50:22.640] I think that's in sync with what's understood about Mars.
[01:50:22.640 --> 01:50:24.400] But that we found it is kind of cool.
[01:50:24.400 --> 01:50:35.080] The ChatGPT one, this is the one I think lends itself the most to being the fiction, only because not for any technical reasons, just for a guess reason.
[01:50:35.720 --> 01:50:41.160] It reads here, as an additional aid, did no better at making diagnoses.
[01:50:41.160 --> 01:50:46.200] So that means it would be fiction if it did better or worse.
[01:50:46.200 --> 01:50:49.080] So you kind of have two directions you can go with this one.
[01:50:49.080 --> 01:50:52.760] So I think that opens up the possibility that that one's fiction a little bit more.
[01:50:52.760 --> 01:50:59.320] And the last one, I don't, I know nothing about mRNA through inhalation.
[01:50:59.320 --> 01:51:04.680] And just based on what I heard Kara and Bob say, yeah, I'll say that one's science.
[01:51:04.680 --> 01:51:06.760] So chat GPT, I'll go with Bob.
[01:51:06.760 --> 01:51:07.560] Fiction.
[01:51:07.560 --> 01:51:08.440] And Jay.
[01:51:08.440 --> 01:51:18.600] The Martian one, I mean, I have no reason to think that the 742 million years ago water thing, like, I just, there's no reason to not believe that.
[01:51:18.600 --> 01:51:22.520] Like, there's just no information in here that you can contest other than the date.
[01:51:22.520 --> 01:51:24.200] And it makes sense, you know.
[01:51:24.200 --> 01:51:28.440] I mean, I can't contest it in any way, so I'm just going to assume that one is science.
[01:51:28.600 --> 01:51:34.440] The second one about the ChatGPT being used as a diagnostic tool.
[01:51:34.440 --> 01:51:36.440] Does it actually help physicians or not?
[01:51:36.440 --> 01:51:46.680] You know, if they have access to the latest information, it would be no different than asking ChatGPT if it has the latest information.
[01:51:46.680 --> 01:51:52.360] The thing that I wouldn't want ChatGPT to actually do, though, is give any kind of medical advice.
[01:51:52.360 --> 01:51:55.320] So I just don't think that's a good thing to do anyway.
[01:51:55.560 --> 01:52:04.040] Now, if they specifically programmed ChatGPT with very specific information and instructions, it might help.
[01:52:04.040 --> 01:52:07.480] I mean, I think it would definitely speed things up at the very least.
[01:52:07.480 --> 01:52:09.960] But I just don't think today that it's there.
[01:52:09.960 --> 01:52:11.160] I don't think that that's it.
[01:52:11.160 --> 01:52:19.600] So I'm definitely very suspicious about the breathing in an mRNA and have it function.
[01:52:20.400 --> 01:52:23.520] You know, I know that breathing certain things in is a vector, right?
[01:52:23.520 --> 01:52:24.880] It does work for certain things.
[01:52:24.880 --> 01:52:27.040] I just don't think it would work for this.
[01:52:27.200 --> 01:52:28.880] So I'm going to say that one's the fiction.
[01:52:28.880 --> 01:52:30.320] Okay, so we're spread out.
[01:52:30.640 --> 01:52:31.760] I hate one first.
[01:52:31.920 --> 01:52:33.520] I disagree with my answer.
[01:52:33.520 --> 01:52:34.640] You want to change the answer?
[01:52:34.720 --> 01:52:35.120] You want to change it?
[01:52:35.200 --> 01:52:36.320] I do, but I can't.
[01:52:37.920 --> 01:52:38.400] We don't do that.
[01:52:38.800 --> 01:52:41.920] Whoa, you never let people do that.
[01:52:41.920 --> 01:52:45.680] I will stick with my answer, but I think the chat GPT one is the fiction.
[01:52:45.680 --> 01:52:46.160] Okay.
[01:52:46.160 --> 01:52:47.440] Well, let's start with number three.
[01:52:48.240 --> 01:52:49.760] We'll go back.
[01:52:49.840 --> 01:52:50.640] Animals study.
[01:52:52.320 --> 01:52:53.360] Liver functional mRNA.
[01:52:53.440 --> 01:52:53.840] Can't say that.
[01:52:53.920 --> 01:52:54.640] Do inhalation.
[01:52:54.640 --> 01:52:56.160] Jay, you think this one is the fiction.
[01:52:56.160 --> 01:52:57.360] Everyone else thinks this one is science.
[01:52:57.360 --> 01:52:58.000] Not anymore, you know.
[01:52:58.240 --> 01:52:59.840] And this one is science.
[01:52:59.840 --> 01:53:01.280] Yeah, sorry, Jay.
[01:53:01.920 --> 01:53:02.880] What do you guys think?
[01:53:02.880 --> 01:53:03.840] It's not a virus.
[01:53:03.840 --> 01:53:08.960] And so if it's not a virus, what are they using as the carrier, as the carrier for the mRNA?
[01:53:08.960 --> 01:53:10.880] Because it can't just make it mRNA.
[01:53:10.960 --> 01:53:11.360] Lipids.
[01:53:11.360 --> 01:53:11.920] CRISPR?
[01:53:11.920 --> 01:53:13.520] Lipids, lipid what?
[01:53:13.840 --> 01:53:14.960] Oh, nanoparticles.
[01:53:15.040 --> 01:53:15.840] Nanosomes.
[01:53:15.840 --> 01:53:16.800] Nanoparticles.
[01:53:16.800 --> 01:53:18.000] Lipid nanoparticles.
[01:53:18.320 --> 01:53:19.680] Didn't you just do a lipid nanoparticle?
[01:53:19.760 --> 01:53:20.080] Yes, I did.
[01:53:20.960 --> 01:53:23.200] Yeah, lipid nanoparticles are magic, man.
[01:53:23.200 --> 01:53:24.160] No, they are great.
[01:53:24.160 --> 01:53:25.440] It's a great technology.
[01:53:25.440 --> 01:53:29.600] So they used lipid nanoparticles to house the mRNA.
[01:53:29.600 --> 01:53:33.840] So basically, it becomes aerosolized or nebulized, and you breathe it in.
[01:53:33.840 --> 01:53:35.120] It gets into the lungs.
[01:53:35.120 --> 01:53:36.000] It sets up shop.
[01:53:36.000 --> 01:53:40.400] And they tested it in mice, and it was just producing a marker.
[01:53:40.800 --> 01:53:46.320] And it was sustained production of the protein that it coded for.
[01:53:46.320 --> 01:53:48.400] I think it made it light up or something.
[01:53:48.400 --> 01:53:58.320] Now, this was also fixing an earlier problem because they had tried to use lipid nanoparticles previously for this aerosolized or nebulized delivery of mRNA.
[01:53:58.320 --> 01:54:03.400] The problem was that the nanoparticles clumped together and they got too big.
[01:53:59.920 --> 01:54:05.080] And then that would provoke an immune response.
[01:54:05.560 --> 01:54:10.680] So, which remember, that was the problem with the other lipid nanoparticle news item that I talked about.
[01:54:10.680 --> 01:54:21.240] So, this study was figuring out how to keep the lipid nanoparticles from clumping so that they could remain individual and they could deliver their mRNA to the lung cells.
[01:54:21.240 --> 01:54:22.200] And it worked.
[01:54:22.200 --> 01:54:22.920] Yay!
[01:54:22.920 --> 01:54:28.600] So, this could be a way, you know, in the future, we may like be inhaling our mRNA vaccines.
[01:54:28.600 --> 01:54:29.800] No more shots, right?
[01:54:29.800 --> 01:54:32.920] Or inhaling mRNA drug deliveries, you know.
[01:54:32.920 --> 01:54:34.120] So, this is a cool.
[01:54:34.280 --> 01:54:36.840] We're already inhaling, what, flu vaccines?
[01:54:37.160 --> 01:54:38.680] Yeah, yeah, yeah, cool, yeah.
[01:54:38.680 --> 01:54:39.480] Did that come back?
[01:54:39.480 --> 01:54:41.240] I thought they stopped doing that.
[01:54:41.240 --> 01:54:45.240] Oh, I don't know, but I mean, it has it has been we have the technology to do that, yeah, yeah.
[01:54:45.320 --> 01:54:47.960] If it's a live virus, only for live viruses, though.
[01:54:48.120 --> 01:55:02.440] Oh, I see, yeah, and that's why they didn't want to necessarily, it was because it's a live virus, not because if it was inhaled, um, so yeah, this is an animal study, so it's got to go through, they're going to, there was in mice, they got to do bigger animals, and then people, and then, so, you know, five to ten years, whatever.
[01:55:03.000 --> 01:55:04.120] I guess we'll keep going back.
[01:55:04.120 --> 01:55:12.680] A recent study finds that doctors using ChatGPT Plus as an additional aid did no better at making diagnoses than those using only traditional methods.
[01:55:12.680 --> 01:55:15.560] Bob and Evan think this one is the fiction.
[01:55:15.560 --> 01:55:18.600] Kara thinks it's the fiction, although she didn't choose it officially.
[01:55:18.600 --> 01:55:23.960] Jay thinks this one is science, and this one is science.
[01:55:23.960 --> 01:55:24.440] What?
[01:55:25.480 --> 01:55:26.280] Oh, no!
[01:55:26.520 --> 01:55:27.720] Oh, shit.
[01:55:29.560 --> 01:55:30.280] What the hell?
[01:55:30.280 --> 01:55:32.840] Kara, man, you managed to pull it out.
[01:55:32.840 --> 01:55:33.480] Oh, my God.
[01:55:33.800 --> 01:55:35.400] Wow, awesome.
[01:55:35.400 --> 01:55:36.480] Like, talking.
[01:55:36.800 --> 01:55:38.680] Being right despite being wrong.
[01:55:38.680 --> 01:55:40.280] That is amazing.
[01:55:40.280 --> 01:55:40.640] Yeah.
[01:55:40.440 --> 01:55:41.960] So, so yeah, they did no better.
[01:55:41.960 --> 01:55:53.280] Now, again, again, this is an additional aid, meaning that the doctors in the group using ChatGPT had access to everything the other group had, plus ChatGPT.
[01:55:53.920 --> 01:55:58.000] And it did not help them make any more accurate diagnoses.
[01:55:58.000 --> 01:56:02.240] But here's the interesting thing I didn't include in the science or fiction.
[01:56:02.240 --> 01:56:11.200] When they compared those two groups to just ChatGPT by itself, ChatGPT outperformed both groups of physicians.
[01:56:11.200 --> 01:56:12.080] You should have included that.
[01:56:12.560 --> 01:56:13.360] Fascinating.
[01:56:13.360 --> 01:56:22.560] Even the group of physicians using ChatGPT, which means that the introduction of a physician resulted in less accurate diagnoses.
[01:56:22.560 --> 01:56:23.920] Yep, goodbye, doctors.
[01:56:24.480 --> 01:56:26.320] So much for all those boards and stuff.
[01:56:26.640 --> 01:56:28.400] That is bananas.
[01:56:28.400 --> 01:56:29.200] So, I know, it's crazy.
[01:56:29.360 --> 01:56:36.800] So, they were asking it more specific questions by then, as opposed to just blanket saying, what does this person have based on these symptoms?
[01:56:36.800 --> 01:56:42.960] Well, what the authors concluded, they did not conclude that we should therefore be having ChatGPT make diagnoses.
[01:56:42.960 --> 01:56:53.200] What they said was, clearly, we need to teach physicians how to optimally use these large language models as a diagnostic assistant because they're not using it right.
[01:56:53.200 --> 01:56:57.200] Whatever, they were just, they didn't help them get to a more accurate diagnosis.
[01:56:57.200 --> 01:57:04.560] So, clearly, they were not leveraging that technology, even though it is capable of making a more accurate diagnosis by itself.
[01:57:04.560 --> 01:57:07.520] Which is, I don't know, it's incredible.
[01:57:07.520 --> 01:57:16.800] Which means that analysis of a Martian meteorite indicates the presence of bodies of liquid water on the surface of Mars as late as 742 million years ago is the fiction.
[01:57:17.120 --> 01:57:18.120] So, here I'm looking at, here.
[01:57:18.080 --> 01:57:24.240] I'm looking at a goddamn article that says, Meteorite contains evidence of liquid water on Mars 742 million years ago.
[01:57:24.240 --> 01:57:24.640] I know.
[01:57:25.040 --> 01:57:25.360] I know.
[01:57:25.360 --> 01:57:26.240] That's exactly that.
[01:57:26.240 --> 01:57:27.600] Is the title of the article?
[01:57:27.920 --> 01:57:29.760] Cool, then we were right.
[01:57:29.880 --> 01:57:52.040] And I was hoping people who read that headline but didn't read into the article itself, especially the actual article, would not have picked up on the fact that they're not saying that there was liquid water on Mars because liquid water on Mars dried up three billion years ago.
[01:57:52.040 --> 01:58:17.480] And what they're saying was that there was probably still some geologic activity, like some volcanic activity, which there still is in pockets on Mars, that that activity melted the permafrost, causing a pocket of liquid water, and that that water affected the crystallization of the minerals in the meteorite that ultimately became a meteorite on Earth, you know, from Mars.
[01:58:18.040 --> 01:58:18.760] Interesting.
[01:58:18.760 --> 01:58:21.080] Yeah, so that was that was the deception.
[01:58:21.080 --> 01:58:29.880] Um, but yeah, if you just read the headline, you might be um it was water, deep water interacting with rock, not water on the surface of Mars.
[01:58:29.880 --> 01:58:31.400] Evan, give us a quote.
[01:58:32.120 --> 01:58:33.400] Here it is.
[01:58:33.400 --> 01:58:35.720] Steve, you uh requested this quote.
[01:58:35.720 --> 01:58:37.160] Yes, I like this quote.
[01:58:37.480 --> 01:59:05.520] I have a foreboding of an America in my children's or grandchildren's time when the United States is a service and information economy, when nearly all the key manufacturing industries have slipped away to other countries, when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues, when the people have lost the ability to set their own agendas or knowledgeably question those in authority.
[01:59:05.520 --> 01:59:21.440] When clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide almost without noticing back into superstition and darkness.
[01:59:21.760 --> 01:59:44.800] The dumbing down of America is most evident in the slow decay of substantive content in the enormously influential media, the 30-second soundbites, now down to 10 seconds or less, lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance.
[01:59:44.800 --> 01:59:58.080] And we should all know that, as Carl Sagan from his book, The Demon Haunted World, if you do not have a copy, you must, must go get a copy, along with a copy of our book, The Skeptic's Guide to the University.
[01:59:58.080 --> 02:00:02.160] Written in the 90s and, you know, basically very prescient.
[02:00:02.160 --> 02:00:03.280] You know, my gosh.
[02:00:03.280 --> 02:00:05.600] In terms of, you basically nailed it.
[02:00:05.600 --> 02:00:09.280] In terms of the case, I may have shared that on my socials the day after the election.
[02:00:09.440 --> 02:00:09.680] I know.
[02:00:09.680 --> 02:00:10.960] This was very a lot of people were sharing.
[02:00:10.960 --> 02:00:14.880] It's like, yeah, you got to read this because this is unfortunately true.
[02:00:14.880 --> 02:00:31.840] The idea that this is my giant frustration with this year is that the media does not meaningfully inform the electorate that people do not understand the issues well enough to make an informed decision.
[02:00:31.840 --> 02:00:43.120] And basically, the majority of people are susceptible to propaganda and manipulation and this kind of lowest common denominator kind of arguments, and it's very disappointing.
[02:00:43.440 --> 02:00:44.240] Hugely.
[02:00:44.800 --> 02:00:49.040] We're going to have some choice things to say about it at our year-end review show in a few days.
[02:00:49.200 --> 02:00:50.000] Oh, yeah.
[02:00:50.640 --> 02:00:57.840] We have a lot of work to do we the the information ecosystem of America is broken, in my opinion.
[02:00:57.840 --> 02:00:59.600] And we've been saying this for years.
[02:00:59.600 --> 02:02:18.680] This is nothing new you know this is it's just getting worse you know with social media and everything uh the loss of good science journalism the loss of good journalism period the you know rise of essentially propaganda media it's just you know we've lost our ability to make informed decisions as voters in this country at least in sufficient numbers that reality holds sway you know the the influence of reality and facts on people's decisions is too small you know to be definitive like it should be unfortunate okay that's the world we're living in thank you all for joining me this week you're welcome all right steve and until next week this is your skeptic's guide to the universe skeptics guide to the universe is produced by sgu productions dedicated to promoting science and critical thinking for more information visit us at the skepticsguide.org send your questions to info at the skepticsguide.org and if you would like to support the show and all the work that we do go to patreon.com slash skepticsguide and consider becoming a patron and becoming part of the sgu community our listeners and supporters are what make sgu possible
Prompt 10: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 11: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:01.040 --> 00:00:06.640] 16 years from today, Greg Gerstner will finally land the perfect cannonball.
[00:00:06.960 --> 00:00:21.920] Epic Splash, Unsuspecting Friends, a work of art only possible because Greg is already meeting all these same people at AARP volunteer and community events that keep him active and involved and help make sure his happiness lives as long as he does.
[00:00:21.920 --> 00:00:25.840] That's why the younger you are, the more you need AARP.
[00:00:25.840 --> 00:00:29.680] Learn more at AARP.org/slash local.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.800] Hello and welcome to the Skeptic's Guide to the Universe.
[00:00:42.800 --> 00:00:47.680] Today is Wednesday, November 13th, 2024, and this is your host, Stephen Novella.
[00:00:47.680 --> 00:00:49.600] Joining me this week are Bob Novella.
[00:00:49.600 --> 00:00:50.240] Hey, everybody.
[00:00:50.240 --> 00:00:51.520] Kara Santa Maria.
[00:00:51.520 --> 00:00:51.920] Howdy.
[00:00:52.000 --> 00:00:52.960] Jay Novella.
[00:00:52.960 --> 00:00:53.760] Hey, guys.
[00:00:53.760 --> 00:00:55.760] And Evan Bernstein.
[00:00:55.760 --> 00:00:57.040] Good evening, everyone.
[00:00:57.040 --> 00:00:58.640] The year is going fast.
[00:00:58.640 --> 00:01:01.120] It's almost the end of 2024.
[00:01:01.600 --> 00:01:05.120] My gosh, we will be recording our year-end episode before you know it.
[00:01:05.120 --> 00:01:05.760] That's right.
[00:01:06.160 --> 00:01:15.680] This is always a good time of the year to remind people that we have published two books: Skeptic's Guide to the Universe and The Skeptic's Guide to the Future.
[00:01:16.000 --> 00:01:20.480] So if you have joined us recently and they're not aware of this fact, check us out.
[00:01:20.960 --> 00:01:24.640] You could get to those books through our website or through Amazon.
[00:01:24.640 --> 00:01:26.880] They make fantastic holiday gifts.
[00:01:26.880 --> 00:01:31.120] You can spread the gift of skepticism to everyone in your life.
[00:01:31.120 --> 00:01:33.440] So check them out if you haven't read them already.
[00:01:33.440 --> 00:01:37.280] And some people have asked us, are they in digital form?
[00:01:37.280 --> 00:01:37.680] Yes.
[00:01:37.680 --> 00:01:39.040] Are they in audio form?
[00:01:39.040 --> 00:01:39.440] Yes.
[00:01:39.440 --> 00:01:41.120] You can get they are read.
[00:01:41.120 --> 00:01:42.800] Both are read by me.
[00:01:43.120 --> 00:01:47.280] But you can also, if you'd like to read on Kindle, you can get them in digital form as well.
[00:01:47.280 --> 00:01:48.240] So, check that out.
[00:01:48.240 --> 00:01:52.080] We're working on a version of Darth Vader reading it, so I'll let you know when that comes out.
[00:01:52.640 --> 00:01:58.960] Oh, have you guys seen any of the latest UAP congressional testimony today?
[00:01:58.960 --> 00:01:59.360] I have not.
[00:01:59.600 --> 00:02:00.120] I heard about it.
[00:01:59.760 --> 00:02:02.200] I didn't get a chance to watch it.
[00:02:02.200 --> 00:02:03.320] Yeah, Jay and I watched it.
[00:01:59.920 --> 00:02:05.640] I think we talked about it on the live stream today.
[00:02:05.960 --> 00:02:08.600] It's as worthless as it's been.
[00:02:08.840 --> 00:02:16.360] Essentially, the UAP is the new name for UFOs, unidentified anomalous phenomena.
[00:02:16.360 --> 00:02:16.920] Okay.
[00:02:16.920 --> 00:02:21.800] I guess it's technically more accurate, more inclusive.
[00:02:21.800 --> 00:02:22.920] But it's still the same thing.
[00:02:22.920 --> 00:02:27.800] It's the same people talking about the same crappy evidence for the last 50 years.
[00:02:27.800 --> 00:02:29.320] They had nothing new.
[00:02:29.320 --> 00:02:31.400] There's just simply nothing going on.
[00:02:31.880 --> 00:02:39.080] One of the bits that we saw today, the UFO proponent was saying, there was this, you know, it's the residue argument, what I call the residue arguments.
[00:02:39.160 --> 00:02:47.400] Like, yes, you could explain 98% of whatever of these sightings of UAP reports, but there's that subset that you can't explain.
[00:02:47.400 --> 00:02:51.880] As if, like, that means there must be something interesting going on.
[00:02:51.880 --> 00:02:54.520] But that's going to be true of anything.
[00:02:54.520 --> 00:03:03.240] Anytime you have hundreds or thousands of whatever, you're not going to have enough information to necessarily definitively explain everything.
[00:03:03.240 --> 00:03:05.960] There's always going to be some unusual cases.
[00:03:05.960 --> 00:03:06.760] And then for people.
[00:03:07.000 --> 00:03:08.440] Without good evidence about that.
[00:03:08.520 --> 00:03:09.320] Yeah, exactly.
[00:03:09.320 --> 00:03:17.640] So the people who have reviewed this said that the reason you can't explain that one or two percent is not because they're fantastical.
[00:03:17.640 --> 00:03:19.640] It's because the evidence is crappy.
[00:03:19.640 --> 00:03:33.560] It says we don't have enough evidence to know what they were, which fits something else that we say all the time, which is like with UFOs, just like with Bigfoot and with everything else, the ambiguity, right, the fuzziness is the phenomenon.
[00:03:33.560 --> 00:03:35.000] That is the phenomenon.
[00:03:35.320 --> 00:03:43.960] It is just the residue of low-quality evidence that or cases that are going to be there for any phenomenon.
[00:03:43.960 --> 00:03:46.400] Yeah, this is going to apply to like every field of every.
[00:03:46.400 --> 00:03:47.600] I think about medicine, right?
[00:03:44.840 --> 00:03:51.920] Like there's a drug, and you take the drug and you get better, but that percentage of people didn't get better.
[00:03:52.240 --> 00:03:54.720] And you look at the autopsy and you can't figure out why.
[00:03:54.720 --> 00:03:57.760] You're not going to assume they didn't get better because magic.
[00:03:57.760 --> 00:03:58.320] Right, exactly.
[00:03:58.720 --> 00:04:03.360] You're just going to assume you couldn't figure it out, but something biological was happening.
[00:04:04.000 --> 00:04:04.480] That's so good.
[00:04:04.640 --> 00:04:05.920] I believe we call this logical fallacy.
[00:04:06.160 --> 00:04:06.960] Good analogy.
[00:04:07.040 --> 00:04:08.720] God of the gaps fallacy.
[00:04:08.720 --> 00:04:09.680] Yes, exactly.
[00:04:09.680 --> 00:04:10.160] Yeah.
[00:04:10.160 --> 00:04:12.160] I can't explain it, therefore, magic.
[00:04:12.160 --> 00:04:14.800] Right, therefore, something fantastic.
[00:04:14.800 --> 00:04:15.280] Yeah.
[00:04:15.280 --> 00:04:16.320] Fill in the blank.
[00:04:16.320 --> 00:04:20.320] And that is where, but that is where all paranormal and all pseudoscience lives.
[00:04:20.320 --> 00:04:21.840] It lives in that gap.
[00:04:22.160 --> 00:04:23.360] Yeah, so nothing new.
[00:04:23.360 --> 00:04:24.560] Same old crap.
[00:04:24.560 --> 00:04:25.280] More of the same.
[00:04:25.280 --> 00:04:26.080] Thank you, Congress.
[00:04:26.080 --> 00:04:27.120] Waste our time, waste our money.
[00:04:27.680 --> 00:04:28.080] Once again.
[00:04:28.080 --> 00:04:35.120] All right, Evan, you're going to get us started with a quickie about advertising magic in Kyrgyzstan.
[00:04:35.840 --> 00:04:36.640] Kyrgyzstan.
[00:04:36.640 --> 00:04:37.440] Kyrgyzstan.
[00:04:37.760 --> 00:04:39.440] Good news, everyone.
[00:04:39.440 --> 00:04:41.520] We're all moving to Kyrgyzstan.
[00:04:42.640 --> 00:04:55.040] And I think I'm going to call these segments going forward good news, everyone, because, you know, occasionally there is some good news in the world of skepticism and rational thought to talk about, and this could be one of them.
[00:04:55.040 --> 00:05:12.080] 2024, yes, Kyrgyzstan enacted earlier this year legislation prohibiting the advertising of services related to the occult and mysticism, which includes things like fortune-telling, shamanism, divination, spiritualism, clairvoyance, and magic.
[00:05:12.080 --> 00:05:13.360] What were we just talking about?
[00:05:13.360 --> 00:05:14.000] Yep.
[00:05:14.000 --> 00:05:22.640] And this band extends across various media platforms, including the internet, outdoor advertising, radio, and television.
[00:05:22.640 --> 00:05:25.680] Whoa, does that mean they can't advertise church?
[00:05:26.000 --> 00:05:27.840] Well, that's interesting.
[00:05:28.160 --> 00:05:30.760] I was going to bring up that more towards the end of the day.
[00:05:31.080 --> 00:05:31.480] Okay, sorry.
[00:05:29.520 --> 00:05:32.760] But no, that's okay.
[00:05:29.840 --> 00:05:34.120] Since you brought it up, I will mention it.
[00:05:34.360 --> 00:05:47.240] So, this is a primarily Muslim country, and the banning of these kinds of practices is very much in accord with Muslim practices.
[00:05:47.240 --> 00:05:53.720] You're not supposed to be doing this thing because it is in violation of the tenets of the religion.
[00:05:53.720 --> 00:06:07.000] So, it's as much a protective measure, probably, for the people of the country, which it is, as it is a, you know, further sort of solidification of the religious culture that permeates.
[00:06:07.000 --> 00:06:09.160] So, it's kind of a combination of both.
[00:06:09.160 --> 00:06:09.880] But you know what?
[00:06:10.040 --> 00:06:14.280] I'll take it because I think other countries should also be doing this stuff.
[00:06:14.680 --> 00:06:16.600] Do we know where Kyrgyzstan is?
[00:06:16.760 --> 00:06:17.880] It's one of the stands.
[00:06:17.880 --> 00:06:18.440] It's one of the stands.
[00:06:18.520 --> 00:06:20.360] Yes, it is one of the stands.
[00:06:20.360 --> 00:06:22.840] And you've been to which one?
[00:06:22.840 --> 00:06:23.720] Kajakstan.
[00:06:23.720 --> 00:06:24.120] Kazakhstan.
[00:06:24.760 --> 00:06:25.480] Yeah, which I borderled.
[00:06:25.640 --> 00:06:26.600] Which borders China.
[00:06:26.920 --> 00:06:27.560] It does.
[00:06:27.560 --> 00:06:27.960] Yep.
[00:06:27.960 --> 00:06:28.520] Yep.
[00:06:28.520 --> 00:06:34.920] So, yeah, and this Kyrgyzstan does also border Kazakhstan, Ubekistan, Tajikistan, and China.
[00:06:34.920 --> 00:06:38.120] And it's 90% covered by mountains, this country.
[00:06:38.120 --> 00:06:40.680] So about 6 million people live there.
[00:06:40.680 --> 00:06:44.280] And they have a nomadic lifestyle traditionally in rural areas.
[00:06:44.280 --> 00:06:44.760] Yurts.
[00:06:44.760 --> 00:06:45.720] Have you ever heard of yurts?
[00:06:45.720 --> 00:06:46.920] Those are those tents.
[00:06:47.240 --> 00:06:53.720] I actually saw a, I don't know, one of these decoration shows, house decor shows, and they talked about yurts.
[00:06:53.720 --> 00:06:55.320] So it was the whole show was all about yurts.
[00:06:55.480 --> 00:06:56.360] I love a good yurt.
[00:06:56.360 --> 00:06:57.800] It was pretty fascinating.
[00:06:58.120 --> 00:06:58.600] I do.
[00:06:58.600 --> 00:06:59.960] I've stayed in a yurt before.
[00:06:59.960 --> 00:07:00.360] I like it.
[00:07:00.680 --> 00:07:01.320] Wow.
[00:07:01.320 --> 00:07:07.800] And, oh, they are the first country in the world to commit to protecting snow leopards, which is an endangered species found in their mountains.
[00:07:07.800 --> 00:07:09.880] So good on them for that as well.
[00:07:09.880 --> 00:07:15.000] Now, the ban extends to, yes, all the medias, broadcast media, social media.
[00:07:16.000 --> 00:07:28.240] The measure is designed to prevent unscrupulous citizens from taking advantage of vulnerable segments of society, says Marlon Mamotailayev, who's one of the sponsors of the amendments.
[00:07:28.240 --> 00:07:31.760] It aligns with similar actions in neighboring countries.
[00:07:32.080 --> 00:07:37.200] Tajikistan, as I mentioned before, they have criminalized sorcery and fortune-telling as well.
[00:07:37.200 --> 00:07:44.800] They characterize these practices as being perpetrated by fraudsters, and they implement strict measures against these activities.
[00:07:44.800 --> 00:07:54.160] In fact, the fines, if you're fined for this activity, it's usually about like a half a year's worth of your money that you would earn in a year.
[00:07:54.800 --> 00:07:59.120] So that's pretty steep, relatively speaking.
[00:07:59.120 --> 00:08:03.840] Here's one example of the kind of stuff that they've been doing with the fortune tellers, right?
[00:08:03.840 --> 00:08:12.880] So the fortune tellers will purport to offer anything from predicting the future to helping find suitable spouses and also to make your businesses flourish.
[00:08:12.880 --> 00:08:19.360] So there was this woman, her name is Mavzuna, a housewife from the northern city of Khunjand.
[00:08:19.600 --> 00:08:21.440] That's her first name, didn't offer her last name.
[00:08:21.440 --> 00:08:27.760] She paid $30 to help find appropriate husbands for her three daughters, who are aged 24 to 30.
[00:08:27.760 --> 00:08:33.840] The fortune teller conducted separate sessions with these women in order to trigger their luck.
[00:08:33.840 --> 00:08:35.840] Then here's what she said.
[00:08:35.840 --> 00:08:44.480] She gave me an egg to throw into the river, and now she's expecting eligible men to turn up to ask for her daughter's hands in marriage.
[00:08:44.480 --> 00:08:48.000] All kinds of things, apparently, this is this is seems to be a common practice.
[00:08:48.000 --> 00:09:01.240] And they'll take these eggs or whatever trinkets, mostly a lot of like talismans and metal jewelry and stuff, and they'll throw them into water bodies, into rivers, and into ponds and other things as part of this.
[00:09:01.560 --> 00:09:15.480] But it's getting the problem is getting so bad that authorities there are concerned that these metal pieces are lying at the bottom of these ponds, riverbeds, and other places where people go or animals go, and it's injuring them.
[00:09:15.480 --> 00:09:15.880] Right?
[00:09:15.880 --> 00:09:28.040] They said, authority, here's one example: authorities recovered half a cart of locks from the bed of one of the rivers that could have split the heads open of unsuspecting swimmers or people diving into the water.
[00:09:28.040 --> 00:09:34.120] And the locks were actually discovered accidentally by police who were looking for evidence of some other crime that was going on.
[00:09:34.120 --> 00:09:41.320] So, so many of these things that it's clogging, you know, that it's polluting their waterways and their water systems.
[00:09:41.320 --> 00:09:49.480] That's how many of these charms and these fortune tellers are selling these people for total, total, you know, garbage and things that just won't come true.
[00:09:49.480 --> 00:09:52.920] So, that is the good news from Kyrgyzstan.
[00:09:52.920 --> 00:10:02.040] And as I said, other countries, I think, should perhaps look to them as an example and try to help their citizens in any way they can from these fraudsters.
[00:10:02.600 --> 00:10:04.120] So, they're just banning the advertising.
[00:10:04.120 --> 00:10:05.560] They're not banning the practices of the case.
[00:10:05.800 --> 00:10:06.520] Correct.
[00:10:06.520 --> 00:10:07.560] Yes, yes.
[00:10:07.560 --> 00:10:09.080] Although they do, right?
[00:10:09.080 --> 00:10:19.080] Although, if you are caught, although if you are caught and determined to be fraudulent in whatever capacity that these courts decide, you can be in big time trouble.
[00:10:19.080 --> 00:10:22.280] So, it is considered kind of mostly an underground practice.
[00:10:23.000 --> 00:10:28.040] It's, you know, the authorities have their eyes on you if you are doing these things to begin with.
[00:10:28.040 --> 00:10:29.960] So, it's definitely frowned upon.
[00:10:29.960 --> 00:10:38.440] Oh, and by the way, fun fact: there's a Kurzig epic poem called The Epic of Mana of Manas, M-A-N-A-S, which is one of the longest poems in the world.
[00:10:38.440 --> 00:10:43.240] It has over 500,000 lines to this poem.
[00:10:43.800 --> 00:10:45.200] That is an epic poem.
[00:10:45.680 --> 00:10:48.880] It would take up three large-volume books, basically.
[00:10:44.760 --> 00:10:51.840] So imagine, it's kind of like, I envision it like memorize.
[00:10:52.000 --> 00:10:58.000] If you know this poem, it's like memorizing the Lord of the Rings trilogy, basically, as a poem, as one poem.
[00:10:58.000 --> 00:10:59.040] Oh, boy.
[00:10:59.040 --> 00:10:59.360] All right.
[00:10:59.360 --> 00:11:00.240] Thanks, Evan.
[00:11:00.240 --> 00:11:00.720] Yep.
[00:11:00.720 --> 00:11:05.840] Kara, does having an armed police officer in school make kids safer?
[00:11:05.840 --> 00:11:07.040] What do you guys think?
[00:11:07.040 --> 00:11:07.520] No.
[00:11:09.120 --> 00:11:11.920] I think we probably wouldn't be talking about it if the answer were in there.
[00:11:12.160 --> 00:11:13.200] Right, you're right.
[00:11:13.520 --> 00:11:14.880] Bob says no.
[00:11:14.880 --> 00:11:16.160] Evan says, um.
[00:11:17.280 --> 00:11:18.960] I have no information to base that.
[00:11:19.280 --> 00:11:21.680] I have no statistics to base that off.
[00:11:21.680 --> 00:11:23.520] I mean, to base the decision.
[00:11:23.520 --> 00:11:26.400] I would say that schools benefit.
[00:11:26.400 --> 00:11:28.000] I don't know to what degree.
[00:11:28.400 --> 00:11:29.040] Interesting.
[00:11:29.040 --> 00:11:29.440] Okay.
[00:11:29.760 --> 00:11:47.840] So, according to Education Week's school shooting tracker, as of the last update, which was November 11th, 2024, so far there have been 35 school shootings in the United States this year that have resulted in injuries or deaths.
[00:11:47.840 --> 00:11:50.960] 217 total since 2018.
[00:11:50.960 --> 00:11:53.280] There were 38 last year.
[00:11:53.280 --> 00:12:01.360] So this year, again, up to two days ago as of this recording, 35 school shootings with injuries or deaths, 65 people killed or injured.
[00:12:01.520 --> 00:12:06.960] So that's 16 people killed, seven students, nine employees, and 49 people injured.
[00:12:06.960 --> 00:12:09.600] And they keep a map of all the school shootings.
[00:12:09.920 --> 00:12:11.600] Very sad work.
[00:12:11.600 --> 00:12:16.880] So this is, as we know here in the States, a big problem, a scary problem.
[00:12:16.880 --> 00:12:19.040] And a lot of parents are afraid.
[00:12:19.040 --> 00:12:23.520] And a lot of people are clamoring to solve this problem.
[00:12:23.520 --> 00:12:38.200] And there have been policy decisions, some of which are misguided, some of which are based in the evidence, but there up to date really hasn't been a lot of evidence to answer that question.
[00:12:38.200 --> 00:12:43.800] You know, are schools with armed police officers actually safer?
[00:12:44.120 --> 00:12:48.680] Does deploying officers in public schools deter criminal activity?
[00:12:48.680 --> 00:12:58.360] And so I'm going to talk about a few studies today, which were compiled by Rod McMullum in Undark magazine.
[00:12:58.360 --> 00:13:01.480] He wrote kind of a long feature article about this.
[00:13:01.480 --> 00:13:14.200] First, a little bit of statisticales: more than 41,000 schools employ at least one law enforcement officer, or sometimes they're called school resource officers.
[00:13:14.520 --> 00:13:17.320] And there have been big changes.
[00:13:17.320 --> 00:13:29.240] During COVID, things got kind of interesting because, of course, we saw the Black Lives Matter movement kind of take a big upswing after the murder of George Floyd.
[00:13:29.240 --> 00:13:35.320] And we also saw a lot of schools, you know, closing or modifying their practices.
[00:13:35.320 --> 00:13:45.640] And so during this time, many school districts decided to decrease funding or end school resource officer programs altogether.
[00:13:45.640 --> 00:13:54.680] And others have gone in the opposite direction, increasing the presence of armed officers within their schools.
[00:13:54.680 --> 00:14:02.520] But this article talks specifically about Chicago, which was one of the largest public school systems in the U.S.
[00:14:02.840 --> 00:14:11.960] There are 325,000 students in Chicago schools that return to classes this year without armed officers present.
[00:14:11.960 --> 00:14:17.680] And this was the first time in Chicago since the year 1966.
[00:14:18.000 --> 00:14:25.600] And so they have already started to collect some data on the outcomes of not having armed resource officers present.
[00:14:25.840 --> 00:14:38.640] And in this model, they actually took the funding that they would have put towards paying these armed police officers and they put it towards having social workers, trauma-informed mental health interventions, and social and emotional learning.
[00:14:38.640 --> 00:14:46.960] So we're going to get to some of these Chicago studies because it's really early, the information is really early.
[00:14:46.960 --> 00:15:04.720] And we're going to turn to a couple of large studies, one of which is a systematic review that was published in November of 2023 called School-Based Law Enforcement Strategies to Reduce Crime, Increase Perceptions of Safety, and Improve Learning Outcomes in Primary and Secondary Schools: A Systematic Review.
[00:15:04.720 --> 00:15:13.840] This study looked at 32 different reports yielding 1,002 effect sizes to ask a number of different questions.
[00:15:13.840 --> 00:15:31.680] But one of the big outcomes that they found was that there was no evidence that there is a safety, and I'm quoting them directly, no evidence that there is a safety-promoting component of what they call SBLE, school-based law enforcement.
[00:15:31.680 --> 00:15:44.320] And the authors, based on this large study, come out in support of the criticism that school-based law enforcement actually criminalizes students and schools.
[00:15:44.320 --> 00:16:02.520] So, not only did they find that gun violence was not reduced, and actually, other forms of violence, specifically in this study, were not reduced, there was an increase in what is often referred to as the school-to-prison pipeline.
[00:16:02.840 --> 00:16:20.840] So, the types of disciplinary actions that were taken against students, which were especially high among black and Hispanic boys, led to either suspension or, in some cases, incarceration.
[00:16:21.080 --> 00:16:35.480] And there is a ton of evidence, and we won't get into like that whole body of literature, but there's a ton of evidence that sort of like punitive action at a young age leads to higher rates of incarceration later.
[00:16:35.480 --> 00:16:45.560] And this is, there have been randomized controlled trials where students are assigned to middle schools where there's stricter punishment versus middle schools without that strict punishment.
[00:16:45.560 --> 00:16:57.800] And they find that the kids who are assigned to those stricter middle schools on average are more likely to drop out of high school, less likely to attend a four-year college, and are more likely to become incarcerated as adults.
[00:16:57.800 --> 00:17:05.160] So, we know that these punitive practices have a detrimental effect on children's development and well-being.
[00:17:05.160 --> 00:17:16.200] And studies are showing that having sworn law enforcement officers in the schools not only do not reduce gun violence, and in some cases, they don't even reduce violent crime.
[00:17:16.200 --> 00:17:36.840] Another study showed that there was a small but significant decrease in fighting, probably because of just the physicality of the police presence, but that they do actually increase the number of infractions that are being reported against these children, which can have detrimental effects later in life.
[00:17:36.840 --> 00:17:38.920] There's a couple of other things I want to point out.
[00:17:38.920 --> 00:17:48.000] So, one study showed that over 90% of schools that have a law enforcement officer present have an officer who's routinely armed.
[00:17:48.000 --> 00:17:58.480] Although some of those officers have special training in violence prevention and youth mental health, requirements vary by state, and in many cases, there is no required special training in these areas.
[00:17:58.480 --> 00:18:01.520] This article does a really interesting job of going back to the history.
[00:18:01.520 --> 00:18:14.080] I didn't realize that resource officers started in 1953 in Flint, Michigan, and then efforts increased in the 60s in large cities like LA, Chicago, and Cincinnati.
[00:18:14.080 --> 00:18:19.920] And then there's a dramatic increase after Columbine in April of 1999.
[00:18:19.920 --> 00:18:28.240] And we saw that after this huge increase by 2022, about 48% of public schools have at least one school resource officer.
[00:18:28.240 --> 00:18:32.400] And again, like I said, more than 90% of those schools have an armed officer.
[00:18:32.400 --> 00:18:39.280] So that led to the 2019 to 2020 year, about 23,400 school resource officers being employed.
[00:18:39.280 --> 00:18:41.760] Now, let's cut back to Chicago.
[00:18:41.760 --> 00:18:46.720] In Chicago, they actually reduced that presence.
[00:18:46.720 --> 00:18:53.520] They, you know, this year was the first year that students went back to school without school resource officers present.
[00:18:53.520 --> 00:19:01.840] Removing, okay, the U of Chicago Consortium on School Research put out a brief in June of this year removing police officers from Chicago schools.
[00:19:01.840 --> 00:19:06.800] And they call it SRO in this case, school resource officers.
[00:19:06.800 --> 00:19:13.440] School resource officer removal was significantly related to having fewer high-level discipline infractions.
[00:19:13.440 --> 00:19:22.880] And it was not related to changes over time in perception of physical safety or student-teacher trust, either by students or teachers.
[00:19:22.880 --> 00:19:30.000] So, when school resource officers were removed, students did not say that they felt less safe, and neither did teachers.
[00:19:30.520 --> 00:19:41.320] They also found that schools that had resource officers, the ones that retained them, were more likely to serve predominantly black students.
[00:19:41.320 --> 00:19:45.960] Black students became more than twice as likely as other students to have a resource officer.
[00:19:45.960 --> 00:19:50.760] They tended to be smaller schools, but they tended to have higher suspension rates.
[00:19:50.760 --> 00:20:00.120] And they tended to be schools where students who were eligible for free or reduced lunch, non-English learners or kids who were in special education.
[00:20:00.440 --> 00:20:07.000] So, these are schools where obviously populations are more vulnerable, are more likely to have these resource officers there.
[00:20:07.000 --> 00:20:25.480] And those resource officers were not over time reducing or effective in reducing gun violence or gun crime, but they were effective in criminalization of students who were there to go to school because infractions were identified and magnified.
[00:20:25.480 --> 00:20:32.600] So, across the board, what we're starting to see, and there's not much literature on this subject, but it's gaining.
[00:20:32.600 --> 00:20:44.840] Across the board, we are starting to see that while some of the studies have cited kind of small benefits, most of them are showing very large detriments.
[00:20:44.840 --> 00:20:52.760] And there is no like detectable, there's no significant difference in outcomes when it comes to school safety.
[00:20:52.760 --> 00:21:02.760] So, the children are no safer, the children and the adults are no safer, and there is no detectable evidence saying that having police in schools is reducing gun violence at all.
[00:21:02.760 --> 00:21:03.960] Yeah, that's interesting.
[00:21:04.920 --> 00:21:06.280] There's a lot of good stuff here.
[00:21:06.280 --> 00:21:08.040] It's not surprising to me.
[00:21:08.040 --> 00:21:29.920] I think this is an area where I have like dug a little bit deep into the research in the past, but I think for some people it is surprising and it is pretty counterintuitive that putting police officers in school leads to more arrests and more expulsions of students, very often disproportionately black and brown students, and very often student disproportionately students with disabilities.
[00:21:29.920 --> 00:21:39.040] But what it doesn't do is it doesn't make them safer from physical harm and violence, which was very often the whole argument, right?
[00:21:39.040 --> 00:21:42.080] It's the whole reasoning behind having these police officers present.
[00:21:42.080 --> 00:21:50.320] Yeah, I mean, it's easy to imagine that for most schools, there's probably not going to be events that require an armed officer present.
[00:21:50.320 --> 00:21:55.280] And even if they are present, as we know, like from that, what was it in that Texas case?
[00:21:55.280 --> 00:21:57.040] It doesn't always help, right?
[00:21:57.360 --> 00:22:08.720] No, not only does it not always help, sometimes there is some evidence to show that when teachers are armed, it can actually lead to more detriment, not less.
[00:22:08.720 --> 00:22:12.400] But we are seeing that there are laws that are being passed.
[00:22:12.400 --> 00:22:17.600] Like there's a law in Texas right now that requires schools to have armed guards.
[00:22:17.600 --> 00:22:21.360] And the Florida law, I think, is even more intense.
[00:22:21.360 --> 00:22:24.240] It requires that somebody in the school be armed.
[00:22:24.240 --> 00:22:27.120] It doesn't even have to be a guard, it can be a teacher.
[00:22:27.120 --> 00:22:31.200] Well, I mean, I think it's also a good area to continue to do research.
[00:22:31.200 --> 00:22:35.600] I doubt this one study is going to change, or there are a few studies that are going to change policy.
[00:22:37.200 --> 00:22:39.200] And they are systematic reviews, by the way.
[00:22:39.200 --> 00:22:43.040] So these are reviews of multiple studies.
[00:22:43.040 --> 00:22:43.440] Right.
[00:22:43.440 --> 00:22:44.480] Okay, thanks, Kara.
[00:22:44.480 --> 00:22:44.640] Yep.
[00:22:44.800 --> 00:22:47.920] Jay, tell us about training medical robots.
[00:22:47.920 --> 00:22:49.520] Yeah, I really like this one, guys.
[00:22:49.520 --> 00:22:58.160] Johns Hopkins University researchers collaborated with Stanford University researchers, and they achieved something that's pretty significant.
[00:22:58.160 --> 00:23:01.240] They're calling it a breakthrough in medical robots.
[00:23:01.240 --> 00:23:10.520] So, you know, for the first time, they were able to train a robot to perform surgical procedures by using video recordings of human surgeons.
[00:23:10.520 --> 00:23:14.760] And this is definitely different than any other way this has been done in the past.
[00:23:14.760 --> 00:23:23.480] This method is called imitation learning, and it essentially eliminates the need for manual programming of the robotic movements, right?
[00:23:23.480 --> 00:23:29.560] This is like when a surgeon will actually, you know, a surgeon's movements will be recorded as they go, right?
[00:23:30.120 --> 00:23:31.720] Which is very difficult.
[00:23:31.720 --> 00:23:36.760] This moves robotic surgery a lot closer to full autonomy, like in the movies and stuff.
[00:23:36.760 --> 00:23:45.640] Like, you know, we've seen this in different science fiction movies where you're seeing like, you know, some type of extensive type of surgery being done, you know, in outer space or whatever.
[00:23:45.640 --> 00:23:52.440] What they now know can be done is robots can perform complex procedures without any human involvement.
[00:23:52.440 --> 00:23:56.680] I know that sounds kind of outlandish, but they were able to do it.
[00:23:56.680 --> 00:24:01.640] The researchers focused on an existing system called the Da Vinci Surgical System.
[00:24:01.640 --> 00:24:04.120] This is a widely used robotic platform.
[00:24:04.120 --> 00:24:07.240] It's, you know, it's considered to be cutting-edge modern medicine.
[00:24:07.240 --> 00:24:21.640] And to overcome that system's current limitations in precision, which is obviously a huge problem, they turned to video recordings captured by wrist cameras attached to the Da Vinci robot's arm during surgeries, right?
[00:24:21.640 --> 00:24:31.800] So, what this means is these videos were originally created for post-operative analysis, where they document techniques and movements of experienced surgeons, right?
[00:24:31.800 --> 00:24:43.320] With nearly 7,000 Da Vinci robots today in use globally and more than 50,000 trained surgeons, the researchers had access to a massive archive of surgical footage.
[00:24:43.320 --> 00:24:46.960] This was really the base of this whole achievement.
[00:24:44.840 --> 00:24:49.680] That footage was absolutely significant.
[00:24:50.000 --> 00:24:56.240] It proved to be an amazing amount of data for them to incorporate into computer learning.
[00:24:56.240 --> 00:24:58.480] So they used hundreds of these recordings.
[00:24:58.480 --> 00:25:00.720] I guess they picked the absolute best ones.
[00:25:00.720 --> 00:25:11.760] And the research team trained their robotic model to perform the basic surgical tasks, such as things like needle manipulation, something called tissue lifting, and suturing.
[00:25:11.760 --> 00:25:19.440] So unlike traditional programming methods which we have today, these require manually coding every step of a procedure.
[00:25:19.440 --> 00:25:20.800] You know, think about that.
[00:25:21.120 --> 00:25:31.200] In order for a robot to perform surgery the old way, they literally have to hand code in all those procedures, and it takes an incredibly long time to do it.
[00:25:31.200 --> 00:25:35.840] The new approach relies on machine learning, which is incredible at this.
[00:25:36.240 --> 00:25:39.920] Machine learning is so powerful when it comes to things like this.
[00:25:39.920 --> 00:25:47.600] So the model combines imitation learning with advanced AI architecture that's similar to those powering Chat GPT, of course.
[00:25:47.600 --> 00:25:54.400] However, instead of processing language, the model speaks in something called kinematics.
[00:25:54.400 --> 00:25:55.600] Have you guys ever heard of that?
[00:25:55.600 --> 00:25:56.640] Yeah, it's just like movement.
[00:25:57.200 --> 00:26:00.000] It's the mathematical representation of robotic movement.
[00:26:00.000 --> 00:26:01.280] So think about that.
[00:26:01.280 --> 00:26:09.040] A chat GPT-like AI system was programmed to speak in mathematical robotic movements.
[00:26:09.040 --> 00:26:10.480] That is so brilliant.
[00:26:10.480 --> 00:26:11.760] I just love that.
[00:26:11.760 --> 00:26:18.480] So historically, training the DaVinci robot to perform any single task was, like I said, incredibly labor-intensive.
[00:26:18.480 --> 00:26:20.480] And I want to give you more details about that.
[00:26:20.480 --> 00:26:24.720] It required programs to code every single movement meticulously.
[00:26:24.720 --> 00:26:36.680] And this took, you know, if you want to pick one specific movement that would be involved in one procedure, modeling the steps involved in suturing for particular types of surgery could take up to guess how many years, guys?
[00:26:36.680 --> 00:26:37.880] I would guess a decade.
[00:26:37.880 --> 00:26:39.160] Right, a decade.
[00:26:39.160 --> 00:26:39.640] Wow.
[00:26:39.640 --> 00:26:40.600] Yeah, I read the article.
[00:26:40.600 --> 00:26:40.760] Yes.
[00:26:41.080 --> 00:26:46.040] Yeah, that's to program every discrete step and all the exceptions and stuff.
[00:26:46.040 --> 00:26:54.440] Yeah, a decade still seems like a little long to me, but yeah, it's a hugely onerous process, especially compared to this little breakthrough they got here.
[00:26:54.440 --> 00:26:55.720] This seems pretty sweet.
[00:26:55.720 --> 00:27:03.800] Yeah, so that decade-long process was incredibly restrictive in the flexibility and scalability of robotic surgery.
[00:27:03.800 --> 00:27:08.200] It was one of the reasons why it didn't proliferate as much as we'd liked it to.
[00:27:08.200 --> 00:27:16.520] So by contrast, the new imitation learning methods allow robots to learn these complex tasks in just a few days, guys.
[00:27:16.520 --> 00:27:17.880] A few days.
[00:27:17.880 --> 00:27:19.880] 10 years to days.
[00:27:19.880 --> 00:27:21.320] Just incredibly unreal.
[00:27:21.320 --> 00:27:29.320] But don't forget, though, I think it said something that 100 videos is a good sample for it to get a good handle on what's required.
[00:27:29.320 --> 00:27:33.240] So 100 videos, that's 100 surgeries, essentially.
[00:27:33.640 --> 00:27:34.040] That's a lot.
[00:27:34.040 --> 00:27:34.520] That's a lot of surprise.
[00:27:34.600 --> 00:27:36.440] But those surgeries are being done anyway.
[00:27:36.760 --> 00:27:37.000] Right.
[00:27:37.640 --> 00:27:40.680] The fact is, we have those surgeries on video.
[00:27:40.680 --> 00:27:41.320] We have them.
[00:27:42.280 --> 00:27:43.080] It's amazing.
[00:27:43.320 --> 00:27:53.560] And as new surgery methods come out, of course, I'm sure that they would videotape them in the correct way to continue to let this happen, right?
[00:27:53.560 --> 00:27:59.400] So one of the key advancements reported lies in the robot's ability to adapt its movements, right?
[00:27:59.400 --> 00:28:05.080] Which is really important because every human body isn't the same and every single situation isn't going to be exactly the same.
[00:28:05.080 --> 00:28:13.640] So, traditional robotic systems had to rely on this rigid, pre-programmed actions, and that's very error-prone in the real world.
[00:28:13.640 --> 00:28:14.800] Oh, it sounds totally brittle.
[00:28:14.800 --> 00:28:15.120] Yeah.
[00:28:14.680 --> 00:28:16.640] Yeah, it's really scary, actually.
[00:28:17.280 --> 00:28:24.560] Training the models to use relative movements instead of fixed motions, the researchers enhanced the robot's precision and flexibility, right?
[00:28:24.560 --> 00:28:36.640] This allows the robot to perform tasks with skill comparable to experienced human surgeons and even recover from mistakes like retrieving a dropped needle and then continuing on with the procedure.
[00:28:36.640 --> 00:28:37.360] So, the break.
[00:28:37.520 --> 00:28:46.080] That one, Jay, that one caught my attention because, yeah, that's cool that it's kind of like some emergent behavior where it knew just to pick it up and continue.
[00:28:46.080 --> 00:28:49.680] But then I thought, well, wait a second, should that needle be sterilized now that it's fallen on the floor?
[00:28:50.160 --> 00:28:53.760] It fell in the person being operated on.
[00:28:53.760 --> 00:28:58.320] And I'm sure it would be an easy thing to fix if they said, you know, don't pick up anything that had hit the floor.
[00:28:58.960 --> 00:29:00.080] Unless it's a heart or something.
[00:29:00.400 --> 00:29:02.000] And you got to pick it up.
[00:29:02.000 --> 00:29:04.560] Yeah, well, those are the kinds of errors that I'd be afraid of.
[00:29:04.560 --> 00:29:14.400] Something that's, you know, that's unusual and rare, something you wouldn't necessarily find even within 100 surgeries, but still kind of like, oh boy, don't do that.
[00:29:14.400 --> 00:29:16.240] A human, any human would know not to do it.
[00:29:16.320 --> 00:29:18.320] Did organs really fall on the floor before?
[00:29:18.560 --> 00:29:20.480] But would you actually be afraid of that, Bob?
[00:29:20.480 --> 00:29:26.640] Because here's the thing: I think it's a little silly to have this standard be that there is no human involved at all.
[00:29:26.640 --> 00:29:28.880] There will always be a human involved.
[00:29:29.280 --> 00:29:33.360] But maybe they're just observing as opposed to actually operating.
[00:29:33.600 --> 00:29:40.800] Or even someone partially knowledgeable enough to know if something's really going off the rails would be nice too, but not necessarily a surgeon.
[00:29:40.800 --> 00:29:41.360] Because that's the thing.
[00:29:41.360 --> 00:29:41.600] Oh, yeah.
[00:29:41.840 --> 00:29:43.440] You need to have a surgeon involved.
[00:29:43.440 --> 00:29:44.080] But you're still right.
[00:29:44.320 --> 00:29:50.320] If there is a bleed, if somebody codes, like, and anesthesia is never going to be done this way.
[00:29:50.320 --> 00:29:52.240] There's going to be an anesthesiologist there.
[00:29:52.240 --> 00:29:53.280] Well, I wouldn't say never.
[00:29:53.280 --> 00:29:57.520] I mean, look, of course, we'd be able to achieve all these things with enough time.
[00:29:58.160 --> 00:29:58.960] Holy crap.
[00:29:58.960 --> 00:30:01.160] But I don't want that to happen.
[00:29:59.680 --> 00:30:01.800] I want there to be a lot of people.
[00:30:02.280 --> 00:30:04.760] You've got to think about it in a completely different way, right?
[00:30:04.760 --> 00:30:10.520] Because, yeah, ideally, you feel better with a human doing it for all these implied reasons.
[00:30:10.520 --> 00:30:23.480] But the point is, this will make surgeries probably cost less money and makes them much more readily available to people everywhere, including people in outer space or on ships that are crossing the ocean or whatever.
[00:30:23.480 --> 00:30:28.040] I mean, I think the benefits here dramatically outweigh any of the negatives.
[00:30:28.040 --> 00:30:32.040] That was the one good thing about Prometheus, the movie, was that medical pod thing.
[00:30:32.040 --> 00:30:33.480] Oh, yeah, but dude, dude, dude.
[00:30:34.280 --> 00:30:39.240] But still, they kind of blew it because this is like centuries in the future.
[00:30:39.240 --> 00:30:42.120] There's an auto-doc, which is so cool.
[00:30:42.120 --> 00:30:46.760] And then she has some abdominal surgery, and then it staples her gut glue.
[00:30:46.840 --> 00:30:47.240] It's terrible.
[00:30:47.480 --> 00:30:49.320] Like, really, you're using staples in two seconds.
[00:30:49.640 --> 00:30:52.280] How about a foam that heals the flesh?
[00:30:52.760 --> 00:30:54.280] Some bio-glue or something.
[00:30:54.280 --> 00:30:55.320] It was just like, oh, come on.
[00:30:55.720 --> 00:31:00.360] But more importantly than that, Bob, Bob, which Star Wars robot would you want surgery?
[00:31:00.360 --> 00:31:02.680] Would you want to be your robotic surgeon?
[00:31:03.000 --> 00:31:05.000] I'd rather pick a Neil Asher Poly universe.
[00:31:05.080 --> 00:31:07.000] I didn't give you that option.
[00:31:07.480 --> 00:31:09.720] It's got to be Star Wars, and you've got to answer in five seconds.
[00:31:09.800 --> 00:31:14.840] Well, whoever handled Luke's amputated hand, I'm sure, would probably be pretty damn good.
[00:31:14.920 --> 00:31:15.640] Oh, I thought you wanted.
[00:31:16.680 --> 00:31:18.600] You wanted IG, you know?
[00:31:18.600 --> 00:31:19.320] Oh, yeah.
[00:31:20.120 --> 00:31:22.200] I don't know if he's signed off on surgery.
[00:31:22.680 --> 00:31:25.880] If I go into combat, he'd probably tear you apart.
[00:31:26.280 --> 00:31:28.600] Yeah, if I go into combat, I want egg.
[00:31:28.600 --> 00:31:30.680] But for surgery, I'm not sure.
[00:31:30.680 --> 00:31:32.440] So, one last point here, guys.
[00:31:32.440 --> 00:31:41.240] So, what they were able to do was have the robot perform discrete pieces of surgery, and now they're, ha ha, they're stitching it all together.
[00:31:41.240 --> 00:31:45.920] So, they're working on making it be able to do the whole process from beginning to end.
[00:31:45.920 --> 00:31:53.680] And again, you know, when you're doing any kind of computer programming, you just break it down into smaller and smaller bits to make it easier.
[00:31:53.680 --> 00:31:56.160] And apparently, that's what they were doing here.
[00:31:56.160 --> 00:31:58.000] But I'm excited about it.
[00:31:58.000 --> 00:32:05.600] I mean, again, if people that can't afford expensive surgeries can have that be affordable to them, I think this would be fantastic.
[00:32:05.600 --> 00:32:15.600] Oh, yeah, I think this clearly has got a pretty bright future, especially after these robots seem to be doing better than what would have been anticipated.
[00:32:15.600 --> 00:32:21.440] Just after like 100 videos, it's got a solid handle on these complex surgical procedures.
[00:32:21.680 --> 00:32:22.720] That's fantastic.
[00:32:22.800 --> 00:32:27.280] Imagine when AI improves even more and you've got even more videos.
[00:32:27.280 --> 00:32:44.720] Imagine if they got a thousand or five thousand videos from like really high-end surgeons and then maybe throw in some like biological context of a body and what a body is and how it should behave like during surgeries and things like that so that there's some more context.
[00:32:44.720 --> 00:32:46.720] I think it would be even better.
[00:32:46.720 --> 00:32:47.840] All right, thank you, Jay.
[00:32:47.840 --> 00:32:50.240] Bob, tell us about these new imaging techniques.
[00:32:50.240 --> 00:32:52.960] This is another sort of medical breakthrough kind of topic.
[00:32:52.960 --> 00:32:54.640] Yeah, oh, don't get me started.
[00:32:54.640 --> 00:32:56.320] All right, I'll say it anyway.
[00:32:56.320 --> 00:33:02.000] So, yeah, this is a new imaging technology called HIP CT, which caught my attention this week.
[00:33:02.000 --> 00:33:12.240] This technology uses fourth-generation synchrotron radiation for medical and materials imaging, far better in many ways than others, like MRIs or CAT scans.
[00:33:12.240 --> 00:33:17.760] This started when I was reading a news item recently about imaging coelacanth fossils.
[00:33:17.720 --> 00:33:30.280] Now, the article kept going on and on about the coelacanths, but I kept thinking that they were kind of burying the lead with this revolutionary imaging technology that they just kind of mentioned almost as an aside.
[00:33:29.840 --> 00:33:32.040] And then, so I was looking at another article.
[00:33:32.600 --> 00:33:40.440] This one was from a couple of months ago, but it was an interesting news item about imaging human hearts with unprecedented detail.
[00:33:40.440 --> 00:33:43.720] And they mentioned the same imaging technology and they did the same thing.
[00:33:43.720 --> 00:33:45.320] It was just kind of like an aside.
[00:33:45.320 --> 00:33:49.160] So I'm like, I want to learn about this thing.
[00:33:49.400 --> 00:33:52.440] So I did some research where they weren't burying the lead.
[00:33:52.440 --> 00:33:58.360] And I learned about HIP CT or hierarchical phase contrast tomography.
[00:33:58.360 --> 00:34:00.040] And it was fascinating.
[00:34:00.040 --> 00:34:03.720] So HIPCT was funded by Chan Zuckerberg Initiative.
[00:34:03.720 --> 00:34:09.000] It's a multinational collaboration between scientists, mathematicians, clinicians, and more.
[00:34:09.000 --> 00:34:18.680] So these images, these imaging advances that they made are based on an upgrade of the European Synchrotron Radiation Facility to create brighter X-rays.
[00:34:18.680 --> 00:34:21.080] And that's kind of the crux of this whole news item.
[00:34:21.080 --> 00:34:27.400] And about these X-rays, it's all about, it turns out, synchrotron radiation, which was fascinating to study.
[00:34:27.560 --> 00:34:31.000] I knew a little bit about it, but I really deepened my understanding.
[00:34:31.160 --> 00:34:33.640] So synchrotron radiation is simply light.
[00:34:33.640 --> 00:34:39.880] It's electromagnetic radiation that's been emitted by charged particles like electrons that are accelerating.
[00:34:39.880 --> 00:34:46.760] But I'm not talking about accelerating like Steve's wife does in her high torque Tesla, going faster and faster on the highway.
[00:34:47.000 --> 00:34:49.800] In physics, that's called linear acceleration, right?
[00:34:49.800 --> 00:34:55.560] You're going faster or you're going slower, you're slowing down, which is it's still called acceleration.
[00:34:55.560 --> 00:34:57.160] It's just negative acceleration.
[00:34:57.400 --> 00:35:00.840] We, of course, call it deceleration, but it's still acceleration.
[00:35:00.840 --> 00:35:05.480] But there's another type of acceleration in physics called centripetal acceleration.
[00:35:05.480 --> 00:35:08.840] That means that essentially you're not going straight.
[00:35:08.840 --> 00:35:10.280] You're on a curved path.
[00:35:10.280 --> 00:35:20.800] So even if your velocity is the same and you're not accelerating classically, colloquially, but you're on a curving path, technically that's acceleration as well.
[00:35:20.800 --> 00:35:22.800] And I remember first learning that years ago.
[00:35:22.800 --> 00:35:30.960] And it's still hard for me to imagine that acceleration is moving on a curved path and having nothing to do with your velocity.
[00:35:30.960 --> 00:35:32.480] But that's what it is in physics.
[00:35:32.480 --> 00:35:40.960] Now, in electrons in the magnetic field going around and around a circular facility, they're experiencing this centripetal acceleration, right?
[00:35:40.960 --> 00:35:41.920] Because they're going around.
[00:35:42.560 --> 00:35:43.600] They're not going straight.
[00:35:43.600 --> 00:35:45.760] So this is a centripetal acceleration.
[00:35:45.760 --> 00:35:57.200] And if they're also moving very, very fast at relativistic speeds, then they will emit significant amounts of synchrotron radiation, which can be anything from infrared to hard x-rays.
[00:35:57.200 --> 00:35:58.560] So why does that happen?
[00:35:58.880 --> 00:35:59.840] Why is that even happening?
[00:35:59.840 --> 00:36:02.960] I was trying to figure out why does it release energy like that?
[00:36:03.200 --> 00:36:06.800] One reason is because that's what Maxwell's equations say will happen, and it does.
[00:36:06.800 --> 00:36:07.840] So he was right.
[00:36:07.840 --> 00:36:11.040] If you're not familiar with the Maxwell's equations, definitely check them out.
[00:36:11.040 --> 00:36:23.280] But another common explanation that you'll find is that if you imagine this electron going around and around, a tiny bit of its kinetic energy is being converted into that radiation as its path changes.
[00:36:23.280 --> 00:36:26.240] Okay, so that's one way to think of what's happening.
[00:36:26.240 --> 00:36:29.280] And that's why these electrons will slow down over time.
[00:36:29.280 --> 00:36:35.520] Because if you're losing some energy, some kinetic energy that's being converted into radiation, you are going to slow down.
[00:36:35.520 --> 00:36:46.880] And that's why these facilities that create synchrotron radiation, they have to continually pump more energy into the collider so that it will stay at the same linear velocity.
[00:36:46.880 --> 00:36:51.760] So, okay, so it's these synchrotron x-rays that are changing many fields of science.
[00:36:51.760 --> 00:36:59.280] They're millions to billions of times brighter than the x-rays that you got, Steve, when you broke both your arms playing Tarzan in that tree.
[00:36:59.280 --> 00:37:00.680] I'm sure you got an X-ray.
[00:37:00.680 --> 00:37:01.320] Oh, no.
[00:37:01.320 --> 00:37:03.080] My wrists, not my arms.
[00:37:03.080 --> 00:37:05.720] Well, wrists, arms, lower arms, whatever.
[00:37:06.920 --> 00:37:13.160] So the X-rays that you got were millions to billions of times dimmer than what they're creating here.
[00:37:13.400 --> 00:37:16.520] These are the brightest X-rays on Earth that I could find.
[00:37:16.520 --> 00:37:26.200] And the relatively new fourth-generation synchrotrons are even better than the Generation 3 that were around for years, kind of like going from Chat GPT-3 to 4.
[00:37:26.200 --> 00:37:32.200] These electrons are traveling at something like 99.99999% the speed of light.
[00:37:32.200 --> 00:37:35.560] So these are like ultra-relativistic, hyper-relativistic.
[00:37:35.560 --> 00:37:41.320] Now, this intense light can be focused to reveal super high-resolution images.
[00:37:41.320 --> 00:37:51.320] So it would be like using x-rays to not only see Steve's entire mangled wrist bone, but also the individual bone cells anywhere in that bone.
[00:37:51.320 --> 00:37:53.720] So that's kind of, we can't do that now.
[00:37:53.720 --> 00:38:07.720] We can't use this radiation to look at living people yet, but that's kind of like what we're dealing with here: from seeing this one image of this one discrete object to being able to also zoom down into the cellular level.
[00:38:08.040 --> 00:38:10.200] But it's not just the light intensity that's changing.
[00:38:10.440 --> 00:38:18.920] The x-ray beam itself can be tuned to specific wavelengths to analyze specific elements in the sample that you want to learn about.
[00:38:19.160 --> 00:38:26.520] The common film x-rays that you use to image your broken bones would use broad spectrum x-rays.
[00:38:26.520 --> 00:38:32.360] So these are using, it's kind of like almost kind of like a laser where you're tuning to very specific wavelengths.
[00:38:32.440 --> 00:38:43.240] So, the main benefit of this imaging technology is, if I had to distill it down to a few words, is that it decouples the field of view from resolution.
[00:38:43.240 --> 00:38:50.960] So, that means that a wide field of view, which historically would mean low resolution, no longer needs to be that way.
[00:38:44.760 --> 00:39:00.480] You could have a wide field of view that gets the whole image, though, say an entire human body can also be extremely high resolution.
[00:39:00.480 --> 00:39:02.960] And that's one of the major advances here.
[00:39:02.960 --> 00:39:05.040] So, how is this being used today?
[00:39:05.040 --> 00:39:11.120] So, I'll go back to the coelacanth story that they're using this technology to image coelacanth fossils.
[00:39:11.120 --> 00:39:13.600] Now, do you remember those coelacanth fish?
[00:39:13.920 --> 00:39:19.600] They were extinct, apparently, for millions of years until a fisherman caught one.
[00:39:19.600 --> 00:39:21.680] Like, what was that in the 70s, Steve?
[00:39:21.680 --> 00:39:23.760] Something like 60s or 70s.
[00:39:23.760 --> 00:39:24.640] Somebody caught one.
[00:39:24.960 --> 00:39:30.960] It's like pulling up a fossil that you thought that was long dead, that should not exist.
[00:39:30.960 --> 00:39:32.960] The guy found one from the depths.
[00:39:32.960 --> 00:39:44.720] So, they're using these synchrotron x-rays to image these new coelacanth fossils that were like 250 million years old that were still in the rock.
[00:39:44.960 --> 00:39:48.240] They didn't excavate this fossil, it was still in the rock.
[00:39:48.240 --> 00:40:00.480] So, they imaged every bone in the fossil that was in the rock, and they were able to create a full 3D model of the fossil bones with a level of detail that they've never seen with this type of fossil before.
[00:40:00.480 --> 00:40:05.680] They created a complete virtual skeleton with amazing detail.
[00:40:05.680 --> 00:40:19.600] Basically, they discovered that this was a third species of coelacanth that they become aware of: the two that they have found extant that are still living, and now this other fossil is a third fossil that they're aware of.
[00:40:19.600 --> 00:40:25.600] In this heart news item that I read about, the researchers were able to x-ray a heart with amazing detail.
[00:40:25.840 --> 00:40:31.640] Senior author Peter Lee DeFil, he's a professor of material science, University of College London.
[00:40:31.640 --> 00:40:37.400] He said, The atlas that we created in this study is like having Google Earth for the human heart.
[00:40:37.400 --> 00:40:45.240] It allows us to view the whole organ at global scale, then zoom into street level to look at cardiovascular features in unprecedented detail.
[00:40:45.240 --> 00:40:46.920] So that's a really cool analogy.
[00:40:46.920 --> 00:40:56.520] He continues: One of the major advantages of this technique is that it achieves a full 3D view of the organ that's around 25 times better than a clinical CT scanner.
[00:40:56.520 --> 00:41:07.240] In addition, it could zoom into cellular level in selected areas, which is 250 times better to achieve the same detail that we would get through a microscope, but without cutting the sample.
[00:41:07.240 --> 00:41:13.160] Being able to image whole organs like this reveals details and connections that were previously unknown.
[00:41:13.160 --> 00:41:14.120] Cool stuff.
[00:41:14.120 --> 00:41:25.000] So they say that the primary limiting factor was not even the image itself, but processing the huge amount of data that's created by this HIP CT technology.
[00:41:25.000 --> 00:41:30.280] Now, so the future of HIP CT looks even brighter than the x-rays it produces.
[00:41:30.600 --> 00:41:37.080] So, I mean, medical imaging, material science, biological research, they are already benefiting from this tool.
[00:41:37.080 --> 00:41:53.400] But going further in the future, as I love to do, of course, perhaps using a fifth-generation radiation source, which probably will eventually arrive at some point, the resolution will likely go from the already awesome micron scale, which is a millionth of a meter.
[00:41:53.400 --> 00:42:03.400] It could get even more awesome to the nanoscale, of course, a billionth of a meter, looking at molecular interactions and what's going on within cells.
[00:42:03.400 --> 00:42:13.560] So, as this becomes more radiation-efficient and safer, this could eventually allow for cellular-level detail on living humans without doing any nasty tissue sampling.
[00:42:13.560 --> 00:42:21.680] Because right now, it's X vivo, which means that you could take a living sample, but you've got to remove it from whatever you're, you know, whatever living creature.
[00:42:21.760 --> 00:42:24.800] So, if it were a human, you'd have to take a sample.
[00:42:24.800 --> 00:42:27.120] So, that's you can't do it on people yet.
[00:42:27.120 --> 00:42:31.440] So, of course, typical hospitals don't have access to synchrotron facilities.
[00:42:31.440 --> 00:42:36.720] So, in the future, accessibility for routine imaging is probably going to be one of the biggest hurdles.
[00:42:36.720 --> 00:42:44.320] You know, they will make this smaller, more efficient, and cheaper, but I think there's probably going to be a limit to how small it can be.
[00:42:44.320 --> 00:42:48.880] So, it's not going to be something that's going to be at a local small hospital.
[00:42:49.200 --> 00:42:49.840] But who knows?
[00:42:49.840 --> 00:42:51.200] Who knows what's going to happen?
[00:42:51.440 --> 00:42:53.120] But it's not just, this isn't just medical.
[00:42:53.120 --> 00:42:56.240] We're also talking material science as well.
[00:42:56.240 --> 00:43:05.280] We'll be able to use this to 3D image micro and nanostructures of composites, and that could lead to more advanced alloys, nanotubes, and polymers, and more.
[00:43:05.280 --> 00:43:09.440] And, like we've said on the show many times, material science is the shit.
[00:43:09.440 --> 00:43:18.000] That's the stuff that's the stuff if you make fundamental advances in material science that can impact society at such a huge level.
[00:43:18.000 --> 00:43:20.320] So many industries, so many levels of society.
[00:43:20.800 --> 00:43:23.440] Those are the real big changes that can really make it make a difference.
[00:43:23.440 --> 00:43:25.360] So, I'll definitely be tracking this tech.
[00:43:25.360 --> 00:43:25.760] All right.
[00:43:25.760 --> 00:43:26.880] Thanks, Bob.
[00:43:27.200 --> 00:43:28.080] All right, everyone.
[00:43:28.080 --> 00:43:32.240] We're going to take a quick break from our show to talk about our sponsor this week, Rocket Money.
[00:43:32.240 --> 00:43:43.360] Yeah, Steve, most Americans think that they spend about $62 per month on subscriptions, but the real number is closer to $300, and that adds up really fast over the course of a year.
[00:43:43.360 --> 00:43:49.280] If you just forget about a couple of subscriptions that you have going right now, that can cost you a lot of money.
[00:43:49.280 --> 00:43:52.960] Managing your finances can be complicated, and it's time-consuming.
[00:43:52.960 --> 00:44:00.760] Rocket Money simplifies your finances, they make it really easy to see exactly what's happening with all your finances and track your spending.
[00:43:59.760 --> 00:44:03.960] They give you full control over all of it right on your phone.
[00:44:04.280 --> 00:44:09.960] Rocket Money is essentially a personal finance app that helps find and cancel your unwanted subscriptions.
[00:44:09.960 --> 00:44:13.080] It also monitors your spending and helps lower your bills.
[00:44:13.080 --> 00:44:17.160] See all of your subscriptions in one place and know exactly where your money is going.
[00:44:17.160 --> 00:44:21.240] For anything you don't want anymore, Rocket Money can help you cancel them with a few taps.
[00:44:21.240 --> 00:44:26.120] Rocket Money will even try to negotiate lower bills for you, sometimes by up to 20%.
[00:44:26.120 --> 00:44:36.600] Rocket Money has over 5 million users and has saved a total of half a billion dollars in canceled subscriptions, saving members up to $740 a year when using all the app's features.
[00:44:36.600 --> 00:44:38.840] So stop wasting money on things you don't use.
[00:44:38.840 --> 00:44:44.920] Cancel your unwanted subscriptions by going to rocketmoney.com/slash SGU.
[00:44:44.920 --> 00:44:49.080] That's rocketmoney.com/slash SGU.
[00:44:49.080 --> 00:44:51.720] RocketMoney.com/slash SGU.
[00:44:51.720 --> 00:44:53.880] All right, guys, let's get back to the show.
[00:44:54.360 --> 00:44:55.480] Let me ask you guys a question.
[00:44:55.480 --> 00:44:57.640] You know, I like to start off with questions.
[00:44:57.960 --> 00:44:58.760] Yes, you do.
[00:44:58.760 --> 00:44:59.080] What?
[00:45:00.520 --> 00:45:06.600] Should we police what physicians say in public on healthcare topics?
[00:45:06.600 --> 00:45:14.040] Specifically, how should the regulatory agencies respond to physicians spreading medical misinformation?
[00:45:14.200 --> 00:45:15.960] I think they should be drawn and quartered.
[00:45:17.320 --> 00:45:18.600] I think they should follow it.
[00:45:18.600 --> 00:45:24.200] They should track it, and there should be ramifications for spreading misinformation or disinformation.
[00:45:24.200 --> 00:45:26.360] So who gets to decide if it's misinformation?
[00:45:26.360 --> 00:45:31.560] Well, how is that not already part of the ethical code of the medical board?
[00:45:31.560 --> 00:45:31.960] It is.
[00:45:32.280 --> 00:45:32.640] Absolutely.
[00:45:32.520 --> 00:45:32.800] Okay.
[00:45:33.320 --> 00:45:37.080] Then, yeah, the medical board should do something about it.
[00:45:37.080 --> 00:45:37.880] So, who is policing?
[00:45:38.040 --> 00:45:41.640] So, in the U.S., in the U.S., there are state medical boards, right?
[00:45:41.640 --> 00:45:45.760] They license physicians, they license all professionals at the state level.
[00:45:44.840 --> 00:45:50.320] And the medical board also regulates those licenses.
[00:45:50.480 --> 00:46:03.600] So, if you're a physician in Connecticut, the Connecticut State Medical Board can hold you to the standard of care and also to a code of ethics, and they can take actions against your license.
[00:46:03.600 --> 00:46:05.840] Yeah, they can fully revoke it if they do something back in that.
[00:46:05.840 --> 00:46:09.360] They can censor you, they could suspend your license, they could revoke your license.
[00:46:09.360 --> 00:46:21.600] There's a range of things that they could do if there is a complaint that you've violated the standard of care or the code of ethics, and an investigation finds that to be true.
[00:46:21.600 --> 00:46:28.960] So, the question is: should a physician making anti-vaccine statements, for example, should that count?
[00:46:28.960 --> 00:46:32.000] Should that be something that violates absolutely, absolutely.
[00:46:32.320 --> 00:46:35.680] What you're saying to the public is no different than what you say to a patient.
[00:46:36.320 --> 00:46:38.320] Yeah, so there's two contexts here.
[00:46:38.320 --> 00:46:46.320] One is a physician giving misinformation directly to their patient, and then another context is giving it to the public, right?
[00:46:46.320 --> 00:46:51.040] Not to somebody that is actually their patient, but just like making statements in public.
[00:46:51.280 --> 00:46:53.360] But this question was specifically about the public.
[00:46:53.360 --> 00:46:54.640] Well, it covers both.
[00:46:55.040 --> 00:46:57.840] No, I'm saying both of those things should be violations.
[00:46:57.840 --> 00:47:03.840] Yeah, I mean, yeah, I think there'd be maybe a little bit more leeway for public, but not that much.
[00:47:04.160 --> 00:47:08.000] And anything egregious absolutely should be dealt with.
[00:47:08.000 --> 00:47:10.480] Okay, so obviously I agree with you.
[00:47:10.480 --> 00:47:19.520] So, there's basically a discussion going on within the medical profession as well as the legal profession about this issue.
[00:47:19.520 --> 00:47:28.640] And there's also the question of: are state medical boards enforcing any kind of standard in terms of public information?
[00:47:29.120 --> 00:47:31.560] Let me quote from you a couple of sources here.
[00:47:31.560 --> 00:47:37.240] The Federation of State Medical Boards, this is the organization of state medical boards in the U.S.
[00:47:37.560 --> 00:47:39.240] Their position is the following.
[00:47:39.240 --> 00:47:50.120] Physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their medical license.
[00:47:50.120 --> 00:47:59.000] Due to their specialized knowledge and training, licensed physicians possess a high degree of public trust and therefore have a powerful platform in society, whether they recognize it or not.
[00:47:59.000 --> 00:48:11.400] They also have an ethical and professional responsibility to practice medicine in the best interests of their patients and must share information that is factual, scientifically grounded, and consensus-driven for the betterment of public health.
[00:48:11.400 --> 00:48:19.400] Spreading inaccurate COVID-19 vaccine information contradicts that responsibility, threatens to further erode public trust in the medical profession, and puts all patients at risk.
[00:48:19.400 --> 00:48:21.720] The AMA basically agrees with them, right?
[00:48:21.720 --> 00:48:25.240] That's they give basically the same different words, same opinion.
[00:48:25.480 --> 00:48:26.360] What's the problem?
[00:48:26.520 --> 00:48:28.840] The problem is they're not doing it.
[00:48:28.840 --> 00:48:29.960] Oh, geez.
[00:48:29.960 --> 00:48:39.800] So there was a recent study, which is what triggered this whole review, that asked the question: are state medical boards disciplining physicians for spreading misinformation?
[00:48:39.800 --> 00:48:43.000] Now, this is a difficult question to get at, right?
[00:48:43.000 --> 00:48:46.040] When you think about how would you research, how would you answer this question?
[00:48:46.040 --> 00:48:50.440] They basically looked where the light was available, right?
[00:48:50.440 --> 00:48:56.840] So they got the information that they were able to get, and then we have to use that as a way to kind of infer how they're doing.
[00:48:56.840 --> 00:48:58.360] So this is a JAMA article.
[00:48:58.360 --> 00:49:12.120] They looked at 3,128 medical board disciplinary proceedings involving physicians, and they found that only 0.1% of them were for spreading misinformation to the community.
[00:49:12.120 --> 00:49:14.280] That was the least common reason.
[00:49:14.280 --> 00:49:14.920] 0.1%.
[00:49:15.760 --> 00:49:26.160] Direct patient misinformation and inappropriate advertising were tied for the next two least common reasons at 0.3%.
[00:49:26.480 --> 00:49:32.000] So these are way below all the other reasons that physicians get disciplined.
[00:49:32.000 --> 00:49:32.960] Now, of course, we don't know.
[00:49:33.280 --> 00:49:34.080] Is that representative?
[00:49:34.400 --> 00:49:39.440] We don't know how many times physicians are violating this or how many complaints there were, what the percent.
[00:49:39.680 --> 00:49:52.320] The thing is that complaints against physicians, if they are found to not be worth acting upon, right, if the physicians are essentially cleared, that information is not made public.
[00:49:52.320 --> 00:49:52.640] That is.
[00:49:53.120 --> 00:49:55.840] Yeah, and the complaints have to be filed in the first place.
[00:49:56.080 --> 00:49:59.040] Nobody is monitoring or policing physicians.
[00:49:59.200 --> 00:50:00.000] Yeah, so we don't.
[00:50:00.160 --> 00:50:10.240] There's no way to know how often physicians are spreading misinformation and how often the complaints are being made and how they're being decided.
[00:50:10.240 --> 00:50:20.800] All we know is of those complaints that are made public, it's a very teeny tiny slice of 0.1% basically for spreading public misinformation.
[00:50:20.800 --> 00:50:26.240] So the authors of the study were saying, yeah, you know, this suggests that they're just not doing their job.
[00:50:26.880 --> 00:50:29.920] They also say that we have some indirect information.
[00:50:29.920 --> 00:50:38.400] So there is an increase in the number of complaints about physicians spreading misinformation.
[00:50:38.400 --> 00:50:39.360] Okay, that's good.
[00:50:39.680 --> 00:50:44.400] So we know that while that's happening, the number of disciplinary actions is not increasing.
[00:50:44.400 --> 00:50:44.960] That's not good.
[00:50:44.960 --> 00:50:47.680] Yeah, so it doesn't appear, there does appear to be a disconnect.
[00:50:47.680 --> 00:50:50.800] But yeah, so but you know, we don't have really perfect information here.
[00:50:51.120 --> 00:51:00.000] Also, how many I'm I'm curious, how many physicians, and I know that that's a special division of doctors that you hear talking online, are actual medical doctors.
[00:51:00.440 --> 00:51:07.960] How many of the COVID misinformation is being touted by people who never, who like didn't maintain their license?
[00:51:08.520 --> 00:51:13.000] I don't know what the percentage is, but I know a lot of the ones that I am aware of are MDs.
[00:51:13.720 --> 00:51:14.360] And they still have to do that.
[00:51:14.600 --> 00:51:16.360] They have their MD and they're practicing.
[00:51:16.360 --> 00:51:16.760] Yeah.
[00:51:17.160 --> 00:51:17.880] Yeah, that's scary.
[00:51:17.880 --> 00:51:18.680] That's really scary.
[00:51:18.680 --> 00:51:19.480] Yeah, it is.
[00:51:19.480 --> 00:51:32.120] The other side of the coin, though, is the question: if we think it's reasonable for medical boards to hold physicians accountable for spreading misinformation, are they the right people to do it?
[00:51:32.120 --> 00:51:33.880] And probably not.
[00:51:33.880 --> 00:51:40.440] Yeah, the consensus seems to be that whether or not you think they should be doing it, they don't have the resources to do it.
[00:51:40.440 --> 00:51:42.280] They're basically not in a position to do it.
[00:51:42.280 --> 00:52:03.800] They don't have the infrastructure of people who can, first of all, determine if something crosses over that line to, all right, this is demonstrable, egregious misinformation, not something in the gray zone, not a matter of opinion, not something about which an ethical physician can have a minority but reasonable opinion.
[00:52:03.800 --> 00:52:10.760] This is demonstrably wrong, harming the public health, and is something that we could say is over the line.
[00:52:11.080 --> 00:52:14.440] Well, and there's enough examples.
[00:52:14.440 --> 00:52:16.600] I mean, there are famous examples like Dr.
[00:52:16.600 --> 00:52:26.200] Death in Texas and like these different, but there are enough examples of individuals who it took far too long for them to be stripped of their license.
[00:52:26.200 --> 00:52:33.320] Like they caused so much harm before enough evidence was amassed and a license was actually revoked.
[00:52:33.320 --> 00:52:38.040] I don't think people are losing their licenses left and right for small things.
[00:52:38.040 --> 00:52:41.720] Like, it has to be really egregious, and a lot of people have to be harmed for it.
[00:52:41.880 --> 00:52:46.400] Generally speaking, state boards are very, very forgiving.
[00:52:47.280 --> 00:52:56.480] Yeah, and so, and so that's already worrisome that they would then be tasked with policing this public health issue because it really is a public health issue.
[00:52:57.040 --> 00:53:02.400] So, I think the answer is that they need to reconfigure themselves to be able to do this.
[00:53:02.400 --> 00:53:09.120] And the other way to think about this is that there are other resources out there that they could be availing themselves of.
[00:53:09.600 --> 00:53:10.160] Skeptic groups.
[00:53:10.320 --> 00:53:11.600] For example, no, actually, no.
[00:53:11.600 --> 00:53:14.480] What I would say is that this is not something that we should be doing.
[00:53:15.040 --> 00:53:24.560] We should be educating the public about this and maybe pointing the finger at individuals, but we can't be the ones to be deciding whose license gets acted against.
[00:53:24.560 --> 00:53:30.560] But there are other layers of quality control within medicine, right?
[00:53:30.560 --> 00:53:32.640] It's not just state medical boards.
[00:53:32.640 --> 00:53:34.960] There's also specialty boards, right?
[00:53:34.960 --> 00:53:46.560] So, if you're an internal medicine doctor, you can get board certified by the Board of Internal Medicine, which means they can withdraw your certification if you violate their standards.
[00:53:47.040 --> 00:53:48.800] You also have to be credentialed at your hospital.
[00:53:48.960 --> 00:53:51.040] You also have to be credentialed at your hospital.
[00:53:51.120 --> 00:53:54.640] Of course, you could have your own private practice somewhere affiliated with no other institution.
[00:53:54.640 --> 00:53:56.960] You don't have to be board certified.
[00:53:56.960 --> 00:53:59.760] You can just have a license and do whatever you want.
[00:53:59.760 --> 00:54:05.120] So, then there's always two layers: there's the state medical board, and then there's the law.
[00:54:05.120 --> 00:54:06.720] You could always be sued, right?
[00:54:06.720 --> 00:54:08.480] There's the malpractice standard.
[00:54:08.480 --> 00:54:14.160] Of course, that's not the one we want to rely upon because that only kicks in after harm has already been done, right?
[00:54:14.480 --> 00:54:20.480] But is there, should there be like some sort of CDC NIH, like some sort of like public?
[00:54:20.720 --> 00:54:23.920] There's no federal-level regulation of licenses.
[00:54:23.920 --> 00:54:35.800] This would be a no, I don't mean of licenses, I mean of taking some form of action against an individual who is causing a viable risk to public health.
[00:54:36.040 --> 00:54:39.640] Yeah, so again, there's no legal infrastructure right now for this to happen at the federal level.
[00:54:39.640 --> 00:54:49.080] So if we want to existing infrastructure, what we could have is, say, for example, the specialty boards are in a great position.
[00:54:49.080 --> 00:54:55.160] They're in the best position to determine what is misinformation because this is what they do all the time.
[00:54:55.160 --> 00:54:56.200] They are the experts.
[00:54:56.200 --> 00:55:01.880] They put together panels of experts to review the scientific evidence and come up with practice guidelines.
[00:55:01.880 --> 00:55:05.480] And that could include communication guidelines as well.
[00:55:05.480 --> 00:55:09.560] And then all the state has to do is say, yeah, whatever these guys say, right?
[00:55:09.560 --> 00:55:17.400] All they have to do is we're going to utilize the recognized American board certification organizations and their practice guidelines.
[00:55:17.400 --> 00:55:18.760] And they do this all the time.
[00:55:18.760 --> 00:55:22.280] The state boards don't reinvent the wheel 50 times for every state.
[00:55:22.440 --> 00:55:25.240] They use, you know, like the AMA ethical standards.
[00:55:25.240 --> 00:55:27.400] They say, yep, we'll adopt those, right?
[00:55:27.400 --> 00:55:40.520] They will use, when you're sued for, or there's an action, a complaint that you violated the standard of care, for example, they don't determine themselves what the standard of care is.
[00:55:40.520 --> 00:55:44.760] They refer to published guidelines or experts or whatever, other people.
[00:55:44.760 --> 00:56:10.040] So it's not a stretch to say that the specialty boards need to get way more involved with this and can be a resource for the state medical boards who then also need to be acting on these much more aggressively and availing themselves of the specialty boards to determine who is violating, who's spreading misinformation and disinformation.
[00:56:10.360 --> 00:56:18.480] But then there's another layer here, a wrinkle, and I don't know if you want to get into it, Steve, but there is a wrinkle that comes down to straight up political pressure.
[00:56:18.480 --> 00:56:36.160] And when the zeitgeist of the state, whether we're talking about the directives of the state attorney general or the directives of a surgeon general, are explicitly saying, no, we promote these practices, we promote this misinformation.
[00:56:36.160 --> 00:56:42.640] Now things get really murky because there may be political pressure being leveraged onto that state medical board.
[00:56:42.640 --> 00:56:49.920] Well, there shouldn't be, and almost by definition, there really can't be any pressure by any federal person or agency.
[00:56:50.080 --> 00:56:51.360] No, I'm talking about at the state level.
[00:56:51.360 --> 00:56:52.160] Oh, yeah, at the state level.
[00:56:52.160 --> 00:56:53.600] Yeah, it's all at the state level.
[00:56:53.600 --> 00:56:54.560] But that's what I'm saying.
[00:56:54.560 --> 00:56:58.400] There are states in which this is pro-misinformation.
[00:56:59.120 --> 00:57:04.640] Is the governor going to put pressure on the professional board who licenses the professionals that they should be independent?
[00:57:04.880 --> 00:57:08.720] You see this if it happens all the time.
[00:57:08.720 --> 00:57:10.880] Yeah, it should not happen.
[00:57:11.200 --> 00:57:17.440] The state attorney or the state surgeon general or whoever it is, do they have a surgeon general of each state?
[00:57:17.440 --> 00:57:17.680] No.
[00:57:17.680 --> 00:57:18.880] Usually I don't either.
[00:57:19.520 --> 00:57:21.040] Attorney General handles it.
[00:57:21.200 --> 00:57:22.080] Yeah, the AG.
[00:57:22.080 --> 00:57:30.080] So, but I mean, I saw this when I was living in Florida, these directives coming from state officials saying don't get vaccinated.
[00:57:30.080 --> 00:57:31.280] Oh, yeah, absolutely.
[00:57:31.840 --> 00:57:37.520] But that does not tie the hands of any professional organization.
[00:57:37.520 --> 00:57:43.600] And I have been involved with professional organizations essentially fighting with the state over this.
[00:57:43.600 --> 00:57:56.800] And also, like a state medical board, you know, again, sort of relying upon professional organizations in order to act against somebody who was being protected politically, for example.
[00:57:56.800 --> 00:58:00.360] So I think they are at cross purposes, which is the way it should be.
[00:58:00.360 --> 00:58:02.360] Yeah, which is good because it avoids that conflict of the fact that it's not.
[00:57:59.840 --> 00:58:02.920] Yeah, right.
[00:58:03.000 --> 00:58:10.920] And then the national board certification boards, they are completely independent, right?
[00:58:10.920 --> 00:58:12.280] They are not even part of the government.
[00:58:12.280 --> 00:58:21.880] They are just academic kind of professional organizations whose purpose is really to just promote and certify high standards within their specialty.
[00:58:21.880 --> 00:58:26.200] Again, I think they're optimally positioned to deal with these issues.
[00:58:26.200 --> 00:58:31.080] And I think they just need to functionally connect with the state medical boards.
[00:58:31.800 --> 00:58:43.080] So, and unfortunately, they've been gun-shy, though, because they've been burned so often by both states and by individual quacks, right?
[00:58:43.080 --> 00:58:54.760] So, for example, and competing professional organizations, like the famously, like in the 80s, the chiropractors sued the AMA for practice infringement and won.
[00:58:54.760 --> 00:59:00.920] And so, the AMA has like been completely gun-shy about dealing with pseudoscience in medicine ever since then.
[00:59:00.920 --> 00:59:13.000] In Connecticut, for example, the state government went against the American Infectious Disease Society, right, on the question of chronic Lyme disease.
[00:59:13.000 --> 00:59:17.160] They sided with the quacks against the professional organization.
[00:59:17.160 --> 00:59:22.040] So, the states should absolutely not do that, but they absolutely do do that, right?
[00:59:22.040 --> 00:59:34.840] The legislature, the legislature takes on the regulatory roles or supermans the regulatory role for purely political purposes, always causing mischief, never in a good way, in my experience.
[00:59:34.840 --> 00:59:38.840] They should just basically let the professionals handle it and stay the freak out of it.
[00:59:38.840 --> 00:59:51.360] Well, and that's what I'm so afraid of: is deregulating and the increased distrust of these professional organizations of institutional knowledge as a whole.
[00:59:51.680 --> 01:00:03.840] What happens when you get to a scary place where in certain states, these types of licenses are no longer even regulated, and anybody can call themselves a doctor.
[01:00:03.840 --> 01:00:09.680] Yeah, well, if you have a complete erosion of expertise and professionalism and any kind of quality control, the game is over.
[01:00:09.680 --> 01:00:10.560] But we're not there.
[01:00:10.560 --> 01:00:11.440] We are not there yet.
[01:00:11.440 --> 01:00:13.520] We're not there, but we have holes.
[01:00:13.520 --> 01:00:14.640] Yeah, absolutely.
[01:00:14.640 --> 01:00:16.400] Yeah, and they're mainly legislative.
[01:00:16.400 --> 01:00:17.840] They are mainly legislative.
[01:00:17.840 --> 01:00:18.480] Yeah.
[01:00:18.480 --> 01:00:24.000] But that's where a lot of power lies, and that's the part that scares me because the deregulation is happening at the legislative level.
[01:00:24.000 --> 01:00:24.640] Yeah.
[01:00:24.640 --> 01:00:29.760] Yeah, and there's the healthcare freedom laws, which are all happening at the state legislative level.
[01:00:29.760 --> 01:00:32.000] That's where all the mischief is happening.
[01:00:32.000 --> 01:00:40.480] But while they still have the power to, you know, to protect the public with some level of ethical and quality control, they should do their job.
[01:00:40.480 --> 01:00:45.760] And right now they're really not doing it because I just think they're not set up for it.
[01:00:45.760 --> 01:00:50.160] You know, again, they have not adapted to this new world of misinformation, and they need to.
[01:00:50.160 --> 01:00:51.360] That's the bottom line.
[01:00:51.360 --> 01:01:02.320] 100%, because otherwise, who is going to protect vulnerable individuals who literally do not know any better and are trying to listen to smart people giving them good advice.
[01:01:02.640 --> 01:01:08.080] Yeah, you end up with the dueling experts, you know, and everyone just thinks, ah, whatever.
[01:01:08.080 --> 01:01:09.040] I could believe whatever I want.
[01:01:09.200 --> 01:01:13.040] I'll just go with my tribe, you know, because everyone has their own experts.
[01:01:13.040 --> 01:01:13.520] Exactly.
[01:01:13.520 --> 01:01:13.840] Yeah.
[01:01:13.840 --> 01:01:14.720] And they're expert.
[01:01:14.720 --> 01:01:15.760] It seems legitimate.
[01:01:15.760 --> 01:01:16.560] They're licensed.
[01:01:16.560 --> 01:01:17.280] They're practicing.
[01:01:17.280 --> 01:01:18.000] They're credentialed.
[01:01:18.000 --> 01:01:19.360] And they've never taken that away yet.
[01:01:19.600 --> 01:01:27.680] If you want the benefits of being licensed and credentialed and certified, you have to actually adhere to the standard.
[01:01:27.680 --> 01:01:28.960] You can't have it both ways.
[01:01:28.960 --> 01:01:39.640] You can't say, I'm a board-certified physician, so you can trust me, and not be liable to those very certifying entities saying, Nope, you can't say that.
[01:01:39.640 --> 01:01:40.680] That's not true.
[01:01:40.680 --> 01:01:42.360] That's misinformation.
[01:01:42.360 --> 01:01:45.800] So the whole, there is no First Amendment defense here.
[01:01:45.800 --> 01:01:48.840] There isn't, because this is not personal speech or private speech.
[01:01:48.840 --> 01:01:50.520] This is professional speech.
[01:01:50.520 --> 01:01:52.280] Nobody deserves to be licensed.
[01:01:52.280 --> 01:01:54.440] Nobody deserves to be certified.
[01:01:54.440 --> 01:01:56.040] And you're board certified.
[01:01:56.040 --> 01:02:03.000] And so, yes, you can absolutely lose your board certification if you violate whatever standard the board decides to have.
[01:02:03.000 --> 01:02:05.240] It's completely up to them, right?
[01:02:05.240 --> 01:02:09.560] As long as it's being applied fairly and not selectively, you know.
[01:02:09.560 --> 01:02:14.600] But if they say this is the standard, you have to abide by our standards in order to be board certified.
[01:02:14.600 --> 01:02:21.880] And that includes passing an exam and maintaining your continuing medical education and practicing within the standard of care, blah, blah, blah.
[01:02:21.880 --> 01:02:29.240] And for any reason they could say, nope, you no longer meet whatever we consider to be the requirements for certification, it can be removed.
[01:02:29.240 --> 01:02:46.200] So anyway, yeah, I think it's pretty clear, but the bottom line is there needs to be pressure to take a more active role in policing misinformation and disinformation coming from licensed and certified physicians.
[01:02:46.200 --> 01:02:48.920] Jay, it's who's that noisy time?
[01:02:48.920 --> 01:02:52.040] All right, guys, last week I played this noisy.
[01:03:05.640 --> 01:03:06.920] You guys have any guesses?
[01:03:06.920 --> 01:03:11.480] Yeah, shots of Ready Whip being passed around.
[01:03:11.480 --> 01:03:12.960] Everyone, open your mouth.
[01:03:15.920 --> 01:03:18.800] So, a listener named Matthew Morrison wrote in and said, Hi, Jay.
[01:03:18.800 --> 01:03:24.880] My wife, Nicole, daughter Nev, and I all agree this sounds like a fire extinguisher being used.
[01:03:24.880 --> 01:03:27.200] I think it would be louder than that if it were a fire extinguisher.
[01:03:27.280 --> 01:03:28.560] Those things can be loud.
[01:03:28.560 --> 01:03:31.280] Yeah, when you're right on there, sure, they're crazy loud.
[01:03:31.280 --> 01:03:34.640] It's not a fire extinguisher, but that's a good guess.
[01:03:34.640 --> 01:03:37.600] A listener named Joel Harding said, Hi, Skeptics Guide.
[01:03:37.600 --> 01:03:43.680] My six-year-old son Reed guesses that this who's that noisy for this week is an axolotl.
[01:03:44.000 --> 01:03:47.040] And I, you know, I have, I didn't know that they made noise.
[01:03:47.040 --> 01:03:47.840] Do they make noise?
[01:03:47.840 --> 01:03:48.480] I mean, do they?
[01:03:48.560 --> 01:03:50.160] I don't think they make noise like that.
[01:03:50.160 --> 01:03:51.360] They're like teeny tiny.
[01:03:51.360 --> 01:03:57.040] I mean, if they did make a noise, I suppose it would be something like that with them like screaming, my lungs are on the outside.
[01:03:57.920 --> 01:04:00.880] You know, it's a pretty scary thing that they got going on there.
[01:04:00.880 --> 01:04:06.080] Another listener named Adam Russell wrote in and said, This week's noisy is the mating call of a Minock.
[01:04:08.240 --> 01:04:09.120] Good job, right?
[01:04:09.120 --> 01:04:10.560] I don't know if everybody got that.
[01:04:10.560 --> 01:04:12.160] Anyone know what a Minock is?
[01:04:12.160 --> 01:04:12.480] I do.
[01:04:12.480 --> 01:04:13.600] Yeah, from Star Wars.
[01:04:13.920 --> 01:04:14.640] Star Wars.
[01:04:14.640 --> 01:04:16.560] Specifically, like the true on wirelines.
[01:04:16.960 --> 01:04:17.760] Yeah, the cables, you know.
[01:04:18.000 --> 01:04:20.080] Specifically, the Empire Strikes Back.
[01:04:20.080 --> 01:04:21.680] Another listener named Nick wrote in.
[01:04:21.680 --> 01:04:23.360] He said, Hello, second time guesser.
[01:04:23.360 --> 01:04:28.160] My guest this week is a brush-tail possum letting you know you're in its territory.
[01:04:28.160 --> 01:04:33.680] I've recently had two of these cute-looking monstrous sound creatures move in under my house.
[01:04:33.680 --> 01:04:39.120] They are not the brush-tail possum, and I wouldn't be happy if I met one of those.
[01:04:39.120 --> 01:04:40.560] We have a winner from last week, guys.
[01:04:40.560 --> 01:04:41.520] Are you prepared?
[01:04:41.760 --> 01:04:43.200] You're all brimming with excitement.
[01:04:43.200 --> 01:04:43.920] I could tell.
[01:04:43.920 --> 01:04:44.560] I'm not ready.
[01:04:44.960 --> 01:04:45.360] Now I'm ready.
[01:04:45.440 --> 01:04:50.560] Jesse Babonis wrote, I think this noisy is the call of a kiwi bird.
[01:04:51.200 --> 01:04:53.200] That is the sound of a kiwi bird.
[01:04:53.200 --> 01:04:54.160] Listen again.
[01:05:01.080 --> 01:05:02.440] Not a happy creature.
[01:05:02.440 --> 01:05:07.560] Steve, I have to ask you: did you have any idea that it was a bird?
[01:05:08.120 --> 01:05:09.960] That was my guess.
[01:05:09.960 --> 01:05:10.520] You did.
[01:05:11.080 --> 01:05:13.320] But you had no idea what the actual bird was.
[01:05:13.320 --> 01:05:15.160] Yeah, I haven't heard that one before.
[01:05:15.160 --> 01:05:16.120] Kiwi, huh?
[01:05:16.120 --> 01:05:16.840] Kiwi, huh?
[01:05:17.000 --> 01:05:21.000] That little thing that we saw at night in New Zealand?
[01:05:21.000 --> 01:05:23.400] Well, yeah, we actually saw it during the day, didn't we?
[01:05:23.400 --> 01:05:23.720] Did we?
[01:05:23.720 --> 01:05:24.600] I thought it was.
[01:05:24.680 --> 01:05:28.600] Well, it was day to us, but it was in a night every enclosure.
[01:05:28.600 --> 01:05:32.440] And the enclosure was flipped so that its night was our day so that we could see it, right?
[01:05:32.440 --> 01:05:32.920] Remember that?
[01:05:32.920 --> 01:05:33.800] Oh, it was like a trick.
[01:05:33.960 --> 01:05:35.320] Yeah, oh, yeah, yeah, yeah.
[01:05:35.320 --> 01:05:37.160] Yeah, well, apparently they make that noise.
[01:05:37.160 --> 01:05:39.960] So not the most inviting noise, huh?
[01:05:40.280 --> 01:05:41.080] No.
[01:05:41.720 --> 01:05:43.400] Sounds terrifying.
[01:05:43.400 --> 01:05:45.640] All right, guys, I have a new noisy for you this week.
[01:05:45.640 --> 01:05:48.040] I'm going to play it for you right now.
[01:06:06.840 --> 01:06:07.480] Yes.
[01:06:07.480 --> 01:06:08.600] It's not a bird.
[01:06:08.600 --> 01:06:10.280] That is most definitely not our bird.
[01:06:10.280 --> 01:06:13.160] I will give everybody that unbelievable hint.
[01:06:13.960 --> 01:06:20.200] So, guys, if you think you know what this week's noisy is, or if you have a cool noisy that you heard, give me an email.
[01:06:20.200 --> 01:06:21.480] Send me an email.
[01:06:21.480 --> 01:06:27.000] The only place I want you to send me the email is wtn at the skepticsguide.org.
[01:06:27.000 --> 01:06:28.200] A few quick things, Steve.
[01:06:28.200 --> 01:06:35.640] So you could always become a patron of the SGU in these times where the world needs skepticism.
[01:06:35.640 --> 01:06:49.600] I believe I wouldn't say now more than ever, but now is definitely an inflection point where we really feel the need to help educate people on how to think and how to be reasonable and how to use reason to come to your conclusions.
[01:06:44.600 --> 01:06:50.800] That's why we're here, guys.
[01:06:51.040 --> 01:06:57.680] So if you would like to support the work that we do, go to patreon.com forward slash skepticsguide, all one word.
[01:06:57.680 --> 01:06:59.200] You can also join our mailing list.
[01:06:59.200 --> 01:07:05.680] Every week we send out an email, it gives you a summary of all the things that the SGU has done in the last week.
[01:07:05.680 --> 01:07:11.360] And you could sign up going to theskepticsguide.org and on our homepage, you'll see a button to join there.
[01:07:11.360 --> 01:07:15.360] And we have three shows coming up: the DC, Washington, D.C.
[01:07:15.520 --> 01:07:21.280] private show, which is a live recording of the SGU, and also we do an extra hour of fun and audience interaction.
[01:07:21.280 --> 01:07:22.720] It's different every time.
[01:07:22.720 --> 01:07:25.040] If you haven't been to one, go to one.
[01:07:25.040 --> 01:07:28.800] There's only a certain number of these that we will ever do left, correct, guys?
[01:07:29.440 --> 01:07:32.160] It's an unknown number, but it is limited by the same.
[01:07:32.400 --> 01:07:33.920] That is a correct.
[01:07:34.000 --> 01:07:35.920] That is a correct statement.
[01:07:36.240 --> 01:07:43.760] We also, on the same day, this is happening all on December 7th of this year, we have a skeptical extravaganza.
[01:07:43.760 --> 01:07:45.440] This is our stage show.
[01:07:45.600 --> 01:07:50.320] This is a live performance that includes all of the SGU and George Robb.
[01:07:50.320 --> 01:07:52.560] Ian is manning the tech table.
[01:07:52.560 --> 01:07:57.200] And this is a show that we have been perfecting over the last 10 years.
[01:07:57.200 --> 01:07:58.560] It's a ton of fun.
[01:07:58.560 --> 01:08:02.240] I really do hope that you decide to come join us in DC.
[01:08:02.720 --> 01:08:06.880] We also will be scheduling more of these around the country in 2025.
[01:08:06.960 --> 01:08:09.760] We are happening, you know, the talks are happening right now.
[01:08:09.760 --> 01:08:11.520] So this is a great show to grab.
[01:08:11.520 --> 01:08:15.520] You can also go to the skepticsguide.org to buy tickets for that.
[01:08:15.520 --> 01:08:20.080] And then, and the thing that I'm most excited for is Notacon 2025.
[01:08:20.080 --> 01:08:21.640] This is our conference.
[01:08:21.640 --> 01:08:24.640] We started last year in 23.
[01:08:24.640 --> 01:08:27.520] And it is an SGU celebration.
[01:08:27.520 --> 01:08:29.360] That's the best way I could describe this.
[01:08:29.520 --> 01:08:37.880] The conference is largely around socializing, but we do, of course, provide an incredible amount of content during the two days that the conference is going on.
[01:08:37.880 --> 01:08:42.040] And there is going to be a lot of fun new things that we do this year.
[01:08:42.040 --> 01:08:46.440] If you're interested, you can go to nataconcon.com.
[01:08:46.440 --> 01:08:47.800] Evan, what was that URL?
[01:08:48.280 --> 01:08:51.880] NataconCon.com.
[01:08:51.880 --> 01:08:52.600] Correct.
[01:08:52.600 --> 01:08:53.240] Yes.
[01:08:53.240 --> 01:08:54.280] Please do join us.
[01:08:54.280 --> 01:08:55.400] It's an awesome time.
[01:08:55.400 --> 01:08:57.560] It's a great place to meet new people.
[01:08:57.560 --> 01:09:00.680] We had such a blast last year that people are still talking about it.
[01:09:00.680 --> 01:09:02.040] I know I am.
[01:09:02.040 --> 01:09:05.800] And Sharon made us a ton of Biscotti, right, guys?
[01:09:05.800 --> 01:09:06.600] Oh, boy.
[01:09:06.600 --> 01:09:07.080] Oh, boy.
[01:09:07.080 --> 01:09:08.120] That was just so good.
[01:09:08.120 --> 01:09:08.600] Yeah.
[01:09:08.600 --> 01:09:10.120] Yeah, but we ate it all.
[01:09:11.080 --> 01:09:11.960] That's why we're doing it again.
[01:09:12.040 --> 01:09:13.000] It's the only way to get it.
[01:09:13.720 --> 01:09:15.080] To do Natakon.
[01:09:15.080 --> 01:09:15.960] BiscottiCon.
[01:09:16.920 --> 01:09:18.200] That's a wrap, Steve.
[01:09:18.200 --> 01:09:19.320] All right, thanks, Jay.
[01:09:19.320 --> 01:09:20.280] One quick email.
[01:09:20.280 --> 01:09:26.840] We had a number of people write us to follow up with our mid-Atlantic accent discussion.
[01:09:26.840 --> 01:09:27.320] Cool.
[01:09:27.800 --> 01:09:29.000] I file this under.
[01:09:29.000 --> 01:09:32.120] It's always more complicated than you think.
[01:09:32.120 --> 01:09:33.400] It's a big file.
[01:09:33.400 --> 01:09:40.600] So mostly they were referring to a video by a phonetics expert, Dr.
[01:09:40.600 --> 01:09:48.840] Jeff Lindsay, who argues that the notion that the mid-Atlantic accent is fake is itself a myth.
[01:09:48.840 --> 01:09:57.160] And then he sets about to debunk that myth, citing a lot of evidence about the origins of the accent.
[01:09:57.160 --> 01:10:00.760] I got to say, I'm not convinced by his video.
[01:10:00.760 --> 01:10:16.480] I mean, clearly he understands a lot more about this than I do, but I looked up, just did, tried to research it as best as I could, and I could certainly find other experts that seemed to be as legitimate as this guy saying something extremely different.
[01:10:14.840 --> 01:10:21.680] So, at the very least, I think we need to say that this is a controversy or it's contested.
[01:10:21.840 --> 01:10:27.760] I don't know if this guy, this guy has a YouTube channel, right, where he has a lot of videos about language.
[01:10:27.760 --> 01:10:36.320] That doesn't necessarily mean that he's the definitive authority on everything just because he has a YouTube channel, but he certainly is an expert and he has a reasoned argument to make.
[01:10:36.320 --> 01:10:43.840] He's basically saying that the quote-unquote mid-Atlantic accent is a Northeastern regional accent.
[01:10:43.840 --> 01:10:52.560] And it's kind of what, you know, by the early 20th century, what was settled on as sort of a generic American accent.
[01:10:52.560 --> 01:11:02.800] And he cites evidence from like the aspects of the mid-Atlantic accent that exist in the regional accents of the Northeast.
[01:11:02.960 --> 01:11:04.640] Those features all pre-existed.
[01:11:04.640 --> 01:11:06.720] So he said, how could that be fake?
[01:11:07.120 --> 01:11:16.160] However, I also find references that say that not only was the mid-Atlantic accent manufactured, they named the guy who did it.
[01:11:16.160 --> 01:11:20.080] They said this Australian phonetician, William Henry Till.
[01:11:20.320 --> 01:11:21.040] Phonetician.
[01:11:21.040 --> 01:11:36.080] Yeah, phonetician, created the mid-Atlantic accent basically for the purpose of saying there should be an English, like a one accent that any educated person who speaks English anywhere in the world should speak, right?
[01:11:36.080 --> 01:11:39.040] And that's the quote-unquote mid-Atlantic accent.
[01:11:39.040 --> 01:11:56.320] So I'm not sure how to square the apparent history here that is being cited, you know, authoritatively by some experts with this guy's analysis who's saying that, no, it was a Northeastern American regional accent.
[01:11:56.320 --> 01:11:57.040] So I don't know.
[01:11:57.400 --> 01:11:58.960] It seems, it's complicated.
[01:11:58.960 --> 01:12:10.600] I do, there are some points which I think, some of which we made ourselves, and I've actually, interestingly, I first learned that the Mid-Atlantic accent was just a generic American accent.
[01:12:10.680 --> 01:12:13.320] It was picked because it was a generic American accent.
[01:12:13.320 --> 01:12:16.280] It was sort of the most neutral accent.
[01:12:16.280 --> 01:12:22.600] I only later read that it was sort of an invented or a crafted accent.
[01:12:22.600 --> 01:12:30.840] And now this guy's saying, no, it actually is just sort of the regional accent that is the most neutral, and that's why it was kind of preferred by actors.
[01:12:30.840 --> 01:12:39.480] Also, a lot of Hollywood actors were British, so that's why they might sound like half American, half British, et cetera.
[01:12:39.480 --> 01:12:51.080] I also have, you know, researching this, and I've read before that a lot of the quote-unquote regional accents laid on top of that is a socioeconomic accent.
[01:12:51.080 --> 01:13:05.960] So if you, what we might be hearing is just a northeastern upper society accent, which sounds more similar across regions than do the lower socioeconomic class accents, right?
[01:13:05.960 --> 01:13:19.720] So in other words, an upper class person living in Boston, New York, and London would sound a lot more similar to each other than a lower socioeconomic class person from those three cities, right?
[01:13:19.960 --> 01:13:33.400] Yeah, it's also interesting to think too, Steve, about like when did, like, when we think of a neutral broadcast accent today, kind of a generic American accent today, when did that actually start?
[01:13:33.720 --> 01:13:37.280] Yeah, I mean, because in the 1600s and 1700s, we were all British.
[01:13:37.160 --> 01:13:37.600] That was right.
[01:13:37.880 --> 01:13:38.520] You know what I mean?
[01:13:38.520 --> 01:13:41.800] Like that's how people talk about, or French, yeah, it depends on where they were.
[01:13:41.800 --> 01:13:46.720] But the English accent was probably a British accent.
[01:13:44.760 --> 01:13:49.280] And that would have been regional as well.
[01:13:49.600 --> 01:13:51.280] What do you mean by English?
[01:13:51.280 --> 01:13:52.640] I mean from England.
[01:13:52.640 --> 01:13:52.960] Yeah.
[01:13:53.680 --> 01:13:54.960] Yeah, of course it was a British accent.
[01:13:55.360 --> 01:13:56.160] That's what you're saying.
[01:13:56.480 --> 01:14:00.240] I'm saying that early Americans, I'm not talking about indigenous Americans.
[01:14:00.240 --> 01:14:00.640] Yeah.
[01:14:00.640 --> 01:14:01.760] They were settlers.
[01:14:01.760 --> 01:14:02.160] Oh, yeah.
[01:14:02.160 --> 01:14:02.320] So.
[01:14:02.560 --> 01:14:03.040] Colonial.
[01:14:03.040 --> 01:14:07.360] Well, we know that, for example, the New York accent is a Dutch accent.
[01:14:07.360 --> 01:14:07.600] Right.
[01:14:07.600 --> 01:14:07.920] So I'm saying that.
[01:14:08.000 --> 01:14:13.040] The Southern accent is actually closer to a British accent than the Northern accent.
[01:14:13.040 --> 01:14:15.520] And the Appalachian accent is very Scottish.
[01:14:15.520 --> 01:14:19.520] Like, at what point, though, did it become definitively American?
[01:14:19.760 --> 01:14:20.960] Well, that's what this guy is saying.
[01:14:21.840 --> 01:14:29.600] Around this time is when it sort of coalesced into a distinctive American accent, and this was just the Northeastern version of that.
[01:14:29.920 --> 01:14:31.520] That feels very late.
[01:14:31.520 --> 01:14:32.480] Yeah, I don't know.
[01:14:32.480 --> 01:14:35.200] Because it feels like we skipped the whole 1800s.
[01:14:35.200 --> 01:14:35.760] I don't know.
[01:14:36.320 --> 01:14:36.720] Yeah.
[01:14:36.880 --> 01:14:41.440] So I have not come to a comfortable place where I think I've wrapped my head around this.
[01:14:41.440 --> 01:14:42.640] I think it's complicated.
[01:14:42.640 --> 01:14:44.640] I think there are a lot of moving parts here.
[01:14:45.040 --> 01:14:52.880] It may not be so simple as this was a fake accent that was taught in finishing schools and affected by actors and actresses.
[01:14:52.880 --> 01:14:59.200] I think there's multiple things going on, but I'm not convinced that it's entirely a natural regional accent either.
[01:14:59.200 --> 01:15:05.360] I think there is an element of education and socioeconomic status in here as well.
[01:15:05.680 --> 01:15:06.720] And it may be both.
[01:15:06.720 --> 01:15:12.000] It may be that some people naturally spoke that way and other people go, I like the way that person is talking.
[01:15:12.000 --> 01:15:13.920] I'm going to emulate that.
[01:15:13.920 --> 01:15:14.480] Totally.
[01:15:14.480 --> 01:15:23.120] And I know, you know, that if you go to like radio broadcast school right now, they will teach you how to talk in a very specific way.
[01:15:23.640 --> 01:15:27.440] There is there is a broadcast, quote-unquote, accent today.
[01:15:27.440 --> 01:15:30.600] We just take it for granted because it's normal for us.
[01:15:31.000 --> 01:15:32.440] Yeah, it's the neutral American.
[01:15:32.760 --> 01:15:36.040] And it's what like I have friends who are actors, right?
[01:15:29.840 --> 01:15:37.240] But are from other countries.
[01:15:37.480 --> 01:15:43.080] So they go to dialect coaches and they learn how to speak neutral American, like broadcast.
[01:15:43.080 --> 01:15:45.320] They also learn how to speak in different accents.
[01:15:45.320 --> 01:15:45.480] Right.
[01:15:45.640 --> 01:15:46.680] Whale pilgrim.
[01:15:46.680 --> 01:15:53.400] So that neutral American accent is not fake, but it's not natural either.
[01:15:53.400 --> 01:15:58.520] It is affected, even though it is based upon regional accents.
[01:15:58.520 --> 01:15:59.720] So I think it's a complicated.
[01:15:59.800 --> 01:16:02.040] Yeah, that's an interesting way to even look at it, Steve.
[01:16:02.040 --> 01:16:06.280] Like I speak with more of a neutral broadcast accent.
[01:16:06.280 --> 01:16:11.080] I have never had a southern accent, but my entire family does.
[01:16:11.080 --> 01:16:11.560] Right.
[01:16:11.880 --> 01:16:13.800] So like you were adopted.
[01:16:13.800 --> 01:16:15.000] But did I fake that?
[01:16:15.000 --> 01:16:17.400] Did I choose that at a certain point in my life?
[01:16:17.400 --> 01:16:19.320] Did I unconsciously choose that?
[01:16:19.320 --> 01:16:21.160] Like it's very interesting.
[01:16:21.480 --> 01:16:24.040] Some people have thick accents and some people don't.
[01:16:24.120 --> 01:16:24.520] Could be.
[01:16:24.520 --> 01:16:26.440] Your peers may have influenced it.
[01:16:26.440 --> 01:16:28.120] I'm sure a lot of things influenced it.
[01:16:28.280 --> 01:16:29.880] I watched television growing up.
[01:16:29.880 --> 01:16:30.360] I don't know.
[01:16:30.680 --> 01:16:31.160] Well, it's funny.
[01:16:31.160 --> 01:16:37.960] I remember I had a conversation with somebody from the South, and we were talking about the accents on television.
[01:16:37.960 --> 01:16:43.880] And it was their impression that people on TV spoke in a southern accent.
[01:16:43.880 --> 01:16:56.120] While it was our impression, coming from Connecticut, that they spoke in a northern accent that the southern accent is the exception, and they thought the northern accent was the exception.
[01:16:56.920 --> 01:17:00.520] And I think that both of those are really obvious.
[01:17:00.520 --> 01:17:01.240] Yeah, right.
[01:17:02.200 --> 01:17:05.400] I hear your Kinnedicushian accents all the time on the show.
[01:17:05.400 --> 01:17:07.080] But my parents, I'm from Texas.
[01:17:07.080 --> 01:17:08.680] My parents sound southern.
[01:17:09.000 --> 01:17:10.680] I definitely hear their accents as well.
[01:17:10.680 --> 01:17:15.760] But it's funny, everyone thought their own accent was the rule, and the other ones were the exception.
[01:17:15.760 --> 01:17:16.240] Yeah.
[01:17:16.240 --> 01:17:20.000] You know, it's like, oh, yeah, that show, yeah, that's because that like Barney Miller, they're in New York.
[01:17:14.920 --> 01:17:21.520] So of course they're speaking with a New York accent.
[01:17:21.600 --> 01:17:23.200] That's not the TV accent, though.
[01:17:23.200 --> 01:17:25.200] The TV accent is my accent.
[01:17:25.760 --> 01:17:27.040] That's what everyone perceived.
[01:17:27.040 --> 01:17:27.840] That's funny.
[01:17:27.840 --> 01:17:28.640] Hey, Steve.
[01:17:28.640 --> 01:17:29.200] Yeah.
[01:17:29.520 --> 01:17:32.080] I forgot to mention something, and this one's pretty cool.
[01:17:32.080 --> 01:17:38.560] So we just turned on on our Patreon the ability to gift somebody a membership.
[01:17:38.560 --> 01:17:39.280] Whoa, neat.
[01:17:39.280 --> 01:17:39.840] Oh, man.
[01:17:40.000 --> 01:17:43.920] I thought this would be a cool thing to do for the holiday season.
[01:17:43.920 --> 01:17:45.440] So you just go to patreon.com.
[01:17:45.520 --> 01:17:47.680] Just buy them our books and give them a membership.
[01:17:48.160 --> 01:17:49.680] No accent required.
[01:17:49.680 --> 01:17:56.400] So all you got to do is go to patreon.com forward slash skepticsguide forward slash gift.
[01:17:56.400 --> 01:17:59.360] And that link will be on the SGU homepage as well.
[01:17:59.360 --> 01:18:01.280] And what you could do is you could do two things.
[01:18:01.280 --> 01:18:05.600] One, you could give it to anybody you want as a gift in any parameters that you want.
[01:18:05.600 --> 01:18:07.920] And it could be a cool Christmas present or whatever.
[01:18:07.920 --> 01:18:15.600] The other thing you could do is you could donate a gift membership to someone in the skeptical community that we will find.
[01:18:15.600 --> 01:18:17.920] You know, Ian and I are working on this right now.
[01:18:18.400 --> 01:18:21.440] It will be very easy to locate people that want a free membership.
[01:18:21.440 --> 01:18:26.560] And then they'll know who you are if you want them to as the person giving the gift.
[01:18:26.560 --> 01:18:27.680] So check it out.
[01:18:27.680 --> 01:18:31.440] I think it's a really cool thing to do and it also can support the SGU.
[01:18:31.440 --> 01:18:31.680] All right.
[01:18:31.680 --> 01:18:32.640] Thanks, Jay.
[01:18:32.640 --> 01:18:36.080] Guys, we have a great interview coming up with Michael Mann.
[01:18:36.080 --> 01:18:37.840] So let's go to that interview now.
[01:18:48.000 --> 01:18:50.080] We are joined now by Michael Mann.
[01:18:50.080 --> 01:18:52.160] Michael, welcome back to The Skeptics Guide.
[01:18:52.320 --> 01:18:52.800] Thanks, Steve.
[01:18:52.800 --> 01:18:53.440] It's great to be with you.
[01:18:53.440 --> 01:18:54.960] Yeah, always a pleasure to have you on the show.
[01:18:54.960 --> 01:19:08.120] So, you are obviously a famous climate scientist, and that's had both good and negative aspects in that you've been targeted because you have spoken publicly what the science says about climate change.
[01:19:08.120 --> 01:19:12.200] So, since we've talked to you last time, it's been a few years, actually, I think, since we spoke last time.
[01:19:12.520 --> 01:19:17.080] How do you think things are going in terms of like hearts and minds, climate change, et cetera?
[01:19:17.480 --> 01:19:18.600] It's a great question.
[01:19:18.920 --> 01:19:28.120] This was sort of the topic of my penultimate book, not the most recent one, our fragile moment, but the new climate war back in 2020.
[01:19:28.760 --> 01:19:37.960] Sort of, there's this evolution away from denialism because it's difficult to deny something that people can see with their own two eyes.
[01:19:38.360 --> 01:19:43.160] Literally, being told to not believe what you're seeing with your eyes, what you're hearing with your ears.
[01:19:43.160 --> 01:19:54.520] And so, I think polluters, other bad actors sort of promoting the fossil fuel agenda, if you will, recognize that denialism doesn't really cut it anymore.
[01:19:54.520 --> 01:19:57.160] So, they've turned to other tactics.
[01:19:57.160 --> 01:20:00.440] And among them, ironically, is doomism.
[01:20:01.080 --> 01:20:04.520] So, I mean, and I mentioned this in my talk today.
[01:20:05.160 --> 01:20:17.240] That's in some ways a greater threat today than denialism because you have people who would otherwise sort of be mobilizing, would be on the front lines demanding action, becoming convinced it's too late to do anything.
[01:20:17.240 --> 01:20:19.880] And it potentially leads them down this path of disengagement.
[01:20:19.880 --> 01:20:24.040] And you've got some bad actors who are sort of feeding the flames of doomism.
[01:20:24.200 --> 01:20:26.120] It's a really diabolical.
[01:20:26.440 --> 01:20:30.280] It's absolutely diabolical and Machiavellian.
[01:20:32.120 --> 01:20:33.800] And that's what we face today.
[01:20:33.800 --> 01:20:36.360] Yeah, and it's incredibly effective.
[01:20:36.360 --> 01:20:42.960] And in fact, even like among ourselves, we've had to be vigilant about not falling for the doomism.
[01:20:42.760 --> 01:20:49.760] Because just to put it, my perspective on this is that, and this is the same thing with anti-vaxxers or whatever.
[01:20:50.560 --> 01:20:56.720] The big tell is that they're a denialist, is that the answer is always do nothing.
[01:20:57.200 --> 01:21:03.520] Or it's always the negative, or it's always the vaccines, or it's like with climate change, the answer is always, well, we just shouldn't do anything.
[01:21:03.520 --> 01:21:06.160] And then reverse engineer what the argument is.
[01:21:06.320 --> 01:21:07.600] Yes, reverse engineer.
[01:21:07.600 --> 01:21:08.240] And it's different.
[01:21:08.240 --> 01:21:10.400] The argument changes, but the conclusion is always the same.
[01:21:10.400 --> 01:21:12.160] So at first it was like, it's not happening.
[01:21:12.720 --> 01:21:14.720] Then it's like, well, it's natural cycles.
[01:21:15.440 --> 01:21:18.640] And now it's like, well, there's nothing we can do about it anyway, so why bother?
[01:21:18.880 --> 01:21:22.960] And unfortunately, we've kind of been playing into that.
[01:21:23.520 --> 01:21:24.480] The people were like the...
[01:21:24.560 --> 01:21:24.960] That's right.
[01:21:24.960 --> 01:21:26.240] Yeah, we're saying, hey, we've got to do something.
[01:21:26.240 --> 01:21:27.120] It's going to get a bit too late.
[01:21:27.120 --> 01:21:27.680] It's going to be bad.
[01:21:27.680 --> 01:21:28.160] Blah, blah, blah.
[01:21:28.160 --> 01:21:29.040] Now they're saying, yeah, you're right.
[01:21:29.040 --> 01:21:30.160] It's like, in fact, it's too late.
[01:21:30.160 --> 01:21:31.600] So why bother doing anything?
[01:21:31.600 --> 01:21:35.760] It's like, shit, we've inadvertently been playing into that narrative.
[01:21:35.760 --> 01:21:39.360] And as you say, it undercuts the very people who are probably the most activists.
[01:21:39.840 --> 01:21:43.040] Yeah, it is truly pernicious.
[01:21:43.040 --> 01:22:06.960] And the good news, I suppose, is in a sense, it's an evolution towards the stages, evolving in terms of the stages of denialism or delay and towards something that we can hopefully, the point I make in terms of our messaging on climate, it's good that there's a sense of urgency.
[01:22:07.200 --> 01:22:09.200] We can make use of that.
[01:22:09.200 --> 01:22:11.440] People clearly understand the urgency.
[01:22:11.440 --> 01:22:13.920] It's the agency that we have to convince them of.
[01:22:13.920 --> 01:22:15.520] But we're sort of halfway there.
[01:22:16.240 --> 01:22:21.120] And we need to sort of deprogram them from the misinformation that they've been fed.
[01:22:21.280 --> 01:22:33.960] And that's what, again, is so sort of pernicious here is the way that misinformation has been used and weaponized to feed this sort of agenda of despair and doomism.
[01:22:34.280 --> 01:22:42.520] And, you know, so it's a matter of telling people: look, the science action, look what the models predicted and look what the observations show.
[01:22:42.520 --> 01:22:51.160] The models done a really good job, and those same models tell us that we can limit warming below catastrophic levels.
[01:22:51.160 --> 01:22:53.880] The only obstacle is politics, right?
[01:22:54.040 --> 01:22:54.520] Action.
[01:22:54.520 --> 01:22:55.880] It's not physics.
[01:22:55.880 --> 01:22:56.200] Right.
[01:22:56.200 --> 01:22:57.800] The physics doesn't tell us we can't do it.
[01:22:57.800 --> 01:22:58.440] The technology doesn't.
[01:22:58.600 --> 01:22:59.720] Right, we have the technology too.
[01:22:59.720 --> 01:23:00.600] I make that point too.
[01:23:00.600 --> 01:23:03.560] We actually don't need any technology that we don't already have.
[01:23:03.560 --> 01:23:03.960] That's right.
[01:23:03.960 --> 01:23:05.000] And it's only going to get better.
[01:23:05.800 --> 01:23:09.960] But even if we flatlined our technology, we could still solve it, and it's going to get better.
[01:23:10.200 --> 01:23:10.840] Yeah, exactly.
[01:23:10.840 --> 01:23:17.560] The economies of scale, there's learning by doing the new efficiencies that arise when you start deploying that new technology.
[01:23:17.560 --> 01:23:22.440] So yeah, we will do better than sort of the baseline scenario.
[01:23:22.440 --> 01:23:29.880] But even the baseline scenario of a shift towards these new energy options will get us there.
[01:23:29.880 --> 01:23:41.320] But how do you, maybe convince is the wrong word, how do you have a conversation with individuals who are despairing and are despairing for legitimate reasons, right?
[01:23:41.320 --> 01:23:49.800] Because they look at the political will and they look at every global summit and we're constantly falling short of these arbitrary goals.
[01:23:49.800 --> 01:23:53.080] And they go, like, yeah, we might have the technology, we might have whatever.
[01:23:53.080 --> 01:23:57.000] We're not going to get there because they don't want to enough, and I'm just a person and I can't control this.
[01:23:57.000 --> 01:23:57.880] Yeah, no, absolutely.
[01:23:57.880 --> 01:24:00.760] And this is often the first point I make.
[01:24:00.760 --> 01:24:06.600] We have to sort of parse out the different sort of sources of despair and doomism.
[01:24:06.600 --> 01:24:12.520] Because the doomism, based on the idea that we've triggered runaway warming, and I mean, that's all just nonsense.
[01:24:12.520 --> 01:24:14.680] And we've got to debunk that nonsense.
[01:24:15.200 --> 01:24:26.400] The argument that, well, look at how fraught our politics are, look at the election that you know we're in right now, that's much more difficult to combat because it's understandable.
[01:24:26.400 --> 01:24:40.960] Young folks, especially I teach at the University of Pennsylvania, I interact with Gen Z folks every day, and I understand that, and I understand where it comes from, the cynicism that we can ever rise to the challenge.
[01:24:41.280 --> 01:24:59.120] What I would say is that I remain sort of a stubborn optimist, is the expression I would use, in that I see in those same students a level of fortitude and just sort of Gen Zers.
[01:24:59.120 --> 01:25:03.600] To me, they get it in a way that previous generations didn't.
[01:25:03.600 --> 01:25:07.600] They understand that it's their future that's on the line.
[01:25:07.600 --> 01:25:09.120] They are very savvy.
[01:25:09.120 --> 01:25:20.080] They're very, when it comes to communication and me, because they grew up in this sort of social media world and they have tools and abilities that my generation didn't have, and they're using those tools to make a real difference.
[01:25:20.080 --> 01:25:28.320] So when I see the students that I teach, every day they sort of inspire me and they convince me that we can do it.
[01:25:29.520 --> 01:25:31.520] I've described myself as a pragmatic optimist.
[01:25:32.080 --> 01:25:32.480] Same way.
[01:25:32.480 --> 01:25:35.040] I'm an optimist because that's the only thing that's going to work.
[01:25:35.920 --> 01:25:39.280] Because being a doomist is self-defeating.
[01:25:39.920 --> 01:25:42.800] And the same thing with political, it's, oh, there's nothing we could do about it.
[01:25:42.800 --> 01:25:44.000] That's a self-fulfilling prophecy.
[01:25:44.240 --> 01:25:45.200] That's exactly right.
[01:25:45.200 --> 01:25:47.200] And that's the struggle here, right?
[01:25:47.200 --> 01:25:55.040] Is that as scientists and as critical thinkers, we understand that we've got to be truthful to what the science says.
[01:25:55.520 --> 01:26:08.840] We cannot violate that oath that we have to be truthful to the science, which means that if the science really did tell us it was too late to prevent runaway warming, we'd have to come forward and say so.
[01:26:09.160 --> 01:26:18.840] I mean, and so Steve Schneider referred to this as sort of an ethical double bind, that we have to be truthful to the science and we have to be effective at the same time.
[01:26:18.840 --> 01:26:21.240] It's looking forward as we ever had to do.
[01:26:21.800 --> 01:26:23.560] I often see a parallel.
[01:26:23.560 --> 01:26:35.320] It's interesting in my work working with cancer patients, I see a big parallel, and I learn a lot from my patients that there's often a hope for the best, but prepare for the worst mentality.
[01:26:35.320 --> 01:26:46.600] And understanding that there are some things that are written in stone, and there are other things where, when it comes to hope, I think this is the biggest thing that has kept me from diving into cynicism.
[01:26:46.600 --> 01:26:48.200] Hope is a moving target.
[01:26:48.200 --> 01:26:52.360] You can adjust and adapt your hope based on the available evidence.
[01:26:52.600 --> 01:26:54.920] And it doesn't have to be all the way over here.
[01:26:55.160 --> 01:26:57.240] You can have hope for something over here.
[01:26:57.240 --> 01:26:58.040] That's achievable.
[01:26:58.280 --> 01:27:04.040] Yeah, when you're constantly sort of allowing that, it can keep you looking at the next step and the next way.
[01:27:04.520 --> 01:27:06.680] We're sort of wired that way, aren't we?
[01:27:06.680 --> 01:27:15.960] Yeah, and along those lines, because I was going to ask you this question too, because one of the tactics that we also take is: yes, there are tipping points, but climate change also is a spectrum.
[01:27:15.960 --> 01:27:20.120] It's like, and yeah, even if you think it's going to get bad, it could get worse.
[01:27:20.440 --> 01:27:23.800] So at this point, our goal is always to make it less bad.
[01:27:24.760 --> 01:27:25.560] No, absolutely.
[01:27:25.720 --> 01:27:37.400] And this is, I always try to disabuse folks from the notion that there's like this cliff, like this tipping point, that we warm the planet to 1.5 Celsius and we go off the climate cliff.
[01:27:37.400 --> 01:27:38.840] That's not how it works.
[01:27:39.160 --> 01:27:42.120] I liken it to a highway.
[01:27:42.120 --> 01:27:47.440] We're going down this dangerous highway and we want to get off at the earliest exit that we can.
[01:27:44.840 --> 01:27:52.320] If we miss the 1.5 exit, we don't drive all the way to Topeka.
[01:27:52.960 --> 01:27:55.280] We get off at the 1.6 exit.
[01:27:55.840 --> 01:28:06.400] But isn't there, though, isn't there something, though, to the idea, though, that a tipping point could be achieved where the climate basically resets itself into a new stable configuration?
[01:28:06.400 --> 01:28:07.840] I mean, isn't that a thing, though?
[01:28:08.160 --> 01:28:09.520] Yeah, so let me clarify.
[01:28:09.520 --> 01:28:11.600] So there is no one tipping point.
[01:28:12.240 --> 01:28:17.360] There are many possible tipping points with various subsystems of the climate.
[01:28:17.360 --> 01:28:19.360] There are certain tipping elements.
[01:28:19.360 --> 01:28:24.400] There are aspects of the climate system that have tipping point dynamics.
[01:28:24.640 --> 01:28:26.720] But no one tipping point is catastrophic.
[01:28:26.720 --> 01:28:34.640] Yeah, so if you look at global temperature, for example, that behaves very linearly, at least as far as we can see with the observations and the models.
[01:28:35.440 --> 01:28:40.080] The amount of warming is linearly proportional to the carbon emissions.
[01:28:40.080 --> 01:28:41.840] No obvious tipping point.
[01:28:41.840 --> 01:28:47.360] There's no runaway methane feedback of the sort that some climate doomers argue.
[01:28:48.720 --> 01:28:52.880] The warming is a very linear function, and the models have predicted it very well.
[01:28:52.880 --> 01:28:59.200] Where there is the potential for tipping points is in the non-linear components of the system.
[01:28:59.200 --> 01:29:01.120] And the ice sheets are a good example.
[01:29:01.280 --> 01:29:08.960] There are feedback processes that once you start the collapse of the ice shelves and the ice starts surging into the ocean, it sort of feeds back on itself.
[01:29:08.960 --> 01:29:11.520] There's something called the ice cliff instability.
[01:29:11.520 --> 01:29:17.440] There are these various positive feedback mechanisms, and of course, positive feedback is not a good thing here.
[01:29:18.560 --> 01:29:21.040] And so, there are elements of the climate system.
[01:29:21.040 --> 01:29:23.040] Ice sheet collapse is one of them.
[01:29:23.040 --> 01:29:28.160] There's been a lot of press coverage over the last week or so about the ocean conveyor.
[01:29:28.720 --> 01:29:29.360] Yes, AMOC?
[01:29:29.440 --> 01:29:30.680] The AMOC, exactly.
[01:29:30.840 --> 01:29:34.760] Atlantic meridional overturning circulation.
[01:29:30.000 --> 01:29:36.440] You can see why we don't spell it out.
[01:29:37.240 --> 01:29:39.000] You were a researcher on that, weren't you?
[01:29:39.800 --> 01:29:45.400] I signed my name to a letter from a number of scientists.
[01:29:45.400 --> 01:29:54.920] Stefan Ramsdorf, a friend of mine from Germany, leading climate scientist from the Potsdam Institute, was sort of spearheaded this.
[01:29:54.920 --> 01:30:02.280] But I signed my name to it, and for some reason, in the States anyway, I got mentioned quite a bit in the coverage.
[01:30:02.280 --> 01:30:07.160] I was sort of like a minor signatory of it, and I agree with the overall tenor of what it said.
[01:30:07.160 --> 01:31:11.200] And there's some fine print that didn't get much coverage in the that you know it's speculative that we don't know where the the tipping point is there are uncertainties in the models the observations you know are incomplete and so it's more speculative than definitive and this is about like the Atlantic conveyor belt turning off is that yeah they said you saw the movie the day after tomorrow yeah I mean and what happens in that movie is unrealistic right you're not gonna have tornadoes destroy Los Angeles you're not gonna have super cooled plumes of air killing people or an ice sheet forming you know within three days over the northern half of North America and you're certainly not gonna have Jake Gyllenha winning an academic decatal law there's just no way that's gonna be at least believable I mean you I could not suspend my disproportionate debelief in that they did say, though, like it could be it could be within a few decades, or it could be, you know, by the end of the century.
[01:31:11.200 --> 01:31:13.920] You know, I mean, the time span is is big, but.
[01:31:14.200 --> 01:31:17.040] The models, there's a huge spread in the models that we use.
[01:31:17.040 --> 01:31:20.640] Some of them, it collapses within a couple decades.
[01:31:20.640 --> 01:31:24.080] Some of them, it goes along happily to the end of the century.
[01:31:24.080 --> 01:31:29.920] The other thing is, if it happens, the consequences aren't quite as catastrophic as you might think.
[01:31:30.160 --> 01:31:31.520] Okay, that's encouraging.
[01:31:31.520 --> 01:31:35.920] Yeah, I mean, so what would happen is you'd get cooling in the North Atlantic.
[01:31:35.920 --> 01:31:51.840] And we wrote an article in Nature back in 2015 showing that there's signs, like if you look at the global temperature pattern of the last century, there's warming just about everywhere, but there's this little hole south of Greenland in the North Atlantic where temperatures have actually been cooling.
[01:31:51.840 --> 01:31:52.800] Is that what they call it?
[01:31:52.800 --> 01:31:53.440] The cold blob?
[01:31:53.440 --> 01:31:54.000] The cold blob.
[01:31:54.000 --> 01:31:54.320] There you are.
[01:31:54.320 --> 01:31:54.560] Absolutely.
[01:31:54.720 --> 01:31:55.840] Is Ireland in the cold blob?
[01:31:56.000 --> 01:31:57.280] Because they complain a lot about this.
[01:31:57.760 --> 01:31:59.360] It's getting colder in Ireland.
[01:31:59.360 --> 01:32:03.840] Well, they're in the periphery of where the area that could see.
[01:32:03.840 --> 01:32:10.160] So, I mean, in reality, there's only a small area in the North Atlantic that would likely cool.
[01:32:10.640 --> 01:32:21.920] You would, yeah, and maybe reaching in, you know, to like England, Ireland, slight cooling that could happen, but it's competing against the overall warming.
[01:32:21.920 --> 01:32:23.440] And so it's a battle against.
[01:32:23.680 --> 01:32:28.800] And the further you get away from the bullseye, the more the warming is likely to win out over the cooling.
[01:32:28.960 --> 01:32:33.200] It's like the heat miser and cold misery.
[01:32:33.280 --> 01:32:34.800] Cold misery is a good idea.
[01:32:34.960 --> 01:32:35.280] Absolutely.
[01:32:35.280 --> 01:32:40.240] In fact, I talk about cold miser and heat miser in Our Fragile Moment.
[01:32:40.640 --> 01:32:41.360] As you should.
[01:32:41.360 --> 01:32:41.760] Yeah.
[01:32:42.800 --> 01:32:44.560] And you name it.
[01:32:44.560 --> 01:32:45.840] We talk about it in the book.
[01:32:46.720 --> 01:32:57.680] You mentioned this idea that you're already starting to see some evidence of this, that you use the analogy of this really dangerous freeway and wanting to get off of it as soon as possible.
[01:32:57.680 --> 01:33:06.760] And yes, there may be a couple of different identified cliffs to be relative points where there are cataclysmic or maybe less cataclysmic events happening.
[01:33:07.080 --> 01:33:12.920] But I think an important point to reinforce there is that we are on the freeway now.
[01:33:12.920 --> 01:33:21.960] And just because I am in a home with good air conditioning and I can wear a coat in the winter and I can wear a tank top in the summer and I have plenty of water at my disposal.
[01:33:22.120 --> 01:33:25.480] There are people whose lives are already destroyed by this.
[01:33:25.880 --> 01:33:27.800] And I think we often don't bring that.
[01:33:27.800 --> 01:33:30.920] We keep talking about later on instead of right now.
[01:33:31.160 --> 01:33:33.400] So the climate refugees you're talking about, right?
[01:33:33.400 --> 01:33:43.640] Like, yeah, we if you don't live, if you don't live near the equator, then not much has really changed for me for if you're, you know, there are all of these areas.
[01:33:43.880 --> 01:33:46.760] Yeah, there's people that are already just massively displaced.
[01:33:47.960 --> 01:33:56.680] I just covered this famine that's happening across multiple countries where there's no that high nutrition yield food available to these children.
[01:33:56.680 --> 01:33:57.800] This is climate change.
[01:33:58.360 --> 01:33:58.600] Absolutely.
[01:33:58.840 --> 01:33:59.880] It's directly related.
[01:34:00.200 --> 01:34:00.760] Absolutely.
[01:34:00.760 --> 01:34:08.520] I mean, this is something that I try to convey when I talk about where we are, that dangerous climate change.
[01:34:08.520 --> 01:34:14.280] There's this false notion that, again, that cliff, like dangerous climate change is 1.5 Celsius.
[01:34:14.280 --> 01:34:16.440] That's what dangerous climate change.
[01:34:16.920 --> 01:34:18.600] Dangerous climate change is here.
[01:34:19.560 --> 01:34:25.240] If you're in Puerto Rico, if you're California, if you're, I mean, North Carolina.
[01:34:25.240 --> 01:34:26.040] North Carolina.
[01:34:26.760 --> 01:34:32.840] Yeah, as you said, we can't ever say that hurricane was due to climate change, but we know that there's way too many things happening.
[01:34:33.080 --> 01:34:44.360] Well, we know that rainfall, there's a study already, an attribution study, that shows that there was a 50% increase in the rainfall associated with Hellene that can be attributed to the warming of the planet.
[01:34:44.360 --> 01:34:47.600] And that's that can be linked directly to human lives.
[01:34:48.400 --> 01:34:48.640] Absolutely.
[01:34:44.840 --> 01:34:52.080] Lives loss and huge amounts of property damage.
[01:34:52.720 --> 01:34:54.240] A friend of mine lives in Texas, right?
[01:34:54.240 --> 01:34:56.160] And he's been telling me about the summers recently.
[01:34:56.400 --> 01:34:58.240] And, you know, and he's like, it's getting bad.
[01:34:58.240 --> 01:35:00.640] You know, the grid down there sucks.
[01:35:00.640 --> 01:35:03.440] It can't handle anything extreme that's going on.
[01:35:03.440 --> 01:35:04.800] You know, it's going to be a long time.
[01:35:05.120 --> 01:35:05.520] Yeah.
[01:35:05.520 --> 01:35:07.680] Oh, no, he had they were way above 110.
[01:35:07.680 --> 01:35:10.400] They're way above 110 last year, right?
[01:35:10.400 --> 01:35:10.800] That's okay.
[01:35:10.800 --> 01:35:13.200] You can just fly off to Cancun if you're the center of the country.
[01:35:13.840 --> 01:35:14.160] Exactly.
[01:35:14.880 --> 01:35:21.520] So what's funny is I'm talking to him, and in the conversation, we're like, what's happening with the people who live in Mexico?
[01:35:21.520 --> 01:35:23.600] Or you keep going down to the equator.
[01:35:23.600 --> 01:35:26.160] I mean, it's like, yeah, it's bad in Texas, but we're north.
[01:35:26.480 --> 01:35:27.920] Well, and that's very real.
[01:35:27.920 --> 01:35:37.360] I mean, and so, like, you know, this leads us to, so all of these calamities mean that there are areas of the planet that are becoming increasingly unlivable.
[01:35:37.360 --> 01:35:46.080] And that means more people, 8.2 billion people, and growing, competing for less land, less food, less water.
[01:35:46.400 --> 01:35:50.880] And that is a prescription for a dystopian huge.
[01:35:51.120 --> 01:35:53.040] You think we're having an immigrant problem now?
[01:35:53.440 --> 01:35:54.160] Imagine what that's going to happen.
[01:35:54.240 --> 01:35:57.280] I think it's just going to be a lot of that even.
[01:35:57.600 --> 01:36:00.720] Like, there are people who need a place to live.
[01:36:00.720 --> 01:36:01.360] Well, that's the thing.
[01:36:01.600 --> 01:36:04.400] The thing that scares me the most is that, right?
[01:36:04.400 --> 01:36:10.080] When we get to a point, let's say, you know, and I don't want to catastrophize, but this basically is a catastrophe.
[01:36:10.080 --> 01:36:18.160] But the idea is at some point, there's something that there's going to be some point in time when a lot of people are going to be displaced.
[01:36:18.160 --> 01:36:22.880] And it can't be like, it's not going to be a regional fix, like, oh, they just moved a little north here, whatever.
[01:36:22.880 --> 01:36:25.280] It's going to be like, we have to move people out.
[01:36:25.280 --> 01:36:26.480] This is a global effort.
[01:36:26.960 --> 01:36:30.600] It could be a thing where we have to save 100 million people from dying, you know?
[01:36:32.440 --> 01:36:33.320] We are in that.
[01:36:33.320 --> 01:36:34.280] Yeah, exactly.
[01:36:34.600 --> 01:36:35.400] It's already happening.
[01:36:35.640 --> 01:36:35.880] Exactly.
[01:36:35.960 --> 01:36:37.800] And it's causing massive geopolitical.
[01:36:37.880 --> 01:36:39.400] But you know what they say, Kara?
[01:36:39.400 --> 01:36:46.120] They say, well, we're not, the United States isn't the biggest greenhouse gas emitter, and if China's not doing it, why should we do it?
[01:36:46.120 --> 01:36:46.920] So we are.
[01:36:47.160 --> 01:36:48.120] We put more.
[01:36:48.120 --> 01:36:52.760] So the climate only cares about the cumulative carbon pollution that's been added.
[01:36:52.760 --> 01:36:54.280] It doesn't care about the country at all.
[01:36:54.840 --> 01:36:55.720] It doesn't care about the country.
[01:36:55.800 --> 01:36:58.200] It doesn't care about what year it was added.
[01:36:58.200 --> 01:37:02.120] So the United States has put more carbon pollution into the atmosphere than any other country.
[01:37:02.600 --> 01:37:03.800] Cumulatively, we are the big.
[01:37:03.960 --> 01:37:04.120] Yeah.
[01:37:04.360 --> 01:37:04.680] Currently.
[01:37:05.880 --> 01:37:07.880] Per capita, we're also the biggest.
[01:37:08.280 --> 01:37:08.760] Exactly.
[01:37:08.760 --> 01:37:10.280] But China just has a big population.
[01:37:10.600 --> 01:37:11.720] They're currently at the moment.
[01:37:11.880 --> 01:37:12.760] They're currently, exactly.
[01:37:12.760 --> 01:37:13.000] Exactly.
[01:37:13.400 --> 01:37:16.120] So they're also kicking our ass on solar power and other things.
[01:37:16.360 --> 01:37:19.720] Yeah, so they may mitigate that long before they even reach us cumulatively.
[01:37:19.960 --> 01:37:20.840] That's exactly right.
[01:37:21.080 --> 01:37:24.120] They're ahead of us on the renewable energy front.
[01:37:24.120 --> 01:37:29.480] They are, yes, they may be emitting carbon, more carbon than us, but not on a per capita basis.
[01:37:29.800 --> 01:37:30.360] Not even close.
[01:37:30.520 --> 01:37:32.680] So it's a diversionary tactic, right?
[01:37:33.080 --> 01:37:37.160] When you hear that, oh, what about, it's what about is it what we call it?
[01:37:37.560 --> 01:37:37.800] Yeah, yeah.
[01:37:38.600 --> 01:37:41.960] I know, it's frustrating because you're like, well, it's got to start somewhere.
[01:37:41.960 --> 01:37:44.600] You know, it's like, it's not like, okay, on three, we're all going to start doing this.
[01:37:45.080 --> 01:37:47.480] Well, it's the exact same thing as the doomism.
[01:37:47.480 --> 01:37:50.280] It's like, well, unless everyone's doing it, what's the point?
[01:37:50.280 --> 01:37:52.120] It's like, yeah, but every bit counts.
[01:37:52.120 --> 01:37:53.400] It could always be worse.
[01:37:53.400 --> 01:37:54.440] Making it less bad.
[01:37:54.680 --> 01:37:55.560] We're doing our best.
[01:37:55.560 --> 01:37:56.040] All that stuff.
[01:37:56.040 --> 01:37:56.680] It's like it's.
[01:37:56.680 --> 01:37:59.480] And I think it is part of the same counteracting the doomism thing.
[01:37:59.560 --> 01:37:59.640] Yeah.
[01:37:59.640 --> 01:38:06.680] You hear the analogy of like being in a rowboat and there's holes in the floor of the rowboat and you're not going to plug your hole because that guy over there hasn't plugged his hole.
[01:38:06.680 --> 01:38:07.400] Like, what are you doing?
[01:38:07.880 --> 01:38:08.760] Of course, I'm going to plug my hole.
[01:38:08.760 --> 01:38:10.760] Then I'll go help you plug your hole once my hole is plugged.
[01:38:10.920 --> 01:38:17.280] And the governmental b bureaucracy, it's, you know, it's maddening because we see issues being discussed.
[01:38:17.280 --> 01:38:18.000] Like, yeah, I get it.
[01:38:18.000 --> 01:38:19.280] Yeah, that's not a bad issue.
[01:38:14.760 --> 01:38:20.000] That's not a bad issue.
[01:38:20.160 --> 01:38:25.920] But can global warming please take a front seat at some point in some recent administration, right?
[01:38:25.920 --> 01:38:27.520] Well, what do you ask you next?
[01:38:28.160 --> 01:38:33.520] I actually wanted to piggyback on what you just said because I want to extend the analogy further.
[01:38:33.680 --> 01:38:39.520] And then there are those who are telling us that, no, we don't need to plug the holes in the bottom of the boat.
[01:38:39.520 --> 01:38:42.800] We just need spoons to spoon out the water.
[01:38:43.440 --> 01:38:43.840] Right.
[01:38:44.160 --> 01:38:44.880] Yeah, yeah.
[01:38:45.120 --> 01:38:45.520] All these things.
[01:38:45.680 --> 01:38:46.400] Geoengineering.
[01:38:46.720 --> 01:38:48.720] It's so obvious what we can do, right?
[01:38:49.360 --> 01:38:55.200] I had somebody email me going, What if everything stays the same and we just get really good at carbon capture?
[01:38:55.200 --> 01:38:58.880] And I'm like, We will be very good at carbon capture in about 100 years.
[01:38:58.880 --> 01:39:01.360] We'll have massive, awesome technology.
[01:39:01.520 --> 01:39:02.640] We can't wait till then.
[01:39:02.640 --> 01:39:03.360] And that's exactly what we're doing.
[01:39:03.840 --> 01:39:16.800] Speaking about that, speaking about that, some people I've read have a point of view where they say that at some point, some of these, maybe some of these non-linear tipping points will be kind of imminent, apparently.
[01:39:16.800 --> 01:39:20.640] And merely decreasing the warming isn't going to be enough.
[01:39:20.640 --> 01:39:25.680] And we need to actively cool the planet with some form of geoengineering.
[01:39:25.680 --> 01:39:27.440] So, what are your thoughts on geoengineering?
[01:39:27.600 --> 01:39:30.880] Yes, I have a whole chapter in the new climate war.
[01:39:30.880 --> 01:39:33.120] I'll talk a little bit about it in our fragile moment.
[01:39:33.120 --> 01:39:36.320] Geoengineering or what could possibly go wrong.
[01:39:37.040 --> 01:39:37.200] Right.
[01:39:37.440 --> 01:39:38.800] Like Snow Skier Serviversia.
[01:39:39.760 --> 01:39:40.000] Right.
[01:39:40.160 --> 01:39:40.480] Yes.
[01:39:40.880 --> 01:39:41.200] Yeah.
[01:39:42.400 --> 01:39:56.400] And there was also an episode of this climate series that ran on one of the streaming services where they had a geoengineering sort of mishap.
[01:39:56.280 --> 01:39:57.600] Mishap, exactly.
[01:39:57.920 --> 01:39:58.240] Oops.
[01:39:58.400 --> 01:40:08.280] I think it involved the problem is that it's being used mostly as a crutch right now, as an excuse for business as usual.
[01:40:08.520 --> 01:40:12.040] I mean, who's really, you know, who is really pushing geoengineering?
[01:40:12.040 --> 01:40:15.640] It's Rex Tillerson, you know, former CEO of ExxonMobil.
[01:40:15.640 --> 01:40:18.040] Climate change is just an engineering problem.
[01:40:18.040 --> 01:40:25.320] This idea that we can engineer our way out of it is sort of a license to continue to pollute.
[01:40:25.320 --> 01:40:31.720] And as we've already sort of alluded to here, this is not proven technology at scale.
[01:40:31.720 --> 01:40:34.120] Like, yeah, it's theoretical at this point.
[01:40:34.360 --> 01:40:41.400] There's no demonstration that it can be done at scale on the timeframe necessary, like carbon capture.
[01:40:41.400 --> 01:40:46.520] Now, geoengineering other things like sulfate aerosol in the stratosphere.
[01:40:46.520 --> 01:40:50.680] I mean, that's something that we have lots of, we've seen it happen with volcanic eruptions.
[01:40:50.680 --> 01:40:54.840] We know what it can, the impact it can have to the Earth's albedo and stuff.
[01:40:54.840 --> 01:41:01.800] So those are, that seems more interesting to me because you put those up there, and some studies show that diamond dust apparently would be really efficient.
[01:41:02.040 --> 01:41:09.320] Put them up there, they last for a while, and they slowly have a lot of $100 trillion.
[01:41:09.640 --> 01:41:12.600] $100 trillion or $200 trillion, over 40, over 40 years.
[01:41:12.840 --> 01:41:13.640] 200 trillion.
[01:41:13.640 --> 01:41:14.440] Over 40 years.
[01:41:14.440 --> 01:41:18.040] But, I mean, it could be, they said 1.6 degrees cooling in that time.
[01:41:18.040 --> 01:41:19.400] But here's the part that's so infuriating.
[01:41:19.400 --> 01:41:22.360] It's like I think about these like almost like a medical analogy, right?
[01:41:22.360 --> 01:41:25.560] Like, if you don't want cavities, brush your teeth.
[01:41:25.560 --> 01:41:28.440] Don't just extract the tooth every time you get a cavity.
[01:41:28.760 --> 01:41:33.560] Like, if you're allergic to shellfish, don't just eat a bunch of shellfish and keep giving yourself a bunch of stuff.
[01:41:34.440 --> 01:41:45.360] I used that framing in a piece I wrote for BBC once that, you know, as sort of planetary doctors, if you will, we have an oath to first do no harm.
[01:41:44.840 --> 01:41:52.000] And this violates that oath, potentially, because what you're describing, like, yeah, it's like a volcano.
[01:41:52.080 --> 01:41:53.680] We know how during a volcanic eruption.
[01:41:53.840 --> 01:41:54.720] Acid rain, acid rain, isn't it?
[01:41:54.880 --> 01:41:58.480] Whether there's acid rain, and it's also the climate that you get.
[01:41:58.960 --> 01:42:05.840] So if all we cared about was the global average temperature, you can calculate how much sulfate aerosol you would need to pump in to cool off.
[01:42:06.960 --> 01:42:16.000] But the problem is, if you look at the pattern of the climate response to a volcanic eruption, it's not the inverse of the global warming pattern.
[01:42:16.480 --> 01:42:19.680] So some areas will warm even faster.
[01:42:19.680 --> 01:42:20.880] Some will cool.
[01:42:20.880 --> 01:42:25.760] The hydrological cycle globally over the continents is likely to slow down.
[01:42:25.760 --> 01:42:34.160] So you get drying over the continents and then ozone depletion and other potential unintended consequences.
[01:42:34.160 --> 01:42:36.720] So sort of the principle of unintended.
[01:42:36.880 --> 01:42:39.680] And there's uncertainty, and uncertainty isn't our friend here.
[01:42:39.920 --> 01:42:41.280] This is the largest system.
[01:42:41.280 --> 01:42:42.320] We are in this system.
[01:42:42.320 --> 01:42:43.840] It's a stochastic system.
[01:42:44.080 --> 01:42:45.760] Everything we do changes it.
[01:42:45.760 --> 01:42:49.200] And we got into this mess because we geoengineer.
[01:42:49.200 --> 01:42:50.800] I think that's the thing we have to remember.
[01:42:51.120 --> 01:42:54.000] The extractive practices, the utilization, the burning.
[01:42:54.000 --> 01:42:55.440] This is geoengineering.
[01:42:55.440 --> 01:43:01.360] And we think we're going to geoengineer our way out when what we could do is stop geoengineering so much.
[01:43:01.600 --> 01:43:04.480] Well, this actually came up at the VIP luncheon.
[01:43:04.800 --> 01:43:05.840] Were any of you at that?
[01:43:06.400 --> 01:43:06.560] Okay.
[01:43:07.280 --> 01:43:08.080] We're not VIP.
[01:43:08.800 --> 01:43:10.800] I had an extra ticket I would have given somebody.
[01:43:11.440 --> 01:43:12.240] We were interviewing people.
[01:43:12.320 --> 01:43:12.720] We were interviewing.
[01:43:12.880 --> 01:43:13.360] Yeah, no, exactly.
[01:43:13.440 --> 01:43:13.840] Got it.
[01:43:13.840 --> 01:43:14.240] Yeah.
[01:43:15.040 --> 01:43:17.360] We had a good sort of panel discussion.
[01:43:17.840 --> 01:43:24.000] And one of the things that came up was like we were asked a question about time travel.
[01:43:24.000 --> 01:43:24.720] I was going to say that.
[01:43:24.880 --> 01:43:25.920] I was just going to say that.
[01:43:25.920 --> 01:43:28.480] And Neil deGrasse Tyson had an answer.
[01:43:28.480 --> 01:43:31.640] Brian Cox had an answer.
[01:43:32.200 --> 01:43:35.000] And then the question was posed to the physicists.
[01:43:28.960 --> 01:43:35.880] I said, oh, wait a second.
[01:43:35.960 --> 01:43:37.400] That was all but dissertation physics.
[01:43:37.400 --> 01:43:40.520] So let me give an answer, like a story about time travel.
[01:43:41.000 --> 01:43:45.640] And the story that I used is like when people asked me, is there any techno fix to climate that would work?
[01:43:45.640 --> 01:43:46.840] I said, yeah, time machine.
[01:43:47.400 --> 01:43:47.960] Right.
[01:43:48.440 --> 01:43:51.880] That's the only techno fix that would be going to work.
[01:43:52.040 --> 01:43:52.760] It'll fully work.
[01:43:53.160 --> 01:44:06.840] I'm afraid, though, that 30, say 30 years when it's bad, really bad, because I'm not very, as you know, I'm not very optimistic that we're going to get our shit together enough to make this not even more disastrous.
[01:44:06.920 --> 01:44:11.240] 30 or 40 years, I could see some countries getting together and say, we've got to do something dramatic now.
[01:44:11.240 --> 01:44:12.680] Let's try this geoengineering.
[01:44:12.680 --> 01:44:13.320] Hell yeah.
[01:44:13.640 --> 01:44:14.520] You raise a good point.
[01:44:14.920 --> 01:44:15.560] That's scary.
[01:44:16.040 --> 01:44:17.400] We don't have a global government.
[01:44:17.800 --> 01:44:17.960] Yeah.
[01:44:18.200 --> 01:44:18.520] Exactly.
[01:44:18.760 --> 01:44:19.560] That's a problem.
[01:44:19.880 --> 01:44:24.040] And so you have people now looking at global governance issues.
[01:44:25.240 --> 01:44:26.440] And I support that.
[01:44:26.440 --> 01:44:37.080] Like, even if you don't support geoengineering, we should be thinking about what we do in the event that a rogue country, like China, decides it's favorable for its interests to.
[01:44:37.560 --> 01:44:57.640] And part of the problem here is one place cools at the expense of another place warming more, and so it becomes this very fraught sort of game theory problem if you have individual actors acting in a way to try to optimize their own climate in a way that may be destructive to the global climate.
[01:44:57.960 --> 01:45:01.800] Yeah, I don't want to be the sunny optimist Pollyannish about it.
[01:45:01.960 --> 01:45:06.280] I don't think it's too late to solve the climate crisis for the reasons that we've talked about.
[01:45:06.280 --> 01:45:09.480] But I think we have a monumental battle on our hands.
[01:45:09.480 --> 01:45:20.160] And the larger battle here is the battle to preserve democracy, to preserve fact-based discourse, and the notion of objective truth.
[01:45:22.640 --> 01:45:24.160] Which you guys are already doing.
[01:45:24.400 --> 01:45:27.200] You're saying the solution is to become a patron of the SGU.
[01:45:27.760 --> 01:45:28.640] That's what I'm hearing.
[01:45:29.440 --> 01:45:30.480] That's what I heard.
[01:45:30.480 --> 01:45:30.880] It is.
[01:45:32.640 --> 01:45:34.480] And I'll be looking for the check in the mail next week.
[01:45:34.720 --> 01:45:35.200] Thank you very much.
[01:45:37.120 --> 01:45:38.400] We pay in bubblegum now.
[01:45:40.240 --> 01:45:42.320] Well, Michael, thank you so much for sitting down with us.
[01:45:42.320 --> 01:45:43.040] This was great.
[01:45:43.040 --> 01:45:43.840] It was a pleasure, guys.
[01:45:43.840 --> 01:45:44.080] Thanks.
[01:45:44.080 --> 01:45:44.320] Thank you.
[01:45:44.480 --> 01:45:44.800] Thank you.
[01:45:44.800 --> 01:45:45.840] Thank you.
[01:45:48.400 --> 01:45:53.680] It's time for science or fiction.
[01:45:57.840 --> 01:46:03.040] Each week, I come up with three science news items or facts: two real, one fake.
[01:46:03.040 --> 01:46:07.040] And then I challenge my panel of skeptics to tell me which one is the fake.
[01:46:07.040 --> 01:46:08.560] Three regular news items.
[01:46:08.560 --> 01:46:09.760] You guys ready this week?
[01:46:09.760 --> 01:46:10.400] Let's do it.
[01:46:10.400 --> 01:46:11.520] All right, here we go.
[01:46:11.520 --> 01:46:23.280] Item number one: analysis of a Martian meteorite indicates the presence of bodies of liquid water on the surface of Mars as late as 742 million years ago.
[01:46:23.280 --> 01:46:35.280] Item number two: a recent study finds that doctors using ChatGPT Plus as an additional aid did no better at making diagnoses than those using only traditional methods.
[01:46:35.280 --> 01:46:44.080] And item number three: in an animal study, researchers have demonstrated the ability to deliver functional mRNA through inhalation.
[01:46:44.400 --> 01:46:45.920] Kara, go first.
[01:46:45.920 --> 01:46:59.920] Okay, so scientists analyzed a Martian meteorite and found the presence or found something indicative of the presence of bodies of liquid water on the surface of Mars as late, does that say 700?
[01:47:00.920 --> 01:47:01.480] Yep.
[01:47:01.640 --> 01:47:02.440] Million years ago.
[01:47:02.440 --> 01:47:06.600] Yeah, I have no idea when the liquid dried up, but I do think there was liquid.
[01:47:06.760 --> 01:47:13.000] A recent study finds that doctors using Chat GPT plus, I don't really know, is that just like the most recent one?
[01:47:13.000 --> 01:47:13.560] What's the difference?
[01:47:14.280 --> 01:47:14.760] Okay.
[01:47:14.760 --> 01:47:20.360] As an additional aid, did know better at making diagnoses than those using only traditional methods.
[01:47:20.360 --> 01:47:21.640] Well, there's so much here.
[01:47:21.640 --> 01:47:23.080] Like, what were they diagnosing?
[01:47:23.080 --> 01:47:24.440] Like a strep throat?
[01:47:24.440 --> 01:47:27.800] Or were these complex, complicated things?
[01:47:27.800 --> 01:47:35.800] And then item number three, in an animal study, researchers have demonstrated the ability to deliver functional mRNA through inhalation.
[01:47:35.800 --> 01:47:38.600] Meaning that it gets into the system and it functions.
[01:47:38.600 --> 01:47:39.800] It does what it's supposed to do.
[01:47:39.800 --> 01:47:41.880] It takes up shop, makes proteins.
[01:47:42.280 --> 01:47:45.240] So I could see them doing that with some sort of viral vector.
[01:47:45.240 --> 01:47:48.520] You inhale the virus and then it inserts itself.
[01:47:48.520 --> 01:47:48.920] I don't know.
[01:47:48.920 --> 01:47:50.600] That one doesn't bother me that much.
[01:47:50.600 --> 01:47:55.160] I have no idea the age of the water, though, so that's problematic.
[01:47:55.160 --> 01:47:58.840] And then the ChatGPT Plus, no better at making diet.
[01:47:59.000 --> 01:48:00.920] There's just too many unknowns in this one.
[01:48:00.920 --> 01:48:05.560] I believe that would be the case if they were diagnosing strep throat.
[01:48:05.560 --> 01:48:17.080] I do not believe that would be the case if they were diagnosing some sort of kind of infectious disease that required understanding of tropical medicine or differentials that weren't.
[01:48:17.320 --> 01:48:21.480] I mean, I'll say they were challenging diagnoses, otherwise it would have been a pointless study.
[01:48:22.040 --> 01:48:22.840] Yeah, I agree.
[01:48:22.840 --> 01:48:27.960] So I think in that case, then the Chat GPT plus one is the fiction.
[01:48:27.960 --> 01:48:31.960] I think that probably it did help to be able to ask.
[01:48:32.280 --> 01:48:38.040] Wait, real quick, though, when you say traditional methods, does that mean they were still able to look stuff up in books?
[01:48:38.040 --> 01:48:38.680] Yeah.
[01:48:38.680 --> 01:48:47.120] Oh, yeah, so they were still following a standard differential model and they still had access to all the same information.
[01:48:48.160 --> 01:48:49.520] Okay, then I'll go with Mars.
[01:48:49.520 --> 01:48:50.800] Maybe the date's wrong.
[01:48:44.760 --> 01:48:51.200] I don't know.
[01:48:52.960 --> 01:48:54.240] Yeah, I don't know.
[01:48:54.240 --> 01:48:55.200] Okay, Bob.
[01:48:55.520 --> 01:48:56.960] Yeah, the Martian meteorite.
[01:48:56.960 --> 01:48:58.080] Yeah, that kind of makes sense.
[01:48:58.080 --> 01:49:00.640] I don't ever see any problems with that.
[01:49:02.320 --> 01:49:08.000] Let's see, go to three, the inhaling mRNA.
[01:49:08.000 --> 01:49:09.520] Yeah, I agree with Kara.
[01:49:09.920 --> 01:49:12.720] If there's a viral vector, that makes a lot of sense.
[01:49:12.720 --> 01:49:14.800] But, I mean, viral vectors are scary.
[01:49:14.800 --> 01:49:16.240] Would they use that route?
[01:49:16.240 --> 01:49:21.760] And then if they had a viral vector, we wouldn't necessarily even need to use it for inhalation.
[01:49:21.760 --> 01:49:23.360] But it kind of makes sense.
[01:49:23.360 --> 01:49:34.160] But ChatGPT, yeah, I mean, I think there might be, I don't know, maybe, unless you're using ChatGPT with a specific medical database.
[01:49:34.160 --> 01:49:45.040] If you're just using, I'm not aware of the ins and outs of ChatGPT Plus, but I think there might be a lot of medical misinformation mixed in there.
[01:49:45.040 --> 01:49:48.480] So generic ChatGPT might not be a great source for that kind of stuff.
[01:49:48.480 --> 01:49:52.400] But it wouldn't surprise me either way, but I'd say ChatGPT is fiction.
[01:49:52.400 --> 01:49:53.520] Okay, Evan?
[01:49:53.520 --> 01:50:02.320] Yeah, so the Mars meteorite, as late as 742 million years ago, with indications of liquid water.
[01:50:02.640 --> 01:50:07.760] We know there has been liquid water on the surface of Mars, but when?
[01:50:07.760 --> 01:50:14.320] And a meteorite revealing that seems to, I don't really see anything wrong with this one.
[01:50:14.560 --> 01:50:18.960] Nothing is saying, hey, Red Alert, this doesn't jive.
[01:50:19.120 --> 01:50:22.640] I think that's in sync with what's understood about Mars.
[01:50:22.640 --> 01:50:24.400] But that we found it is kind of cool.
[01:50:24.400 --> 01:50:35.080] The ChatGPT one, this is the one I think lends itself the most to being the fiction, only because not for any technical reasons, just for a guess reason.
[01:50:35.720 --> 01:50:41.160] It reads here, as an additional aid, did no better at making diagnoses.
[01:50:41.160 --> 01:50:46.200] So that means it would be fiction if it did better or worse.
[01:50:46.200 --> 01:50:49.080] So you kind of have two directions you can go with this one.
[01:50:49.080 --> 01:50:52.760] So I think that opens up the possibility that that one's fiction a little bit more.
[01:50:52.760 --> 01:50:59.320] And the last one, I don't, I know nothing about mRNA through inhalation.
[01:50:59.320 --> 01:51:04.680] And just based on what I heard Kara and Bob say, yeah, I'll say that one's science.
[01:51:04.680 --> 01:51:06.760] So chat GPT, I'll go with Bob.
[01:51:06.760 --> 01:51:07.560] Fiction.
[01:51:07.560 --> 01:51:08.440] And Jay.
[01:51:08.440 --> 01:51:18.600] The Martian one, I mean, I have no reason to think that the 742 million years ago water thing, like, I just, there's no reason to not believe that.
[01:51:18.600 --> 01:51:22.520] Like, there's just no information in here that you can contest other than the date.
[01:51:22.520 --> 01:51:24.200] And it makes sense, you know.
[01:51:24.200 --> 01:51:28.440] I mean, I can't contest it in any way, so I'm just going to assume that one is science.
[01:51:28.600 --> 01:51:34.440] The second one about the ChatGPT being used as a diagnostic tool.
[01:51:34.440 --> 01:51:36.440] Does it actually help physicians or not?
[01:51:36.440 --> 01:51:46.680] You know, if they have access to the latest information, it would be no different than asking ChatGPT if it has the latest information.
[01:51:46.680 --> 01:51:52.360] The thing that I wouldn't want ChatGPT to actually do, though, is give any kind of medical advice.
[01:51:52.360 --> 01:51:55.320] So I just don't think that's a good thing to do anyway.
[01:51:55.560 --> 01:52:04.040] Now, if they specifically programmed ChatGPT with very specific information and instructions, it might help.
[01:52:04.040 --> 01:52:07.480] I mean, I think it would definitely speed things up at the very least.
[01:52:07.480 --> 01:52:09.960] But I just don't think today that it's there.
[01:52:09.960 --> 01:52:11.160] I don't think that that's it.
[01:52:11.160 --> 01:52:19.600] So I'm definitely very suspicious about the breathing in an mRNA and have it function.
[01:52:20.400 --> 01:52:23.520] You know, I know that breathing certain things in is a vector, right?
[01:52:23.520 --> 01:52:24.880] It does work for certain things.
[01:52:24.880 --> 01:52:27.040] I just don't think it would work for this.
[01:52:27.200 --> 01:52:28.880] So I'm going to say that one's the fiction.
[01:52:28.880 --> 01:52:30.320] Okay, so we're spread out.
[01:52:30.640 --> 01:52:31.760] I hate one first.
[01:52:31.920 --> 01:52:33.520] I disagree with my answer.
[01:52:33.520 --> 01:52:34.640] You want to change the answer?
[01:52:34.720 --> 01:52:35.120] You want to change it?
[01:52:35.200 --> 01:52:36.320] I do, but I can't.
[01:52:37.920 --> 01:52:38.400] We don't do that.
[01:52:38.800 --> 01:52:41.920] Whoa, you never let people do that.
[01:52:41.920 --> 01:52:45.680] I will stick with my answer, but I think the chat GPT one is the fiction.
[01:52:45.680 --> 01:52:46.160] Okay.
[01:52:46.160 --> 01:52:47.440] Well, let's start with number three.
[01:52:48.240 --> 01:52:49.760] We'll go back.
[01:52:49.840 --> 01:52:50.640] Animals study.
[01:52:52.320 --> 01:52:53.360] Liver functional mRNA.
[01:52:53.440 --> 01:52:53.840] Can't say that.
[01:52:53.920 --> 01:52:54.640] Do inhalation.
[01:52:54.640 --> 01:52:56.160] Jay, you think this one is the fiction.
[01:52:56.160 --> 01:52:57.360] Everyone else thinks this one is science.
[01:52:57.360 --> 01:52:58.000] Not anymore, you know.
[01:52:58.240 --> 01:52:59.840] And this one is science.
[01:52:59.840 --> 01:53:01.280] Yeah, sorry, Jay.
[01:53:01.920 --> 01:53:02.880] What do you guys think?
[01:53:02.880 --> 01:53:03.840] It's not a virus.
[01:53:03.840 --> 01:53:08.960] And so if it's not a virus, what are they using as the carrier, as the carrier for the mRNA?
[01:53:08.960 --> 01:53:10.880] Because it can't just make it mRNA.
[01:53:10.960 --> 01:53:11.360] Lipids.
[01:53:11.360 --> 01:53:11.920] CRISPR?
[01:53:11.920 --> 01:53:13.520] Lipids, lipid what?
[01:53:13.840 --> 01:53:14.960] Oh, nanoparticles.
[01:53:15.040 --> 01:53:15.840] Nanosomes.
[01:53:15.840 --> 01:53:16.800] Nanoparticles.
[01:53:16.800 --> 01:53:18.000] Lipid nanoparticles.
[01:53:18.320 --> 01:53:19.680] Didn't you just do a lipid nanoparticle?
[01:53:19.760 --> 01:53:20.080] Yes, I did.
[01:53:20.960 --> 01:53:23.200] Yeah, lipid nanoparticles are magic, man.
[01:53:23.200 --> 01:53:24.160] No, they are great.
[01:53:24.160 --> 01:53:25.440] It's a great technology.
[01:53:25.440 --> 01:53:29.600] So they used lipid nanoparticles to house the mRNA.
[01:53:29.600 --> 01:53:33.840] So basically, it becomes aerosolized or nebulized, and you breathe it in.
[01:53:33.840 --> 01:53:35.120] It gets into the lungs.
[01:53:35.120 --> 01:53:36.000] It sets up shop.
[01:53:36.000 --> 01:53:40.400] And they tested it in mice, and it was just producing a marker.
[01:53:40.800 --> 01:53:46.320] And it was sustained production of the protein that it coded for.
[01:53:46.320 --> 01:53:48.400] I think it made it light up or something.
[01:53:48.400 --> 01:53:58.320] Now, this was also fixing an earlier problem because they had tried to use lipid nanoparticles previously for this aerosolized or nebulized delivery of mRNA.
[01:53:58.320 --> 01:54:03.400] The problem was that the nanoparticles clumped together and they got too big.
[01:53:59.920 --> 01:54:05.080] And then that would provoke an immune response.
[01:54:05.560 --> 01:54:10.680] So, which remember, that was the problem with the other lipid nanoparticle news item that I talked about.
[01:54:10.680 --> 01:54:21.240] So, this study was figuring out how to keep the lipid nanoparticles from clumping so that they could remain individual and they could deliver their mRNA to the lung cells.
[01:54:21.240 --> 01:54:22.200] And it worked.
[01:54:22.200 --> 01:54:22.920] Yay!
[01:54:22.920 --> 01:54:28.600] So, this could be a way, you know, in the future, we may like be inhaling our mRNA vaccines.
[01:54:28.600 --> 01:54:29.800] No more shots, right?
[01:54:29.800 --> 01:54:32.920] Or inhaling mRNA drug deliveries, you know.
[01:54:32.920 --> 01:54:34.120] So, this is a cool.
[01:54:34.280 --> 01:54:36.840] We're already inhaling, what, flu vaccines?
[01:54:37.160 --> 01:54:38.680] Yeah, yeah, yeah, cool, yeah.
[01:54:38.680 --> 01:54:39.480] Did that come back?
[01:54:39.480 --> 01:54:41.240] I thought they stopped doing that.
[01:54:41.240 --> 01:54:45.240] Oh, I don't know, but I mean, it has it has been we have the technology to do that, yeah, yeah.
[01:54:45.320 --> 01:54:47.960] If it's a live virus, only for live viruses, though.
[01:54:48.120 --> 01:55:02.440] Oh, I see, yeah, and that's why they didn't want to necessarily, it was because it's a live virus, not because if it was inhaled, um, so yeah, this is an animal study, so it's got to go through, they're going to, there was in mice, they got to do bigger animals, and then people, and then, so, you know, five to ten years, whatever.
[01:55:03.000 --> 01:55:04.120] I guess we'll keep going back.
[01:55:04.120 --> 01:55:12.680] A recent study finds that doctors using ChatGPT Plus as an additional aid did no better at making diagnoses than those using only traditional methods.
[01:55:12.680 --> 01:55:15.560] Bob and Evan think this one is the fiction.
[01:55:15.560 --> 01:55:18.600] Kara thinks it's the fiction, although she didn't choose it officially.
[01:55:18.600 --> 01:55:23.960] Jay thinks this one is science, and this one is science.
[01:55:23.960 --> 01:55:24.440] What?
[01:55:25.480 --> 01:55:26.280] Oh, no!
[01:55:26.520 --> 01:55:27.720] Oh, shit.
[01:55:29.560 --> 01:55:30.280] What the hell?
[01:55:30.280 --> 01:55:32.840] Kara, man, you managed to pull it out.
[01:55:32.840 --> 01:55:33.480] Oh, my God.
[01:55:33.800 --> 01:55:35.400] Wow, awesome.
[01:55:35.400 --> 01:55:36.480] Like, talking.
[01:55:36.800 --> 01:55:38.680] Being right despite being wrong.
[01:55:38.680 --> 01:55:40.280] That is amazing.
[01:55:40.280 --> 01:55:40.640] Yeah.
[01:55:40.440 --> 01:55:41.960] So, so yeah, they did no better.
[01:55:41.960 --> 01:55:53.280] Now, again, again, this is an additional aid, meaning that the doctors in the group using ChatGPT had access to everything the other group had, plus ChatGPT.
[01:55:53.920 --> 01:55:58.000] And it did not help them make any more accurate diagnoses.
[01:55:58.000 --> 01:56:02.240] But here's the interesting thing I didn't include in the science or fiction.
[01:56:02.240 --> 01:56:11.200] When they compared those two groups to just ChatGPT by itself, ChatGPT outperformed both groups of physicians.
[01:56:11.200 --> 01:56:12.080] You should have included that.
[01:56:12.560 --> 01:56:13.360] Fascinating.
[01:56:13.360 --> 01:56:22.560] Even the group of physicians using ChatGPT, which means that the introduction of a physician resulted in less accurate diagnoses.
[01:56:22.560 --> 01:56:23.920] Yep, goodbye, doctors.
[01:56:24.480 --> 01:56:26.320] So much for all those boards and stuff.
[01:56:26.640 --> 01:56:28.400] That is bananas.
[01:56:28.400 --> 01:56:29.200] So, I know, it's crazy.
[01:56:29.360 --> 01:56:36.800] So, they were asking it more specific questions by then, as opposed to just blanket saying, what does this person have based on these symptoms?
[01:56:36.800 --> 01:56:42.960] Well, what the authors concluded, they did not conclude that we should therefore be having ChatGPT make diagnoses.
[01:56:42.960 --> 01:56:53.200] What they said was, clearly, we need to teach physicians how to optimally use these large language models as a diagnostic assistant because they're not using it right.
[01:56:53.200 --> 01:56:57.200] Whatever, they were just, they didn't help them get to a more accurate diagnosis.
[01:56:57.200 --> 01:57:04.560] So, clearly, they were not leveraging that technology, even though it is capable of making a more accurate diagnosis by itself.
[01:57:04.560 --> 01:57:07.520] Which is, I don't know, it's incredible.
[01:57:07.520 --> 01:57:16.800] Which means that analysis of a Martian meteorite indicates the presence of bodies of liquid water on the surface of Mars as late as 742 million years ago is the fiction.
[01:57:17.120 --> 01:57:18.120] So, here I'm looking at, here.
[01:57:18.080 --> 01:57:24.240] I'm looking at a goddamn article that says, Meteorite contains evidence of liquid water on Mars 742 million years ago.
[01:57:24.240 --> 01:57:24.640] I know.
[01:57:25.040 --> 01:57:25.360] I know.
[01:57:25.360 --> 01:57:26.240] That's exactly that.
[01:57:26.240 --> 01:57:27.600] Is the title of the article?
[01:57:27.920 --> 01:57:29.760] Cool, then we were right.
[01:57:29.880 --> 01:57:52.040] And I was hoping people who read that headline but didn't read into the article itself, especially the actual article, would not have picked up on the fact that they're not saying that there was liquid water on Mars because liquid water on Mars dried up three billion years ago.
[01:57:52.040 --> 01:58:17.480] And what they're saying was that there was probably still some geologic activity, like some volcanic activity, which there still is in pockets on Mars, that that activity melted the permafrost, causing a pocket of liquid water, and that that water affected the crystallization of the minerals in the meteorite that ultimately became a meteorite on Earth, you know, from Mars.
[01:58:18.040 --> 01:58:18.760] Interesting.
[01:58:18.760 --> 01:58:21.080] Yeah, so that was that was the deception.
[01:58:21.080 --> 01:58:29.880] Um, but yeah, if you just read the headline, you might be um it was water, deep water interacting with rock, not water on the surface of Mars.
[01:58:29.880 --> 01:58:31.400] Evan, give us a quote.
[01:58:32.120 --> 01:58:33.400] Here it is.
[01:58:33.400 --> 01:58:35.720] Steve, you uh requested this quote.
[01:58:35.720 --> 01:58:37.160] Yes, I like this quote.
[01:58:37.480 --> 01:59:05.520] I have a foreboding of an America in my children's or grandchildren's time when the United States is a service and information economy, when nearly all the key manufacturing industries have slipped away to other countries, when awesome technological powers are in the hands of a very few, and no one representing the public interest can even grasp the issues, when the people have lost the ability to set their own agendas or knowledgeably question those in authority.
[01:59:05.520 --> 01:59:21.440] When clutching our crystals and nervously consulting our horoscopes, our critical faculties in decline, unable to distinguish between what feels good and what's true, we slide almost without noticing back into superstition and darkness.
[01:59:21.760 --> 01:59:44.800] The dumbing down of America is most evident in the slow decay of substantive content in the enormously influential media, the 30-second soundbites, now down to 10 seconds or less, lowest common denominator programming, credulous presentations on pseudoscience and superstition, but especially a kind of celebration of ignorance.
[01:59:44.800 --> 01:59:58.080] And we should all know that, as Carl Sagan from his book, The Demon Haunted World, if you do not have a copy, you must, must go get a copy, along with a copy of our book, The Skeptic's Guide to the University.
[01:59:58.080 --> 02:00:02.160] Written in the 90s and, you know, basically very prescient.
[02:00:02.160 --> 02:00:03.280] You know, my gosh.
[02:00:03.280 --> 02:00:05.600] In terms of, you basically nailed it.
[02:00:05.600 --> 02:00:09.280] In terms of the case, I may have shared that on my socials the day after the election.
[02:00:09.440 --> 02:00:09.680] I know.
[02:00:09.680 --> 02:00:10.960] This was very a lot of people were sharing.
[02:00:10.960 --> 02:00:14.880] It's like, yeah, you got to read this because this is unfortunately true.
[02:00:14.880 --> 02:00:31.840] The idea that this is my giant frustration with this year is that the media does not meaningfully inform the electorate that people do not understand the issues well enough to make an informed decision.
[02:00:31.840 --> 02:00:43.120] And basically, the majority of people are susceptible to propaganda and manipulation and this kind of lowest common denominator kind of arguments, and it's very disappointing.
[02:00:43.440 --> 02:00:44.240] Hugely.
[02:00:44.800 --> 02:00:49.040] We're going to have some choice things to say about it at our year-end review show in a few days.
[02:00:49.200 --> 02:00:50.000] Oh, yeah.
[02:00:50.640 --> 02:00:57.840] We have a lot of work to do we the the information ecosystem of America is broken, in my opinion.
[02:00:57.840 --> 02:00:59.600] And we've been saying this for years.
[02:00:59.600 --> 02:02:18.680] This is nothing new you know this is it's just getting worse you know with social media and everything uh the loss of good science journalism the loss of good journalism period the you know rise of essentially propaganda media it's just you know we've lost our ability to make informed decisions as voters in this country at least in sufficient numbers that reality holds sway you know the the influence of reality and facts on people's decisions is too small you know to be definitive like it should be unfortunate okay that's the world we're living in thank you all for joining me this week you're welcome all right steve and until next week this is your skeptic's guide to the universe skeptics guide to the universe is produced by sgu productions dedicated to promoting science and critical thinking for more information visit us at the skepticsguide.org send your questions to info at the skepticsguide.org and if you would like to support the show and all the work that we do go to patreon.com slash skepticsguide and consider becoming a patron and becoming part of the sgu community our listeners and supporters are what make sgu possible