Debug Information
Processing Details
- VTT File: skepticast2024-08-17.vtt
- Processing Time: September 11, 2025 at 03:41 PM
- Total Chunks: 3
- Transcript Length: 160,411 characters
- Caption Count: 1,679 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.240 --> 00:00:05.680] I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website now.
[00:00:05.680 --> 00:00:07.680] Thankfully, Bluehost made it easy.
[00:00:07.680 --> 00:00:12.400] I customized, optimized, and monetized everything exactly how I wanted with AI.
[00:00:12.400 --> 00:00:14.160] In minutes, my site was up.
[00:00:14.160 --> 00:00:15.200] I couldn't believe it.
[00:00:15.200 --> 00:00:18.240] The search engine tools even helped me get more site visitors.
[00:00:18.240 --> 00:00:21.680] Whatever your passion project is, you can set it up with Bluehost.
[00:00:21.680 --> 00:00:24.720] With their 30-day money-back guarantee, what do you got to lose?
[00:00:24.720 --> 00:00:26.400] Head to bluehost.com.
[00:00:26.400 --> 00:00:30.800] That's B-L-U-E-H-O-S-T.com to start now.
[00:00:33.680 --> 00:00:36.960] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.960 --> 00:00:39.920] Your escape to reality.
[00:00:40.560 --> 00:00:43.120] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:43.120 --> 00:00:48.080] Today is Tuesday, August 13th, 2024, and this is your host, Stephen Novella.
[00:00:48.080 --> 00:00:49.840] Joining me this week are Bob Novella.
[00:00:49.840 --> 00:00:50.400] Hey, everybody.
[00:00:50.400 --> 00:00:51.840] Kara Santa Maria.
[00:00:51.840 --> 00:00:52.240] Howdy.
[00:00:52.320 --> 00:00:53.280] Jay Novella.
[00:00:53.280 --> 00:00:53.840] Hey, guys.
[00:00:53.840 --> 00:00:55.520] And Evan Bernstein.
[00:00:55.520 --> 00:00:56.640] Good evening, everyone.
[00:00:56.640 --> 00:01:06.560] We're recording a day early because we are going to Chicago this weekend to do the extravaganza and to record the 1,000th episode.
[00:01:06.560 --> 00:01:10.080] So when this show goes out, there we will.
[00:01:10.080 --> 00:01:14.880] That will be extravaganza day, and the day after the show goes out, we'll be recording the 1,000th episode.
[00:01:14.880 --> 00:01:18.000] So it's probably too late to get tickets.
[00:01:19.520 --> 00:01:20.960] Yeah, but try.
[00:01:20.960 --> 00:01:25.280] So, Jay, today is International Left-Handedness Day.
[00:01:25.280 --> 00:01:25.760] Yeah.
[00:01:26.080 --> 00:01:27.200] And you're left-handed.
[00:01:27.200 --> 00:01:28.480] So what am I supposed to do?
[00:01:28.480 --> 00:01:30.240] Like, let's really think about this.
[00:01:30.240 --> 00:01:32.720] How do I celebrate my left-handedness?
[00:01:33.440 --> 00:01:35.120] Like, what am I supposed to do?
[00:01:35.120 --> 00:01:37.360] You're supposed to give it a day off.
[00:01:37.360 --> 00:01:38.480] The rest of us are right-handed.
[00:01:38.480 --> 00:01:40.000] We have no idea how you think.
[00:01:41.280 --> 00:01:44.720] Go to the left-handed store, buy a bunch of left-handed items.
[00:01:45.200 --> 00:01:48.560] I'm a left-handed yank, drive on the left side of the road suddenly.
[00:01:48.880 --> 00:01:50.400] I kind of like being left-handed.
[00:01:51.600 --> 00:01:52.160] I don't know.
[00:01:52.160 --> 00:01:52.800] Jay.
[00:01:54.080 --> 00:02:00.440] Jay, I say this not to make you feel uncomfortable because this is you are like my brother, so this is separate to you.
[00:01:59.760 --> 00:02:02.840] I am weirdly attracted to left-handed people.
[00:02:02.840 --> 00:02:03.160] Really?
[00:02:03.160 --> 00:02:03.640] That is weird.
[00:01:59.840 --> 00:02:04.760] I agree with you.
[00:01:59.920 --> 00:02:05.160] That is weird.
[00:02:05.480 --> 00:02:06.520] Has anybody else?
[00:02:06.600 --> 00:02:07.480] Have you noticed that?
[00:02:07.480 --> 00:02:08.280] And how do you know?
[00:02:08.680 --> 00:02:13.640] I've noticed that I think I date a higher-than-normal proportion of lefties.
[00:02:13.720 --> 00:02:15.400] You're sure you're not just that's interesting, Kara.
[00:02:15.640 --> 00:02:20.040] I'm just wondering what other traits come along with being so in latter.
[00:02:20.200 --> 00:02:20.520] Sinister.
[00:02:20.600 --> 00:02:20.920] Sinister.
[00:02:21.160 --> 00:02:21.880] Sinistra.
[00:02:21.880 --> 00:02:22.440] Sinistra.
[00:02:22.520 --> 00:02:23.000] Sorry, Kara.
[00:02:23.000 --> 00:02:24.920] I didn't mean to kink shame you there, Kara.
[00:02:24.920 --> 00:02:25.240] Thank you.
[00:02:25.640 --> 00:02:31.560] What's funny is that that was an answer in a recent New York Times crossword puzzle.
[00:02:31.560 --> 00:02:32.200] Kinkshame.
[00:02:32.200 --> 00:02:33.320] I know, kink shame.
[00:02:33.320 --> 00:02:41.480] And my wife had never heard the word before, and she doubted it was real until I was proven correct when we finished the puzzle.
[00:02:41.800 --> 00:02:43.960] We got that one right away.
[00:02:44.920 --> 00:02:52.520] What I'm not sure about is why would some people decide, like, hey, we're going to push to have like a national international left-handed today?
[00:02:52.520 --> 00:02:57.720] Like, you know, I wasn't teased or anything, so I don't need the support aspect of it.
[00:02:57.720 --> 00:02:58.680] I just don't get it.
[00:02:58.680 --> 00:03:01.000] Maybe people were back in the day a lot more.
[00:03:01.000 --> 00:03:02.520] I mean, a lot of people were forced to be right-handed.
[00:03:04.120 --> 00:03:08.200] I think when we were young, we were already past the anti-left-handed phase.
[00:03:08.360 --> 00:03:14.120] That was still, I remember when we were young, talking about, oh, in the past, people would force you to write right-handed and whatever, but that was already done.
[00:03:14.120 --> 00:03:14.680] Yeah.
[00:03:14.680 --> 00:03:28.760] But to be fair, even though in the past you were, you know, lefties were forced to write right-handed, Jay, you're still, just by virtue of the structures that you live in, forced to be right-handed about certain things, right?
[00:03:28.760 --> 00:03:30.120] I don't think so.
[00:03:30.120 --> 00:03:32.120] I mean, look, I have to use like scissors.
[00:03:32.280 --> 00:03:33.640] You have to use right-handed scissors.
[00:03:33.960 --> 00:03:38.520] I can cut with my right hand because most left-handed people learn to accommodate.
[00:03:38.840 --> 00:03:40.280] That's exactly what I mean.
[00:03:40.280 --> 00:03:45.760] There are things in this world that are handed, and you just have to learn how to use them.
[00:03:45.760 --> 00:03:49.360] Yeah, I mean, I could use both hands for lots of different things.
[00:03:50.880 --> 00:03:51.920] I mean, I.
[00:03:52.000 --> 00:03:54.080] I think that's even more likely to be ambidextrous.
[00:03:54.400 --> 00:03:55.920] I can shoot a gun both ways.
[00:03:55.920 --> 00:03:58.480] I can swing rackets both ways.
[00:04:00.000 --> 00:04:01.520] I can't golf both ways.
[00:04:02.640 --> 00:04:04.160] No swing, no backswing.
[00:04:04.160 --> 00:04:06.240] And I golf right-handed, by the way.
[00:04:06.880 --> 00:04:07.440] Oh, yeah?
[00:04:07.840 --> 00:04:09.520] But you shoot bows left-handed.
[00:04:09.520 --> 00:04:11.840] I shoot bows left-handed, but I could shoot the other way.
[00:04:11.840 --> 00:04:13.280] I could shoot just as good the other way.
[00:04:13.680 --> 00:04:20.640] So you probably can't remember, but I'm curious about y'all's experiences because Jay's the baby.
[00:04:20.640 --> 00:04:32.400] You know, my best friend has a little girl who's about to turn two, and it was obvious so early that she was left, that she is going, we say going to be, but whatever, that she is left-handed.
[00:04:32.400 --> 00:04:39.280] She just does things with her left hand that seem weird to us, but clearly are very natural for her.
[00:04:39.280 --> 00:04:40.400] When did you know?
[00:04:40.400 --> 00:04:43.760] Like, did you guys notice that Jay was left-handed when he was a little baby?
[00:04:43.760 --> 00:04:48.080] The first indication was when I was probably about three or four years old.
[00:04:48.080 --> 00:04:50.080] I was sucking my left thumb.
[00:04:50.480 --> 00:04:51.200] Oh, three or four.
[00:04:51.360 --> 00:04:53.360] That develops in the womb, thumb sucking.
[00:04:53.680 --> 00:04:55.280] I probably was doing that even earlier.
[00:04:56.000 --> 00:04:59.200] It emerges between 18 months and four years old, so that's not a natural.
[00:04:59.440 --> 00:05:02.960] Yeah, this little girl is using her spoon with her left hand.
[00:05:03.440 --> 00:05:08.720] If you hand her a crayon and she goes to like take crayon to paper, she always grabs it with her left hand.
[00:05:08.880 --> 00:05:11.040] Will she cross the midline with her left hand?
[00:05:11.040 --> 00:05:11.520] I don't know.
[00:05:11.520 --> 00:05:12.080] We should check that.
[00:05:12.160 --> 00:05:12.800] That's the test.
[00:05:13.200 --> 00:05:13.440] Okay.
[00:05:13.760 --> 00:05:14.560] Kara for sure.
[00:05:14.560 --> 00:05:16.720] Now that I think about it, we always knew.
[00:05:16.720 --> 00:05:17.200] Yeah.
[00:05:17.200 --> 00:05:17.440] Yeah.
[00:05:17.440 --> 00:05:19.760] My mom and I always knew there was something different.
[00:05:19.760 --> 00:05:39.640] And so, did you know that, okay, I think, and Steve, maybe you can fact-check me on this, or somebody can Google furiously, that left-handedness occurs in about 10% of the population, and that about 10% of those lefties have their language center of their brain lateralized to the right.
[00:05:40.280 --> 00:05:40.680] Something like that.
[00:05:40.680 --> 00:05:41.080] It might even be.
[00:05:41.240 --> 00:05:42.360] It's something like 10 of 10.
[00:05:42.360 --> 00:05:43.800] Yeah, it might not be a perfect number.
[00:05:44.280 --> 00:05:47.320] So, I wonder, Jay, have you ever had like an fMRI?
[00:05:47.320 --> 00:05:48.360] I don't think about this.
[00:05:48.360 --> 00:05:50.680] No, I've had MRIs, but nothing.
[00:05:50.760 --> 00:05:53.720] But they're not like a brain, a brain scan that shows function.
[00:05:53.720 --> 00:05:55.560] I would be so curious.
[00:05:56.040 --> 00:05:58.200] The estimates are between 15 and 30 percent.
[00:05:58.520 --> 00:05:59.000] Oh, wow, that's right.
[00:05:59.240 --> 00:06:01.880] Hemits are dominant from language in left-handed people.
[00:06:01.880 --> 00:06:04.760] And it's like 2 to 5% in right-handed people.
[00:06:04.760 --> 00:06:06.120] Oh, that's much higher than I thought.
[00:06:06.120 --> 00:06:06.920] That's amazing.
[00:06:06.920 --> 00:06:07.400] Yeah.
[00:06:07.400 --> 00:06:08.280] Yeah, that's so good.
[00:06:08.440 --> 00:06:09.000] We have to know that.
[00:06:09.000 --> 00:06:15.720] Like, when you do a neurological exam, you're supposed to ask and record the handedness of the person for a couple of reasons.
[00:06:15.720 --> 00:06:24.360] One is when you're interpreting the exam, like they should be a little bit better with their dominant hand and their non-dominant hand on some of the things we have them do.
[00:06:24.360 --> 00:06:31.000] And if it's reversed, that could mean that they have like a stroke, you know, like that they're impaired with what should be their dominant hand.
[00:06:31.000 --> 00:06:39.960] But also, when we're localizing like a stroke, for example, you have to know if there is a reasonable probability that they're cross-dominant, right?
[00:06:39.960 --> 00:06:41.080] That their right hammers are dominant.
[00:06:41.240 --> 00:06:45.000] Yeah, you need to know functionally how those things are going to affect them then.
[00:06:45.000 --> 00:06:54.520] And also, from a neuropsych perspective, the neuropsych assessments that are given, handedness matters for a lot of the motor assessments because they're normed.
[00:06:54.520 --> 00:07:04.280] And so, if you don't know about their handedness and you look at the norms tables with an assumption that somebody is right-handed, you may misinterpret their scoring.
[00:07:04.280 --> 00:07:05.840] Yeah, no, it's interesting.
[00:07:05.480 --> 00:07:06.160] Yeah.
[00:07:06.440 --> 00:07:14.480] And, Evan, before we started the show, you were asking about the genetics of handedness, and the short answer is that it's complicated.
[00:07:14.040 --> 00:07:17.600] It is not a one-gene, one-trait thing.
[00:07:17.760 --> 00:07:26.800] It's not even necessarily just like it's a multi-I mean, it's definitely multi-gene, but it's nothing we could predict based upon the genes because it's partly developmental, too.
[00:07:26.800 --> 00:07:27.280] Yeah.
[00:07:27.280 --> 00:07:27.600] Right?
[00:07:27.840 --> 00:07:31.920] Meaning that it's not 100% determined just by the genes.
[00:07:31.920 --> 00:07:34.960] So there's no simple inheritance pattern as the bottom line.
[00:07:34.960 --> 00:07:38.080] Like for Jay, everyone else in our family is right-handed.
[00:07:38.080 --> 00:07:40.640] Jay's an isolated left-hander, you know.
[00:07:40.640 --> 00:07:43.440] And Jay, what were you saying about you and your wife and your kids?
[00:07:43.440 --> 00:07:47.040] Yeah, so my wife is left-handed, but my kids are both right-handed.
[00:07:47.040 --> 00:07:48.480] That's fascinating.
[00:07:48.480 --> 00:07:59.520] Yeah, children of two left-handed parents are still more likely to be right-handed, even though they're more likely to be left-handed than people born of right-handed parents.
[00:07:59.760 --> 00:08:03.440] So there is an increased probability, but it's still the minority.
[00:08:03.760 --> 00:08:16.960] Oh, and even just recently, like there was a study that was literally just published in April of this year where a rare genetic variant was identified that presents in fewer than 1% of people.
[00:08:16.960 --> 00:08:22.720] But in those people, they're 2.7 times more likely to be in lefties than righties.
[00:08:22.960 --> 00:08:25.760] So it's like they're still identifying the complex genetics.
[00:08:25.760 --> 00:08:26.160] Yeah.
[00:08:26.480 --> 00:08:38.320] And identical twins, they're more likely a higher chance of matching their handedness, but they can have opposite-handedness, which tells you it's not 100% genetic, otherwise, it would have to be the same every time.
[00:08:38.320 --> 00:08:39.200] But it's not.
[00:08:39.200 --> 00:08:41.520] And usually they also would look with twin studies, right?
[00:08:41.520 --> 00:08:51.920] You want to look at those really interesting studies where they're monozygotic, but they were raised separately and see if that varies based on raised separately versus raised together.
[00:08:51.920 --> 00:08:54.120] So that's left-handedness, Dang, for you.
[00:08:53.920 --> 00:08:54.280] Huh.
[00:08:54.560 --> 00:08:55.200] Yeah.
[00:08:55.200 --> 00:08:58.160] Wasn't it Flanders?
[00:08:58.160 --> 00:08:59.360] Was he left-handed?
[00:08:59.360 --> 00:08:59.880] Yes.
[00:08:59.880 --> 00:09:01.560] Ned Flanders was the seasons.
[00:09:01.640 --> 00:09:02.520] Yeah, he had the leftorian.
[00:09:02.680 --> 00:09:03.240] The left-handed sword.
[00:09:03.400 --> 00:09:04.120] Yeah, the left-orium.
[00:08:59.520 --> 00:09:05.160] The leftorian.
[00:09:05.480 --> 00:09:08.520] I think there's one of those in like at Universal Studios here.
[00:09:09.720 --> 00:09:10.440] Like a left-handed.
[00:09:10.600 --> 00:09:13.560] With legitimate left-hand devices for purchase.
[00:09:13.880 --> 00:09:15.240] Oh, my gosh.
[00:09:15.560 --> 00:09:19.000] Don't quote me on that, but I'm pretty sure there is, or maybe it's at Disney.
[00:09:19.320 --> 00:09:21.320] I wouldn't be surprised.
[00:09:21.640 --> 00:09:35.160] Jay, the thing that's always, to me, would seem like this would be the one that the thing that would annoy me most about living in a right-handed world, if you were left-handed, is that when you write, you have to push your letters rather than pull them.
[00:09:35.480 --> 00:09:38.920] And you have to move your hand over what you've already written.
[00:09:38.920 --> 00:09:40.280] Yeah, so you're smudging.
[00:09:40.440 --> 00:09:42.040] Side of your hand's always smudgy.
[00:09:42.040 --> 00:09:42.600] Always.
[00:09:42.600 --> 00:09:47.000] Always get blue on my pinky and palm, like side of my palm.
[00:09:47.000 --> 00:09:48.440] And I have sloppy handwriting.
[00:09:48.440 --> 00:09:49.560] I think a lot of lefties do.
[00:09:49.560 --> 00:09:52.200] It's very difficult to write neatly.
[00:09:52.200 --> 00:09:54.040] That said, you know, I don't care.
[00:09:54.040 --> 00:09:54.520] Yeah.
[00:09:54.520 --> 00:09:56.520] Well, you use a keyboard more than you write most likely.
[00:09:56.760 --> 00:09:57.880] No, it's not as big a deal.
[00:09:58.200 --> 00:10:00.120] I love left-handed dominant keyboards.
[00:10:00.440 --> 00:10:02.120] I love writing with pen and paper, though.
[00:10:02.120 --> 00:10:04.920] I love using pencils and pens and whatever.
[00:10:05.240 --> 00:10:06.600] I've always loved to do that.
[00:10:06.600 --> 00:10:10.040] I love to draw, even though I don't draw well, but I do it all the time.
[00:10:10.040 --> 00:10:15.240] Are you one of those people, Jay, who left-handers who writes above the words rather than below?
[00:10:15.240 --> 00:10:15.480] Oh.
[00:10:15.800 --> 00:10:16.760] Oh, interesting.
[00:10:16.760 --> 00:10:17.320] That's weird.
[00:10:17.320 --> 00:10:17.960] I remember that.
[00:10:17.960 --> 00:10:18.520] What do you mean?
[00:10:19.320 --> 00:10:22.360] Like, when I write, my hand is below the sentence I'm writing.
[00:10:22.360 --> 00:10:27.480] But I've seen left-handers who hold their hand above the sentence that they're writing.
[00:10:27.560 --> 00:10:28.600] Mine's parallelist a little bit.
[00:10:28.680 --> 00:10:33.080] They bend their wrist down that they're looking at from the top rather than just being from below.
[00:10:33.720 --> 00:10:34.840] Yeah, mine's parallel.
[00:10:34.840 --> 00:10:35.400] Oh, my gosh.
[00:10:35.400 --> 00:10:37.480] Okay, so I had a question come up.
[00:10:38.120 --> 00:10:42.920] I had a question come up, and I just Googled it, and BBC Science Focus covered it.
[00:10:42.920 --> 00:10:50.400] I was wondering: are there more left-handed people in cultures where they write right to left?
[00:10:50.720 --> 00:10:54.080] Because it's Arabic, Hebraic, yeah, exactly.
[00:10:54.400 --> 00:11:01.920] And apparently, according to this article written, BBC Science Focus by Louis Viazon, quite the reverse.
[00:11:01.920 --> 00:11:12.560] Various surveys have found that the highest incidence of left-handedness is in Western countries with about 13% of the populations in the USA, Canada, and the Netherlands.
[00:11:12.560 --> 00:11:14.080] The UK is just behind it.
[00:11:14.080 --> 00:11:22.640] Most of the countries that use right to left are predominantly Asian and Arabic, and they have left-handedness rates below 6%.
[00:11:22.640 --> 00:11:29.440] So it says in Muslim countries, the advantage of smudge-free handwriting is outweighed by the fact that the left-hand is considered unclean.
[00:11:29.440 --> 00:11:30.320] Right, it's cultural.
[00:11:30.640 --> 00:11:32.880] Yeah, fascinating.
[00:11:32.880 --> 00:11:34.720] I love learning new things.
[00:11:34.720 --> 00:11:35.200] No, you do.
[00:11:36.560 --> 00:11:40.080] We didn't even know until 20 minutes ago that it was left-handed, frankly.
[00:11:40.640 --> 00:11:41.840] And here we are.
[00:11:42.160 --> 00:11:43.360] Here we are.
[00:11:43.360 --> 00:11:44.000] Learning.
[00:11:44.320 --> 00:11:52.640] I wonder if that also has to do with that they killed the left-handed people more in the past and selected against it.
[00:11:53.280 --> 00:11:54.160] Those little genetic.
[00:11:54.240 --> 00:11:56.480] And then also just trained out of it culturally.
[00:11:57.040 --> 00:11:59.440] But that wouldn't really affect the genetics of it.
[00:12:00.640 --> 00:12:02.560] Yeah, if it's still not accepted.
[00:12:02.560 --> 00:12:12.480] But I'm saying, like, throughout history, because this has been true in many cultures, even in Western cultures, that being left-handed could be a sign of the devil or the sign of whatever.
[00:12:12.480 --> 00:12:16.960] And they were more likely to be persecuted, more likely to be punished and executed.
[00:12:16.960 --> 00:12:27.920] And I wonder if there's just different cultural differences in how aggressively they slaughtered left-handed people actually affects the probability of being, yeah, the incident of right-handed versus left-handed.
[00:12:28.240 --> 00:12:33.640] Yeah, it makes you wonder if that history wasn't there, would it be more like half-half?
[00:12:33.640 --> 00:12:36.520] Or, you know, what would be the variation?
[00:12:29.760 --> 00:12:37.160] All right.
[00:12:37.400 --> 00:12:41.160] Jay, you're going to get us started with a discussion of a new scam.
[00:12:41.160 --> 00:12:41.560] All right.
[00:12:41.560 --> 00:12:46.440] So I picked a particular topic of so many.
[00:12:46.440 --> 00:12:48.200] I was just talking to Bob before the show.
[00:12:48.200 --> 00:12:54.760] I was reading about a scam on elderly people to rob them of their retirement, which I'll talk about in another show.
[00:12:54.760 --> 00:13:02.840] But I'm going to talk about election-based scams and how they manifest and how they could affect just average people.
[00:13:02.840 --> 00:13:07.800] So one thing that they do, you're probably aware that they send phishing emails.
[00:13:07.800 --> 00:13:12.120] They'll send emails to they create websites that appear like official websites.
[00:13:12.120 --> 00:13:18.280] They might ask for personal information like your social security number, your voter registration details.
[00:13:18.280 --> 00:13:23.800] They might say to you that they want to confirm that you're registered or are you eligible to vote.
[00:13:23.800 --> 00:13:26.360] They're just trying to get personal information from you.
[00:13:26.360 --> 00:13:28.680] And then with all of these, I'm going to read to you.
[00:13:28.680 --> 00:13:34.120] You got to be careful and make sure that you're on the actual official websites.
[00:13:34.120 --> 00:13:37.640] This is hard because they can make the website look exactly like the original.
[00:13:37.960 --> 00:13:38.680] Never click a link.
[00:13:38.920 --> 00:13:39.960] Go there directly.
[00:13:39.960 --> 00:13:43.560] Yeah, so you need to be very, very fastidious about this.
[00:13:44.840 --> 00:13:46.680] They have fake donation pages.
[00:13:47.320 --> 00:13:49.320] This is very common.
[00:13:49.320 --> 00:13:52.040] Like we get a lot of texts asking for money.
[00:13:52.360 --> 00:13:56.600] Scammers will set up these fake donation websites for political campaigns.
[00:13:56.600 --> 00:13:58.520] And the pages look very convincing.
[00:13:58.520 --> 00:14:00.520] They use official-looking logos.
[00:14:00.520 --> 00:14:02.520] They might be using the exact real logos.
[00:14:02.520 --> 00:14:04.840] They can just download them off the other websites.
[00:14:04.840 --> 00:14:08.840] And they're going to try to get them to you through text and email, social media.
[00:14:08.840 --> 00:14:19.680] The money that they collect does not go to anything but the scammer and it's not going to the political cause and you cannot get taxed on that, which you normally can for other types of donations.
[00:14:19.680 --> 00:14:22.720] They have voter suppression tactics.
[00:14:22.720 --> 00:14:39.040] Some scams are aimed to simply suppress voter turnout by spreading misinformation, things about voting procedures, incorrect polling locations, fake deadlines, misleading information about your ID, voter ID requirements.
[00:14:39.280 --> 00:14:41.680] These are all out there just to confuse people.
[00:14:41.680 --> 00:14:46.160] Again, go to your state or town website, find the information yourself.
[00:14:46.560 --> 00:14:49.120] Don't let people send you the information.
[00:14:49.120 --> 00:14:50.640] Go get it yourself.
[00:14:50.880 --> 00:14:53.040] Phone scams pretending to be pollsters.
[00:14:53.040 --> 00:14:54.480] This is very common.
[00:14:54.480 --> 00:15:03.920] Scammers will call people pretending to be conducting political polls, and during that call, they might ask you for sensitive information saying it's necessary for the poll.
[00:15:03.920 --> 00:15:05.040] Make sure it's unique.
[00:15:05.040 --> 00:15:08.080] They don't double, you know, ask people questions, all that stuff.
[00:15:08.080 --> 00:15:09.440] It's all BS.
[00:15:09.760 --> 00:15:15.360] They're just collecting your personal data and they want to commit identity fraud on you.
[00:15:15.360 --> 00:15:18.560] The text message scams, I get a ton of these.
[00:15:18.560 --> 00:15:28.720] Scammers will send text messages that they're pretending to be from, again, you know what I'm going to say, political campaigns, websites could be from the person themselves.
[00:15:28.720 --> 00:15:34.560] It could be Trump or Harris that is going to be texting you directly with your name.
[00:15:34.560 --> 00:15:37.680] Hey, Steve, blah, blah, blah, send me money, right?
[00:15:37.680 --> 00:15:38.960] You got to be really careful.
[00:15:38.960 --> 00:15:49.600] If you want to donate money to a political campaign, just make sure you go to the official websites, do some searching yourself, ask, you know, chat GPT, whatever, but do, you know, do the research yourself.
[00:15:49.600 --> 00:15:52.800] Don't respond to things that people send you.
[00:15:52.800 --> 00:15:56.080] And then fake voting assistance services.
[00:15:56.400 --> 00:16:06.920] Scammers may offer services that are there to assist you with voting, such as helping you filling out absentee ballots, offering to deliver the ballots on your behalf, blah, blah, blah.
[00:16:07.000 --> 00:16:12.760] These services are typically fraudulent and they may steal your vote and personal information.
[00:16:12.760 --> 00:16:14.120] Imagine that.
[00:16:14.120 --> 00:16:19.800] You find out that they cast a ballot for the exact opposite person you want to be in office.
[00:16:19.800 --> 00:16:30.200] So to protect yourself from these scams, election scams, other scams, it's important to verify any election-related communication through the actual official channels.
[00:16:30.200 --> 00:16:34.520] Going to the state websites is a good idea, or your town websites.
[00:16:34.520 --> 00:16:35.880] Verify the links.
[00:16:35.880 --> 00:16:37.320] Just be fastidious.
[00:16:37.320 --> 00:16:42.520] The other thing is, never provide personal information in response to unsolicited requests.
[00:16:42.520 --> 00:16:44.040] Ever, ever.
[00:16:44.040 --> 00:16:59.000] If your social security number is in play, if they want bank information, if they want anything that feels like they shouldn't need it in order to do what they're doing, make a huge red flag go up and do not do it.
[00:16:59.000 --> 00:17:00.280] Always call back.
[00:17:00.280 --> 00:17:04.360] Tell them, hey, I'm going to call the official phone number on the website.
[00:17:04.360 --> 00:17:06.520] If they give you grief about that, it's a scam.
[00:17:06.520 --> 00:17:07.160] You know what I mean?
[00:17:07.320 --> 00:17:09.000] You just got to be really careful.
[00:17:09.000 --> 00:17:11.080] And then, this is really important.
[00:17:11.080 --> 00:17:17.080] And I've said this before: tell everybody in your life, especially old people, what I just said.
[00:17:17.080 --> 00:17:18.360] All of this stuff.
[00:17:18.360 --> 00:17:24.040] Because statistically, the older people are, the more likely they are to fall for these scams.
[00:17:24.040 --> 00:17:29.880] And it's just really easy for you to call up your mom or your dad or whoever and just be like, hey, guess what?
[00:17:30.280 --> 00:17:31.240] You got to watch out.
[00:17:31.240 --> 00:17:34.200] People are, you know, we're in election season here.
[00:17:34.200 --> 00:17:37.400] People are going to be soliciting you for personal information.
[00:17:37.720 --> 00:17:39.480] They could do lots of different things to you.
[00:17:39.480 --> 00:17:44.120] Just don't respond to anything and say things like, hey, if you want to donate any money, I'll do it for you.
[00:17:44.120 --> 00:17:45.920] Let me know and I'll take care of it for you.
[00:17:45.920 --> 00:17:46.320] All right.
[00:17:46.320 --> 00:17:47.520] Thank you, Jay.
[00:17:47.520 --> 00:17:49.920] All right, Kara, tell us about childhood vaccines.
[00:17:49.920 --> 00:17:50.880] Are these a good thing?
[00:17:51.520 --> 00:17:52.000] I don't know.
[00:17:52.000 --> 00:17:52.880] What are you going to say?
[00:17:53.680 --> 00:17:55.040] I might have an opinion on that.
[00:17:55.440 --> 00:17:56.240] Maybe a little one.
[00:17:56.480 --> 00:18:13.120] So the CDC just released in their Morbidity and Mortality Weekly report a report on August 8th titled Health and Economic Benefits of Routine Childhood Immunizations in the Era of the Vaccines for Children program, United States, 1994 to 2023.
[00:18:13.120 --> 00:18:29.040] So if you didn't know, since 1994 here in the United States, there's been a program called Vaccines for Children, VFC, which has covered the cost of vaccines for children if their families would not otherwise have been able to afford the vaccines.
[00:18:29.040 --> 00:18:44.560] And so this study looked at the health benefits and the economic impact of routine immunization among both VFC eligible and non-eligible children born between 1994 and 2023.
[00:18:44.560 --> 00:18:48.160] They looked at nine different childhood vaccines.
[00:18:48.160 --> 00:18:52.000] Ooh, am I going to know what these are based on their initialisms?
[00:18:52.000 --> 00:18:52.560] Let's see.
[00:18:53.440 --> 00:18:54.400] Thank you so much.
[00:18:54.400 --> 00:18:54.720] Help.
[00:18:55.040 --> 00:18:56.560] You're welcome, Bob.
[00:18:57.200 --> 00:19:07.360] I think this is diphtheria, tetanus, and pertussis, so DTP or DTAP, HIB, Haemophilus influenza type B.
[00:19:07.360 --> 00:19:08.800] I think that's what that is.
[00:19:08.800 --> 00:19:11.600] OPVIPV.
[00:19:11.600 --> 00:19:16.400] That is oral and inactivated poliovirus vaccine.
[00:19:16.720 --> 00:19:23.760] Also, MMR, measles, mumps, rubella, hepatitis B, VAR.
[00:19:23.760 --> 00:19:25.040] Oh, that's Varicella.
[00:19:25.040 --> 00:19:26.640] Oh, I'm so jealous.
[00:19:26.960 --> 00:19:29.360] I wish we had Varicella vaccines.
[00:19:29.360 --> 00:19:31.400] Yeah, we all had to get the chicken pox.
[00:19:31.880 --> 00:19:32.920] And then Rhoda.
[00:19:32.920 --> 00:19:36.120] So that's the Rhoda virus, I'm assuming.
[00:19:36.680 --> 00:19:39.080] So, what do you think they found?
[00:19:41.000 --> 00:19:41.800] It helped, right?
[00:19:41.800 --> 00:19:44.120] No, but let's look at the magnitude of these findings.
[00:19:44.120 --> 00:19:50.040] And then I want to talk a little bit about something that is very worrisome based on a recent Gallup poll.
[00:19:50.360 --> 00:20:08.040] So among those children born 94 to 2023, based on the analyses in this report, routine childhood vaccines would have prevented approximately, or had prevented approximately 508 million cases of illness.
[00:20:08.040 --> 00:20:08.600] Wow.
[00:20:08.600 --> 00:20:16.040] 32 million hospitalizations and 1.129 million deaths.
[00:20:16.040 --> 00:20:17.880] 1,129,000 deaths.
[00:20:19.080 --> 00:20:22.440] And now let's talk over, that's a long span.
[00:20:22.440 --> 00:20:24.600] That's 20, almost 20 years.
[00:20:24.600 --> 00:20:25.000] Okay.
[00:20:25.000 --> 00:20:25.480] Wow.
[00:20:25.480 --> 00:20:28.120] No, almost 30 years, 94 to 2023.
[00:20:28.120 --> 00:20:31.080] Let's talk about the cost savings.
[00:20:31.080 --> 00:20:39.880] You know, it always feels a little icky when we talk about cost savings when something that is, you know, as extreme as illness and death.
[00:20:40.200 --> 00:20:49.400] But it does cost money to treat disease, and it costs money to try to save a child who has a vaccine-preventable illness.
[00:20:49.720 --> 00:20:56.360] So, according to this report, there are two different ways that they looked at these cost savings.
[00:20:56.360 --> 00:21:07.640] The first one was direct health care costs, and the direct health care costs, they saw a savings of $540 billion.
[00:21:07.640 --> 00:21:15.000] So, $540 billion in just the direct health care that would have been required if these children had not been vaccinated.
[00:21:16.080 --> 00:21:29.120] But then they looked at something even larger and they said, okay, well, even more than that, what are these other costs, these kind of externalized costs, these societal costs?
[00:21:29.120 --> 00:21:35.200] And they estimate that at $2.7 trillion.
[00:21:35.920 --> 00:21:45.440] And that includes things like: what if your parents have to stay home from work to care for the child while they're sick and they lose wages from that?
[00:21:45.440 --> 00:21:50.720] What if a child becomes permanently disabled from an infection, which can definitely happen?
[00:21:50.720 --> 00:21:55.520] And, you know, then there are costs throughout the lifespan of supporting that child.
[00:21:55.520 --> 00:21:59.920] So those additional costs bump that estimate up to 2.7 trillion.
[00:21:59.920 --> 00:22:11.040] That seems to be on par with savings from like curing hepatitis C or savings from the legislation like the Clean Air Act.
[00:22:11.040 --> 00:22:16.640] The ROI on childhood vaccines is literally bananas.
[00:22:16.640 --> 00:22:17.040] Yeah, it is.
[00:22:17.280 --> 00:22:18.080] The amount of lives saved.
[00:22:18.640 --> 00:22:22.400] It is the most cost-effective public health measure.
[00:22:22.880 --> 00:22:23.600] Prevention?
[00:22:23.600 --> 00:22:24.080] Always.
[00:22:24.080 --> 00:22:24.640] 100%.
[00:22:25.040 --> 00:22:28.000] When is prevention not one of the best health savings?
[00:22:28.000 --> 00:22:28.240] Right.
[00:22:28.240 --> 00:22:29.840] And vaccines, especially.
[00:22:29.840 --> 00:22:34.400] I mean, because it's like really well-established prevention.
[00:22:34.400 --> 00:22:35.760] So here's the thing.
[00:22:35.760 --> 00:22:42.800] That may be the reality of the situation, but a recent Gallup poll, and when I say recent, this was published August 7th.
[00:22:42.800 --> 00:22:47.840] The polling was done from like July 1st to the 21st, 2024.
[00:22:47.840 --> 00:22:48.880] So this is pretty recent.
[00:22:48.880 --> 00:22:53.840] A Gallup poll showed that there have been a lot of changes in how U.S.
[00:22:53.840 --> 00:22:57.440] adults view childhood vaccinations.
[00:22:57.440 --> 00:22:59.440] And this is pretty scary.
[00:22:59.440 --> 00:23:08.600] So, when asked if it is important for parents to have their children vaccinated, we saw a huge change.
[00:23:08.600 --> 00:23:15.000] And that change was heavily lateralized according to political affiliation.
[00:23:15.000 --> 00:23:17.880] And where do you think the biggest changes have taken place?
[00:23:17.880 --> 00:23:20.440] This data was from 2002 to 2024.
[00:23:20.440 --> 00:23:22.520] We can look at trends across that line.
[00:23:24.040 --> 00:23:27.480] What political party do you think saw the biggest change in their views?
[00:23:27.960 --> 00:23:29.240] It's got to be the Republican.
[00:23:29.480 --> 00:23:32.360] Democrat, Republican.
[00:23:33.000 --> 00:23:33.160] Right.
[00:23:33.160 --> 00:23:46.280] So historically, we often think of vaccine hesitancy, vaccine denialism, as sort of being like an anti-science or a pseudoscience problem on the left.
[00:23:46.280 --> 00:23:53.800] But what's interesting is that it's actually long been pretty consistent across political parties.
[00:23:54.360 --> 00:24:03.080] What's really interesting is that in the past only like four years, there's been a massive shift in Republican views.
[00:24:03.080 --> 00:24:08.280] So when asked, for example, how important is it that parents get their children vaccinated?
[00:24:08.280 --> 00:24:13.640] Extremely important, very important, somewhat important, not very important, or not at all important.
[00:24:13.960 --> 00:24:22.600] Those who answered extremely important in 2002, it was 66% of Democrats and 62% of Republicans.
[00:24:22.600 --> 00:24:29.880] In 2020, it was 67% of Democrats and 52% of Republicans.
[00:24:30.680 --> 00:24:37.880] In 2024, 63% of Democrats, 26% of Republicans.
[00:24:37.880 --> 00:24:39.320] Yeah, it's tanking.
[00:24:39.640 --> 00:24:40.600] Tanking.
[00:24:40.600 --> 00:24:41.480] It's tanking.
[00:24:41.960 --> 00:24:43.440] It's got to be taken what they were asking.
[00:24:43.640 --> 00:24:44.560] It has to be the COVID.
[00:24:45.040 --> 00:24:49.600] They asked, how important is it that parents get their children vaccinated?
[00:24:44.440 --> 00:24:50.880] Wow.
[00:24:52.080 --> 00:25:08.000] So again, back in 2001, on average, 94% of respondents said that it was extremely important, and only 69% of respondents are saying extremely important right now.
[00:25:08.640 --> 00:25:11.920] Massive decline in these viewpoints.
[00:25:11.920 --> 00:25:30.880] We've also seen, you know, some change, some other trends and changes across the board, but I think this one is the most telling and it's the most worrisome because nothing changed in early, actually, this was sorry, when I gave you those numbers before the 2020 to 2024, it was really late 2019 to 2024.
[00:25:30.880 --> 00:25:33.920] So it was, of course, pre-COVID versus after.
[00:25:33.920 --> 00:25:34.880] Right, okay.
[00:25:34.880 --> 00:25:38.720] So nothing changed except the rhetoric, right?
[00:25:38.720 --> 00:25:44.080] The outcomes of childhood vaccinations were just as strong.
[00:25:44.080 --> 00:25:51.840] The health consequences, those lives saved, the amount of money saved within the economy, just as robust.
[00:25:51.840 --> 00:25:54.400] The only thing that changed was the rhetoric.
[00:25:54.400 --> 00:25:58.560] And then here's another big one that I think I would be remiss if I didn't mention.
[00:25:58.560 --> 00:26:00.400] Here's another question that was asked.
[00:26:00.400 --> 00:26:06.640] Do you think vaccines are more dangerous than the diseases they are designed to prevent?
[00:26:06.640 --> 00:26:12.240] So once again, do you think vaccines are more dangerous than the diseases they are designed to prevent?
[00:26:12.240 --> 00:26:17.360] Overall, in 2001, only 6% of people said yes.
[00:26:17.360 --> 00:26:20.000] They are more dangerous than the diseases themselves.
[00:26:20.000 --> 00:26:23.760] In 2024, 20% of people said yes.
[00:26:23.760 --> 00:26:24.640] It's crazy.
[00:26:24.640 --> 00:26:40.520] And when you look at the difference between Democrats and Democratically leaning individuals and Republicans, Republically leaning individuals, 2001, 6% of Republicans said yes, the vaccines are more dangerous than the diseases they're designed to prevent.
[00:26:40.520 --> 00:26:42.840] 5% of Democrats said yes.
[00:26:42.840 --> 00:26:46.440] 2019, 12% of Republicans said yes.
[00:26:46.440 --> 00:26:48.200] 10% of Democrats said yes.
[00:26:48.520 --> 00:26:56.760] 2024, 31% of Republicans said yes, 5% of Democrats said yes.
[00:26:57.880 --> 00:27:12.920] 31% of Democrats who, I'm sorry, of Republican respondents to this Gallup poll in July of 2024 said that vaccines are more dangerous than the diseases they are designed to prevent.
[00:27:13.240 --> 00:27:13.640] Amazing.
[00:27:14.120 --> 00:27:17.240] That's some powerful influence.
[00:27:17.240 --> 00:27:38.760] You know, like not only is this a beautiful example of a theme that we often talk about on this show, which is that the data, the actual data, which is easily accessible, is completely out of step with the rhetoric and the beliefs on these highly kind of charged political topics.
[00:27:38.760 --> 00:27:56.600] But it's also a beautiful example of how much influence certain individuals can have on the viewpoints of many people in this country and how those changing viewpoints will directly affect public policy.
[00:27:56.600 --> 00:28:05.640] Oh my gosh, we've known this, especially since the vaccines cause autism from the late 90s and early 2000s.
[00:28:05.640 --> 00:28:07.080] We've seen this.
[00:28:07.720 --> 00:28:17.440] But, and you say that, Evan, that we have seen this, but if you actually look at the data, the vaccines cause autism scare in the 90s and 2000s.
[00:28:17.440 --> 00:28:20.000] Yes, it did affect trust.
[00:28:20.000 --> 00:28:24.240] You do see a change, but I don't think it was this extreme.
[00:28:24.240 --> 00:28:29.200] Because the internet wasn't as widely accessible back then as it is today.
[00:28:29.200 --> 00:28:31.040] So that's definitely part of it.
[00:28:31.040 --> 00:28:47.120] It also shows this tendency that we have, this sort of cognitive bias that we have, to take fear-mongering, information that was fear-mongering, and then apply it to other aspects of our thinking.
[00:28:47.120 --> 00:29:00.000] Because most of the rhetoric was COVID vaccine rhetoric, yet we are seeing a massive impact on questions about whether parents should vaccinate their children against childhood diseases.
[00:29:00.000 --> 00:29:02.960] Yeah, well, it's a deterioration of trust and authority.
[00:29:03.280 --> 00:29:04.960] Yeah, it's really scary.
[00:29:04.960 --> 00:29:16.560] You know, I think that this is a big deal because what we're seeing now, this study about the Vaccines for Children program is showing us the real impact of vaccinations.
[00:29:16.560 --> 00:29:21.440] This Gallup poll is showing us what people think right now.
[00:29:21.440 --> 00:29:33.520] So I'm kind of scared to see the outcome studies in five to ten years of children who were refused their vaccines by their parents and how they fared.
[00:29:33.840 --> 00:29:36.960] Will we start to see the re-emergence of some things that we're talking about?
[00:29:37.120 --> 00:29:38.320] We already have.
[00:29:38.400 --> 00:29:40.720] I mean, we've been seeing measles outbreaks.
[00:29:40.720 --> 00:29:41.120] That would have been a lot of fun.
[00:29:41.280 --> 00:29:42.800] Yeah, we have seen those before.
[00:29:42.800 --> 00:29:44.800] You know, we've talked about the California, right?
[00:29:44.800 --> 00:29:45.920] The whole Disneyland thing.
[00:29:45.920 --> 00:29:50.880] We've talked about Minnesota kind of being a hotbed for that occurring every now and again.
[00:29:50.880 --> 00:29:54.240] And I can tell you, like, and this is just, you know, a personal experience thing.
[00:29:54.240 --> 00:29:58.720] So all of us on this podcast are old enough that we got chickenpox, right?
[00:29:58.720 --> 00:29:59.040] Yep.
[00:29:59.040 --> 00:29:59.600] I did, yep.
[00:29:59.600 --> 00:30:00.600] Age 11.
[00:30:00.600 --> 00:30:03.400] Did any of you, have any of you had shingles yet?
[00:30:03.400 --> 00:30:03.800] I did.
[00:30:03.800 --> 00:30:04.600] No, I'm supposed to.
[00:30:04.760 --> 00:30:05.240] I didn't get it.
[00:29:59.920 --> 00:30:06.120] I got the vaccine.
[00:30:06.360 --> 00:30:08.120] Yeah, I just got mine soon.
[00:30:08.120 --> 00:30:08.600] Yeah.
[00:30:08.600 --> 00:30:11.080] I'm not eligible yet for my vaccine.
[00:30:11.080 --> 00:30:13.160] So I'm in the scary spot.
[00:30:13.160 --> 00:30:15.320] My kid's sister had shingles recently.
[00:30:15.320 --> 00:30:19.400] Multiple patients that I work with in the cancer center have had shingles.
[00:30:19.400 --> 00:30:21.560] It is violently painful.
[00:30:21.560 --> 00:30:22.040] Like it is.
[00:30:22.280 --> 00:30:23.080] Oh, yeah.
[00:30:23.080 --> 00:30:23.720] Gosh.
[00:30:23.720 --> 00:30:24.200] I don't care.
[00:30:24.440 --> 00:30:32.600] And especially if you're already struggling with diseases that affect your immune system, if you're already struggling with, you know, these other issues.
[00:30:32.600 --> 00:30:37.320] Like, if you are a cancer patient and you have shingles, it is not a picnic.
[00:30:37.560 --> 00:30:38.600] And I see it.
[00:30:38.600 --> 00:30:39.800] I see it weekly.
[00:30:39.800 --> 00:30:42.040] And it's just devastating.
[00:30:42.040 --> 00:30:49.480] And yes, we are all from a cohort where we didn't have access to the vaccine or the vaccine, or for some of them, they're younger, right?
[00:30:49.480 --> 00:30:51.080] But we all have the virus in us.
[00:30:51.080 --> 00:30:54.920] And so the vaccine is there to help with that flare-up.
[00:30:54.920 --> 00:30:58.520] But kids today will never know that pain.
[00:30:58.520 --> 00:31:01.000] And to deny them that, you know, that vaccine.
[00:31:01.000 --> 00:31:05.960] And of course, I have a vaccine-preventable illness that caused cancer.
[00:31:05.960 --> 00:31:08.280] I had to have a drastic surgery.
[00:31:08.280 --> 00:31:16.120] I had to have a hysterectomy to remove my cervix because my cancer was caused by a vaccine-preventable virus.
[00:31:16.120 --> 00:31:23.000] That vaccine didn't exist when I was exposed, but you know, it happens.
[00:31:23.000 --> 00:31:26.120] And like, I wish I was lucky enough to be vaccinated.
[00:31:26.120 --> 00:31:30.680] I'm vaccinated against that disease now, but I wish I was lucky enough to be vaccinated.
[00:31:30.680 --> 00:31:44.800] And it just, it really, like, I think it's more infuriating, and it feels that much more of a dig when you see people saying, I'm not going to vaccinate my kids, or vaccines aren't important, or the vaccine is worse than the disease.
[00:31:44.680 --> 00:31:47.280] When it's like, you don't have the disease, dude.
[00:31:47.280 --> 00:31:48.160] Let me tell you.
[00:31:48.160 --> 00:31:48.400] Right.
[00:31:48.480 --> 00:31:48.960] Yeah.
[00:31:44.840 --> 00:31:51.280] I'd much rather not have this disease.
[00:31:51.600 --> 00:31:53.040] And I have both with this.
[00:31:53.040 --> 00:31:55.920] I am vaccinated and I have the disease, right?
[00:31:55.920 --> 00:31:57.920] I can tell you the disease was worse.
[00:31:57.920 --> 00:32:00.160] It's, yeah, it's infuriating.
[00:32:00.160 --> 00:32:04.480] I think this is part of a bigger trend of distrust and authority.
[00:32:04.800 --> 00:32:10.160] And at some critical point of distrust, people are like, well, screw it.
[00:32:10.160 --> 00:32:11.680] I'm just going to think whatever I want to think.
[00:32:11.680 --> 00:32:12.320] You know what I mean?
[00:32:12.320 --> 00:32:12.640] Exactly.
[00:32:12.640 --> 00:32:30.080] I'm just going to go along with my tribe or, again, just let confirmation bias completely overwhelm what I believe because I don't have to accommodate or listen to expert opinion because the experts are not trustworthy, you know, or institutions are not trustworthy.
[00:32:30.080 --> 00:32:40.720] And that, I think, is the biggest negative effect cumulatively of a lot of the divisiveness that we have where institutions become collateral damage.
[00:32:40.720 --> 00:32:43.600] I think in a lot of political fights that we have.
[00:32:43.600 --> 00:32:46.800] And I think also social media has done this as well.
[00:32:46.800 --> 00:32:48.880] Oh, yeah, everyone thinks they're an expert.
[00:32:50.000 --> 00:32:57.840] And it's a perfect storm because it's occurring against a background of the thing is working so we don't notice it.
[00:32:58.480 --> 00:33:00.960] You know, like, well, why do I need to vaccinate my kids?
[00:33:00.960 --> 00:33:02.160] Nobody's sick out there.
[00:33:02.160 --> 00:33:04.160] Well, yeah, because of the vaccines.
[00:33:04.160 --> 00:33:04.640] Right.
[00:33:04.960 --> 00:33:10.720] And that's like that negative consequence that comes, like the hidden value in these things.
[00:33:10.720 --> 00:33:13.520] You don't see polio in the United States anymore.
[00:33:13.520 --> 00:33:15.200] You don't see them in non-events.
[00:33:15.200 --> 00:33:15.360] Yeah.
[00:33:15.600 --> 00:33:16.480] It's like, why 2K.
[00:33:16.480 --> 00:33:18.000] Oh, we made a big deal over nothing.
[00:33:18.000 --> 00:33:20.240] No, it was nothing because we made a big deal.
[00:33:20.240 --> 00:33:21.200] Exactly.
[00:33:21.200 --> 00:33:21.760] Yeah.
[00:33:21.760 --> 00:33:29.120] Ask these same people to take down all their protection services for their computers and see how long they last without getting infected probably online, right?
[00:33:29.440 --> 00:33:36.280] I'll bet you they wouldn't be so quick to poo-poo virus protection and other things.
[00:33:36.600 --> 00:33:38.200] And that's your laptop.
[00:33:38.200 --> 00:33:40.440] That's not your body.
[00:33:40.440 --> 00:33:40.600] Right.
[00:33:41.720 --> 00:33:43.000] Your livelihood.
[00:33:43.640 --> 00:33:47.720] I mean, these diseases kill children.
[00:33:47.720 --> 00:33:51.240] And the children they don't kill, they often maim.
[00:33:51.560 --> 00:33:54.360] They often cause permanent disability.
[00:33:55.320 --> 00:33:55.880] Yeah.
[00:33:57.320 --> 00:33:58.840] This is not, you know.
[00:33:59.480 --> 00:34:02.920] I'm emotionally invested in this topic, as we all should be, damn it.
[00:34:03.080 --> 00:34:09.160] Are there any suggestions for what can be done, or do we have to let it play out and hope the culture changes?
[00:34:09.480 --> 00:34:10.280] Well, I mean, that's the thing.
[00:34:10.280 --> 00:34:13.400] Like, the things that need to be done are being done.
[00:34:13.400 --> 00:34:16.120] Like, again, we have a program in the U.S.
[00:34:16.280 --> 00:34:21.240] where children who would not otherwise be able to afford to be vaccinated can get vaccinated for free.
[00:34:21.240 --> 00:34:22.520] As long as their parents.
[00:34:23.160 --> 00:34:23.880] Exactly.
[00:34:23.880 --> 00:34:26.840] The CDC understands the need for this.
[00:34:26.840 --> 00:34:32.520] Many schools have vaccine requirements for your kids to be able to mingle with other kids in those schools.
[00:34:32.840 --> 00:34:38.200] So there are, you know, systems in place to protect us from a public health perspective.
[00:34:38.200 --> 00:34:45.640] But, you know, at a certain point, if you want to get out of this requirement, you're going to figure out a way to do it.
[00:34:45.640 --> 00:34:46.920] You're going to homeschool your kids.
[00:34:46.920 --> 00:34:49.640] You're going to, you know, whatever the case may be.
[00:34:49.800 --> 00:34:50.840] Can't force it.
[00:34:50.840 --> 00:34:51.960] So what do we do?
[00:34:52.200 --> 00:34:57.640] We improve education, and we hope that, you know, we hope that the tides start to change.
[00:34:58.280 --> 00:35:01.320] And we have to mandate vaccines, is the other thing.
[00:35:01.480 --> 00:35:06.040] It's perfectly reasonable to require certain vaccines to attend public school.
[00:35:06.040 --> 00:35:13.760] I think companies have the right to require vaccinations if it could impact the safety of their fellow workers or their customers.
[00:35:16.480 --> 00:35:17.680] I can't go to work without my vaccination.
[00:35:14.200 --> 00:35:18.000] We do now.
[00:35:19.120 --> 00:35:23.760] It's now a plank of one of our major parties to remove that requirement, right?
[00:35:23.760 --> 00:35:29.920] That remove all federal funding for any public school that mandates any vaccine.
[00:35:29.920 --> 00:35:34.240] There's an actual reasonable chance that that could become law in this country.
[00:35:34.560 --> 00:35:37.280] And it's in perfect lockstep with this Gallup poll.
[00:35:37.280 --> 00:35:37.440] Yeah.
[00:35:37.920 --> 00:35:39.200] That's the rhetoric.
[00:35:40.560 --> 00:35:41.040] Disaster.
[00:35:41.040 --> 00:35:42.000] That would be a disaster.
[00:35:42.160 --> 00:35:42.800] Disastrous.
[00:35:43.120 --> 00:35:44.000] Thanks, Kara.
[00:35:44.000 --> 00:35:44.640] All right.
[00:35:44.640 --> 00:35:47.280] Bob, tell us about alien solar panels.
[00:35:47.280 --> 00:35:49.920] Yeah, that sounds like a much happier note, Bob.
[00:35:49.920 --> 00:35:50.240] Yeah.
[00:35:50.240 --> 00:35:50.800] Yeah.
[00:35:51.280 --> 00:35:52.240] Get me out of the dumps.
[00:35:52.400 --> 00:35:54.640] I didn't know aliens had solar panels.
[00:35:54.640 --> 00:35:56.000] Well, that's the question.
[00:35:56.880 --> 00:35:59.440] The question is: if they did, could we detect them?
[00:35:59.440 --> 00:36:00.240] That's the question.
[00:36:00.240 --> 00:36:12.080] So, in a recent interesting study, the researchers created a model of an Earth-like planet, and then they covered it with modern solar panels that would meet all of our current and near-future energy needs.
[00:36:12.640 --> 00:36:25.280] They then imagined that it was a distant alien exoplanet to see how detectable it was with near-future telescopes and how much planet-wide panel coverage aliens might need.
[00:36:25.280 --> 00:36:29.040] And that seems much more of a complicated sentence than I intended.
[00:36:29.040 --> 00:36:30.560] But what did they find?
[00:36:30.560 --> 00:36:32.160] And wouldn't you like to know?
[00:36:32.160 --> 00:36:39.680] The study was published in the Astrophysical Journal called, the study is called Detectability of Solar Panels as a Techno-Signature.
[00:36:40.000 --> 00:36:44.720] A lot of research for extraterrestrial life focuses on biosignatures, right?
[00:36:44.720 --> 00:36:57.040] We've said that on the show many times, such as the molecules of life in exo-atmospheres or specific frequencies of light reflected or emitted from surfaces that point to some sort of life forms.
[00:36:57.040 --> 00:36:59.680] My favorite, though, is not biosignatures.
[00:36:59.880 --> 00:37:03.560] As my grandfather used to say, biosignatures, pet!
[00:37:03.560 --> 00:37:09.320] I love technosignatures, the manifestations of alien technology that we can observe.
[00:37:09.560 --> 00:37:11.320] That's what a technosignature is.
[00:37:11.320 --> 00:37:15.320] I mean, who doesn't like the idea of actual alien technology?
[00:37:15.560 --> 00:37:18.200] To me, that's just like, oh boy, more of that.
[00:37:18.200 --> 00:37:25.800] So, this study focuses on potential techno signatures that are used to power these alien civilizations if they exist.
[00:37:25.800 --> 00:37:29.800] In this case, they were focusing on silicon-based solar panels.
[00:37:29.800 --> 00:37:37.480] Now, they focus on silicon instead of other photovoltaics using things like germanium or gallium, which are much lower in abundance universe-wide.
[00:37:37.480 --> 00:37:43.880] So, that's why it seems kind of all right, if you're going to go with a material that these solar panels are made of, silicon sounds like a good idea.
[00:37:43.880 --> 00:37:52.920] All right, so the first reasons they chose silicon is that it's like I said, silicon was chosen partly because it has a high cosmic abundance.
[00:37:52.920 --> 00:37:58.840] It's all over the place, so there's a higher probability that aliens would probably be using that.
[00:37:58.840 --> 00:38:00.280] Reasonable, I guess.
[00:38:00.280 --> 00:38:05.000] Also, number two, the electronic structure of silicon, its so-called band gap.
[00:38:05.000 --> 00:38:05.960] You may have heard of that.
[00:38:06.360 --> 00:38:11.000] That works especially well with the radiation emitted by sun-like stars.
[00:38:11.000 --> 00:38:15.640] And finally, number three, silicon also happens to be very cost-effective.
[00:38:15.640 --> 00:38:19.480] When you refine it, process it, manufacture it into solar cells.
[00:38:19.480 --> 00:38:20.440] It's relatively easy.
[00:38:20.440 --> 00:38:22.840] So, those three reasons seem reasonable.
[00:38:22.840 --> 00:38:26.120] Okay, maybe aliens would use silicon if they're going to use that at all.
[00:38:26.120 --> 00:38:31.640] So, then the researchers also relied on this idea of what's called artificial spectral edges.
[00:38:31.640 --> 00:38:33.960] I've never heard of that, but it makes sense.
[00:38:33.960 --> 00:38:36.520] I want you to see where they're coming from.
[00:38:36.520 --> 00:38:43.880] This means that there's a dramatic and detectable change in the reflectance distinguishing silicon from non-silicon, right?
[00:38:43.880 --> 00:38:46.880] And this is specifically in UV wavelengths.
[00:38:47.120 --> 00:38:59.440] So imagine you're scanning the surface of an alien planet, and you know, a big chunk of the planet's got all this dirt, and next to the dirt is this sandy desert area, and then next to that are these gargantuan solar panels.
[00:38:59.440 --> 00:39:10.400] So, as it's being scanned or observed from a distant telescope, there would be a rapid increase in reflected UV light that would be easier to detect, right?
[00:39:10.400 --> 00:39:18.320] So, as you're transitioning from one area, like a desert, to this area with lots of solar panels, you're going to see a spike in the UV wavelengths.
[00:39:18.320 --> 00:39:21.200] Like, oh, that might be a solar panel.
[00:39:21.200 --> 00:39:22.480] So, there's that.
[00:39:22.480 --> 00:39:26.640] And it's similar to satellites detecting vegetation on Earth.
[00:39:26.880 --> 00:39:30.880] In this situation, they have, it's called vegetation red edge.
[00:39:30.880 --> 00:39:34.160] So, that's because chlorophyll absorbs a lot of infrared.
[00:39:34.160 --> 00:39:37.600] So, the infrared drops sharply when you look at vegetation.
[00:39:37.600 --> 00:39:42.480] So, you're scanning, satellites are scanning the Earth, you're going to see a big drop in vegetation.
[00:39:42.480 --> 00:39:47.600] That's one big indication that, oh, yeah, there's vegetation there, and that's called vegetation red edge.
[00:39:47.600 --> 00:39:51.360] So, it's kind of very similar to this artificial spectral edge.
[00:39:51.360 --> 00:39:58.480] The researchers then made a model of an Earth-like exoplanet, and to this exoplanet, they added these different land categories.
[00:39:58.480 --> 00:40:04.320] They added ocean, snow, and ice, they added grass, forest, and bare soil.
[00:40:04.640 --> 00:40:07.840] And then they added a sixth category, solar panels.
[00:40:07.840 --> 00:40:12.480] So, those are the five land categories or six land categories.
[00:40:12.480 --> 00:40:14.960] Of course, there's a huge bias there, right?
[00:40:14.960 --> 00:40:20.160] Those are land categories that we have on Earth, but who knows what the exoplanets would have.
[00:40:20.160 --> 00:40:26.000] Although, I assume that it's not unreasonable to pick those, especially for a planet that has some life.
[00:40:26.000 --> 00:40:34.200] Then, they move their model planet 30 light years away, and they determined what would they see if they looked at it with a really cool telescope.
[00:40:34.440 --> 00:40:42.520] So the really cool telescope that they chose was NASA's Habitable Worlds Observatory Telescope.
[00:40:42.520 --> 00:40:45.160] So this is planned to be launched in 2040.
[00:40:45.160 --> 00:40:51.480] It's going to hopefully have a 6.8-meter mirror, and it will look for life on Earth-like exoplanets.
[00:40:51.480 --> 00:40:53.880] So that's a big one that they're planning.
[00:40:53.880 --> 00:40:55.800] I hope the plans come to fruition.
[00:40:55.800 --> 00:40:59.320] Even though I might be dead, but it'd be still nice.
[00:40:59.640 --> 00:41:01.080] So that was the scenario.
[00:41:01.080 --> 00:41:11.080] You know, they have this model planet, this exoplanet that's similar to Earth with all these land areas, and one of the land areas consists of lots of solar panels.
[00:41:11.080 --> 00:41:15.000] And they wanted to see, can we see this thing from 30 light years away?
[00:41:15.000 --> 00:41:30.760] So they did all of their work, and they concluded that detecting such panels, given their hypothetical scenario that they created, would still be very hard, even 30 light-years away, which to me sounds damn close, but it is extremely far away.
[00:41:30.760 --> 00:41:39.960] But still, it's relatively close, but it was still very, very hard to see these solar panels, even when there was a huge amount of them.
[00:41:39.960 --> 00:41:50.200] They concluded that solar panels would often be indistinguishable from non-panel features on the exoplanet, whether it was the other land categories.
[00:41:50.520 --> 00:41:52.840] It would still be hard to distinguish them.
[00:41:53.160 --> 00:41:59.720] They said that detecting the large panels would likely require several hundred hours of observation.
[00:41:59.720 --> 00:42:06.120] So that's a lot of observation, several hundred hours to be focused on a planet looking for solar panels.
[00:42:06.120 --> 00:42:07.320] That's a lot of time.
[00:42:07.320 --> 00:42:17.840] And then also, if the planet happened to have very large solar panels covering a quarter of the planet, say 25%, of course, that would make it easier to detect them.
[00:42:17.840 --> 00:42:23.040] But the scientists said that even that would be difficult to do, even if there was a lot of them.
[00:42:23.040 --> 00:42:32.720] So, bottom line, it would still be extremely hard to see to detect these solar panels, even though they might be bright in the UV and there might be a lot of them.
[00:42:32.720 --> 00:42:36.160] It's still hard, even with a really cool advanced telescope.
[00:42:36.160 --> 00:42:37.040] It would be hard.
[00:42:37.040 --> 00:42:39.680] But they also found something else that was interesting.
[00:42:39.680 --> 00:42:50.640] A lead author of the paper, Ravi Caparapu of NASA's Goddard Space Flight Center, said, We found that spice.
[00:42:50.640 --> 00:42:51.280] Yeah, I thought you were going to say that.
[00:42:51.440 --> 00:42:52.560] Do you like spice?
[00:42:53.120 --> 00:42:54.000] I love spice.
[00:42:54.000 --> 00:42:55.120] No, outer spice.
[00:42:55.360 --> 00:43:11.200] All right, we found that even if our current population of about 8 billion stabilizes at 30 billion, and I think that number, 30 billion, is like is the number that has, you know, they did some study and said, yeah, if we're going to max out people on Earth, it's going to be about 30 billion.
[00:43:11.200 --> 00:43:13.840] Let me start this quote again because I'm throwing in my own words.
[00:43:13.840 --> 00:43:14.320] All right.
[00:43:14.320 --> 00:43:30.160] Ravi said, We found that even if our current population of about 8 billion stabilizes at 30 billion with a high standard of living and we only use solar energy for power, we still use way less energy than that provided by all the sunlight illuminating our planet.
[00:43:30.160 --> 00:43:30.480] Okay?
[00:43:30.880 --> 00:43:47.920] So that means that many advanced civilizations with tens of billions, literally tens of billions of inhabitants, might not need anything ridiculous like Dyson swarms around their parent star collecting a significant fraction of light from their sun.
[00:43:48.480 --> 00:44:00.000] The researchers showed that with 8.9% of the land surface of a planet covered in solar panels, that would be enough for 30 billion people to live to a very high standard of living.
[00:44:00.520 --> 00:44:03.480] And we still wouldn't be able to detect those panels.
[00:44:03.480 --> 00:44:15.880] So you could literally have 30 billion aliens 30 light years away with 9% of their land mass covered with solar, highly reflective solar panels, and we still wouldn't be able to detect them.
[00:44:15.880 --> 00:44:20.520] Now, remember, though, those panels could be on the water, they could be in orbit.
[00:44:20.520 --> 00:44:21.880] It doesn't have to be on the land.
[00:44:21.880 --> 00:44:24.440] I mean, 8.9% is still a lot.
[00:44:24.440 --> 00:44:29.480] I mean, that's essentially what, like covering China and India with solar panels.
[00:44:29.480 --> 00:44:32.040] I mean, that's a lot of solar panels.
[00:44:32.280 --> 00:44:38.120] But what if that could supply your entire planet with power for centuries?
[00:44:38.120 --> 00:44:39.480] That'd be interesting.
[00:44:39.720 --> 00:44:43.720] Coparapu also ties this study to the Fermi paradox.
[00:44:43.720 --> 00:44:48.120] And the Fermi paradox, if you don't remember, asks the question: where are the aliens?
[00:44:48.120 --> 00:44:52.840] Why haven't they spread all over the galaxy by now or the universe by now?
[00:44:52.840 --> 00:44:56.600] They've had literally hundreds of millions or billions of years to do so.
[00:44:56.600 --> 00:44:57.640] Where are they?
[00:44:57.640 --> 00:45:12.920] So Kaparapu said regarding that, he said, the implication is that civilizations may not feel compelled to expand all over the galaxy because they may achieve sustainable population and energy usage levels, even if they choose a very high standard of living.
[00:45:12.920 --> 00:45:21.320] They may expand within their stellar system or even within nearby star systems, but a galaxy-spanning civilization may not exist.
[00:45:21.320 --> 00:45:23.400] So that was the study.
[00:45:23.720 --> 00:45:30.120] You know, it was a fun read, and it was interesting as far as it goes.
[00:45:30.120 --> 00:45:32.120] But I mean, it's obviously clear.
[00:45:32.120 --> 00:45:34.760] The human bias, of course, is written all over this, right?
[00:45:34.760 --> 00:45:50.880] I mean, it it's hard to extrapolate, as we know, with one data point of life, it's hard to extrapolate on other alien civilizations what they would be like, what they would need, you know, what they, how much power, how much energy would they need to live comfortable lives?
[00:45:51.120 --> 00:45:52.000] Would they spread?
[00:45:52.000 --> 00:45:52.720] We have no idea.
[00:45:52.720 --> 00:45:53.680] We have no idea.
[00:45:53.680 --> 00:46:01.600] So I think this study is maybe more predictive about not what aliens would do, but what humanity would do.
[00:46:01.920 --> 00:46:10.960] Maybe for the next two or three centuries, you know, with solar panels and fusion energy, we may not be very detectable at large distances either.
[00:46:12.160 --> 00:46:20.080] And we could have plenty of energy available to us without any gaudy Dyson swarm around our star.
[00:46:20.080 --> 00:46:32.320] And so maybe that's one of the takeaways is that using maybe some solar panels in orbit, some on the Earth, and also fusion, of course, whatever other energies we may use.
[00:46:32.320 --> 00:46:37.600] I mean, we could have plenty of energy, even if we go up to 20 billion or even 30 billion.
[00:46:37.920 --> 00:46:39.360] So that's an interesting angle.
[00:46:39.600 --> 00:46:46.240] But of course, the more alien your aliens are, then I think all bets are kind of off to a certain degree.
[00:46:46.240 --> 00:46:51.520] And the more godlike their technology is, all bets are off because you have no idea what they're doing.
[00:46:52.160 --> 00:46:56.720] I mean, imagine a Kardashev level 3 civilization.
[00:46:56.720 --> 00:47:01.600] We would really have no idea what they would do, how much energy would they need?
[00:47:01.600 --> 00:47:02.640] What would they do with it?
[00:47:03.120 --> 00:47:05.040] Would we be able to even detect it?
[00:47:05.040 --> 00:47:10.160] And it's kind of like ants trying to figure out, what are those two people doing over there?
[00:47:11.760 --> 00:47:12.720] It's like no idea.
[00:47:12.720 --> 00:47:21.280] It's kind of silly to try to think we could even come close to understanding or figuring out what that would be like.
[00:47:21.280 --> 00:47:23.280] But still, an interesting study.
[00:47:23.840 --> 00:47:26.480] So it's indiscernible, in other words.
[00:47:26.480 --> 00:47:26.960] Yes.
[00:47:27.280 --> 00:47:28.400] It's mistaken.
[00:47:28.400 --> 00:47:31.000] You could never suss out the fact that there's all these things.
[00:47:31.080 --> 00:47:34.040] Well, they would have to be close and massive, is what I'm hearing.
[00:47:34.040 --> 00:47:34.520] Yeah.
[00:47:29.840 --> 00:47:34.760] Yeah.
[00:47:35.000 --> 00:47:42.600] So maybe this does explain, you know, maybe that's why they have done some decent searches for techno-signatures like Dyson Swarms.
[00:47:42.600 --> 00:47:49.400] And we've had some, you know, lots of news items about them, but nothing, you know, nothing definitive or even close to being definitive at all.
[00:47:50.520 --> 00:47:55.160] Yeah, but don't they know that we want to know them and they should broadcast to us so we can find them?
[00:47:55.400 --> 00:47:56.040] Yeah.
[00:47:56.040 --> 00:47:57.400] Aliens tend to be that way.
[00:47:57.400 --> 00:48:01.400] Well, if we're basically alone in our galaxy, that is not unrealistic.
[00:48:01.400 --> 00:48:02.760] No, it's absolutely.
[00:48:03.080 --> 00:48:04.120] But we have no idea.
[00:48:04.120 --> 00:48:04.920] We have no idea.
[00:48:04.920 --> 00:48:07.960] But yeah, it's absolutely could be our galaxy.
[00:48:08.280 --> 00:48:18.040] Imagine the next life being, you know, the galaxy over, or 100,000 far outside our, you know, our local group.
[00:48:18.040 --> 00:48:19.000] I mean, imagine that.
[00:48:19.000 --> 00:48:25.560] But I suspect, Steven, I'm sure you do too as well, that there's life all over the place, but more microbial life.
[00:48:25.560 --> 00:48:35.240] This is really about advanced, sapient life that can manipulate technology in ways we would be impressed with.
[00:48:35.560 --> 00:48:39.880] Bob, what do you think would happen if they found a real Dyson sphere?
[00:48:39.880 --> 00:48:43.240] What do you think would happen socially around the world?
[00:48:44.200 --> 00:48:45.800] I think a lot of people wouldn't believe it.
[00:48:46.200 --> 00:48:52.840] Ultimately, I don't think it would have much of an impact because there'd always be some doubt that it was natural.
[00:48:53.240 --> 00:49:02.440] And even if we eventually, like the scientific consensus landed on, okay, this is definitely a techno signature, that would take a lot of time.
[00:49:02.440 --> 00:49:05.800] This wouldn't be like an instant affirmative conclusion.
[00:49:05.800 --> 00:49:07.720] I'd be like, there's a potential one.
[00:49:07.720 --> 00:49:11.400] And we would say, yeah, like the last thousand potential ones, they all cracked out.
[00:49:11.400 --> 00:49:14.520] And say, wait a minute, this one, you know, it would survive more and more tests.
[00:49:14.520 --> 00:49:18.320] It would probably be a process that plays itself out over years.
[00:49:19.200 --> 00:49:23.840] Yeah, we would creep up on the creep up on the consensus.
[00:49:23.840 --> 00:49:25.920] It would be, it wouldn't be a sudden shock.
[00:49:25.920 --> 00:49:26.720] You know what I mean?
[00:49:26.720 --> 00:49:28.560] But I don't think it would happen.
[00:49:28.880 --> 00:49:35.520] Like, you're right scientifically, but I think that socially, like, religions would be formed around this.
[00:49:35.520 --> 00:49:37.440] Like, weird cults would start.
[00:49:37.440 --> 00:49:38.400] Eventually, maybe.
[00:49:38.400 --> 00:49:41.680] Yeah, there'd be all sorts of like weird pseudoscience on that.
[00:49:41.920 --> 00:49:42.800] Which would surprise me.
[00:49:42.800 --> 00:50:04.080] But I think once as we become increasingly convinced that it was that it was a real techno signature, I could imagine a lot of money spent on projects to really observe it with a tool that is going to hone in on that thing and try to confirm it.
[00:50:04.080 --> 00:50:06.080] Yeah, that would happen.
[00:50:06.400 --> 00:50:08.640] But I think it would take a while to get there.
[00:50:08.640 --> 00:50:11.040] So when I ask, what is Odoo?
[00:50:11.040 --> 00:50:12.560] What comes to mind?
[00:50:12.560 --> 00:50:15.200] Well, Odoo is a bit of everything.
[00:50:15.200 --> 00:50:22.320] Odoo is a suite of business management software that some people say is like fertilizer because of the way it promotes growth.
[00:50:22.320 --> 00:50:30.800] But you know, some people also say Odoo is like a magic bean stock because it grows with your company and is also magically affordable.
[00:50:30.800 --> 00:50:37.520] But then again, you could look at Odoo in terms of how its individual software programs are a lot like building blocks.
[00:50:37.520 --> 00:50:40.160] I mean, whatever your business needs.
[00:50:40.160 --> 00:50:43.040] Manufacturing, accounting, HR programs.
[00:50:43.040 --> 00:50:47.360] You can build a custom software suite that's perfect for your company.
[00:50:47.360 --> 00:50:49.200] So, what is Odoo?
[00:50:49.200 --> 00:50:52.240] Well, I guess Odoo is a bit of everything.
[00:50:52.240 --> 00:50:57.760] Odoo is a fertilizer, magic beanstock building blocks for business.
[00:50:57.760 --> 00:50:59.360] Yeah, that's it.
[00:50:59.360 --> 00:51:03.160] Which means that Odoo is exactly what every business needs.
[00:50:59.840 --> 00:51:05.960] Learn more and sign up now at Odo.com.
[00:51:06.120 --> 00:51:08.920] That's ODOO.com.
[00:51:09.560 --> 00:51:13.080] Celebrating is about embracing the moments that matter.
[00:51:13.080 --> 00:51:15.320] It's blowing out candles with family.
[00:51:15.320 --> 00:51:19.480] It's sharing laughter at a barbecue or saying I do.
[00:51:19.480 --> 00:51:24.920] And before the drinks begin, there's one step worth taking: Z-Biotics Pre-Alcohol.
[00:51:24.920 --> 00:51:28.920] Drink it before you celebrate, so you can wake up ready for what's next.
[00:51:28.920 --> 00:51:32.600] Turn every great night into the start of an extraordinary tomorrow.
[00:51:32.600 --> 00:51:36.360] Z-Biotics Pre-Alcohol, the science behind your next great morning.
[00:51:36.360 --> 00:51:38.920] Learn more at zbiotics.com.
[00:51:39.240 --> 00:51:44.760] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Rocket Money.
[00:51:44.760 --> 00:51:50.360] Rocket Money is a personal finance app that finds and cancels your unwanted subscriptions.
[00:51:50.360 --> 00:51:54.920] It monitors your spending and helps lower your bills so that you can grow your savings.
[00:51:54.920 --> 00:51:58.920] So, guys, how much do you think you're paying in subscriptions every month?
[00:51:58.920 --> 00:52:00.520] That's a serious question.
[00:52:00.520 --> 00:52:01.480] You just don't know.
[00:52:01.480 --> 00:52:05.720] So, over 74% of people have subscriptions they've forgotten about.
[00:52:05.720 --> 00:52:06.840] This includes me.
[00:52:06.840 --> 00:52:08.920] I've had this happen to me many times.
[00:52:08.920 --> 00:52:17.560] And most Americans think they spend about $62 per month on subscriptions, but get this: the real number is closer to 300, which is scary.
[00:52:17.560 --> 00:52:22.360] This is literally thousands of dollars a year, half of which you've forgotten about, like me.
[00:52:22.360 --> 00:52:34.520] Yeah, and not only will Rocket Money help you with that by finding these things, canceling them with just a few taps, they'll also even try to negotiate lower bills for you by up to 20%.
[00:52:34.520 --> 00:52:36.840] All you have to do is submit a picture of your bill.
[00:52:36.840 --> 00:52:38.680] Rocket Money takes care of the rest.
[00:52:38.680 --> 00:52:49.600] And Rocket Money has over 5 million users and has saved a total of $500 million in canceled subscriptions, saving members up to $740 a year when using all of the app's features.
[00:52:49.600 --> 00:52:51.760] Stop wasting money on things you don't use.
[00:52:51.760 --> 00:52:58.160] Cancel your unwanted subscriptions by going to rocketmoney.com/slash SGU.
[00:52:58.160 --> 00:53:01.600] That's rocketmoney.com/slash SGU.
[00:53:01.600 --> 00:53:04.960] RocketMoney.com/slash SGU.
[00:53:05.280 --> 00:53:07.600] All right, guys, let's get back to the show.
[00:53:07.600 --> 00:53:12.480] All right, Jay, tell us about these astronauts that are stranded on the International Space Station.
[00:53:12.480 --> 00:53:18.240] Yeah, we have two NASA astronauts, Sunita Williams and Barry Butch Wilmore.
[00:53:18.240 --> 00:53:20.720] How come they didn't give her a space name?
[00:53:20.720 --> 00:53:21.840] You know, I don't know.
[00:53:21.840 --> 00:53:22.960] Like a nickname?
[00:53:22.960 --> 00:53:23.360] Yeah.
[00:53:23.760 --> 00:53:27.600] You know, they all have to have a cool moniker.
[00:53:27.600 --> 00:53:28.800] What was her name again?
[00:53:29.360 --> 00:53:30.560] Sunita Williams.
[00:53:30.560 --> 00:53:31.840] Sunita Williams.
[00:53:31.840 --> 00:53:33.120] Super Sunita Williams.
[00:53:33.120 --> 00:53:33.760] I like it.
[00:53:33.920 --> 00:53:42.160] So they're currently stuck on the International Space Station, and this is due to ongoing technical issues with the Boeing Starliner spacecraft.
[00:53:42.160 --> 00:53:44.560] Not a great position to be in.
[00:53:44.560 --> 00:53:52.480] So Williams and Wilmore, they were delivered to the space station on June 2024 aboard Boeing Starliner.
[00:53:52.480 --> 00:53:59.120] Now, this was a spacecraft that was developed as part of NASA's commercial crew program, which we've talked about.
[00:53:59.120 --> 00:54:06.720] The mission was supposed to be pretty straightforward with the astronauts staying on the ISS scheduled for about a week.
[00:54:06.720 --> 00:54:17.680] And even before the launch, the Starliner encountered some technical difficulties, and the new issues with the thrusters emerged after docking at the ISS.
[00:54:17.680 --> 00:54:21.280] What happened was they just discovered that there were problems with the thrusters.
[00:54:21.280 --> 00:54:28.000] They docked, and then NASA is currently testing the Starliner spacecraft while it's stocked at the ISS.
[00:54:28.000 --> 00:54:29.720] I didn't know that this was possible.
[00:54:29.280 --> 00:54:33.320] The testing is focused on the spacecraft's thruster system.
[00:54:33.480 --> 00:54:37.880] Like I said, NASA engineers, you know, they're carefully running these tests.
[00:54:38.200 --> 00:54:49.960] They're trying to evaluate the performance and the reliability of the thrusters before they make any decisions regarding how the astronauts are going to return to Earth, which is a vital piece of information that they're trying to get to.
[00:54:49.960 --> 00:54:59.720] Now, these tests include monitoring the spacecraft systems and simulating various scenarios that would ensure that the Starliner can safely undock and re-enter Earth's atmosphere.
[00:54:59.720 --> 00:55:07.880] So the results of the test will determine whether the Starliner will be used or if they have to result, you know, turn to an alternate spacecraft.
[00:55:07.880 --> 00:55:12.760] And some of the options could be a SpaceX Crew Dragon or a Russian Soyuz.
[00:55:12.760 --> 00:55:19.800] So right now, they don't know exactly what the problem is still, as far as what I could find out as of today.
[00:55:19.800 --> 00:55:23.320] They're feeling like they need to continue testing.
[00:55:23.320 --> 00:55:31.000] And I was personally very surprised that they could test this as thoroughly as they are testing it while it's docked at the space station.
[00:55:31.000 --> 00:55:32.200] That's awesome.
[00:55:32.200 --> 00:55:39.240] And, you know, the genius of the engineers and the foresight of the engineers is what is hopefully going to pay off here.
[00:55:39.240 --> 00:55:42.360] So the astronauts themselves, they're both in good spirits.
[00:55:42.360 --> 00:55:47.240] They have a lot of confidence that the testing and the safety of the Starliner is going to come out on top.
[00:55:47.240 --> 00:55:56.920] Now, in case of an emergency, NASA and Boeing publicly stated that the spacecraft is equipped to undock from the ISS and return the crew to Earth safely.
[00:55:56.920 --> 00:55:58.520] That's what they're saying today.
[00:55:58.840 --> 00:56:05.480] But under normal circumstances, the spacecraft will not be cleared for return until the thruster issue is fully resolved.
[00:56:05.480 --> 00:56:09.320] So, in the meantime, the other spacecraft docked at the ISS.
[00:56:09.320 --> 00:56:16.400] I think they're now this article I read said s
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
going to return to Earth, which is a vital piece of information that they're trying to get to.
[00:54:49.960 --> 00:54:59.720] Now, these tests include monitoring the spacecraft systems and simulating various scenarios that would ensure that the Starliner can safely undock and re-enter Earth's atmosphere.
[00:54:59.720 --> 00:55:07.880] So the results of the test will determine whether the Starliner will be used or if they have to result, you know, turn to an alternate spacecraft.
[00:55:07.880 --> 00:55:12.760] And some of the options could be a SpaceX Crew Dragon or a Russian Soyuz.
[00:55:12.760 --> 00:55:19.800] So right now, they don't know exactly what the problem is still, as far as what I could find out as of today.
[00:55:19.800 --> 00:55:23.320] They're feeling like they need to continue testing.
[00:55:23.320 --> 00:55:31.000] And I was personally very surprised that they could test this as thoroughly as they are testing it while it's docked at the space station.
[00:55:31.000 --> 00:55:32.200] That's awesome.
[00:55:32.200 --> 00:55:39.240] And, you know, the genius of the engineers and the foresight of the engineers is what is hopefully going to pay off here.
[00:55:39.240 --> 00:55:42.360] So the astronauts themselves, they're both in good spirits.
[00:55:42.360 --> 00:55:47.240] They have a lot of confidence that the testing and the safety of the Starliner is going to come out on top.
[00:55:47.240 --> 00:55:56.920] Now, in case of an emergency, NASA and Boeing publicly stated that the spacecraft is equipped to undock from the ISS and return the crew to Earth safely.
[00:55:56.920 --> 00:55:58.520] That's what they're saying today.
[00:55:58.840 --> 00:56:05.480] But under normal circumstances, the spacecraft will not be cleared for return until the thruster issue is fully resolved.
[00:56:05.480 --> 00:56:09.320] So, in the meantime, the other spacecraft docked at the ISS.
[00:56:09.320 --> 00:56:16.400] I think they're now this article I read said something that a second article didn't, and I don't know exactly what's going on.
[00:56:16.720 --> 00:56:22.800] Now, is there currently a SpaceX, Crew Dragon, and Russian Soyuz at the space station right now?
[00:56:22.800 --> 00:56:25.200] There were inconsistencies with these articles.
[00:56:25.200 --> 00:56:29.040] So, anyway, they're trying to assess all these things at the same time.
[00:56:29.040 --> 00:56:32.240] The engineers are taking a very careful approach.
[00:56:32.240 --> 00:56:38.880] They want to have a possible return window of mid-August, which is, you know, now coming up very soon.
[00:56:38.880 --> 00:56:43.760] But this could extend and they could stay there longer depending on how the tests go.
[00:56:43.760 --> 00:56:50.960] Now, if the technical problems are not resolved in time and there's no alternative spacecraft available, this is the worst case scenario.
[00:56:50.960 --> 00:56:55.280] There is a possibility that the astronauts could remain on the ISS longer as planned.
[00:56:55.280 --> 00:56:59.600] And some reports have mentioned that the stay could extend into 2025.
[00:56:59.600 --> 00:57:01.040] It's a pretty extreme position.
[00:57:01.840 --> 00:57:04.160] But it is a speculative outcome.
[00:57:04.160 --> 00:57:07.120] They're talking about it, of course, just in case it happens.
[00:57:07.120 --> 00:57:09.680] Bottom line is we have to wait a little bit longer.
[00:57:09.680 --> 00:57:14.160] Now, the Starliner is approved to remain docked at the ISS for up to 45 days.
[00:57:14.160 --> 00:57:17.760] This is largely because of the batteries aboard.
[00:57:17.760 --> 00:57:20.880] They could extend that to 90 days using backup systems.
[00:57:20.880 --> 00:57:24.880] You know, the health of those lithium-ion batteries is always a question here.
[00:57:24.880 --> 00:57:31.040] These batteries have previously caused problems and concerns, and that adds another layer of complexity here.
[00:57:31.040 --> 00:57:34.240] So, an interesting thing I found too, this was pretty cool.
[00:57:34.240 --> 00:57:36.880] So, a Russian satellite broke apart.
[00:57:36.880 --> 00:57:50.560] It created debris that briefly threatened the space station, and during that incident, the astronauts, both of them, had to go into their spacecraft for protection, which I found to be very curious and interesting.
[00:57:50.560 --> 00:57:55.840] I guess there's only a certain number of places that they consider to be safe, and the spacecraft are very safe.
[00:57:55.840 --> 00:57:57.840] So, we will see what happens here.
[00:57:57.840 --> 00:57:59.200] I mean, I'm not worried at all.
[00:57:59.200 --> 00:58:05.480] I feel like NASA's got this, and the future here is going to probably reveal itself within the next couple of weeks.
[00:58:05.480 --> 00:58:26.280] Jay, what I'm finding is that the ISS currently has six spaceships docked to it: the Boeing Starliner spacecraft, SpaceX Dragon, Endeavour spacecraft, Northrop Grumman resupply ship, Soyuz MS-25 cruise ship, and Progress 87 and 88 resupply ships.
[00:58:26.280 --> 00:58:26.680] Wow.
[00:58:26.680 --> 00:58:28.600] All docked to the ISS right now.
[00:58:28.600 --> 00:58:29.080] Wow.
[00:58:29.480 --> 00:58:31.080] What's in docking Bay 94?
[00:58:31.800 --> 00:58:33.560] So they have options.
[00:58:34.040 --> 00:58:37.320] They have a Dragon cruise capsule and a Soyuz.
[00:58:37.640 --> 00:58:41.640] And then the rest are resupply ships, I guess are not crew approved.
[00:58:41.640 --> 00:58:45.800] So, guys, what do you think of when I tell you the word framing?
[00:58:46.120 --> 00:58:46.600] Framing?
[00:58:46.600 --> 00:58:48.200] In the context of science communication.
[00:58:48.680 --> 00:58:50.680] What do you think of the term framing?
[00:58:50.680 --> 00:58:51.160] Oh, yeah.
[00:58:51.160 --> 00:59:03.400] Like, it's a strategy for ensuring that the listener is going to approach it in a way that, you know, that you want them to.
[00:59:03.640 --> 00:59:06.840] It's a persuasion strategy, I guess you could say.
[00:59:06.840 --> 00:59:08.200] So, do you guys remember?
[00:59:08.200 --> 00:59:25.640] This might have been a little bit before you were actively paying attention to the skeptical movement, Kara, but maybe 15, 20 years ago, there was a very brief debate within skeptical circles about whether or not framing was a scientific or more of a deceptive practice.
[00:59:25.640 --> 00:59:35.640] Like, there was a couple of people talking about we have to frame our messages appropriately, and other people, other skeptics, like, that's deception, that's politics, that's propaganda.
[00:59:35.960 --> 00:59:36.280] What?
[00:59:36.280 --> 00:59:38.040] All science communication is persuasion.
[00:59:38.040 --> 00:59:41.000] Well, I mean, it's all the other thing is, yes, I agree with you.
[00:59:41.000 --> 00:59:44.440] But also, you are always framing your message.
[00:59:44.680 --> 00:59:45.040] True.
[00:59:45.040 --> 00:59:47.680] Yeah, whether you think you are or not.
[00:59:44.760 --> 00:59:51.600] So it's like saying, oh, we're doing science, not philosophy.
[00:59:51.760 --> 00:59:53.760] It's like, no, you're doing philosophy.
[00:59:53.760 --> 00:59:55.360] You just don't know it.
[00:59:55.360 --> 00:59:55.600] Right.
[00:59:55.600 --> 00:59:59.040] When somebody's like, no, but this is pure information.
[00:59:59.040 --> 00:59:59.440] Yeah.
[00:59:59.920 --> 01:00:00.640] Or whatever.
[01:00:00.960 --> 01:00:09.760] You're just not aware of the, or thoughtful, therefore, about the philosophy that you're using, or in this case, the framing that you're using.
[01:00:10.160 --> 01:00:12.000] The bias you're bringing to the table.
[01:00:12.000 --> 01:00:12.720] Exactly.
[01:00:12.720 --> 01:00:17.680] So you're using an unconscious or assumed framing without even knowing you're doing it.
[01:00:17.680 --> 01:00:21.440] That doesn't make it in any way better or purer or whatever.
[01:00:22.080 --> 01:00:26.000] So it was both naive and I think misplaced, but it was very short-lived.
[01:00:26.000 --> 01:00:33.360] I think it was like very quickly the internal conversation was like, no, this is like a standard part of science communication.
[01:00:34.400 --> 01:00:44.080] So there's a recent study, what's prompting me talking about this is a recent study looking at the framing of global warming, talking to the public about global warming.
[01:00:44.080 --> 01:01:03.840] And they are focusing on, they wanted to see what people's reactions were to just the term climate change versus global warming versus climate crisis versus climate emergency versus climate justice.
[01:01:04.160 --> 01:01:04.720] Interesting.
[01:01:05.600 --> 01:01:09.920] And they essentially asked two questions or two types of questions.
[01:01:09.920 --> 01:01:13.120] One was just, are you familiar with this term?
[01:01:13.120 --> 01:01:16.640] Like have you ever heard this term before, do you know what it means?
[01:01:16.640 --> 01:01:19.760] And the results are pretty predictable.
[01:01:19.760 --> 01:01:27.120] So climate change and global warming were both very high, about 88%, roughly the same.
[01:01:27.120 --> 01:01:30.520] Climate crisis, climate emergency, were less so.
[01:01:29.840 --> 01:01:34.200] Crisis was 76%, climate emergency, 58%.
[01:01:34.760 --> 01:01:38.120] And climate justice was the least at 33%.
[01:01:38.600 --> 01:01:44.520] So people had not heard of that term and weren't really sure what it meant, but very familiar with climate change, global warming.
[01:01:44.520 --> 01:01:52.120] Then they asked people to say whether or not this is something that you are concerned about.
[01:01:52.120 --> 01:01:54.040] Like, are you concerned about climate change?
[01:01:54.040 --> 01:01:56.040] Are you concerned about global warming?
[01:01:56.040 --> 01:01:58.280] And the numbers were very similar.
[01:01:58.840 --> 01:01:59.640] Interesting.
[01:01:59.640 --> 01:02:00.280] Yeah.
[01:02:00.840 --> 01:02:03.240] To the familiarity one.
[01:02:03.240 --> 01:02:26.120] So they concluded that there is a big effect to, you know, when you're talking about framing the discussion, like do you frame a discussion of this issue around the concept of global warming versus climate change, whatever, people's familiarity with the terms that you're using have a dramatic effect on their receptiveness and their attitudes towards that topic.
[01:02:26.120 --> 01:02:35.080] They also, of course, broke it down by political persuasion with the absolutely predictable results, right?
[01:02:35.640 --> 01:02:42.440] Of Democrats being more concerned, Independents less so, Republicans less so.
[01:02:42.440 --> 01:02:45.080] But did they say who responds better to what framing?
[01:02:45.560 --> 01:02:51.480] It was just by, I think familiarity was the determinative factor.
[01:02:51.480 --> 01:03:01.080] And otherwise, it was like the average numbers were similar to the familiarity numbers, but then they would break out based upon political ideology.
[01:03:01.400 --> 01:03:11.720] That's interesting because I guess my understanding was that there was a big cultural shift from global warming to climate change to make it more palatable.
[01:03:11.960 --> 01:03:14.680] They're both there, but they're both the same in this study.
[01:03:15.200 --> 01:03:15.840] That's interesting.
[01:03:15.920 --> 01:03:25.440] So that almost shows that over either it shows that that actually that strategy didn't work or that it did work at the time, but it's become such a commonplace.
[01:03:25.840 --> 01:03:26.560] Yeah, it's been normal.
[01:03:27.280 --> 01:03:34.560] And honestly, that to me makes sense because I find that climate change and global warming are used pretty interchangeably.
[01:03:34.560 --> 01:03:35.440] Yeah, they are.
[01:03:35.440 --> 01:03:37.600] But at the time, it was really intentional.
[01:03:37.600 --> 01:03:41.120] Like if we say climate change, people will not be reactive.
[01:03:41.120 --> 01:03:50.960] Well, I think the idea also is that climate change is more accurate because global warming makes it sound like it's warming everywhere always as opposed to the climate's changing.
[01:03:50.960 --> 01:03:58.080] It's warming on average, but some places are getting warmer, other places not so much, and it's inconsistent year to year, whatever.
[01:03:58.080 --> 01:03:59.840] So climate change is just more accurate.
[01:03:59.840 --> 01:04:07.360] So it was just to avoid the straw man of, well, it's not getting warmer here now, so how can you call it global warming?
[01:04:07.360 --> 01:04:09.120] Like so much for global warming.
[01:04:09.120 --> 01:04:15.840] I don't think that that actually worked because the people who are making that argument are being disingenuous anyway.
[01:04:16.400 --> 01:04:27.680] Yeah, that's what's interesting about it is that, you know, my take would be that calling it climate change, while it might be more, quote, accurate, it's actually more innocuous as well.
[01:04:28.000 --> 01:04:31.360] And so it's much easier to be like, well, who cares if the climate changes?
[01:04:31.360 --> 01:04:31.760] Right.
[01:04:31.760 --> 01:04:33.040] Is it changing for the worst?
[01:04:33.040 --> 01:04:34.880] Is it dangerous that it's changing?
[01:04:34.880 --> 01:04:41.040] Whereas things like climate emergency, they communicate a value judgment.
[01:04:41.040 --> 01:04:41.520] Crisis.
[01:04:41.520 --> 01:04:43.200] I like climate crisis.
[01:04:44.240 --> 01:04:53.680] I think you can make a reasonable argument for not using like climate emergency or climate crisis because of you know all the reasons that the authors here state.
[01:04:53.680 --> 01:05:03.080] But I, in addition, because it has that value judgment, like you say climate crisis, it's much easier to straw man that as a climate alarmist position.
[01:05:03.400 --> 01:05:05.160] Like, is it really a crisis?
[01:05:05.160 --> 01:05:22.280] You know, and so you're asking people to accept not only that there's global warming, but that it's a crisis at the same time rather than easing them in, maybe saying there is you know climate change, it involves on average warming, and it could be a problem, you know, going forward.
[01:05:22.760 --> 01:05:36.600] I'm actually asking, I'm texting a friend right now, she works for a firm, and she's the comms person at the firm, and their job is to partner with agencies, often government agencies.
[01:05:36.600 --> 01:05:48.200] Her specialty is ocean science, but they actually draft policy and they do work to try to make sure that like budgets are protected and that initiatives are going forward.
[01:05:48.200 --> 01:05:49.800] And so they have to be bipartisan.
[01:05:50.120 --> 01:05:53.400] And so I'm actually asking her, do you avoid that kind of language?
[01:05:53.400 --> 01:05:54.680] Like what is your strategy?
[01:05:54.680 --> 01:05:58.440] Because I'm curious how it actually works in Washington.
[01:05:59.400 --> 01:06:08.120] Now, here's another thought that I had is when you use a term like climate crisis or climate emergency, those are emotional words, right?
[01:06:08.360 --> 01:06:12.040] Those are more of value-laden words.
[01:06:12.280 --> 01:06:20.680] So people might think that those terms are biased, and therefore they would assume that the messenger is biased, right?
[01:06:20.680 --> 01:06:29.880] The other thing is that when people hear emotional language, I think there is a defensive sort of knee-jerk reaction because you feel like I'm being manipulated.
[01:06:29.880 --> 01:06:34.120] And that's actually a perfectly reasonable assumption most of the time.
[01:06:34.440 --> 01:06:35.080] Right?
[01:06:35.080 --> 01:06:42.280] So this is, and we get, we have discussed this sort of in general, you know, previously in terms of our messaging.
[01:06:42.280 --> 01:06:48.800] It's like we want to accurately reflect how serious the topics are that we're talking about.
[01:06:49.120 --> 01:06:57.440] Like if we're talking about vaccine denial or anti-vaccine rhetoric, right, Kara, like you were saying, we want to say this is...
[01:06:57.760 --> 01:06:58.800] very serious.
[01:06:58.800 --> 01:07:02.480] This could result in kids dying of otherwise preventable diseases.
[01:07:02.480 --> 01:07:07.680] But if we push that framing too far, you get a backlash.
[01:07:07.920 --> 01:07:12.560] Well, now you're just emotionally trying to manipulate me rather than giving me objective science.
[01:07:12.560 --> 01:07:21.920] So how do you keep the aura of objective science, but still convey the seriousness of the topic that you're talking about, right?
[01:07:21.920 --> 01:07:23.760] I mean, it's a huge problem during COVID.
[01:07:23.760 --> 01:07:24.320] A huge problem.
[01:07:24.560 --> 01:07:25.200] Absolutely.
[01:07:25.200 --> 01:07:28.880] And we also run into the same problem when we're talking about con artists.
[01:07:28.880 --> 01:07:35.680] And pseudoscience, we say like homeopathy, which we're going to talk about in a moment, is rank pseudoscience.
[01:07:35.680 --> 01:07:38.240] There's really no other way to accurately portray it.
[01:07:38.320 --> 01:07:40.560] It is complete and utter nonsense.
[01:07:40.560 --> 01:07:47.040] But when you talk that way, even though it's true, you don't sound like a scientist.
[01:07:47.040 --> 01:07:56.000] But if you talk like people think a scientist is supposed to sound like, you lend false credibility by damning with faint criticism in a way, right?
[01:07:56.000 --> 01:08:00.720] You're not criticizing homeopathy enough with your language.
[01:08:00.720 --> 01:08:05.200] And so people think that it's like something that reasonable scientists could disagree about.
[01:08:05.200 --> 01:08:07.760] It's like, well, the evidence doesn't support homeopathy.
[01:08:07.760 --> 01:08:11.680] But rather than saying this is really complete nonsense.
[01:08:11.680 --> 01:08:12.480] You know what I mean?
[01:08:13.440 --> 01:08:23.520] And there's the layer too, Steve, that to like, because we're talking about science communication and we're often talking about scientists communicating their research or people whose sole job it is to communicate science.
[01:08:23.520 --> 01:08:25.120] But there's a whole other wrinkle in this.
[01:08:25.120 --> 01:08:33.400] And I remember having debates about this for years, which is that science journalism is very different from science communication.
[01:08:29.840 --> 01:08:33.560] Yes.
[01:08:33.880 --> 01:08:38.200] Science communication, often by definition, is agenda-driven.
[01:08:38.440 --> 01:08:38.760] Persuasive.
[01:08:39.320 --> 01:08:40.440] Yes, it's persuasive.
[01:08:40.440 --> 01:08:45.640] Whereas science journalism, I mean, talk about exactly what you opened this conversation with.
[01:08:46.040 --> 01:08:54.600] Their attempt, knowing that there's always going to be some amount of bias, is to be aware of that and to sniff it out and to reduce it wherever possible.
[01:08:55.720 --> 01:08:57.720] That is a journalistic ethic.
[01:08:57.720 --> 01:09:05.800] It's to report what is without saying how it should be, without saying how I want it to be, without saying how I like it to be.
[01:09:06.120 --> 01:09:07.080] This just is.
[01:09:07.080 --> 01:09:10.280] Yeah, well, we do would go in the opinion section of newspapers.
[01:09:10.440 --> 01:09:10.680] Exactly.
[01:09:12.920 --> 01:09:14.120] And a lot of SciCom is options.
[01:09:14.840 --> 01:09:16.280] It's analysis, right?
[01:09:16.280 --> 01:09:18.360] We're giving an analysis, an opinion.
[01:09:18.360 --> 01:09:19.240] And that's fine.
[01:09:19.240 --> 01:09:19.960] You know what I mean?
[01:09:19.960 --> 01:09:24.760] Because we're trying to help people make sense of the science, putting in a data.
[01:09:24.840 --> 01:09:27.320] And the thing is, we're not dishonest about it.
[01:09:27.320 --> 01:09:31.800] No, that does not necessarily imply dishonesty or deception or whatever.
[01:09:31.800 --> 01:09:32.680] It is framing.
[01:09:33.480 --> 01:09:36.040] It is saying, what's the context of the communication?
[01:09:36.280 --> 01:09:37.240] Who am I talking to?
[01:09:37.240 --> 01:09:38.840] What am I trying to convey?
[01:09:38.840 --> 01:09:44.120] And then accurately portraying the science while putting it into some kind of meaningful perspective.
[01:09:44.120 --> 01:09:48.360] And that meaningful perspective is the framing, right?
[01:09:48.920 --> 01:09:51.720] Yeah, and we say we are pro-scientific skepticism.
[01:09:51.720 --> 01:09:52.840] That is the goal here.
[01:09:53.080 --> 01:09:59.480] We are trying to do pros, cons, and pseudoscience, not just talk about it neutrally like an observer from another planet, you know what I mean?
[01:09:59.480 --> 01:09:59.880] Yeah.
[01:09:59.880 --> 01:10:01.080] No, absolutely.
[01:10:01.080 --> 01:10:06.360] And so I've written about framing over the years on many topics, like with GMOs, right?
[01:10:06.360 --> 01:10:13.320] So there's like this pure scientific question about what is the technology, what is genetic engineering, what does the evidence say, say.
[01:10:13.320 --> 01:10:20.400] But we could also frame a discussion around GMOs from a perspective of how should the government regulate it?
[01:10:20.400 --> 01:10:25.920] Or what is the behavior of corporations who use GMOs?
[01:10:25.920 --> 01:10:32.160] Or what is the pseudoscientific claim surrounding GMOs, the conspiracy theories, et cetera?
[01:10:32.160 --> 01:10:33.040] You know what I mean?
[01:10:33.040 --> 01:10:39.280] So those framings are all part of the overall GMO phenomenon.
[01:10:39.280 --> 01:10:52.480] And you can't just say, here's the dry science in complete absence of any context about the conspiracy theories about it or the actually anti-GMO agenda-driven communication that's out there about it.
[01:10:52.800 --> 01:10:53.920] This is so funny, Steve.
[01:10:53.920 --> 01:10:55.440] I just got a text back from my friend.
[01:10:55.840 --> 01:10:59.840] I said, Do you guys avoid words like climate crisis when you write policy?
[01:10:59.840 --> 01:11:04.000] Like, do you tend to avoid emotional language because it's more biased, or do you use it because it's more motivating?
[01:11:04.000 --> 01:11:05.680] We're talking about this on SGU right now.
[01:11:06.080 --> 01:11:12.080] And she responded directly to, Do you guys avoid words like climate crisis when you write policy?
[01:11:12.240 --> 01:11:17.440] And she goes, Oh my god, I'm laughing because I literally just told one of our clients that they need to do this.
[01:11:18.240 --> 01:11:19.280] They need to avoid it.
[01:11:19.280 --> 01:11:19.760] Yeah.
[01:11:19.760 --> 01:11:23.520] They need to avoid using words like climate crisis when they're drafting policy.
[01:11:23.680 --> 01:11:24.160] Yeah, exactly.
[01:11:24.800 --> 01:11:26.640] But also, but that makes sense.
[01:11:26.960 --> 01:11:29.680] There is a darker side to framing, you know, too.
[01:11:29.680 --> 01:11:31.680] Well, first of all, there's a technical side to it.
[01:11:31.680 --> 01:11:36.240] So, for example, when we talk about framing, you could be talking about a little detail.
[01:11:36.240 --> 01:11:40.240] Like, for example, this is sort of the classic example from literature.
[01:11:40.240 --> 01:11:58.080] If a surgeon tells a patient that the surgical procedure they're recommending has a 98% survival rate, the person is much more likely to agree to the procedure than if you say it has a 2% fatality rate, even though that's exactly the same information.
[01:11:58.080 --> 01:11:58.560] Yeah.
[01:11:58.560 --> 01:11:58.800] Right?
[01:11:58.800 --> 01:12:02.600] So, that's sort of that sort of technical framing about how you convey information.
[01:12:02.760 --> 01:12:06.600] And do you say it from a positive perspective or a negative perspective?
[01:12:06.600 --> 01:12:15.000] But then there's also political framing, which I think is what people reacted to, because that is inherently like spin and deception.
[01:12:15.000 --> 01:12:24.040] Like when you refer to the inheritance tax as a death tax, that's a political framing.
[01:12:24.280 --> 01:12:26.520] That is mentioned in all political.
[01:12:28.040 --> 01:12:37.320] If you talk about a hospital ethics committee, if you label it a death panel, that's a framing that's inherently, I think, deceptive.
[01:12:37.560 --> 01:12:41.800] So you can use frame, like a lot of things, you can use it for good or you can use it for evil.
[01:12:41.800 --> 01:12:45.080] But otherwise, it's just a tool of communication.
[01:12:45.080 --> 01:12:55.400] So, for example, you can use framing to rig any discussion or debate by baking in assumptions that favor your conclusion, right?
[01:12:55.400 --> 01:13:02.360] You could essentially frame a topic in such a way that your conclusion is baked into the discussion.
[01:13:02.360 --> 01:13:05.480] And that, of course, is an inherently deceptive use of framing.
[01:13:05.480 --> 01:13:08.360] You have to sort of challenge the framing.
[01:13:08.360 --> 01:13:11.480] You can't just have a conversation within that framing.
[01:13:11.480 --> 01:13:11.960] You know what I mean?
[01:13:12.040 --> 01:13:14.520] Yeah, I mean, it's also like a logical fallacy.
[01:13:14.680 --> 01:13:17.080] So it's like you're using framing to the extent.
[01:13:17.080 --> 01:13:17.320] Yeah.
[01:13:17.560 --> 01:13:19.320] To make a circular argument almost.
[01:13:19.320 --> 01:13:19.800] Yeah.
[01:13:19.800 --> 01:13:20.120] Yeah.
[01:13:20.360 --> 01:13:21.000] Sounds innocent.
[01:13:21.080 --> 01:13:22.280] Sounds inescapable.
[01:13:22.280 --> 01:13:24.360] Well, yeah, we always frame everything.
[01:13:24.360 --> 01:13:24.840] That's the thing.
[01:13:24.840 --> 01:13:28.040] It's like you're always doing philosophy whether you know it or not.
[01:13:28.040 --> 01:13:28.520] You know what I mean?
[01:13:28.760 --> 01:13:35.240] There are some inherent assumptions about knowledge and facts and whatever and the universe in any discussion that you're having.
[01:13:35.640 --> 01:13:39.640] But it's better to be consciously aware of that and the implications of it.
[01:13:40.120 --> 01:13:47.760] Same thing, you're constantly framing your discussion in some context, whether you explicitly know it and express it or not.
[01:13:48.320 --> 01:14:00.080] But when you consciously frame a subject, you should be doing it to optimize understanding and communication while being fair and accurate and scientific.
[01:14:00.080 --> 01:14:07.760] But of course, people will use framing to be deceptive and persuasive, to sell you stuff and to get your vote and whatever, to make you afraid, you know.
[01:14:07.760 --> 01:14:08.400] Absolutely.
[01:14:09.280 --> 01:14:22.000] And the truth is, it's deeply disingenuous if you actively work as a science communicator or identify as a science communicator, if your job is to do journalism, if your job is to do communications.
[01:14:22.000 --> 01:14:25.600] It's deeply disingenuous to not be aware of this.
[01:14:26.320 --> 01:14:28.480] Like, you have to be explicit about it.
[01:14:28.480 --> 01:14:36.400] I have to say, real quick, before you wrap it up, she added a little bit to the text thread, which I think is just so fascinating.
[01:14:36.400 --> 01:14:51.200] She said, we don't avoid the word climate, but we lean into language like strengthen coastal resilience, mitigate climate impacts, et cetera, that talk about specific issues that climate has for communities, ecosystems, and industries.
[01:14:51.200 --> 01:14:55.440] It's boring, but it's just less alienating and gets less of a reaction.
[01:14:56.160 --> 01:15:03.520] Truth is, most people are on board at this point that climate change is real, but they just get their heckles up when we talk about it as a crisis.
[01:15:03.520 --> 01:15:03.840] Right.
[01:15:04.080 --> 01:15:04.800] Fascinating.
[01:15:04.800 --> 01:15:06.720] And climate justice, forget about it.
[01:15:06.720 --> 01:15:12.720] Because so I think you would think justice, like, isn't justice for all, like, isn't this like an American concept?
[01:15:12.720 --> 01:15:14.640] But the word has so much baggage on it.
[01:15:14.640 --> 01:15:23.600] And I think what the specific baggage is, is that people have been convinced that justice is a zero-sum game.
[01:15:23.600 --> 01:15:29.760] And that's one person's justice is at the expense of injustice to somebody else.
[01:15:30.280 --> 01:15:32.280] And again, that's a framing, right?
[01:15:32.280 --> 01:15:40.840] So the whole concept of justice has been framed in this zero-sum way, which makes it have this psychological baggage.
[01:15:40.840 --> 01:15:44.840] Whereas that wasn't true historically, justice was like Superman, right?
[01:15:44.840 --> 01:15:47.480] You know, it was truth justice in the American way.
[01:15:47.880 --> 01:15:49.240] Yeah, something everyone could agree on.
[01:15:49.480 --> 01:15:50.440] Most people could agree on.
[01:15:50.520 --> 01:15:51.080] Right, right.
[01:15:51.480 --> 01:15:57.320] But now it's been made divisive because it's been framed as a zero-sum game.
[01:15:57.320 --> 01:16:01.400] Well, we saw this with the word skeptic as well over our 20 years of doing this, plus.
[01:16:01.560 --> 01:16:01.880] Right.
[01:16:01.880 --> 01:16:08.200] Well, and to me, climate justice, and maybe this is my own bias or my own framing, is not the same thing as climate science.
[01:16:08.200 --> 01:16:08.680] No.
[01:16:08.680 --> 01:16:13.400] It's also not the same thing as a climate crisis or climate change.
[01:16:13.400 --> 01:16:16.120] Climate justice is decision-making.
[01:16:16.200 --> 01:16:17.320] It's inherently political.
[01:16:17.320 --> 01:16:18.760] It's inherently political.
[01:16:18.760 --> 01:16:24.680] It's saying climate change affects different people disproportionately, and we want to mitigate those effects.
[01:16:24.920 --> 01:16:27.080] That's wildly different than just saying climate change exists.
[01:16:27.240 --> 01:16:28.360] It's happening, right?
[01:16:28.360 --> 01:16:29.240] Yeah, yeah.
[01:16:29.560 --> 01:16:30.280] All right.
[01:16:30.280 --> 01:16:34.840] Evan, you're going to give us that promised discussion of homeopathy.
[01:16:34.840 --> 01:16:36.280] Oh, yes, yes.
[01:16:36.280 --> 01:16:43.640] And I'm going to give it to you from, well, where I found it, because it came creeping across my feed recently.
[01:16:43.640 --> 01:16:45.960] Anytime homeopathy in the news does.
[01:16:45.960 --> 01:16:47.960] A magazine called Health.
[01:16:47.960 --> 01:16:53.240] I don't know if you guys remember Health magazine, seeing it on the rack in the supermarket or wherever, right?
[01:16:53.240 --> 01:16:55.000] Founded in 1981.
[01:16:55.000 --> 01:16:56.920] Back then it was called In Health.
[01:16:56.920 --> 01:17:01.240] And this magazine focuses primarily on women's health.
[01:17:01.560 --> 01:17:18.080] Common topics include improper diet, dealing with life issues such as weak relationships and growing stress, not to mention latest fashion and exclusive beauty tips, various healthy food recipes, and other related articles that encourage people to be happy and healthy.
[01:17:14.920 --> 01:17:18.400] Okay.
[01:17:19.280 --> 01:17:27.040] Yeah, magazine was around for a long time, still is, only it's all digital now, and they made that switch over in February of 2022.
[01:17:27.040 --> 01:17:32.240] They ended the print publication and it is digital, ones and zeros all the way.
[01:17:32.240 --> 01:17:36.000] So now we get to enjoy the website called health.com.
[01:17:36.000 --> 01:17:40.560] That's a pretty nice domain name to have if you can secure it, and they did.
[01:17:40.880 --> 01:17:48.000] But as you know, we can't judge a book by its cover, and you can't judge a magazine by its title or the pretty faces on it.
[01:17:48.000 --> 01:17:59.520] And even less so when your magazine is now 100% online, where a person might think a website called health.com might be a reliable source of good health information.
[01:17:59.520 --> 01:18:02.880] In fact, there's a news rating site called NewsGuard.
[01:18:02.880 --> 01:18:11.760] They rated the publication as reliable, noting that it generally avoids deceptive headlines and does not repeatedly publish false content.
[01:18:11.760 --> 01:18:13.120] Okay, all right.
[01:18:13.120 --> 01:18:15.920] Now, this sets up the background for this news item.
[01:18:16.160 --> 01:18:21.360] Headline reads: Everything you need to know about homeopathy and its safety.
[01:18:21.360 --> 01:18:23.280] Courtesyofhealth.com.
[01:18:23.280 --> 01:18:23.840] Okay.
[01:18:24.160 --> 01:18:33.040] I am sure, as we delve into this article, I mean, this article, homeopathy and its safety, it is going to warn the reader about homeopathy, right?
[01:18:33.040 --> 01:18:37.920] It's going to explain about the dubious nature of the claim, like cures like.
[01:18:37.920 --> 01:18:45.840] You know, where a molecule of something that makes you sick can be diluted out of existence, and yet magically its memory will remain to cure what's bad for you.
[01:18:45.840 --> 01:18:53.440] And the article, I'm sure, is going to speak to the implausibility of the suggested mechanisms that define all of homeopathy.
[01:18:53.440 --> 01:18:59.440] And I'm sure they're going to warn their readers about homeopathy can only delay legitimate treatments that actually work helping sick people.
[01:18:59.440 --> 01:19:00.600] Okay, here we go.
[01:19:01.000 --> 01:19:02.520] This was written by Dr.
[01:19:02.840 --> 01:19:08.520] I'm sorry, registered nurse, R-N, Sarah Jividin, J-I-V-I-D-E-N.
[01:19:08.520 --> 01:19:10.440] Sorry if I mispronounced that.
[01:19:10.440 --> 01:19:16.920] And it was reviewed, medically reviewed by Arno Kroener with the letters D-A-O-M.
[01:19:17.240 --> 01:19:18.680] Steve, you're MD, right?
[01:19:18.680 --> 01:19:21.480] That's your, and Kara, your Ph.D.?
[01:19:22.200 --> 01:19:22.680] Okay.
[01:19:22.680 --> 01:19:26.360] Do you ever heard of D-A-O-M for letters?
[01:19:26.360 --> 01:19:28.200] I know what D-O is.
[01:19:29.320 --> 01:19:34.520] Well, a D-A-O-M is a doctorate of acupuncture and oriental medicine.
[01:19:34.520 --> 01:19:35.080] Oh, no.
[01:19:35.160 --> 01:19:35.880] Oh, boy.
[01:19:36.040 --> 01:19:40.920] Described as a degree program that combines traditional Chinese medicine and biomedical concepts.
[01:19:40.920 --> 01:19:42.760] Okay, so there's your medical review.
[01:19:42.760 --> 01:19:43.960] Here we go.
[01:19:43.960 --> 01:19:53.720] Homeopathy, a holistic medical practice that involves treating people, I'm reading the article now, with highly diluted substances to trigger the body's natural healing responses.
[01:19:53.720 --> 01:19:58.280] It was developed in the late 18th century by German physician Samuel Hahnemann.
[01:19:58.280 --> 01:19:59.320] We know all that.
[01:19:59.320 --> 01:20:09.480] Homeopathy is based on the principle of like cures like, meaning a substance causing symptoms in a healthy person can, in minute amounts, treat similar symptoms in a sick person.
[01:20:09.480 --> 01:20:19.960] People might consider homeopathy as an alternative or complementary therapy due to its individualized approach and the belief that it stimulates the body's own healing processes.
[01:20:19.960 --> 01:20:25.000] Its effectiveness is a subject of ongoing debate and research.
[01:20:25.320 --> 01:20:25.800] Right?
[01:20:26.200 --> 01:20:27.000] That's the frame.
[01:20:28.840 --> 01:20:29.800] Right, here we go.
[01:20:29.800 --> 01:20:33.560] We're about to talk about some things that we just talked about again.
[01:20:34.040 --> 01:20:35.160] Homeopathy.
[01:20:35.160 --> 01:20:38.280] Its approach is based on two unconventional theories.
[01:20:38.280 --> 01:20:40.360] At least the unconventional got in there, fine.
[01:20:40.520 --> 01:20:42.440] But unconventional is not even accurate.
[01:20:42.440 --> 01:20:44.720] Again, that's horrible framing.
[01:20:44.720 --> 01:20:47.120] They're pseudoscientific principles.
[01:20:47.120 --> 01:20:50.560] They are magical principles, not unconventional.
[01:20:44.280 --> 01:20:51.680] Right.
[01:20:51.840 --> 01:20:58.720] So, Steve, you got to take your red pen to this and cross out the words and write in the correct ones here.
[01:20:58.720 --> 01:21:02.240] The law of similars and the law of infinitesimals, right?
[01:21:02.720 --> 01:21:03.840] Those nice little theories.
[01:21:04.400 --> 01:21:06.000] Infinite, say that again.
[01:21:06.000 --> 01:21:06.800] Infinitesimals.
[01:21:06.880 --> 01:21:08.080] Infinitesimals.
[01:21:08.080 --> 01:21:10.160] Infinitesimals.
[01:21:10.480 --> 01:21:10.960] Yeah.
[01:21:10.960 --> 01:21:13.440] The law of infinitesimals.
[01:21:13.440 --> 01:21:13.920] Yes.
[01:21:13.920 --> 01:21:15.920] So, first, the law of similars, right?
[01:21:16.400 --> 01:21:18.160] Well, like cures like.
[01:21:19.040 --> 01:21:24.400] Substance that causes symptoms in a healthy person can be used to treat similar symptoms in a sick person.
[01:21:24.400 --> 01:21:25.200] Yep.
[01:21:25.200 --> 01:21:30.080] And then the second law: extreme dilutions of substances.
[01:21:30.080 --> 01:21:34.480] The belief that the more a substance is diluted, the more potent it becomes.
[01:21:34.480 --> 01:21:34.880] How?
[01:21:34.880 --> 01:21:35.440] I mean, why?
[01:21:35.440 --> 01:21:35.840] I don't know.
[01:21:36.160 --> 01:21:36.560] Yeah.
[01:21:36.560 --> 01:21:40.960] These dilutions often reach a point where no molecules of the original substance remain.
[01:21:40.960 --> 01:21:45.600] It is believed that the solution retains a quote memory of the substance.
[01:21:45.600 --> 01:21:48.000] Yes, that magical memory again.
[01:21:48.320 --> 01:21:50.960] And homeopathic treatments are highly personalized.
[01:21:50.960 --> 01:21:51.680] Did you know that?
[01:21:51.680 --> 01:21:56.800] Practitioners consider a person's physical, emotional, and psychological state to choose the most appropriate remedy.
[01:21:56.800 --> 01:21:58.080] Doctors never do that.
[01:21:58.080 --> 01:22:00.880] Some people with the same condition receive different treatments.
[01:22:00.880 --> 01:22:08.560] Homeopathy abuse symptoms as expressions of the body's attempt to heal itself and treat the individual as a whole rather than focusing solely on the disease.
[01:22:08.560 --> 01:22:11.920] Steve, you only focus on the disease, okay?
[01:22:11.920 --> 01:22:14.400] You do not treat the individual as a whole, right?
[01:22:14.720 --> 01:22:15.200] How dare you?
[01:22:15.680 --> 01:22:17.040] How dare you do this?
[01:22:17.040 --> 01:22:24.800] And then they talk about the various things that it can be, you know, made from, in a sense, or extracted from, you know, but we won't go over that.
[01:22:24.800 --> 01:22:26.640] They kind of just give you a list of the things.
[01:22:26.640 --> 01:22:28.960] Now, here's what they do start talking about.
[01:22:29.440 --> 01:22:30.520] What is it good for?
[01:22:31.080 --> 01:22:38.600] Some studies and clinical trials suggest that homeopathy may be helpful or provide symptom relief for certain conditions, such as childhood diarrhea.
[01:22:38.600 --> 01:22:39.960] Adults, you're out of luck.
[01:22:40.200 --> 01:22:59.320] Ear infections, asthma, menopausal symptoms, sore muscles, colds and flu, allergies, but it might also help reduce symptoms of mental health conditions such as major depressive disorder, MDD, and premenstrual dysphoric disorder, PDD.
[01:22:59.640 --> 01:23:00.680] Hmm.
[01:23:01.560 --> 01:23:02.040] Okay.
[01:23:02.600 --> 01:23:03.880] I'm surprised.
[01:23:03.880 --> 01:23:09.080] It's probably an even longer list than all these things in reality, is what they claim.
[01:23:09.320 --> 01:23:10.360] Okay, research.
[01:23:10.360 --> 01:23:12.680] And so we're, you know, most of the way through the article.
[01:23:12.680 --> 01:23:14.920] And we're, you know, now we get to research and evidence.
[01:23:14.920 --> 01:23:19.640] Critics argue the benefits of homeopathy are primarily due to the placebo effect.
[01:23:19.640 --> 01:23:21.240] I don't even know if that's right.
[01:23:21.240 --> 01:23:22.040] Don't the major.
[01:23:22.520 --> 01:23:23.560] Yeah, like there are no benefits.
[01:23:23.800 --> 01:23:29.160] Yeah, by saying the benefits, guys, because then the next line is any therapeutic effects.
[01:23:29.160 --> 01:23:31.000] There are no therapeutic effects.
[01:23:31.000 --> 01:23:31.640] Right.
[01:23:31.640 --> 01:23:34.280] Yeah, if there are any, quote, therapeutic effect.
[01:23:34.520 --> 01:23:35.080] Yeah, you're right.
[01:23:35.080 --> 01:23:36.040] There are no therapeutic effects.
[01:23:36.040 --> 01:23:36.680] It's all non-beneficial.
[01:23:37.160 --> 01:23:40.040] It's all symptomatic, and it's all placebo.
[01:23:40.440 --> 01:23:51.720] Well, and beyond that, the same things that come anytime there is any clinical trial, for example, seeing a person who seems to care about you, who says, I am here to treat you.
[01:23:52.040 --> 01:23:54.440] Those things can have positive psychological benefits.
[01:23:55.880 --> 01:23:56.920] Placebo effects, by definition.
[01:23:57.080 --> 01:23:58.200] Yeah, they're non-specific effects.
[01:23:58.440 --> 01:23:59.800] Right, it doesn't have to be homeopathy.
[01:23:59.800 --> 01:24:01.000] It could be B-venom therapy.
[01:24:01.000 --> 01:24:02.200] It could be something else.
[01:24:02.200 --> 01:24:02.920] Exactly.
[01:24:03.320 --> 01:24:09.480] And then they start, okay, then they start to get into the organizations that are basically saying this is not recommended.
[01:24:09.480 --> 01:24:16.320] 2017, National Health Service in England recommended that healthcare providers do not prescribe homeopathy.
[01:24:15.000 --> 01:24:21.600] The NICE, National Institute of Healthcare, Excellence, does not recommend it either.
[01:24:22.080 --> 01:24:27.760] In fact, they report that homeopathic remedies can be unsafe and some may contain substances that interfere with medications.
[01:24:27.760 --> 01:24:29.360] We've talked about that before.
[01:24:29.360 --> 01:24:34.880] They further advise not using homeopathy for life-threatening illnesses or emergencies in place of vaccines.
[01:24:34.880 --> 01:24:36.400] Absolutely correct.
[01:24:36.720 --> 01:24:38.240] Safety and regulation.
[01:24:38.240 --> 01:24:42.960] According to the National Institutes of Health, there may be safety considerations related to homeopathy.
[01:24:42.960 --> 01:24:43.760] Absolutely.
[01:24:43.760 --> 01:24:54.000] Some homeopathic products labeled as highly diluted might contain significant amounts of active ingredients, and that could lead to potential side effects or drug interactions.
[01:24:54.400 --> 01:24:55.280] And they continue.
[01:24:55.280 --> 01:25:06.080] What would have been better in this article, I think, is if you took these, because they finally do address some of the concerns here, that should have been at the top of the article, right?
[01:25:06.080 --> 01:25:12.400] And basically say, look, here's what this is, and then we can go on and then describe it in that context, right?
[01:25:12.400 --> 01:25:13.440] Framing.
[01:25:13.440 --> 01:25:19.680] So that would have made this a more, somewhat more palatable article to me.
[01:25:19.680 --> 01:25:23.200] But instead, you know, they get right in there because I don't know.
[01:25:23.200 --> 01:25:28.160] I imagine there is a first paragraph bias among people who read articles and things.
[01:25:28.160 --> 01:25:29.280] They're going to, right?
[01:25:29.600 --> 01:25:30.960] And a headline bias, certainly.
[01:25:30.960 --> 01:25:31.840] They'll read a headline.
[01:25:31.840 --> 01:25:35.200] They might read the first couple of paragraphs to really kind of absorb what's going on.
[01:25:35.200 --> 01:25:35.680] They're a good deal.
[01:25:35.920 --> 01:25:39.600] And then by the time you get to paragraph 28, you don't remember that.
[01:25:39.600 --> 01:25:43.280] You kind of remember what was at the top of the article.
[01:25:43.280 --> 01:25:50.400] So I would have much rather seen those lead this article instead of kind of being, well, buried, frankly, towards the bottom.
[01:25:50.400 --> 01:26:00.440] That doesn't make up for the horrible reporting error for the bottom line conclusion was: homeopathy may be beneficial for minor health concerns or in conjunction with more conventional treatment.
[01:26:00.440 --> 01:26:02.520] That's their bottom-line recommendation.
[01:26:02.520 --> 01:26:03.000] Right.
[01:26:03.000 --> 01:26:05.160] Wait, when in conjunction with more?
[01:25:59.840 --> 01:26:06.680] That's like this is a balanced.
[01:26:07.080 --> 01:26:07.720] Yeah, that's part of the thing.
[01:26:07.800 --> 01:26:10.040] When eating with nutrition.
[01:26:10.040 --> 01:26:11.960] When eating with a new protricious breakfast.
[01:26:13.720 --> 01:26:14.840] Straight out of The Simpsons.
[01:26:14.840 --> 01:26:17.640] Yeah, but again, it's a perfect example of what I was talking about about framing.
[01:26:17.640 --> 01:26:20.600] They're framing it as it's a natural treatment.
[01:26:20.600 --> 01:26:24.200] I actually chose FDU because of the small classroom sizes.
[01:26:24.200 --> 01:26:26.040] That's something that was extremely important to me.
[01:26:26.040 --> 01:26:28.680] I wanted to have a connection and a relationship with my professors.
[01:26:28.680 --> 01:26:30.840] I didn't want to just be a number in a classroom.
[01:26:30.840 --> 01:26:34.680] I seized the moment at FDU and found my purpose.
[01:26:36.280 --> 01:26:45.880] Teaching is rewarding and exhausting, but with trustworthy, AI-powered tools from Cengage, you can manage the chaos and make a bigger impact on your students.
[01:26:45.880 --> 01:26:55.240] Our tools give you real-time insights into student performance, help you pinpoint challenges, and provide personalized, scalable support without adding to your workload.
[01:26:55.240 --> 01:26:58.680] It's innovation that makes a real difference for you and your students.
[01:26:58.680 --> 01:27:04.280] Visit Cengage.com to learn how to teach with less stress and more impact.
[01:27:05.560 --> 01:27:07.720] That helps the body heal itself.
[01:27:07.720 --> 01:27:09.960] It's used for all these things.
[01:27:09.960 --> 01:27:12.440] You know, people say it works, blah, blah, blah, blah.
[01:27:12.440 --> 01:27:19.480] And then deep buried, the critics and skeptics say that, you know, whatever, blah, blah, blah, evidence, logic, you know.
[01:27:19.480 --> 01:27:21.720] And then, but we recommend it at the end, right?
[01:27:21.720 --> 01:27:35.960] Then they come back to a recommendation for it rather than a skeptical framing, which I think is way more accurate and more meaningful, which is this is dangerous, pseudoscience from beginning to end.
[01:27:36.280 --> 01:27:38.200] There is no redeeming qualities.
[01:27:38.200 --> 01:27:39.720] It is not a matter of debate.
[01:27:39.720 --> 01:27:41.800] It does not require any further research.
[01:27:41.800 --> 01:27:43.560] You should not use it at all.
[01:27:43.560 --> 01:27:47.120] It is an expensive and wasteful distraction.
[01:27:44.680 --> 01:27:50.080] That's, I think, an accurate and fair framing.
[01:27:50.080 --> 01:27:56.000] And you could say that in a way that is not off-putting and doesn't sound like you're putting anybody down or anything.
[01:27:56.000 --> 01:27:57.200] It's just like, these are the facts.
[01:27:57.200 --> 01:27:59.360] These are the objective scientific facts.
[01:27:59.360 --> 01:28:01.440] But yet, this article is horrible.
[01:28:02.800 --> 01:28:03.920] It could be worse.
[01:28:05.680 --> 01:28:08.960] They did include the skepticism buried deep in the article.
[01:28:08.960 --> 01:28:12.000] So they get half a point for that.
[01:28:12.000 --> 01:28:14.480] But as you say, Evan, yeah, it's buried.
[01:28:14.480 --> 01:28:18.400] It's contradicted by other parts of the article.
[01:28:20.240 --> 01:28:22.400] Bottom line is it's terrible.
[01:28:22.400 --> 01:28:24.000] Health.com, everyone.
[01:28:24.000 --> 01:28:25.920] Beware the title.
[01:28:25.920 --> 01:28:27.440] Yeah, they get health.com.
[01:28:27.440 --> 01:28:28.240] That is terrible.
[01:28:28.240 --> 01:28:28.800] I know.
[01:28:28.800 --> 01:28:33.120] That's a certain injustice if I use the word for that, too.
[01:28:33.440 --> 01:28:33.840] All right.
[01:28:33.840 --> 01:28:34.720] Thanks, Evan.
[01:28:34.720 --> 01:28:37.200] Jay, it's who's that noisy time.
[01:28:37.200 --> 01:28:39.680] All right, guys, last week I played This Noisy.
[01:28:55.680 --> 01:28:57.600] That pause is there for a reason.
[01:28:57.600 --> 01:28:58.240] Is it?
[01:28:58.240 --> 01:28:58.800] Yes.
[01:28:58.800 --> 01:29:03.440] So I, every week, I try to figure out what will people guess, you know?
[01:29:03.440 --> 01:29:07.040] And I thought of a couple of things I thought people would guess, and people did guess them.
[01:29:07.040 --> 01:29:08.080] And I'll tell you what they are.
[01:29:08.080 --> 01:29:12.160] So a listener named Brian Sinkowski.
[01:29:12.480 --> 01:29:13.520] Sinkowski's.
[01:29:13.520 --> 01:29:13.920] There it is.
[01:29:13.920 --> 01:29:14.880] He gave me the phonetics.
[01:29:14.880 --> 01:29:15.920] Thank you, Brian.
[01:29:15.920 --> 01:29:16.960] He said, Hi, Jay.
[01:29:16.960 --> 01:29:20.080] I'm sure you've heard this guess already, but this sounds like a rainstick.
[01:29:20.080 --> 01:29:21.840] He said his college roommate had one.
[01:29:21.840 --> 01:29:22.800] Of course, he did.
[01:29:22.800 --> 01:29:29.120] Everybody was getting high, and the rainstick gets pulled out, along with that head scratcher thing made out of metal.
[01:29:29.200 --> 01:29:31.400] Looks like a hanger, a bunch of hangers.
[01:29:31.640 --> 01:29:39.160] Anyway, I knew people were going to guess rainstick, and this absolutely does sound exactly like a rainstick, but it's not a rainstick.
[01:29:39.160 --> 01:29:43.960] The next listener who guessed, this is Tor Tor Strasburg.
[01:29:43.960 --> 01:29:48.360] He said, Hi, this week's Noisy Sounds Like the Quarter Dispenser at the Laundromat.
[01:29:48.360 --> 01:29:50.040] Put in a bill and get quarters.
[01:29:50.040 --> 01:29:51.960] This could be for 20 bucks.
[01:29:51.960 --> 01:29:52.920] Kind of.
[01:29:52.920 --> 01:29:58.760] I mean, you know, it's usually louder and a little bit more high-pitched, but not a horrible guess.
[01:29:58.760 --> 01:30:01.480] I have another guesser here named Edward Myers.
[01:30:01.480 --> 01:30:02.440] Edward says, Hi, Jay.
[01:30:02.920 --> 01:30:05.960] This week, the noisy sounds like dominoes toppling.
[01:30:05.960 --> 01:30:07.480] I'm not sure what color they are.
[01:30:07.480 --> 01:30:09.800] Well, if you could tell that by the sound.
[01:30:09.800 --> 01:30:13.560] Sure, it has a domino kind of effect, but that's not the answer.
[01:30:13.560 --> 01:30:17.240] Emily Markwell said, a sperm whale using echolocation.
[01:30:17.240 --> 01:30:18.120] Whoa.
[01:30:18.120 --> 01:30:19.560] That's a cool guess.
[01:30:19.560 --> 01:30:23.640] I mean, I love echolocation and I love whales, but that is not correct.
[01:30:23.640 --> 01:30:25.960] I'm going to give you one more guess here.
[01:30:25.960 --> 01:30:29.480] Another listener named Micah Swindles said, Hi, Jay.
[01:30:29.720 --> 01:30:33.400] Micah, here from, dude, you know I can't pronounce that.
[01:30:33.400 --> 01:30:36.520] Wayne Wainuomate.
[01:30:36.520 --> 01:30:39.000] Waynui Omata in New Zealand.
[01:30:39.000 --> 01:30:40.920] Wai Nui Omata.
[01:30:40.920 --> 01:30:43.480] Okay, he gave me the phonetic Wai Nui omata.
[01:30:43.480 --> 01:30:45.240] Okay, if you say so.
[01:30:45.480 --> 01:30:49.960] I've seen lots of different videos online recently of big hailstones falling.
[01:30:49.960 --> 01:30:52.920] Multiple people guessed this one, by the way.
[01:30:52.920 --> 01:30:56.360] So he says he thinks it's hailstones, but he doesn't know what they're falling on.
[01:30:56.360 --> 01:30:57.560] That's not a horrible guess either.
[01:30:57.560 --> 01:30:58.200] I like that guess.
[01:30:58.200 --> 01:31:00.120] I think hailstones are cool, but that is not it.
[01:31:00.120 --> 01:31:01.160] Okay, I have a winner.
[01:31:01.160 --> 01:31:09.080] The winner is Jake Bellevoyne, and he says, Hey, all this week's noisy, I believe, is bobbin lace making.
[01:31:09.080 --> 01:31:11.880] The noise is cool, but the video of this art is awesome.
[01:31:11.880 --> 01:31:13.240] Thanks for everything you do.
[01:31:13.240 --> 01:31:15.600] Okay, so how do I describe this?
[01:31:15.760 --> 01:31:37.120] Imagine a bunch of metal, I'm sorry, a bunch of wood dowels that are, let's say they're about five to six inches long, and they have like a bobbin, kind of like a bulbous end on them, kind of weigh them down, and they all are connected to the textile that lace is made out of, right?
[01:31:37.760 --> 01:31:49.200] Each one of them has a line on it, and you use those dowels to move the different threads to make the weave, and there's a lot of them.
[01:31:49.200 --> 01:31:57.680] And the people who do this are so good at it that they are ripping through these dowels with the threads attached to them super fast.
[01:31:57.680 --> 01:32:00.640] Toss them two to the left, two to the right, one under, one over, blah, blah, blah, blah.
[01:32:00.720 --> 01:32:01.200] You know what I mean?
[01:32:01.200 --> 01:32:02.080] It goes super quick.
[01:32:02.080 --> 01:32:04.240] You got to look it up and watch the video.
[01:32:04.240 --> 01:32:10.800] It looks like the person's having like, you know, a neurological incident going on, but they're not because it's all real.
[01:32:10.800 --> 01:32:13.280] They're really doing it and their hands are moving that fast.
[01:32:13.280 --> 01:32:17.920] So let's listen to that again as you can hopefully visualize my explanation.
[01:32:25.280 --> 01:32:35.600] So the noise you're hearing is them whipping those pieces of wood around that have the line attached to them, moving them back and forth and to the right and over and under and all this stuff.
[01:32:35.600 --> 01:32:36.960] It's very complicated.
[01:32:36.960 --> 01:32:45.600] And when you look up and you see where all the threads are going and you see the lace that's partially created, you're like, wow, like really complicated.
[01:32:45.600 --> 01:32:51.040] You have to have a lot of dexterity, probably an enormous amount of patience to just learn how to do that.
[01:32:51.040 --> 01:32:53.600] Anyway, very cool thing that people do.
[01:32:53.600 --> 01:32:57.920] I'm sure machines do it now, but it's still wonderful that people do it with their hands.
[01:32:57.920 --> 01:32:59.280] Anyway, good job, everybody.
[01:32:59.280 --> 01:33:01.480] Lots of fun guesses in there this week.
[01:33:01.720 --> 01:33:03.720] I have a new noisy for you guys this week.
[01:33:03.720 --> 01:33:09.560] This noisy was sent in by a listener named Daniel Berliner, Berliner, and here it is.
[01:33:24.600 --> 01:33:35.640] All right, guys, if you think you know this week's noisy or if you heard something cool, there's only one thing you got to do: email me at wtn at the skepticsguide.org.
[01:33:35.800 --> 01:33:39.320] Guys, as you know, we are recording our 1,000th show this weekend.
[01:33:39.320 --> 01:33:41.880] Tickets are sold, everything is done.
[01:33:42.360 --> 01:33:44.520] The show is just about to happen.
[01:33:44.520 --> 01:33:51.400] But if you would like to show your support for the SGU and the work that we've done, and you couldn't make it to the 1,000th show, become a patron.
[01:33:51.400 --> 01:33:51.960] Please do.
[01:33:51.960 --> 01:33:53.880] Please consider becoming a patron.
[01:33:53.880 --> 01:34:00.440] Not only do we appreciate your support, but all the contributions that we get definitely help us expand on what we do.
[01:34:00.440 --> 01:34:08.200] And on top of that, if you are a patron at the $5 level or higher, you will have access to this five-hour show.
[01:34:08.200 --> 01:34:09.400] You will have access to it.
[01:34:09.400 --> 01:34:12.120] You'll have access to the video, the audio, everything.
[01:34:12.120 --> 01:34:13.480] Both live and recorded.
[01:34:13.720 --> 01:34:15.560] All be there for you, right, Steve?
[01:34:15.560 --> 01:34:15.880] Yep.
[01:34:16.200 --> 01:34:18.040] Anyway, guys, we're looking forward to this weekend.
[01:34:18.040 --> 01:34:19.720] Congratulations to everybody, guys.
[01:34:19.720 --> 01:34:20.280] We did it.
[01:34:20.280 --> 01:34:20.760] We did it.
[01:34:20.760 --> 01:34:21.560] Yay.
[01:34:21.880 --> 01:34:22.840] Great job, Jay.
[01:34:22.840 --> 01:34:23.400] Made it to it.
[01:34:23.640 --> 01:34:23.960] Yep.
[01:34:23.960 --> 01:34:25.160] Made it to a thousand.
[01:34:25.160 --> 01:34:30.040] And I reserve the right to amend that compliment after the show.
[01:34:31.320 --> 01:34:34.120] But I won't, because I'm sure it's going to be kick-ass.
[01:34:34.120 --> 01:34:37.320] All right, guys, let's go on with science or fiction.
[01:34:39.560 --> 01:34:44.120] It's time for science or fiction.
[01:34:48.800 --> 01:34:56.400] Each week, I come up with three science news items or facts: two genuine and one fictitious, and I challenge my panel of skeptics to tell me which one is the fake.
[01:34:56.400 --> 01:34:58.240] We got three regular news items.
[01:34:58.240 --> 01:34:59.120] You guys ready?
[01:34:59.120 --> 01:34:59.520] Yep.
[01:34:59.920 --> 01:35:11.840] Item number one: a new analysis concludes that large language model AIs do not display any truly emergent abilities and therefore do not pose a novel risk.
[01:35:11.840 --> 01:35:21.440] I number two, researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:35:21.760 --> 01:35:31.760] And item number three, scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity.
[01:35:31.760 --> 01:35:33.040] Kara, go first.
[01:35:33.040 --> 01:35:40.000] A new analysis concludes that large language model AI do not display any truly emergent abilities.
[01:35:40.000 --> 01:35:41.520] Okay, can you define that?
[01:35:41.520 --> 01:35:46.080] So an emergent ability is something that was not specifically trained or programmed or whatever.
[01:35:46.080 --> 01:35:47.760] They just sort of develop this.
[01:35:47.760 --> 01:35:54.960] They extend the boundaries of what they're capable of doing, sort of as an emergent complexity rises.
[01:35:55.120 --> 01:35:56.480] You couldn't have predicted it.
[01:35:56.800 --> 01:36:00.000] Oh, I thought that was kind of the point, but maybe not.
[01:36:00.000 --> 01:36:02.320] So therefore, do not pose a novel risk.
[01:36:02.320 --> 01:36:11.920] Like they could run away potentially with the things that they've been fed and new combinations and I guess permutations of the inputs.
[01:36:12.160 --> 01:36:18.720] They can, you know, solve problems better, but they're not going to, like, I guess, achieve this novel echelon.
[01:36:18.720 --> 01:36:21.120] They're not going to run away and start murdering people.
[01:36:21.120 --> 01:36:21.520] Okay.
[01:36:21.520 --> 01:36:27.360] Researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:36:28.160 --> 01:36:28.560] Okay.
[01:36:28.880 --> 01:36:40.600] Scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity.
[01:36:40.600 --> 01:36:46.840] Okay, so the question with this one is these routine biological markers, you may or may not be able to answer.
[01:36:47.000 --> 01:36:49.320] Blood tests, you know, or x-rays or whatever.
[01:36:49.640 --> 01:36:54.840] So you're talking about people after they are not talking about things that make them more.
[01:36:55.000 --> 01:37:04.120] No, no, this is just diagnosing that somebody has it based upon kind of laboratory workup that we would normally do in medicine.
[01:37:05.080 --> 01:37:12.840] So I thought that we were sort of not sure why people get long COVID.
[01:37:13.160 --> 01:37:16.360] 90% sensitivity and specificity.
[01:37:16.600 --> 01:37:18.760] I mean, that would be a game changer.
[01:37:18.760 --> 01:37:25.640] Like if we could just diagnose it, know immediately this is long COVID, develop treatments for it, have this class of people.
[01:37:25.640 --> 01:37:29.480] But I thought this was more of one of those things where it's like, is it long COVID?
[01:37:29.480 --> 01:37:29.960] Is it not?
[01:37:29.960 --> 01:37:31.400] How do we define it?
[01:37:31.400 --> 01:37:35.400] Is it actually, you know, this, this, is it POTS?
[01:37:35.400 --> 01:37:36.040] Is it this?
[01:37:36.040 --> 01:37:38.120] Is it a constellation of factors?
[01:37:38.120 --> 01:37:42.520] So yeah, that one's bugging me as the, I think that one might be the fiction.
[01:37:42.520 --> 01:37:50.520] I don't think that long COVID is a quote, I mean, it is a disease, but I think, you know, how there's, we often make this distinction, right?
[01:37:50.520 --> 01:38:01.400] Maybe you can help me with the terminology between like a constellation of symptoms, like a disorder, and a specific disease, like, you know, you catch an infection or this, this genetic thing changed.
[01:38:01.560 --> 01:38:04.600] So, yeah, I think that one might be the fiction because of that.
[01:38:05.240 --> 01:38:06.600] Okay, Jay.
[01:38:06.600 --> 01:38:15.040] Okay, the first one about the large language models and that they don't have emergent behavior.
[01:38:13.400 --> 01:38:16.880] Truly emergent behavior.
[01:38:14.600 --> 01:38:17.680] I mean, whatever.
[01:38:19.040 --> 01:38:23.680] I'm not sure what Steve is using that word for, but they do have emergent behavior.
[01:38:24.000 --> 01:38:29.760] You know, as far as I understand, so I just can't see how that could be science right there.
[01:38:29.760 --> 01:38:31.280] Bob says truly, though.
[01:38:31.280 --> 01:38:32.400] Yeah, that's tough.
[01:38:32.800 --> 01:38:34.400] Let me go on to the next one.
[01:38:34.400 --> 01:38:39.200] The Insight Mars lander concluded that the mid-crust of Mars is saturated with liquid water.
[01:38:39.200 --> 01:38:40.960] Yeah, I think this one is science.
[01:38:40.960 --> 01:38:47.040] You know, we've been searching for water for so long, and I know that we've been drilling and all sorts of things.
[01:38:47.040 --> 01:38:50.480] And I wouldn't be surprised if I found out that this is the truth there.
[01:38:50.480 --> 01:38:57.360] And then the last one, scientists have demonstrated that a set of routine biological markers can diagnose long COVID.
[01:38:57.360 --> 01:38:59.840] A routine set of biological markers.
[01:38:59.840 --> 01:39:01.440] I wonder what the hell that is.
[01:39:01.440 --> 01:39:03.600] It's 90% specific.
[01:39:04.400 --> 01:39:06.400] All right, that's interesting.
[01:39:06.400 --> 01:39:09.360] Yeah, I'm going to say that the first one is fiction.
[01:39:09.840 --> 01:39:15.120] I'm fairly certain that large AI models have emergent capabilities.
[01:39:15.120 --> 01:39:16.080] Okay, Bob.
[01:39:16.480 --> 01:39:17.440] This is tough.
[01:39:18.240 --> 01:39:24.880] The Mars one, I buy the Mars one, but the routine biological markers for long COVID, man.
[01:39:25.200 --> 01:39:26.320] That one's tough.
[01:39:26.320 --> 01:39:30.320] I would think it would be something more extensive.
[01:39:30.320 --> 01:39:31.360] So there's no level.
[01:39:31.440 --> 01:39:37.040] So the implication is that there's no real genetic assessment going on.
[01:39:37.040 --> 01:39:42.560] It's just, right, are they actually looking at the level of the genome or is it just...
[01:39:42.720 --> 01:39:44.080] You said blood tests.
[01:39:44.080 --> 01:39:44.720] Yeah, root.
[01:39:44.800 --> 01:39:49.200] Yeah, it's not going to be like fancy genet egg testing because that wouldn't be routine.
[01:39:49.200 --> 01:39:49.680] Right.
[01:39:49.680 --> 01:39:51.360] I just wanted to make that clear.
[01:39:51.360 --> 01:40:01.080] But the thing with the with number one here, the large language models, and the truly emergent behavior, that the truly is, to me, potentially a key word.
[01:40:01.720 --> 01:40:03.960] So then how do you define emergent?
[01:40:03.960 --> 01:40:14.360] You know, and if you if you define truly emergent as something really, you know, really dramatic and emergent, that's truly.
[01:40:14.680 --> 01:40:22.440] And so the other lowly, merely minimally emergent behaviors don't really count because they're not truly emergent.
[01:40:22.760 --> 01:40:27.240] You know, it's just like, I don't know, I'm just like picking away at this needlessly.
[01:40:28.120 --> 01:40:37.240] But I remember GPT-40 when that came out, I mean, I was reading some scientists' assessments, and they're like, yeah, this was not anticipated.
[01:40:37.240 --> 01:40:40.040] Some of these behaviors were not anticipated.
[01:40:41.160 --> 01:40:45.880] And back then, you know, they were saying that, yeah, this definitely is emergent.
[01:40:45.880 --> 01:40:55.960] But now that it's kind of taken time, time has passed, they look deeper, maybe it's really, they can explain it, and it's not really emergent.
[01:40:55.960 --> 01:41:01.320] So that wouldn't surprise me because I haven't taken a deep dive in that in quite a while, many, many months.
[01:41:01.320 --> 01:41:03.640] So who knows what they could have discovered?
[01:41:03.640 --> 01:41:09.240] Ah, I think I'll go with, so I can kind of explain the emergent abilities.
[01:41:09.240 --> 01:41:17.000] So then I'll have to default then to the routine biological markers detecting long COVID with CARA.
[01:41:17.000 --> 01:41:17.160] Okay.
[01:41:17.800 --> 01:41:18.760] And Evan.
[01:41:20.200 --> 01:41:21.800] Yeah, all the same reasons.
[01:41:21.800 --> 01:41:31.960] Everyone else said I don't think I have anything really much to add here, except that the mid-crust of Mars may not be saturated with liquid water.
[01:41:31.960 --> 01:41:35.480] It could be a marshmallow or cream filling instead.
[01:41:35.640 --> 01:41:37.320] That could be the fiction for that one.
[01:41:37.320 --> 01:41:39.160] You didn't think about that.
[01:41:39.160 --> 01:41:41.400] Otherwise, everything else was expressed.
[01:41:41.400 --> 01:41:42.360] So, okay, right.
[01:41:42.360 --> 01:41:44.680] Is it going to be the AI or is it going to be the COVID one?
[01:41:44.960 --> 01:41:45.680] I don't know.
[01:41:47.040 --> 01:41:50.080] Again, emergent, truly emergent, right?
[01:41:50.080 --> 01:41:50.720] So, hmm.
[01:41:51.120 --> 01:41:54.640] And therefore, do not pose a novel risk.
[01:41:54.640 --> 01:42:06.160] So, you kind of got two pieces in this one about the AI that could either throw it off, or if one's dependent on the other, bring it all in one direction, if you know what I mean.
[01:42:06.160 --> 01:42:10.800] But I will choose Karab, I'll go with you guys.
[01:42:10.800 --> 01:42:11.840] I'll join the COVID.
[01:42:11.840 --> 01:42:14.240] Jay, I almost almost went AI.
[01:42:14.240 --> 01:42:16.640] It was really practically a coin flip.
[01:42:16.640 --> 01:42:18.240] So, I'm sorry to leave you there.
[01:42:18.240 --> 01:42:19.840] All right, so we'll start with number two.
[01:42:19.840 --> 01:42:27.440] Researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:42:27.440 --> 01:42:33.360] You all think this one is science, and this one is science.
[01:42:33.360 --> 01:42:34.720] Is this science?
[01:42:34.720 --> 01:42:37.920] So, yes, saturated, saturated with water.
[01:42:37.920 --> 01:42:39.040] Now, why do they think this?
[01:42:39.680 --> 01:42:41.680] They were sort of examining data.
[01:42:41.680 --> 01:42:50.560] Basically, this is like seismic data, and they're looking at how the seismic waves move through the crust.
[01:42:50.560 --> 01:42:51.680] And when they interpret.
[01:42:52.000 --> 01:42:54.560] What causes the seismic waves in this case?
[01:42:54.880 --> 01:42:58.320] I guess there are Marsquakes, and they measure them.
[01:42:59.200 --> 01:43:10.800] So, essentially, when they interpret the data, they say the best explanation for the data that we have is that the mid-crust is saturated with liquid water.
[01:43:10.800 --> 01:43:18.480] So, it means the upper crust has dried out, but there's still a substantial amount of water in the mid-crust.
[01:43:18.480 --> 01:43:22.560] The mid-crust is 11.5 to 20 kilometers down.
[01:43:22.880 --> 01:43:33.800] So, that's pretty deep, but it means if we could drill 12 kilometers down from the surface on Mars, we might have access to liquid water, right?
[01:43:33.800 --> 01:43:34.760] Basically, a well.
[01:43:29.680 --> 01:43:36.520] That's still pretty deep, though.
[01:43:36.840 --> 01:43:38.680] Yeah, quite deep.
[01:43:38.680 --> 01:43:39.240] All right.
[01:43:39.240 --> 01:43:41.080] Which one should we go to next?
[01:43:41.080 --> 01:43:42.760] Let's go back to number one.
[01:43:42.760 --> 01:43:50.920] A new analysis concludes that large language model AIs do not display any truly emergent abilities and therefore do not pose a novel risk.
[01:43:50.920 --> 01:43:52.760] Jay, you think this one is the fiction.
[01:43:52.760 --> 01:43:54.920] Everyone else thinks this one is science.
[01:43:54.920 --> 01:44:04.840] What if I told you that truly emergent is as opposed to apparently emergent, right?
[01:44:04.840 --> 01:44:12.040] So not minimally emergent, but just that it appeared to be emergent, but it really isn't.
[01:44:12.040 --> 01:44:14.120] Yeah, like they figured, yeah, they figured it out.
[01:44:14.360 --> 01:44:21.240] Because if you can figure out how it arose, really, in a sense, then it's a current definition.
[01:44:21.320 --> 01:44:23.000] Does that change your choice at all?
[01:44:23.000 --> 01:44:23.720] If I said that?
[01:44:23.720 --> 01:44:24.760] But yeah, potentially.
[01:44:25.400 --> 01:44:26.040] Yeah.
[01:44:26.360 --> 01:44:31.480] Well, I mean, I wouldn't necessarily switch, but that makes sense.
[01:44:31.480 --> 01:44:35.560] Well, this one is science.
[01:44:35.560 --> 01:44:36.600] Sorry, Jay.
[01:44:37.480 --> 01:44:45.640] Yeah, so what they found was that the behavior that was thought to be emergent actually isn't.
[01:44:45.640 --> 01:44:58.360] That it says our findings suggest that purported emergent abilities are not truly emergent, but result from a combination of in-context learning, model memory, and linguistic knowledge.
[01:44:58.360 --> 01:45:05.640] So the in-context learning is basically, training on a specific question, right, or a specific body of knowledge.
[01:45:05.640 --> 01:45:12.840] So, the point of this was to understand how these models work and how they perform, right?
[01:45:12.840 --> 01:45:19.040] So, understanding how it gets to the behaviors that it displays was sort of the point.
[01:45:14.200 --> 01:45:25.440] And they're saying, Yeah, this is, when you look at it, it's not going outside the boundaries of what is already there.
[01:45:25.440 --> 01:45:28.000] It is not a truly new emergent behavior.
[01:45:28.000 --> 01:45:36.880] It is just doing what it does in the context of this, you know, again, this context learning model, memory, et cetera, explains everything.
[01:45:37.440 --> 01:45:50.240] And then the bit about therefore doesn't pose a novel risk, that was more spin, but that just basically means it doesn't pose the risk that you would think truly emergent behavior does because that's completely unpredictable, right?
[01:45:50.240 --> 01:46:00.880] It's meaning because it's something truly emergent by definition you can't predict because it's not extrapolated from what it already could do, right?
[01:46:01.200 --> 01:46:06.560] But of course, that's the part that the news releases went with.
[01:46:06.560 --> 01:46:10.640] And then, when I went to the study, it wasn't about that at all, you know?
[01:46:10.960 --> 01:46:18.800] So, like, the news release says AI poses no existential threat to humanity, new study finds.
[01:46:19.120 --> 01:46:24.720] The study says, are emergent abilities and large language models just in context learning?
[01:46:24.720 --> 01:46:26.160] That was their headline.
[01:46:26.160 --> 01:46:27.680] Completely different, you know?
[01:46:28.160 --> 01:46:28.800] What?
[01:46:28.960 --> 01:46:30.000] So always the case.
[01:46:30.560 --> 01:46:31.360] Yeah, amazing.
[01:46:31.680 --> 01:46:36.640] I'm really hating those titles even more than I used to.
[01:46:36.640 --> 01:46:55.600] All right, this means that scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity is the fiction because the truth is, what the study shows is that there are no biological markers that could tell you if you have long COVID or not.
[01:46:55.600 --> 01:46:57.760] We have no markers for long COVID.
[01:46:58.080 --> 01:46:58.720] So, how do we?
[01:46:59.200 --> 01:47:01.720] It's a clinical diagnosis, as you were saying, right?
[01:47:01.720 --> 01:47:02.840] It's a syndrome or whatever.
[01:47:02.920 --> 01:47:03.640] It's just a bunch of symptoms.
[01:46:59.920 --> 01:47:04.840] Syndrome, that's what it is.
[01:47:04.920 --> 01:47:13.080] It's a clinical diagnosis, and it's not, we don't understand the pathophysiology, and therefore there are no pathophysiological markers, right?
[01:47:13.080 --> 01:47:13.640] Like the functionality.
[01:47:13.800 --> 01:47:15.800] But that doesn't mean that we won't in the future.
[01:47:15.800 --> 01:47:21.480] It doesn't, but there are certainly entities that have remained in that category for decades.
[01:47:21.480 --> 01:47:22.680] So, you know what I mean?
[01:47:22.920 --> 01:47:26.360] It also means that then the boundaries are always going to be blurry, right?
[01:47:27.080 --> 01:47:28.840] Like, this person might, this something.
[01:47:29.080 --> 01:47:34.360] It's like if we rule out everything else, this is maybe what's going on, but we don't really know.
[01:47:34.360 --> 01:47:37.720] I mean, it may just be that there's nothing pathological to find.
[01:47:37.720 --> 01:47:45.880] You know, sometimes there are post-infectious syndromes where people have symptoms, and it's probably based on the fact that their nerves are functioning differently.
[01:47:45.880 --> 01:47:48.760] It's a functional, you know, problem at that point.
[01:47:48.760 --> 01:47:56.200] It's like, yeah, at a cellular level, at a neurological level, a metabolic level, things are a bit off, but there's nothing measurable, right?
[01:47:56.200 --> 01:47:59.320] There's no pathophysiological smoking gun.
[01:47:59.320 --> 01:48:02.120] And therefore, it will always be in that category.
[01:48:02.120 --> 01:48:06.680] There's never going to be a blood test for like a migraine, for example.
[01:48:07.080 --> 01:48:07.480] Yeah.
[01:48:07.480 --> 01:48:07.960] But yes.
[01:48:08.120 --> 01:48:16.440] I think it's sad, though, because there are certain things that, like, if you say, if you describe your migraine and it's like follows the textbook, it's like, yeah, you have a migraine.
[01:48:16.520 --> 01:48:17.000] It could be very similar to the same thing.
[01:48:17.160 --> 01:48:28.920] With long COVID, very similar to like fibromyalgia, very similar to, like, a lot of people just aren't believed, they aren't really listened to, they're sort of minimized, but they don't feel well.
[01:48:28.920 --> 01:48:31.720] And they know that they don't feel well, and they're trying to get answers.
[01:48:31.880 --> 01:48:45.840] But I will say, I do think, even over the course of my career, over the last 20 years, the sort of acceptance of these diagnoses, diagnoses, and the skepticism towards patients who have them, I think, has improved.
[01:48:46.160 --> 01:48:53.600] Meaning, it's basically accepted that, yeah, people have these syndromes, even though we can't find any pathological smoking gun.
[01:48:53.600 --> 01:48:56.480] And they're just as real, you know, and whatever.
[01:48:56.480 --> 01:48:57.680] We need to take them seriously.
[01:48:57.680 --> 01:49:03.920] We need to, you know, basically being thinking that it's all in the patient's head or that they're faking doesn't get you anywhere.
[01:49:03.920 --> 01:49:08.480] It's not really accurate, and it's just an unwarranted negative judgment.
[01:49:08.480 --> 01:49:13.520] Yeah, and that may be the trend when you look at these things systemically.
[01:49:13.520 --> 01:49:21.040] But I can tell you, being a psychologist in a healthcare setting, every, not every, but many women that I talked to had experiences.
[01:49:21.280 --> 01:49:21.520] Totally.
[01:49:21.520 --> 01:49:23.120] It is more targeted towards women.
[01:49:23.120 --> 01:49:23.760] Absolutely.
[01:49:23.760 --> 01:49:27.440] More targeted towards women, more targeted towards people of color, where they were not believed.
[01:49:27.440 --> 01:49:33.440] Like very often, you know, patients with cancer diagnoses, it's caught later in women, it's caught later in people of color.
[01:49:33.440 --> 01:49:38.400] Heart attacks are caught later in women because they present atypically and they're not believed as much.
[01:49:38.400 --> 01:49:38.720] Absolutely.
[01:49:39.200 --> 01:49:41.840] And these are things that we do have measurements for.
[01:49:41.840 --> 01:49:43.520] So now you talk about something where there are no measurements.
[01:49:43.680 --> 01:49:46.080] They're just not doing the test when they should.
[01:49:46.080 --> 01:49:47.520] But in this case, there isn't a test.
[01:49:47.520 --> 01:49:48.880] It's all clinical syndrome.
[01:49:49.120 --> 01:49:53.200] And so you can imagine how much harder it is to be heard and believed when that's the case.
[01:49:53.200 --> 01:49:53.600] Yeah.
[01:49:53.600 --> 01:49:54.480] It's a bummer.
[01:49:54.480 --> 01:49:56.640] Yeah, we have to constantly remind ourselves of this fact.
[01:49:56.640 --> 01:50:01.440] This is like, you know, part of being a good clinician is sort of beating all of your biases out of yourself.
[01:50:01.440 --> 01:50:02.000] You know what I mean?
[01:50:02.000 --> 01:50:05.120] You really, those cultural biases are not good.
[01:50:05.840 --> 01:50:10.960] They do interfere with your ability to objectively and accurately assess your patient.
[01:50:12.480 --> 01:50:12.960] All right.
[01:50:12.960 --> 01:50:14.160] Evan, give us a quote.
[01:50:14.480 --> 01:50:17.760] It is important to promote a scientific attitude.
[01:50:17.760 --> 01:50:22.640] This is much more important than really understanding all concepts about science.
[01:50:22.640 --> 01:50:30.280] If you understand how science works and you develop a scientific attitude towards the world, you are going to be a skeptic.
[01:50:30.600 --> 01:50:32.520] Natalia Pasternak.
[01:50:29.520 --> 01:50:35.720] She's a microbiologist, an author, a science communicator.
[01:50:36.040 --> 01:50:40.680] She's from Brazil, but she'll be with us in Las Vegas in October at Sycon.
[01:50:41.080 --> 01:50:41.960] Yeah, she's wonderful.
[01:50:41.960 --> 01:50
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 9: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 3 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
caught later in women because they present atypically and they're not believed as much.
[01:49:38.400 --> 01:49:38.720] Absolutely.
[01:49:39.200 --> 01:49:41.840] And these are things that we do have measurements for.
[01:49:41.840 --> 01:49:43.520] So now you talk about something where there are no measurements.
[01:49:43.680 --> 01:49:46.080] They're just not doing the test when they should.
[01:49:46.080 --> 01:49:47.520] But in this case, there isn't a test.
[01:49:47.520 --> 01:49:48.880] It's all clinical syndrome.
[01:49:49.120 --> 01:49:53.200] And so you can imagine how much harder it is to be heard and believed when that's the case.
[01:49:53.200 --> 01:49:53.600] Yeah.
[01:49:53.600 --> 01:49:54.480] It's a bummer.
[01:49:54.480 --> 01:49:56.640] Yeah, we have to constantly remind ourselves of this fact.
[01:49:56.640 --> 01:50:01.440] This is like, you know, part of being a good clinician is sort of beating all of your biases out of yourself.
[01:50:01.440 --> 01:50:02.000] You know what I mean?
[01:50:02.000 --> 01:50:05.120] You really, those cultural biases are not good.
[01:50:05.840 --> 01:50:10.960] They do interfere with your ability to objectively and accurately assess your patient.
[01:50:12.480 --> 01:50:12.960] All right.
[01:50:12.960 --> 01:50:14.160] Evan, give us a quote.
[01:50:14.480 --> 01:50:17.760] It is important to promote a scientific attitude.
[01:50:17.760 --> 01:50:22.640] This is much more important than really understanding all concepts about science.
[01:50:22.640 --> 01:50:30.280] If you understand how science works and you develop a scientific attitude towards the world, you are going to be a skeptic.
[01:50:30.600 --> 01:50:32.520] Natalia Pasternak.
[01:50:29.520 --> 01:50:35.720] She's a microbiologist, an author, a science communicator.
[01:50:36.040 --> 01:50:40.680] She's from Brazil, but she'll be with us in Las Vegas in October at Sycon.
[01:50:41.080 --> 01:50:41.960] Yeah, she's wonderful.
[01:50:41.960 --> 01:50:43.400] I had her on my show a while back.
[01:50:43.400 --> 01:50:48.920] She was kind of responsible for like a lot of pushback against Bolsonaro's COVID policy.
[01:50:48.920 --> 01:50:49.240] Wow.
[01:50:49.240 --> 01:50:51.160] I think she made huge waves in Brazil.
[01:50:51.160 --> 01:50:51.960] That's awesome.
[01:50:51.960 --> 01:50:52.520] Yeah.
[01:50:52.520 --> 01:50:58.200] Yeah, we're looking forward to meeting all these people in Vegas and seeing all their talks.
[01:50:58.200 --> 01:50:58.760] Yeah.
[01:50:58.760 --> 01:51:01.720] I know we're always busy, but I do try to get to some of the good ones.
[01:51:01.720 --> 01:51:02.120] Definitely.
[01:51:02.120 --> 01:51:04.040] There'll be a lot of good ones this year.
[01:51:04.040 --> 01:51:04.680] All right, guys.
[01:51:04.680 --> 01:51:06.920] Well, thank you all for joining me this week.
[01:51:06.920 --> 01:51:07.800] You're welcome, brother.
[01:51:07.960 --> 01:51:08.200] Thanks, guys.
[01:51:08.280 --> 01:51:09.000] Thanks, Doctor.
[01:51:09.000 --> 01:51:13.480] And until next week, this is your Skeptic's Guide to the Universe.
[01:51:15.720 --> 01:51:22.440] Skeptic's Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:51:22.440 --> 01:51:27.080] For more information, visit us at theskepticsguide.org.
[01:51:27.080 --> 01:51:31.000] Send your questions to info at the skepticsguide.org.
[01:51:31.000 --> 01:51:41.720] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:51:41.720 --> 01:51:45.080] Our listeners and supporters are what make SGU possible.
[01:51:52.760 --> 01:51:55.800] You don't have to wait for a milestone or holiday.
[01:51:55.800 --> 01:51:56.920] Celebrate now.
[01:51:56.920 --> 01:52:04.200] Take a pause from the everyday and allow yourself the gift of wellness and rejuvenation in a truly awe-inspiring setting.
[01:52:04.200 --> 01:52:12.200] Guided nature hikes, wellness classes, evening entertainment, and locally sourced cuisine, all included in your stay package.
[01:52:12.200 --> 01:52:18.400] Feel worlds away and experience for yourself why Mohawk Mountain House was voted number one resort.
[01:52:18.400 --> 01:52:23.040] Now is the perfect time to plan your getaway at mohonk.com.
Prompt 10: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 11: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.240 --> 00:00:05.680] I'm no tech genius, but I knew if I wanted my business to crush it, I needed a website now.
[00:00:05.680 --> 00:00:07.680] Thankfully, Bluehost made it easy.
[00:00:07.680 --> 00:00:12.400] I customized, optimized, and monetized everything exactly how I wanted with AI.
[00:00:12.400 --> 00:00:14.160] In minutes, my site was up.
[00:00:14.160 --> 00:00:15.200] I couldn't believe it.
[00:00:15.200 --> 00:00:18.240] The search engine tools even helped me get more site visitors.
[00:00:18.240 --> 00:00:21.680] Whatever your passion project is, you can set it up with Bluehost.
[00:00:21.680 --> 00:00:24.720] With their 30-day money-back guarantee, what do you got to lose?
[00:00:24.720 --> 00:00:26.400] Head to bluehost.com.
[00:00:26.400 --> 00:00:30.800] That's B-L-U-E-H-O-S-T.com to start now.
[00:00:33.680 --> 00:00:36.960] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.960 --> 00:00:39.920] Your escape to reality.
[00:00:40.560 --> 00:00:43.120] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:43.120 --> 00:00:48.080] Today is Tuesday, August 13th, 2024, and this is your host, Stephen Novella.
[00:00:48.080 --> 00:00:49.840] Joining me this week are Bob Novella.
[00:00:49.840 --> 00:00:50.400] Hey, everybody.
[00:00:50.400 --> 00:00:51.840] Kara Santa Maria.
[00:00:51.840 --> 00:00:52.240] Howdy.
[00:00:52.320 --> 00:00:53.280] Jay Novella.
[00:00:53.280 --> 00:00:53.840] Hey, guys.
[00:00:53.840 --> 00:00:55.520] And Evan Bernstein.
[00:00:55.520 --> 00:00:56.640] Good evening, everyone.
[00:00:56.640 --> 00:01:06.560] We're recording a day early because we are going to Chicago this weekend to do the extravaganza and to record the 1,000th episode.
[00:01:06.560 --> 00:01:10.080] So when this show goes out, there we will.
[00:01:10.080 --> 00:01:14.880] That will be extravaganza day, and the day after the show goes out, we'll be recording the 1,000th episode.
[00:01:14.880 --> 00:01:18.000] So it's probably too late to get tickets.
[00:01:19.520 --> 00:01:20.960] Yeah, but try.
[00:01:20.960 --> 00:01:25.280] So, Jay, today is International Left-Handedness Day.
[00:01:25.280 --> 00:01:25.760] Yeah.
[00:01:26.080 --> 00:01:27.200] And you're left-handed.
[00:01:27.200 --> 00:01:28.480] So what am I supposed to do?
[00:01:28.480 --> 00:01:30.240] Like, let's really think about this.
[00:01:30.240 --> 00:01:32.720] How do I celebrate my left-handedness?
[00:01:33.440 --> 00:01:35.120] Like, what am I supposed to do?
[00:01:35.120 --> 00:01:37.360] You're supposed to give it a day off.
[00:01:37.360 --> 00:01:38.480] The rest of us are right-handed.
[00:01:38.480 --> 00:01:40.000] We have no idea how you think.
[00:01:41.280 --> 00:01:44.720] Go to the left-handed store, buy a bunch of left-handed items.
[00:01:45.200 --> 00:01:48.560] I'm a left-handed yank, drive on the left side of the road suddenly.
[00:01:48.880 --> 00:01:50.400] I kind of like being left-handed.
[00:01:51.600 --> 00:01:52.160] I don't know.
[00:01:52.160 --> 00:01:52.800] Jay.
[00:01:54.080 --> 00:02:00.440] Jay, I say this not to make you feel uncomfortable because this is you are like my brother, so this is separate to you.
[00:01:59.760 --> 00:02:02.840] I am weirdly attracted to left-handed people.
[00:02:02.840 --> 00:02:03.160] Really?
[00:02:03.160 --> 00:02:03.640] That is weird.
[00:01:59.840 --> 00:02:04.760] I agree with you.
[00:01:59.920 --> 00:02:05.160] That is weird.
[00:02:05.480 --> 00:02:06.520] Has anybody else?
[00:02:06.600 --> 00:02:07.480] Have you noticed that?
[00:02:07.480 --> 00:02:08.280] And how do you know?
[00:02:08.680 --> 00:02:13.640] I've noticed that I think I date a higher-than-normal proportion of lefties.
[00:02:13.720 --> 00:02:15.400] You're sure you're not just that's interesting, Kara.
[00:02:15.640 --> 00:02:20.040] I'm just wondering what other traits come along with being so in latter.
[00:02:20.200 --> 00:02:20.520] Sinister.
[00:02:20.600 --> 00:02:20.920] Sinister.
[00:02:21.160 --> 00:02:21.880] Sinistra.
[00:02:21.880 --> 00:02:22.440] Sinistra.
[00:02:22.520 --> 00:02:23.000] Sorry, Kara.
[00:02:23.000 --> 00:02:24.920] I didn't mean to kink shame you there, Kara.
[00:02:24.920 --> 00:02:25.240] Thank you.
[00:02:25.640 --> 00:02:31.560] What's funny is that that was an answer in a recent New York Times crossword puzzle.
[00:02:31.560 --> 00:02:32.200] Kinkshame.
[00:02:32.200 --> 00:02:33.320] I know, kink shame.
[00:02:33.320 --> 00:02:41.480] And my wife had never heard the word before, and she doubted it was real until I was proven correct when we finished the puzzle.
[00:02:41.800 --> 00:02:43.960] We got that one right away.
[00:02:44.920 --> 00:02:52.520] What I'm not sure about is why would some people decide, like, hey, we're going to push to have like a national international left-handed today?
[00:02:52.520 --> 00:02:57.720] Like, you know, I wasn't teased or anything, so I don't need the support aspect of it.
[00:02:57.720 --> 00:02:58.680] I just don't get it.
[00:02:58.680 --> 00:03:01.000] Maybe people were back in the day a lot more.
[00:03:01.000 --> 00:03:02.520] I mean, a lot of people were forced to be right-handed.
[00:03:04.120 --> 00:03:08.200] I think when we were young, we were already past the anti-left-handed phase.
[00:03:08.360 --> 00:03:14.120] That was still, I remember when we were young, talking about, oh, in the past, people would force you to write right-handed and whatever, but that was already done.
[00:03:14.120 --> 00:03:14.680] Yeah.
[00:03:14.680 --> 00:03:28.760] But to be fair, even though in the past you were, you know, lefties were forced to write right-handed, Jay, you're still, just by virtue of the structures that you live in, forced to be right-handed about certain things, right?
[00:03:28.760 --> 00:03:30.120] I don't think so.
[00:03:30.120 --> 00:03:32.120] I mean, look, I have to use like scissors.
[00:03:32.280 --> 00:03:33.640] You have to use right-handed scissors.
[00:03:33.960 --> 00:03:38.520] I can cut with my right hand because most left-handed people learn to accommodate.
[00:03:38.840 --> 00:03:40.280] That's exactly what I mean.
[00:03:40.280 --> 00:03:45.760] There are things in this world that are handed, and you just have to learn how to use them.
[00:03:45.760 --> 00:03:49.360] Yeah, I mean, I could use both hands for lots of different things.
[00:03:50.880 --> 00:03:51.920] I mean, I.
[00:03:52.000 --> 00:03:54.080] I think that's even more likely to be ambidextrous.
[00:03:54.400 --> 00:03:55.920] I can shoot a gun both ways.
[00:03:55.920 --> 00:03:58.480] I can swing rackets both ways.
[00:04:00.000 --> 00:04:01.520] I can't golf both ways.
[00:04:02.640 --> 00:04:04.160] No swing, no backswing.
[00:04:04.160 --> 00:04:06.240] And I golf right-handed, by the way.
[00:04:06.880 --> 00:04:07.440] Oh, yeah?
[00:04:07.840 --> 00:04:09.520] But you shoot bows left-handed.
[00:04:09.520 --> 00:04:11.840] I shoot bows left-handed, but I could shoot the other way.
[00:04:11.840 --> 00:04:13.280] I could shoot just as good the other way.
[00:04:13.680 --> 00:04:20.640] So you probably can't remember, but I'm curious about y'all's experiences because Jay's the baby.
[00:04:20.640 --> 00:04:32.400] You know, my best friend has a little girl who's about to turn two, and it was obvious so early that she was left, that she is going, we say going to be, but whatever, that she is left-handed.
[00:04:32.400 --> 00:04:39.280] She just does things with her left hand that seem weird to us, but clearly are very natural for her.
[00:04:39.280 --> 00:04:40.400] When did you know?
[00:04:40.400 --> 00:04:43.760] Like, did you guys notice that Jay was left-handed when he was a little baby?
[00:04:43.760 --> 00:04:48.080] The first indication was when I was probably about three or four years old.
[00:04:48.080 --> 00:04:50.080] I was sucking my left thumb.
[00:04:50.480 --> 00:04:51.200] Oh, three or four.
[00:04:51.360 --> 00:04:53.360] That develops in the womb, thumb sucking.
[00:04:53.680 --> 00:04:55.280] I probably was doing that even earlier.
[00:04:56.000 --> 00:04:59.200] It emerges between 18 months and four years old, so that's not a natural.
[00:04:59.440 --> 00:05:02.960] Yeah, this little girl is using her spoon with her left hand.
[00:05:03.440 --> 00:05:08.720] If you hand her a crayon and she goes to like take crayon to paper, she always grabs it with her left hand.
[00:05:08.880 --> 00:05:11.040] Will she cross the midline with her left hand?
[00:05:11.040 --> 00:05:11.520] I don't know.
[00:05:11.520 --> 00:05:12.080] We should check that.
[00:05:12.160 --> 00:05:12.800] That's the test.
[00:05:13.200 --> 00:05:13.440] Okay.
[00:05:13.760 --> 00:05:14.560] Kara for sure.
[00:05:14.560 --> 00:05:16.720] Now that I think about it, we always knew.
[00:05:16.720 --> 00:05:17.200] Yeah.
[00:05:17.200 --> 00:05:17.440] Yeah.
[00:05:17.440 --> 00:05:19.760] My mom and I always knew there was something different.
[00:05:19.760 --> 00:05:39.640] And so, did you know that, okay, I think, and Steve, maybe you can fact-check me on this, or somebody can Google furiously, that left-handedness occurs in about 10% of the population, and that about 10% of those lefties have their language center of their brain lateralized to the right.
[00:05:40.280 --> 00:05:40.680] Something like that.
[00:05:40.680 --> 00:05:41.080] It might even be.
[00:05:41.240 --> 00:05:42.360] It's something like 10 of 10.
[00:05:42.360 --> 00:05:43.800] Yeah, it might not be a perfect number.
[00:05:44.280 --> 00:05:47.320] So, I wonder, Jay, have you ever had like an fMRI?
[00:05:47.320 --> 00:05:48.360] I don't think about this.
[00:05:48.360 --> 00:05:50.680] No, I've had MRIs, but nothing.
[00:05:50.760 --> 00:05:53.720] But they're not like a brain, a brain scan that shows function.
[00:05:53.720 --> 00:05:55.560] I would be so curious.
[00:05:56.040 --> 00:05:58.200] The estimates are between 15 and 30 percent.
[00:05:58.520 --> 00:05:59.000] Oh, wow, that's right.
[00:05:59.240 --> 00:06:01.880] Hemits are dominant from language in left-handed people.
[00:06:01.880 --> 00:06:04.760] And it's like 2 to 5% in right-handed people.
[00:06:04.760 --> 00:06:06.120] Oh, that's much higher than I thought.
[00:06:06.120 --> 00:06:06.920] That's amazing.
[00:06:06.920 --> 00:06:07.400] Yeah.
[00:06:07.400 --> 00:06:08.280] Yeah, that's so good.
[00:06:08.440 --> 00:06:09.000] We have to know that.
[00:06:09.000 --> 00:06:15.720] Like, when you do a neurological exam, you're supposed to ask and record the handedness of the person for a couple of reasons.
[00:06:15.720 --> 00:06:24.360] One is when you're interpreting the exam, like they should be a little bit better with their dominant hand and their non-dominant hand on some of the things we have them do.
[00:06:24.360 --> 00:06:31.000] And if it's reversed, that could mean that they have like a stroke, you know, like that they're impaired with what should be their dominant hand.
[00:06:31.000 --> 00:06:39.960] But also, when we're localizing like a stroke, for example, you have to know if there is a reasonable probability that they're cross-dominant, right?
[00:06:39.960 --> 00:06:41.080] That their right hammers are dominant.
[00:06:41.240 --> 00:06:45.000] Yeah, you need to know functionally how those things are going to affect them then.
[00:06:45.000 --> 00:06:54.520] And also, from a neuropsych perspective, the neuropsych assessments that are given, handedness matters for a lot of the motor assessments because they're normed.
[00:06:54.520 --> 00:07:04.280] And so, if you don't know about their handedness and you look at the norms tables with an assumption that somebody is right-handed, you may misinterpret their scoring.
[00:07:04.280 --> 00:07:05.840] Yeah, no, it's interesting.
[00:07:05.480 --> 00:07:06.160] Yeah.
[00:07:06.440 --> 00:07:14.480] And, Evan, before we started the show, you were asking about the genetics of handedness, and the short answer is that it's complicated.
[00:07:14.040 --> 00:07:17.600] It is not a one-gene, one-trait thing.
[00:07:17.760 --> 00:07:26.800] It's not even necessarily just like it's a multi-I mean, it's definitely multi-gene, but it's nothing we could predict based upon the genes because it's partly developmental, too.
[00:07:26.800 --> 00:07:27.280] Yeah.
[00:07:27.280 --> 00:07:27.600] Right?
[00:07:27.840 --> 00:07:31.920] Meaning that it's not 100% determined just by the genes.
[00:07:31.920 --> 00:07:34.960] So there's no simple inheritance pattern as the bottom line.
[00:07:34.960 --> 00:07:38.080] Like for Jay, everyone else in our family is right-handed.
[00:07:38.080 --> 00:07:40.640] Jay's an isolated left-hander, you know.
[00:07:40.640 --> 00:07:43.440] And Jay, what were you saying about you and your wife and your kids?
[00:07:43.440 --> 00:07:47.040] Yeah, so my wife is left-handed, but my kids are both right-handed.
[00:07:47.040 --> 00:07:48.480] That's fascinating.
[00:07:48.480 --> 00:07:59.520] Yeah, children of two left-handed parents are still more likely to be right-handed, even though they're more likely to be left-handed than people born of right-handed parents.
[00:07:59.760 --> 00:08:03.440] So there is an increased probability, but it's still the minority.
[00:08:03.760 --> 00:08:16.960] Oh, and even just recently, like there was a study that was literally just published in April of this year where a rare genetic variant was identified that presents in fewer than 1% of people.
[00:08:16.960 --> 00:08:22.720] But in those people, they're 2.7 times more likely to be in lefties than righties.
[00:08:22.960 --> 00:08:25.760] So it's like they're still identifying the complex genetics.
[00:08:25.760 --> 00:08:26.160] Yeah.
[00:08:26.480 --> 00:08:38.320] And identical twins, they're more likely a higher chance of matching their handedness, but they can have opposite-handedness, which tells you it's not 100% genetic, otherwise, it would have to be the same every time.
[00:08:38.320 --> 00:08:39.200] But it's not.
[00:08:39.200 --> 00:08:41.520] And usually they also would look with twin studies, right?
[00:08:41.520 --> 00:08:51.920] You want to look at those really interesting studies where they're monozygotic, but they were raised separately and see if that varies based on raised separately versus raised together.
[00:08:51.920 --> 00:08:54.120] So that's left-handedness, Dang, for you.
[00:08:53.920 --> 00:08:54.280] Huh.
[00:08:54.560 --> 00:08:55.200] Yeah.
[00:08:55.200 --> 00:08:58.160] Wasn't it Flanders?
[00:08:58.160 --> 00:08:59.360] Was he left-handed?
[00:08:59.360 --> 00:08:59.880] Yes.
[00:08:59.880 --> 00:09:01.560] Ned Flanders was the seasons.
[00:09:01.640 --> 00:09:02.520] Yeah, he had the leftorian.
[00:09:02.680 --> 00:09:03.240] The left-handed sword.
[00:09:03.400 --> 00:09:04.120] Yeah, the left-orium.
[00:08:59.520 --> 00:09:05.160] The leftorian.
[00:09:05.480 --> 00:09:08.520] I think there's one of those in like at Universal Studios here.
[00:09:09.720 --> 00:09:10.440] Like a left-handed.
[00:09:10.600 --> 00:09:13.560] With legitimate left-hand devices for purchase.
[00:09:13.880 --> 00:09:15.240] Oh, my gosh.
[00:09:15.560 --> 00:09:19.000] Don't quote me on that, but I'm pretty sure there is, or maybe it's at Disney.
[00:09:19.320 --> 00:09:21.320] I wouldn't be surprised.
[00:09:21.640 --> 00:09:35.160] Jay, the thing that's always, to me, would seem like this would be the one that the thing that would annoy me most about living in a right-handed world, if you were left-handed, is that when you write, you have to push your letters rather than pull them.
[00:09:35.480 --> 00:09:38.920] And you have to move your hand over what you've already written.
[00:09:38.920 --> 00:09:40.280] Yeah, so you're smudging.
[00:09:40.440 --> 00:09:42.040] Side of your hand's always smudgy.
[00:09:42.040 --> 00:09:42.600] Always.
[00:09:42.600 --> 00:09:47.000] Always get blue on my pinky and palm, like side of my palm.
[00:09:47.000 --> 00:09:48.440] And I have sloppy handwriting.
[00:09:48.440 --> 00:09:49.560] I think a lot of lefties do.
[00:09:49.560 --> 00:09:52.200] It's very difficult to write neatly.
[00:09:52.200 --> 00:09:54.040] That said, you know, I don't care.
[00:09:54.040 --> 00:09:54.520] Yeah.
[00:09:54.520 --> 00:09:56.520] Well, you use a keyboard more than you write most likely.
[00:09:56.760 --> 00:09:57.880] No, it's not as big a deal.
[00:09:58.200 --> 00:10:00.120] I love left-handed dominant keyboards.
[00:10:00.440 --> 00:10:02.120] I love writing with pen and paper, though.
[00:10:02.120 --> 00:10:04.920] I love using pencils and pens and whatever.
[00:10:05.240 --> 00:10:06.600] I've always loved to do that.
[00:10:06.600 --> 00:10:10.040] I love to draw, even though I don't draw well, but I do it all the time.
[00:10:10.040 --> 00:10:15.240] Are you one of those people, Jay, who left-handers who writes above the words rather than below?
[00:10:15.240 --> 00:10:15.480] Oh.
[00:10:15.800 --> 00:10:16.760] Oh, interesting.
[00:10:16.760 --> 00:10:17.320] That's weird.
[00:10:17.320 --> 00:10:17.960] I remember that.
[00:10:17.960 --> 00:10:18.520] What do you mean?
[00:10:19.320 --> 00:10:22.360] Like, when I write, my hand is below the sentence I'm writing.
[00:10:22.360 --> 00:10:27.480] But I've seen left-handers who hold their hand above the sentence that they're writing.
[00:10:27.560 --> 00:10:28.600] Mine's parallelist a little bit.
[00:10:28.680 --> 00:10:33.080] They bend their wrist down that they're looking at from the top rather than just being from below.
[00:10:33.720 --> 00:10:34.840] Yeah, mine's parallel.
[00:10:34.840 --> 00:10:35.400] Oh, my gosh.
[00:10:35.400 --> 00:10:37.480] Okay, so I had a question come up.
[00:10:38.120 --> 00:10:42.920] I had a question come up, and I just Googled it, and BBC Science Focus covered it.
[00:10:42.920 --> 00:10:50.400] I was wondering: are there more left-handed people in cultures where they write right to left?
[00:10:50.720 --> 00:10:54.080] Because it's Arabic, Hebraic, yeah, exactly.
[00:10:54.400 --> 00:11:01.920] And apparently, according to this article written, BBC Science Focus by Louis Viazon, quite the reverse.
[00:11:01.920 --> 00:11:12.560] Various surveys have found that the highest incidence of left-handedness is in Western countries with about 13% of the populations in the USA, Canada, and the Netherlands.
[00:11:12.560 --> 00:11:14.080] The UK is just behind it.
[00:11:14.080 --> 00:11:22.640] Most of the countries that use right to left are predominantly Asian and Arabic, and they have left-handedness rates below 6%.
[00:11:22.640 --> 00:11:29.440] So it says in Muslim countries, the advantage of smudge-free handwriting is outweighed by the fact that the left-hand is considered unclean.
[00:11:29.440 --> 00:11:30.320] Right, it's cultural.
[00:11:30.640 --> 00:11:32.880] Yeah, fascinating.
[00:11:32.880 --> 00:11:34.720] I love learning new things.
[00:11:34.720 --> 00:11:35.200] No, you do.
[00:11:36.560 --> 00:11:40.080] We didn't even know until 20 minutes ago that it was left-handed, frankly.
[00:11:40.640 --> 00:11:41.840] And here we are.
[00:11:42.160 --> 00:11:43.360] Here we are.
[00:11:43.360 --> 00:11:44.000] Learning.
[00:11:44.320 --> 00:11:52.640] I wonder if that also has to do with that they killed the left-handed people more in the past and selected against it.
[00:11:53.280 --> 00:11:54.160] Those little genetic.
[00:11:54.240 --> 00:11:56.480] And then also just trained out of it culturally.
[00:11:57.040 --> 00:11:59.440] But that wouldn't really affect the genetics of it.
[00:12:00.640 --> 00:12:02.560] Yeah, if it's still not accepted.
[00:12:02.560 --> 00:12:12.480] But I'm saying, like, throughout history, because this has been true in many cultures, even in Western cultures, that being left-handed could be a sign of the devil or the sign of whatever.
[00:12:12.480 --> 00:12:16.960] And they were more likely to be persecuted, more likely to be punished and executed.
[00:12:16.960 --> 00:12:27.920] And I wonder if there's just different cultural differences in how aggressively they slaughtered left-handed people actually affects the probability of being, yeah, the incident of right-handed versus left-handed.
[00:12:28.240 --> 00:12:33.640] Yeah, it makes you wonder if that history wasn't there, would it be more like half-half?
[00:12:33.640 --> 00:12:36.520] Or, you know, what would be the variation?
[00:12:29.760 --> 00:12:37.160] All right.
[00:12:37.400 --> 00:12:41.160] Jay, you're going to get us started with a discussion of a new scam.
[00:12:41.160 --> 00:12:41.560] All right.
[00:12:41.560 --> 00:12:46.440] So I picked a particular topic of so many.
[00:12:46.440 --> 00:12:48.200] I was just talking to Bob before the show.
[00:12:48.200 --> 00:12:54.760] I was reading about a scam on elderly people to rob them of their retirement, which I'll talk about in another show.
[00:12:54.760 --> 00:13:02.840] But I'm going to talk about election-based scams and how they manifest and how they could affect just average people.
[00:13:02.840 --> 00:13:07.800] So one thing that they do, you're probably aware that they send phishing emails.
[00:13:07.800 --> 00:13:12.120] They'll send emails to they create websites that appear like official websites.
[00:13:12.120 --> 00:13:18.280] They might ask for personal information like your social security number, your voter registration details.
[00:13:18.280 --> 00:13:23.800] They might say to you that they want to confirm that you're registered or are you eligible to vote.
[00:13:23.800 --> 00:13:26.360] They're just trying to get personal information from you.
[00:13:26.360 --> 00:13:28.680] And then with all of these, I'm going to read to you.
[00:13:28.680 --> 00:13:34.120] You got to be careful and make sure that you're on the actual official websites.
[00:13:34.120 --> 00:13:37.640] This is hard because they can make the website look exactly like the original.
[00:13:37.960 --> 00:13:38.680] Never click a link.
[00:13:38.920 --> 00:13:39.960] Go there directly.
[00:13:39.960 --> 00:13:43.560] Yeah, so you need to be very, very fastidious about this.
[00:13:44.840 --> 00:13:46.680] They have fake donation pages.
[00:13:47.320 --> 00:13:49.320] This is very common.
[00:13:49.320 --> 00:13:52.040] Like we get a lot of texts asking for money.
[00:13:52.360 --> 00:13:56.600] Scammers will set up these fake donation websites for political campaigns.
[00:13:56.600 --> 00:13:58.520] And the pages look very convincing.
[00:13:58.520 --> 00:14:00.520] They use official-looking logos.
[00:14:00.520 --> 00:14:02.520] They might be using the exact real logos.
[00:14:02.520 --> 00:14:04.840] They can just download them off the other websites.
[00:14:04.840 --> 00:14:08.840] And they're going to try to get them to you through text and email, social media.
[00:14:08.840 --> 00:14:19.680] The money that they collect does not go to anything but the scammer and it's not going to the political cause and you cannot get taxed on that, which you normally can for other types of donations.
[00:14:19.680 --> 00:14:22.720] They have voter suppression tactics.
[00:14:22.720 --> 00:14:39.040] Some scams are aimed to simply suppress voter turnout by spreading misinformation, things about voting procedures, incorrect polling locations, fake deadlines, misleading information about your ID, voter ID requirements.
[00:14:39.280 --> 00:14:41.680] These are all out there just to confuse people.
[00:14:41.680 --> 00:14:46.160] Again, go to your state or town website, find the information yourself.
[00:14:46.560 --> 00:14:49.120] Don't let people send you the information.
[00:14:49.120 --> 00:14:50.640] Go get it yourself.
[00:14:50.880 --> 00:14:53.040] Phone scams pretending to be pollsters.
[00:14:53.040 --> 00:14:54.480] This is very common.
[00:14:54.480 --> 00:15:03.920] Scammers will call people pretending to be conducting political polls, and during that call, they might ask you for sensitive information saying it's necessary for the poll.
[00:15:03.920 --> 00:15:05.040] Make sure it's unique.
[00:15:05.040 --> 00:15:08.080] They don't double, you know, ask people questions, all that stuff.
[00:15:08.080 --> 00:15:09.440] It's all BS.
[00:15:09.760 --> 00:15:15.360] They're just collecting your personal data and they want to commit identity fraud on you.
[00:15:15.360 --> 00:15:18.560] The text message scams, I get a ton of these.
[00:15:18.560 --> 00:15:28.720] Scammers will send text messages that they're pretending to be from, again, you know what I'm going to say, political campaigns, websites could be from the person themselves.
[00:15:28.720 --> 00:15:34.560] It could be Trump or Harris that is going to be texting you directly with your name.
[00:15:34.560 --> 00:15:37.680] Hey, Steve, blah, blah, blah, send me money, right?
[00:15:37.680 --> 00:15:38.960] You got to be really careful.
[00:15:38.960 --> 00:15:49.600] If you want to donate money to a political campaign, just make sure you go to the official websites, do some searching yourself, ask, you know, chat GPT, whatever, but do, you know, do the research yourself.
[00:15:49.600 --> 00:15:52.800] Don't respond to things that people send you.
[00:15:52.800 --> 00:15:56.080] And then fake voting assistance services.
[00:15:56.400 --> 00:16:06.920] Scammers may offer services that are there to assist you with voting, such as helping you filling out absentee ballots, offering to deliver the ballots on your behalf, blah, blah, blah.
[00:16:07.000 --> 00:16:12.760] These services are typically fraudulent and they may steal your vote and personal information.
[00:16:12.760 --> 00:16:14.120] Imagine that.
[00:16:14.120 --> 00:16:19.800] You find out that they cast a ballot for the exact opposite person you want to be in office.
[00:16:19.800 --> 00:16:30.200] So to protect yourself from these scams, election scams, other scams, it's important to verify any election-related communication through the actual official channels.
[00:16:30.200 --> 00:16:34.520] Going to the state websites is a good idea, or your town websites.
[00:16:34.520 --> 00:16:35.880] Verify the links.
[00:16:35.880 --> 00:16:37.320] Just be fastidious.
[00:16:37.320 --> 00:16:42.520] The other thing is, never provide personal information in response to unsolicited requests.
[00:16:42.520 --> 00:16:44.040] Ever, ever.
[00:16:44.040 --> 00:16:59.000] If your social security number is in play, if they want bank information, if they want anything that feels like they shouldn't need it in order to do what they're doing, make a huge red flag go up and do not do it.
[00:16:59.000 --> 00:17:00.280] Always call back.
[00:17:00.280 --> 00:17:04.360] Tell them, hey, I'm going to call the official phone number on the website.
[00:17:04.360 --> 00:17:06.520] If they give you grief about that, it's a scam.
[00:17:06.520 --> 00:17:07.160] You know what I mean?
[00:17:07.320 --> 00:17:09.000] You just got to be really careful.
[00:17:09.000 --> 00:17:11.080] And then, this is really important.
[00:17:11.080 --> 00:17:17.080] And I've said this before: tell everybody in your life, especially old people, what I just said.
[00:17:17.080 --> 00:17:18.360] All of this stuff.
[00:17:18.360 --> 00:17:24.040] Because statistically, the older people are, the more likely they are to fall for these scams.
[00:17:24.040 --> 00:17:29.880] And it's just really easy for you to call up your mom or your dad or whoever and just be like, hey, guess what?
[00:17:30.280 --> 00:17:31.240] You got to watch out.
[00:17:31.240 --> 00:17:34.200] People are, you know, we're in election season here.
[00:17:34.200 --> 00:17:37.400] People are going to be soliciting you for personal information.
[00:17:37.720 --> 00:17:39.480] They could do lots of different things to you.
[00:17:39.480 --> 00:17:44.120] Just don't respond to anything and say things like, hey, if you want to donate any money, I'll do it for you.
[00:17:44.120 --> 00:17:45.920] Let me know and I'll take care of it for you.
[00:17:45.920 --> 00:17:46.320] All right.
[00:17:46.320 --> 00:17:47.520] Thank you, Jay.
[00:17:47.520 --> 00:17:49.920] All right, Kara, tell us about childhood vaccines.
[00:17:49.920 --> 00:17:50.880] Are these a good thing?
[00:17:51.520 --> 00:17:52.000] I don't know.
[00:17:52.000 --> 00:17:52.880] What are you going to say?
[00:17:53.680 --> 00:17:55.040] I might have an opinion on that.
[00:17:55.440 --> 00:17:56.240] Maybe a little one.
[00:17:56.480 --> 00:18:13.120] So the CDC just released in their Morbidity and Mortality Weekly report a report on August 8th titled Health and Economic Benefits of Routine Childhood Immunizations in the Era of the Vaccines for Children program, United States, 1994 to 2023.
[00:18:13.120 --> 00:18:29.040] So if you didn't know, since 1994 here in the United States, there's been a program called Vaccines for Children, VFC, which has covered the cost of vaccines for children if their families would not otherwise have been able to afford the vaccines.
[00:18:29.040 --> 00:18:44.560] And so this study looked at the health benefits and the economic impact of routine immunization among both VFC eligible and non-eligible children born between 1994 and 2023.
[00:18:44.560 --> 00:18:48.160] They looked at nine different childhood vaccines.
[00:18:48.160 --> 00:18:52.000] Ooh, am I going to know what these are based on their initialisms?
[00:18:52.000 --> 00:18:52.560] Let's see.
[00:18:53.440 --> 00:18:54.400] Thank you so much.
[00:18:54.400 --> 00:18:54.720] Help.
[00:18:55.040 --> 00:18:56.560] You're welcome, Bob.
[00:18:57.200 --> 00:19:07.360] I think this is diphtheria, tetanus, and pertussis, so DTP or DTAP, HIB, Haemophilus influenza type B.
[00:19:07.360 --> 00:19:08.800] I think that's what that is.
[00:19:08.800 --> 00:19:11.600] OPVIPV.
[00:19:11.600 --> 00:19:16.400] That is oral and inactivated poliovirus vaccine.
[00:19:16.720 --> 00:19:23.760] Also, MMR, measles, mumps, rubella, hepatitis B, VAR.
[00:19:23.760 --> 00:19:25.040] Oh, that's Varicella.
[00:19:25.040 --> 00:19:26.640] Oh, I'm so jealous.
[00:19:26.960 --> 00:19:29.360] I wish we had Varicella vaccines.
[00:19:29.360 --> 00:19:31.400] Yeah, we all had to get the chicken pox.
[00:19:31.880 --> 00:19:32.920] And then Rhoda.
[00:19:32.920 --> 00:19:36.120] So that's the Rhoda virus, I'm assuming.
[00:19:36.680 --> 00:19:39.080] So, what do you think they found?
[00:19:41.000 --> 00:19:41.800] It helped, right?
[00:19:41.800 --> 00:19:44.120] No, but let's look at the magnitude of these findings.
[00:19:44.120 --> 00:19:50.040] And then I want to talk a little bit about something that is very worrisome based on a recent Gallup poll.
[00:19:50.360 --> 00:20:08.040] So among those children born 94 to 2023, based on the analyses in this report, routine childhood vaccines would have prevented approximately, or had prevented approximately 508 million cases of illness.
[00:20:08.040 --> 00:20:08.600] Wow.
[00:20:08.600 --> 00:20:16.040] 32 million hospitalizations and 1.129 million deaths.
[00:20:16.040 --> 00:20:17.880] 1,129,000 deaths.
[00:20:19.080 --> 00:20:22.440] And now let's talk over, that's a long span.
[00:20:22.440 --> 00:20:24.600] That's 20, almost 20 years.
[00:20:24.600 --> 00:20:25.000] Okay.
[00:20:25.000 --> 00:20:25.480] Wow.
[00:20:25.480 --> 00:20:28.120] No, almost 30 years, 94 to 2023.
[00:20:28.120 --> 00:20:31.080] Let's talk about the cost savings.
[00:20:31.080 --> 00:20:39.880] You know, it always feels a little icky when we talk about cost savings when something that is, you know, as extreme as illness and death.
[00:20:40.200 --> 00:20:49.400] But it does cost money to treat disease, and it costs money to try to save a child who has a vaccine-preventable illness.
[00:20:49.720 --> 00:20:56.360] So, according to this report, there are two different ways that they looked at these cost savings.
[00:20:56.360 --> 00:21:07.640] The first one was direct health care costs, and the direct health care costs, they saw a savings of $540 billion.
[00:21:07.640 --> 00:21:15.000] So, $540 billion in just the direct health care that would have been required if these children had not been vaccinated.
[00:21:16.080 --> 00:21:29.120] But then they looked at something even larger and they said, okay, well, even more than that, what are these other costs, these kind of externalized costs, these societal costs?
[00:21:29.120 --> 00:21:35.200] And they estimate that at $2.7 trillion.
[00:21:35.920 --> 00:21:45.440] And that includes things like: what if your parents have to stay home from work to care for the child while they're sick and they lose wages from that?
[00:21:45.440 --> 00:21:50.720] What if a child becomes permanently disabled from an infection, which can definitely happen?
[00:21:50.720 --> 00:21:55.520] And, you know, then there are costs throughout the lifespan of supporting that child.
[00:21:55.520 --> 00:21:59.920] So those additional costs bump that estimate up to 2.7 trillion.
[00:21:59.920 --> 00:22:11.040] That seems to be on par with savings from like curing hepatitis C or savings from the legislation like the Clean Air Act.
[00:22:11.040 --> 00:22:16.640] The ROI on childhood vaccines is literally bananas.
[00:22:16.640 --> 00:22:17.040] Yeah, it is.
[00:22:17.280 --> 00:22:18.080] The amount of lives saved.
[00:22:18.640 --> 00:22:22.400] It is the most cost-effective public health measure.
[00:22:22.880 --> 00:22:23.600] Prevention?
[00:22:23.600 --> 00:22:24.080] Always.
[00:22:24.080 --> 00:22:24.640] 100%.
[00:22:25.040 --> 00:22:28.000] When is prevention not one of the best health savings?
[00:22:28.000 --> 00:22:28.240] Right.
[00:22:28.240 --> 00:22:29.840] And vaccines, especially.
[00:22:29.840 --> 00:22:34.400] I mean, because it's like really well-established prevention.
[00:22:34.400 --> 00:22:35.760] So here's the thing.
[00:22:35.760 --> 00:22:42.800] That may be the reality of the situation, but a recent Gallup poll, and when I say recent, this was published August 7th.
[00:22:42.800 --> 00:22:47.840] The polling was done from like July 1st to the 21st, 2024.
[00:22:47.840 --> 00:22:48.880] So this is pretty recent.
[00:22:48.880 --> 00:22:53.840] A Gallup poll showed that there have been a lot of changes in how U.S.
[00:22:53.840 --> 00:22:57.440] adults view childhood vaccinations.
[00:22:57.440 --> 00:22:59.440] And this is pretty scary.
[00:22:59.440 --> 00:23:08.600] So, when asked if it is important for parents to have their children vaccinated, we saw a huge change.
[00:23:08.600 --> 00:23:15.000] And that change was heavily lateralized according to political affiliation.
[00:23:15.000 --> 00:23:17.880] And where do you think the biggest changes have taken place?
[00:23:17.880 --> 00:23:20.440] This data was from 2002 to 2024.
[00:23:20.440 --> 00:23:22.520] We can look at trends across that line.
[00:23:24.040 --> 00:23:27.480] What political party do you think saw the biggest change in their views?
[00:23:27.960 --> 00:23:29.240] It's got to be the Republican.
[00:23:29.480 --> 00:23:32.360] Democrat, Republican.
[00:23:33.000 --> 00:23:33.160] Right.
[00:23:33.160 --> 00:23:46.280] So historically, we often think of vaccine hesitancy, vaccine denialism, as sort of being like an anti-science or a pseudoscience problem on the left.
[00:23:46.280 --> 00:23:53.800] But what's interesting is that it's actually long been pretty consistent across political parties.
[00:23:54.360 --> 00:24:03.080] What's really interesting is that in the past only like four years, there's been a massive shift in Republican views.
[00:24:03.080 --> 00:24:08.280] So when asked, for example, how important is it that parents get their children vaccinated?
[00:24:08.280 --> 00:24:13.640] Extremely important, very important, somewhat important, not very important, or not at all important.
[00:24:13.960 --> 00:24:22.600] Those who answered extremely important in 2002, it was 66% of Democrats and 62% of Republicans.
[00:24:22.600 --> 00:24:29.880] In 2020, it was 67% of Democrats and 52% of Republicans.
[00:24:30.680 --> 00:24:37.880] In 2024, 63% of Democrats, 26% of Republicans.
[00:24:37.880 --> 00:24:39.320] Yeah, it's tanking.
[00:24:39.640 --> 00:24:40.600] Tanking.
[00:24:40.600 --> 00:24:41.480] It's tanking.
[00:24:41.960 --> 00:24:43.440] It's got to be taken what they were asking.
[00:24:43.640 --> 00:24:44.560] It has to be the COVID.
[00:24:45.040 --> 00:24:49.600] They asked, how important is it that parents get their children vaccinated?
[00:24:44.440 --> 00:24:50.880] Wow.
[00:24:52.080 --> 00:25:08.000] So again, back in 2001, on average, 94% of respondents said that it was extremely important, and only 69% of respondents are saying extremely important right now.
[00:25:08.640 --> 00:25:11.920] Massive decline in these viewpoints.
[00:25:11.920 --> 00:25:30.880] We've also seen, you know, some change, some other trends and changes across the board, but I think this one is the most telling and it's the most worrisome because nothing changed in early, actually, this was sorry, when I gave you those numbers before the 2020 to 2024, it was really late 2019 to 2024.
[00:25:30.880 --> 00:25:33.920] So it was, of course, pre-COVID versus after.
[00:25:33.920 --> 00:25:34.880] Right, okay.
[00:25:34.880 --> 00:25:38.720] So nothing changed except the rhetoric, right?
[00:25:38.720 --> 00:25:44.080] The outcomes of childhood vaccinations were just as strong.
[00:25:44.080 --> 00:25:51.840] The health consequences, those lives saved, the amount of money saved within the economy, just as robust.
[00:25:51.840 --> 00:25:54.400] The only thing that changed was the rhetoric.
[00:25:54.400 --> 00:25:58.560] And then here's another big one that I think I would be remiss if I didn't mention.
[00:25:58.560 --> 00:26:00.400] Here's another question that was asked.
[00:26:00.400 --> 00:26:06.640] Do you think vaccines are more dangerous than the diseases they are designed to prevent?
[00:26:06.640 --> 00:26:12.240] So once again, do you think vaccines are more dangerous than the diseases they are designed to prevent?
[00:26:12.240 --> 00:26:17.360] Overall, in 2001, only 6% of people said yes.
[00:26:17.360 --> 00:26:20.000] They are more dangerous than the diseases themselves.
[00:26:20.000 --> 00:26:23.760] In 2024, 20% of people said yes.
[00:26:23.760 --> 00:26:24.640] It's crazy.
[00:26:24.640 --> 00:26:40.520] And when you look at the difference between Democrats and Democratically leaning individuals and Republicans, Republically leaning individuals, 2001, 6% of Republicans said yes, the vaccines are more dangerous than the diseases they're designed to prevent.
[00:26:40.520 --> 00:26:42.840] 5% of Democrats said yes.
[00:26:42.840 --> 00:26:46.440] 2019, 12% of Republicans said yes.
[00:26:46.440 --> 00:26:48.200] 10% of Democrats said yes.
[00:26:48.520 --> 00:26:56.760] 2024, 31% of Republicans said yes, 5% of Democrats said yes.
[00:26:57.880 --> 00:27:12.920] 31% of Democrats who, I'm sorry, of Republican respondents to this Gallup poll in July of 2024 said that vaccines are more dangerous than the diseases they are designed to prevent.
[00:27:13.240 --> 00:27:13.640] Amazing.
[00:27:14.120 --> 00:27:17.240] That's some powerful influence.
[00:27:17.240 --> 00:27:38.760] You know, like not only is this a beautiful example of a theme that we often talk about on this show, which is that the data, the actual data, which is easily accessible, is completely out of step with the rhetoric and the beliefs on these highly kind of charged political topics.
[00:27:38.760 --> 00:27:56.600] But it's also a beautiful example of how much influence certain individuals can have on the viewpoints of many people in this country and how those changing viewpoints will directly affect public policy.
[00:27:56.600 --> 00:28:05.640] Oh my gosh, we've known this, especially since the vaccines cause autism from the late 90s and early 2000s.
[00:28:05.640 --> 00:28:07.080] We've seen this.
[00:28:07.720 --> 00:28:17.440] But, and you say that, Evan, that we have seen this, but if you actually look at the data, the vaccines cause autism scare in the 90s and 2000s.
[00:28:17.440 --> 00:28:20.000] Yes, it did affect trust.
[00:28:20.000 --> 00:28:24.240] You do see a change, but I don't think it was this extreme.
[00:28:24.240 --> 00:28:29.200] Because the internet wasn't as widely accessible back then as it is today.
[00:28:29.200 --> 00:28:31.040] So that's definitely part of it.
[00:28:31.040 --> 00:28:47.120] It also shows this tendency that we have, this sort of cognitive bias that we have, to take fear-mongering, information that was fear-mongering, and then apply it to other aspects of our thinking.
[00:28:47.120 --> 00:29:00.000] Because most of the rhetoric was COVID vaccine rhetoric, yet we are seeing a massive impact on questions about whether parents should vaccinate their children against childhood diseases.
[00:29:00.000 --> 00:29:02.960] Yeah, well, it's a deterioration of trust and authority.
[00:29:03.280 --> 00:29:04.960] Yeah, it's really scary.
[00:29:04.960 --> 00:29:16.560] You know, I think that this is a big deal because what we're seeing now, this study about the Vaccines for Children program is showing us the real impact of vaccinations.
[00:29:16.560 --> 00:29:21.440] This Gallup poll is showing us what people think right now.
[00:29:21.440 --> 00:29:33.520] So I'm kind of scared to see the outcome studies in five to ten years of children who were refused their vaccines by their parents and how they fared.
[00:29:33.840 --> 00:29:36.960] Will we start to see the re-emergence of some things that we're talking about?
[00:29:37.120 --> 00:29:38.320] We already have.
[00:29:38.400 --> 00:29:40.720] I mean, we've been seeing measles outbreaks.
[00:29:40.720 --> 00:29:41.120] That would have been a lot of fun.
[00:29:41.280 --> 00:29:42.800] Yeah, we have seen those before.
[00:29:42.800 --> 00:29:44.800] You know, we've talked about the California, right?
[00:29:44.800 --> 00:29:45.920] The whole Disneyland thing.
[00:29:45.920 --> 00:29:50.880] We've talked about Minnesota kind of being a hotbed for that occurring every now and again.
[00:29:50.880 --> 00:29:54.240] And I can tell you, like, and this is just, you know, a personal experience thing.
[00:29:54.240 --> 00:29:58.720] So all of us on this podcast are old enough that we got chickenpox, right?
[00:29:58.720 --> 00:29:59.040] Yep.
[00:29:59.040 --> 00:29:59.600] I did, yep.
[00:29:59.600 --> 00:30:00.600] Age 11.
[00:30:00.600 --> 00:30:03.400] Did any of you, have any of you had shingles yet?
[00:30:03.400 --> 00:30:03.800] I did.
[00:30:03.800 --> 00:30:04.600] No, I'm supposed to.
[00:30:04.760 --> 00:30:05.240] I didn't get it.
[00:29:59.920 --> 00:30:06.120] I got the vaccine.
[00:30:06.360 --> 00:30:08.120] Yeah, I just got mine soon.
[00:30:08.120 --> 00:30:08.600] Yeah.
[00:30:08.600 --> 00:30:11.080] I'm not eligible yet for my vaccine.
[00:30:11.080 --> 00:30:13.160] So I'm in the scary spot.
[00:30:13.160 --> 00:30:15.320] My kid's sister had shingles recently.
[00:30:15.320 --> 00:30:19.400] Multiple patients that I work with in the cancer center have had shingles.
[00:30:19.400 --> 00:30:21.560] It is violently painful.
[00:30:21.560 --> 00:30:22.040] Like it is.
[00:30:22.280 --> 00:30:23.080] Oh, yeah.
[00:30:23.080 --> 00:30:23.720] Gosh.
[00:30:23.720 --> 00:30:24.200] I don't care.
[00:30:24.440 --> 00:30:32.600] And especially if you're already struggling with diseases that affect your immune system, if you're already struggling with, you know, these other issues.
[00:30:32.600 --> 00:30:37.320] Like, if you are a cancer patient and you have shingles, it is not a picnic.
[00:30:37.560 --> 00:30:38.600] And I see it.
[00:30:38.600 --> 00:30:39.800] I see it weekly.
[00:30:39.800 --> 00:30:42.040] And it's just devastating.
[00:30:42.040 --> 00:30:49.480] And yes, we are all from a cohort where we didn't have access to the vaccine or the vaccine, or for some of them, they're younger, right?
[00:30:49.480 --> 00:30:51.080] But we all have the virus in us.
[00:30:51.080 --> 00:30:54.920] And so the vaccine is there to help with that flare-up.
[00:30:54.920 --> 00:30:58.520] But kids today will never know that pain.
[00:30:58.520 --> 00:31:01.000] And to deny them that, you know, that vaccine.
[00:31:01.000 --> 00:31:05.960] And of course, I have a vaccine-preventable illness that caused cancer.
[00:31:05.960 --> 00:31:08.280] I had to have a drastic surgery.
[00:31:08.280 --> 00:31:16.120] I had to have a hysterectomy to remove my cervix because my cancer was caused by a vaccine-preventable virus.
[00:31:16.120 --> 00:31:23.000] That vaccine didn't exist when I was exposed, but you know, it happens.
[00:31:23.000 --> 00:31:26.120] And like, I wish I was lucky enough to be vaccinated.
[00:31:26.120 --> 00:31:30.680] I'm vaccinated against that disease now, but I wish I was lucky enough to be vaccinated.
[00:31:30.680 --> 00:31:44.800] And it just, it really, like, I think it's more infuriating, and it feels that much more of a dig when you see people saying, I'm not going to vaccinate my kids, or vaccines aren't important, or the vaccine is worse than the disease.
[00:31:44.680 --> 00:31:47.280] When it's like, you don't have the disease, dude.
[00:31:47.280 --> 00:31:48.160] Let me tell you.
[00:31:48.160 --> 00:31:48.400] Right.
[00:31:48.480 --> 00:31:48.960] Yeah.
[00:31:44.840 --> 00:31:51.280] I'd much rather not have this disease.
[00:31:51.600 --> 00:31:53.040] And I have both with this.
[00:31:53.040 --> 00:31:55.920] I am vaccinated and I have the disease, right?
[00:31:55.920 --> 00:31:57.920] I can tell you the disease was worse.
[00:31:57.920 --> 00:32:00.160] It's, yeah, it's infuriating.
[00:32:00.160 --> 00:32:04.480] I think this is part of a bigger trend of distrust and authority.
[00:32:04.800 --> 00:32:10.160] And at some critical point of distrust, people are like, well, screw it.
[00:32:10.160 --> 00:32:11.680] I'm just going to think whatever I want to think.
[00:32:11.680 --> 00:32:12.320] You know what I mean?
[00:32:12.320 --> 00:32:12.640] Exactly.
[00:32:12.640 --> 00:32:30.080] I'm just going to go along with my tribe or, again, just let confirmation bias completely overwhelm what I believe because I don't have to accommodate or listen to expert opinion because the experts are not trustworthy, you know, or institutions are not trustworthy.
[00:32:30.080 --> 00:32:40.720] And that, I think, is the biggest negative effect cumulatively of a lot of the divisiveness that we have where institutions become collateral damage.
[00:32:40.720 --> 00:32:43.600] I think in a lot of political fights that we have.
[00:32:43.600 --> 00:32:46.800] And I think also social media has done this as well.
[00:32:46.800 --> 00:32:48.880] Oh, yeah, everyone thinks they're an expert.
[00:32:50.000 --> 00:32:57.840] And it's a perfect storm because it's occurring against a background of the thing is working so we don't notice it.
[00:32:58.480 --> 00:33:00.960] You know, like, well, why do I need to vaccinate my kids?
[00:33:00.960 --> 00:33:02.160] Nobody's sick out there.
[00:33:02.160 --> 00:33:04.160] Well, yeah, because of the vaccines.
[00:33:04.160 --> 00:33:04.640] Right.
[00:33:04.960 --> 00:33:10.720] And that's like that negative consequence that comes, like the hidden value in these things.
[00:33:10.720 --> 00:33:13.520] You don't see polio in the United States anymore.
[00:33:13.520 --> 00:33:15.200] You don't see them in non-events.
[00:33:15.200 --> 00:33:15.360] Yeah.
[00:33:15.600 --> 00:33:16.480] It's like, why 2K.
[00:33:16.480 --> 00:33:18.000] Oh, we made a big deal over nothing.
[00:33:18.000 --> 00:33:20.240] No, it was nothing because we made a big deal.
[00:33:20.240 --> 00:33:21.200] Exactly.
[00:33:21.200 --> 00:33:21.760] Yeah.
[00:33:21.760 --> 00:33:29.120] Ask these same people to take down all their protection services for their computers and see how long they last without getting infected probably online, right?
[00:33:29.440 --> 00:33:36.280] I'll bet you they wouldn't be so quick to poo-poo virus protection and other things.
[00:33:36.600 --> 00:33:38.200] And that's your laptop.
[00:33:38.200 --> 00:33:40.440] That's not your body.
[00:33:40.440 --> 00:33:40.600] Right.
[00:33:41.720 --> 00:33:43.000] Your livelihood.
[00:33:43.640 --> 00:33:47.720] I mean, these diseases kill children.
[00:33:47.720 --> 00:33:51.240] And the children they don't kill, they often maim.
[00:33:51.560 --> 00:33:54.360] They often cause permanent disability.
[00:33:55.320 --> 00:33:55.880] Yeah.
[00:33:57.320 --> 00:33:58.840] This is not, you know.
[00:33:59.480 --> 00:34:02.920] I'm emotionally invested in this topic, as we all should be, damn it.
[00:34:03.080 --> 00:34:09.160] Are there any suggestions for what can be done, or do we have to let it play out and hope the culture changes?
[00:34:09.480 --> 00:34:10.280] Well, I mean, that's the thing.
[00:34:10.280 --> 00:34:13.400] Like, the things that need to be done are being done.
[00:34:13.400 --> 00:34:16.120] Like, again, we have a program in the U.S.
[00:34:16.280 --> 00:34:21.240] where children who would not otherwise be able to afford to be vaccinated can get vaccinated for free.
[00:34:21.240 --> 00:34:22.520] As long as their parents.
[00:34:23.160 --> 00:34:23.880] Exactly.
[00:34:23.880 --> 00:34:26.840] The CDC understands the need for this.
[00:34:26.840 --> 00:34:32.520] Many schools have vaccine requirements for your kids to be able to mingle with other kids in those schools.
[00:34:32.840 --> 00:34:38.200] So there are, you know, systems in place to protect us from a public health perspective.
[00:34:38.200 --> 00:34:45.640] But, you know, at a certain point, if you want to get out of this requirement, you're going to figure out a way to do it.
[00:34:45.640 --> 00:34:46.920] You're going to homeschool your kids.
[00:34:46.920 --> 00:34:49.640] You're going to, you know, whatever the case may be.
[00:34:49.800 --> 00:34:50.840] Can't force it.
[00:34:50.840 --> 00:34:51.960] So what do we do?
[00:34:52.200 --> 00:34:57.640] We improve education, and we hope that, you know, we hope that the tides start to change.
[00:34:58.280 --> 00:35:01.320] And we have to mandate vaccines, is the other thing.
[00:35:01.480 --> 00:35:06.040] It's perfectly reasonable to require certain vaccines to attend public school.
[00:35:06.040 --> 00:35:13.760] I think companies have the right to require vaccinations if it could impact the safety of their fellow workers or their customers.
[00:35:16.480 --> 00:35:17.680] I can't go to work without my vaccination.
[00:35:14.200 --> 00:35:18.000] We do now.
[00:35:19.120 --> 00:35:23.760] It's now a plank of one of our major parties to remove that requirement, right?
[00:35:23.760 --> 00:35:29.920] That remove all federal funding for any public school that mandates any vaccine.
[00:35:29.920 --> 00:35:34.240] There's an actual reasonable chance that that could become law in this country.
[00:35:34.560 --> 00:35:37.280] And it's in perfect lockstep with this Gallup poll.
[00:35:37.280 --> 00:35:37.440] Yeah.
[00:35:37.920 --> 00:35:39.200] That's the rhetoric.
[00:35:40.560 --> 00:35:41.040] Disaster.
[00:35:41.040 --> 00:35:42.000] That would be a disaster.
[00:35:42.160 --> 00:35:42.800] Disastrous.
[00:35:43.120 --> 00:35:44.000] Thanks, Kara.
[00:35:44.000 --> 00:35:44.640] All right.
[00:35:44.640 --> 00:35:47.280] Bob, tell us about alien solar panels.
[00:35:47.280 --> 00:35:49.920] Yeah, that sounds like a much happier note, Bob.
[00:35:49.920 --> 00:35:50.240] Yeah.
[00:35:50.240 --> 00:35:50.800] Yeah.
[00:35:51.280 --> 00:35:52.240] Get me out of the dumps.
[00:35:52.400 --> 00:35:54.640] I didn't know aliens had solar panels.
[00:35:54.640 --> 00:35:56.000] Well, that's the question.
[00:35:56.880 --> 00:35:59.440] The question is: if they did, could we detect them?
[00:35:59.440 --> 00:36:00.240] That's the question.
[00:36:00.240 --> 00:36:12.080] So, in a recent interesting study, the researchers created a model of an Earth-like planet, and then they covered it with modern solar panels that would meet all of our current and near-future energy needs.
[00:36:12.640 --> 00:36:25.280] They then imagined that it was a distant alien exoplanet to see how detectable it was with near-future telescopes and how much planet-wide panel coverage aliens might need.
[00:36:25.280 --> 00:36:29.040] And that seems much more of a complicated sentence than I intended.
[00:36:29.040 --> 00:36:30.560] But what did they find?
[00:36:30.560 --> 00:36:32.160] And wouldn't you like to know?
[00:36:32.160 --> 00:36:39.680] The study was published in the Astrophysical Journal called, the study is called Detectability of Solar Panels as a Techno-Signature.
[00:36:40.000 --> 00:36:44.720] A lot of research for extraterrestrial life focuses on biosignatures, right?
[00:36:44.720 --> 00:36:57.040] We've said that on the show many times, such as the molecules of life in exo-atmospheres or specific frequencies of light reflected or emitted from surfaces that point to some sort of life forms.
[00:36:57.040 --> 00:36:59.680] My favorite, though, is not biosignatures.
[00:36:59.880 --> 00:37:03.560] As my grandfather used to say, biosignatures, pet!
[00:37:03.560 --> 00:37:09.320] I love technosignatures, the manifestations of alien technology that we can observe.
[00:37:09.560 --> 00:37:11.320] That's what a technosignature is.
[00:37:11.320 --> 00:37:15.320] I mean, who doesn't like the idea of actual alien technology?
[00:37:15.560 --> 00:37:18.200] To me, that's just like, oh boy, more of that.
[00:37:18.200 --> 00:37:25.800] So, this study focuses on potential techno signatures that are used to power these alien civilizations if they exist.
[00:37:25.800 --> 00:37:29.800] In this case, they were focusing on silicon-based solar panels.
[00:37:29.800 --> 00:37:37.480] Now, they focus on silicon instead of other photovoltaics using things like germanium or gallium, which are much lower in abundance universe-wide.
[00:37:37.480 --> 00:37:43.880] So, that's why it seems kind of all right, if you're going to go with a material that these solar panels are made of, silicon sounds like a good idea.
[00:37:43.880 --> 00:37:52.920] All right, so the first reasons they chose silicon is that it's like I said, silicon was chosen partly because it has a high cosmic abundance.
[00:37:52.920 --> 00:37:58.840] It's all over the place, so there's a higher probability that aliens would probably be using that.
[00:37:58.840 --> 00:38:00.280] Reasonable, I guess.
[00:38:00.280 --> 00:38:05.000] Also, number two, the electronic structure of silicon, its so-called band gap.
[00:38:05.000 --> 00:38:05.960] You may have heard of that.
[00:38:06.360 --> 00:38:11.000] That works especially well with the radiation emitted by sun-like stars.
[00:38:11.000 --> 00:38:15.640] And finally, number three, silicon also happens to be very cost-effective.
[00:38:15.640 --> 00:38:19.480] When you refine it, process it, manufacture it into solar cells.
[00:38:19.480 --> 00:38:20.440] It's relatively easy.
[00:38:20.440 --> 00:38:22.840] So, those three reasons seem reasonable.
[00:38:22.840 --> 00:38:26.120] Okay, maybe aliens would use silicon if they're going to use that at all.
[00:38:26.120 --> 00:38:31.640] So, then the researchers also relied on this idea of what's called artificial spectral edges.
[00:38:31.640 --> 00:38:33.960] I've never heard of that, but it makes sense.
[00:38:33.960 --> 00:38:36.520] I want you to see where they're coming from.
[00:38:36.520 --> 00:38:43.880] This means that there's a dramatic and detectable change in the reflectance distinguishing silicon from non-silicon, right?
[00:38:43.880 --> 00:38:46.880] And this is specifically in UV wavelengths.
[00:38:47.120 --> 00:38:59.440] So imagine you're scanning the surface of an alien planet, and you know, a big chunk of the planet's got all this dirt, and next to the dirt is this sandy desert area, and then next to that are these gargantuan solar panels.
[00:38:59.440 --> 00:39:10.400] So, as it's being scanned or observed from a distant telescope, there would be a rapid increase in reflected UV light that would be easier to detect, right?
[00:39:10.400 --> 00:39:18.320] So, as you're transitioning from one area, like a desert, to this area with lots of solar panels, you're going to see a spike in the UV wavelengths.
[00:39:18.320 --> 00:39:21.200] Like, oh, that might be a solar panel.
[00:39:21.200 --> 00:39:22.480] So, there's that.
[00:39:22.480 --> 00:39:26.640] And it's similar to satellites detecting vegetation on Earth.
[00:39:26.880 --> 00:39:30.880] In this situation, they have, it's called vegetation red edge.
[00:39:30.880 --> 00:39:34.160] So, that's because chlorophyll absorbs a lot of infrared.
[00:39:34.160 --> 00:39:37.600] So, the infrared drops sharply when you look at vegetation.
[00:39:37.600 --> 00:39:42.480] So, you're scanning, satellites are scanning the Earth, you're going to see a big drop in vegetation.
[00:39:42.480 --> 00:39:47.600] That's one big indication that, oh, yeah, there's vegetation there, and that's called vegetation red edge.
[00:39:47.600 --> 00:39:51.360] So, it's kind of very similar to this artificial spectral edge.
[00:39:51.360 --> 00:39:58.480] The researchers then made a model of an Earth-like exoplanet, and to this exoplanet, they added these different land categories.
[00:39:58.480 --> 00:40:04.320] They added ocean, snow, and ice, they added grass, forest, and bare soil.
[00:40:04.640 --> 00:40:07.840] And then they added a sixth category, solar panels.
[00:40:07.840 --> 00:40:12.480] So, those are the five land categories or six land categories.
[00:40:12.480 --> 00:40:14.960] Of course, there's a huge bias there, right?
[00:40:14.960 --> 00:40:20.160] Those are land categories that we have on Earth, but who knows what the exoplanets would have.
[00:40:20.160 --> 00:40:26.000] Although, I assume that it's not unreasonable to pick those, especially for a planet that has some life.
[00:40:26.000 --> 00:40:34.200] Then, they move their model planet 30 light years away, and they determined what would they see if they looked at it with a really cool telescope.
[00:40:34.440 --> 00:40:42.520] So the really cool telescope that they chose was NASA's Habitable Worlds Observatory Telescope.
[00:40:42.520 --> 00:40:45.160] So this is planned to be launched in 2040.
[00:40:45.160 --> 00:40:51.480] It's going to hopefully have a 6.8-meter mirror, and it will look for life on Earth-like exoplanets.
[00:40:51.480 --> 00:40:53.880] So that's a big one that they're planning.
[00:40:53.880 --> 00:40:55.800] I hope the plans come to fruition.
[00:40:55.800 --> 00:40:59.320] Even though I might be dead, but it'd be still nice.
[00:40:59.640 --> 00:41:01.080] So that was the scenario.
[00:41:01.080 --> 00:41:11.080] You know, they have this model planet, this exoplanet that's similar to Earth with all these land areas, and one of the land areas consists of lots of solar panels.
[00:41:11.080 --> 00:41:15.000] And they wanted to see, can we see this thing from 30 light years away?
[00:41:15.000 --> 00:41:30.760] So they did all of their work, and they concluded that detecting such panels, given their hypothetical scenario that they created, would still be very hard, even 30 light-years away, which to me sounds damn close, but it is extremely far away.
[00:41:30.760 --> 00:41:39.960] But still, it's relatively close, but it was still very, very hard to see these solar panels, even when there was a huge amount of them.
[00:41:39.960 --> 00:41:50.200] They concluded that solar panels would often be indistinguishable from non-panel features on the exoplanet, whether it was the other land categories.
[00:41:50.520 --> 00:41:52.840] It would still be hard to distinguish them.
[00:41:53.160 --> 00:41:59.720] They said that detecting the large panels would likely require several hundred hours of observation.
[00:41:59.720 --> 00:42:06.120] So that's a lot of observation, several hundred hours to be focused on a planet looking for solar panels.
[00:42:06.120 --> 00:42:07.320] That's a lot of time.
[00:42:07.320 --> 00:42:17.840] And then also, if the planet happened to have very large solar panels covering a quarter of the planet, say 25%, of course, that would make it easier to detect them.
[00:42:17.840 --> 00:42:23.040] But the scientists said that even that would be difficult to do, even if there was a lot of them.
[00:42:23.040 --> 00:42:32.720] So, bottom line, it would still be extremely hard to see to detect these solar panels, even though they might be bright in the UV and there might be a lot of them.
[00:42:32.720 --> 00:42:36.160] It's still hard, even with a really cool advanced telescope.
[00:42:36.160 --> 00:42:37.040] It would be hard.
[00:42:37.040 --> 00:42:39.680] But they also found something else that was interesting.
[00:42:39.680 --> 00:42:50.640] A lead author of the paper, Ravi Caparapu of NASA's Goddard Space Flight Center, said, We found that spice.
[00:42:50.640 --> 00:42:51.280] Yeah, I thought you were going to say that.
[00:42:51.440 --> 00:42:52.560] Do you like spice?
[00:42:53.120 --> 00:42:54.000] I love spice.
[00:42:54.000 --> 00:42:55.120] No, outer spice.
[00:42:55.360 --> 00:43:11.200] All right, we found that even if our current population of about 8 billion stabilizes at 30 billion, and I think that number, 30 billion, is like is the number that has, you know, they did some study and said, yeah, if we're going to max out people on Earth, it's going to be about 30 billion.
[00:43:11.200 --> 00:43:13.840] Let me start this quote again because I'm throwing in my own words.
[00:43:13.840 --> 00:43:14.320] All right.
[00:43:14.320 --> 00:43:30.160] Ravi said, We found that even if our current population of about 8 billion stabilizes at 30 billion with a high standard of living and we only use solar energy for power, we still use way less energy than that provided by all the sunlight illuminating our planet.
[00:43:30.160 --> 00:43:30.480] Okay?
[00:43:30.880 --> 00:43:47.920] So that means that many advanced civilizations with tens of billions, literally tens of billions of inhabitants, might not need anything ridiculous like Dyson swarms around their parent star collecting a significant fraction of light from their sun.
[00:43:48.480 --> 00:44:00.000] The researchers showed that with 8.9% of the land surface of a planet covered in solar panels, that would be enough for 30 billion people to live to a very high standard of living.
[00:44:00.520 --> 00:44:03.480] And we still wouldn't be able to detect those panels.
[00:44:03.480 --> 00:44:15.880] So you could literally have 30 billion aliens 30 light years away with 9% of their land mass covered with solar, highly reflective solar panels, and we still wouldn't be able to detect them.
[00:44:15.880 --> 00:44:20.520] Now, remember, though, those panels could be on the water, they could be in orbit.
[00:44:20.520 --> 00:44:21.880] It doesn't have to be on the land.
[00:44:21.880 --> 00:44:24.440] I mean, 8.9% is still a lot.
[00:44:24.440 --> 00:44:29.480] I mean, that's essentially what, like covering China and India with solar panels.
[00:44:29.480 --> 00:44:32.040] I mean, that's a lot of solar panels.
[00:44:32.280 --> 00:44:38.120] But what if that could supply your entire planet with power for centuries?
[00:44:38.120 --> 00:44:39.480] That'd be interesting.
[00:44:39.720 --> 00:44:43.720] Coparapu also ties this study to the Fermi paradox.
[00:44:43.720 --> 00:44:48.120] And the Fermi paradox, if you don't remember, asks the question: where are the aliens?
[00:44:48.120 --> 00:44:52.840] Why haven't they spread all over the galaxy by now or the universe by now?
[00:44:52.840 --> 00:44:56.600] They've had literally hundreds of millions or billions of years to do so.
[00:44:56.600 --> 00:44:57.640] Where are they?
[00:44:57.640 --> 00:45:12.920] So Kaparapu said regarding that, he said, the implication is that civilizations may not feel compelled to expand all over the galaxy because they may achieve sustainable population and energy usage levels, even if they choose a very high standard of living.
[00:45:12.920 --> 00:45:21.320] They may expand within their stellar system or even within nearby star systems, but a galaxy-spanning civilization may not exist.
[00:45:21.320 --> 00:45:23.400] So that was the study.
[00:45:23.720 --> 00:45:30.120] You know, it was a fun read, and it was interesting as far as it goes.
[00:45:30.120 --> 00:45:32.120] But I mean, it's obviously clear.
[00:45:32.120 --> 00:45:34.760] The human bias, of course, is written all over this, right?
[00:45:34.760 --> 00:45:50.880] I mean, it it's hard to extrapolate, as we know, with one data point of life, it's hard to extrapolate on other alien civilizations what they would be like, what they would need, you know, what they, how much power, how much energy would they need to live comfortable lives?
[00:45:51.120 --> 00:45:52.000] Would they spread?
[00:45:52.000 --> 00:45:52.720] We have no idea.
[00:45:52.720 --> 00:45:53.680] We have no idea.
[00:45:53.680 --> 00:46:01.600] So I think this study is maybe more predictive about not what aliens would do, but what humanity would do.
[00:46:01.920 --> 00:46:10.960] Maybe for the next two or three centuries, you know, with solar panels and fusion energy, we may not be very detectable at large distances either.
[00:46:12.160 --> 00:46:20.080] And we could have plenty of energy available to us without any gaudy Dyson swarm around our star.
[00:46:20.080 --> 00:46:32.320] And so maybe that's one of the takeaways is that using maybe some solar panels in orbit, some on the Earth, and also fusion, of course, whatever other energies we may use.
[00:46:32.320 --> 00:46:37.600] I mean, we could have plenty of energy, even if we go up to 20 billion or even 30 billion.
[00:46:37.920 --> 00:46:39.360] So that's an interesting angle.
[00:46:39.600 --> 00:46:46.240] But of course, the more alien your aliens are, then I think all bets are kind of off to a certain degree.
[00:46:46.240 --> 00:46:51.520] And the more godlike their technology is, all bets are off because you have no idea what they're doing.
[00:46:52.160 --> 00:46:56.720] I mean, imagine a Kardashev level 3 civilization.
[00:46:56.720 --> 00:47:01.600] We would really have no idea what they would do, how much energy would they need?
[00:47:01.600 --> 00:47:02.640] What would they do with it?
[00:47:03.120 --> 00:47:05.040] Would we be able to even detect it?
[00:47:05.040 --> 00:47:10.160] And it's kind of like ants trying to figure out, what are those two people doing over there?
[00:47:11.760 --> 00:47:12.720] It's like no idea.
[00:47:12.720 --> 00:47:21.280] It's kind of silly to try to think we could even come close to understanding or figuring out what that would be like.
[00:47:21.280 --> 00:47:23.280] But still, an interesting study.
[00:47:23.840 --> 00:47:26.480] So it's indiscernible, in other words.
[00:47:26.480 --> 00:47:26.960] Yes.
[00:47:27.280 --> 00:47:28.400] It's mistaken.
[00:47:28.400 --> 00:47:31.000] You could never suss out the fact that there's all these things.
[00:47:31.080 --> 00:47:34.040] Well, they would have to be close and massive, is what I'm hearing.
[00:47:34.040 --> 00:47:34.520] Yeah.
[00:47:29.840 --> 00:47:34.760] Yeah.
[00:47:35.000 --> 00:47:42.600] So maybe this does explain, you know, maybe that's why they have done some decent searches for techno-signatures like Dyson Swarms.
[00:47:42.600 --> 00:47:49.400] And we've had some, you know, lots of news items about them, but nothing, you know, nothing definitive or even close to being definitive at all.
[00:47:50.520 --> 00:47:55.160] Yeah, but don't they know that we want to know them and they should broadcast to us so we can find them?
[00:47:55.400 --> 00:47:56.040] Yeah.
[00:47:56.040 --> 00:47:57.400] Aliens tend to be that way.
[00:47:57.400 --> 00:48:01.400] Well, if we're basically alone in our galaxy, that is not unrealistic.
[00:48:01.400 --> 00:48:02.760] No, it's absolutely.
[00:48:03.080 --> 00:48:04.120] But we have no idea.
[00:48:04.120 --> 00:48:04.920] We have no idea.
[00:48:04.920 --> 00:48:07.960] But yeah, it's absolutely could be our galaxy.
[00:48:08.280 --> 00:48:18.040] Imagine the next life being, you know, the galaxy over, or 100,000 far outside our, you know, our local group.
[00:48:18.040 --> 00:48:19.000] I mean, imagine that.
[00:48:19.000 --> 00:48:25.560] But I suspect, Steven, I'm sure you do too as well, that there's life all over the place, but more microbial life.
[00:48:25.560 --> 00:48:35.240] This is really about advanced, sapient life that can manipulate technology in ways we would be impressed with.
[00:48:35.560 --> 00:48:39.880] Bob, what do you think would happen if they found a real Dyson sphere?
[00:48:39.880 --> 00:48:43.240] What do you think would happen socially around the world?
[00:48:44.200 --> 00:48:45.800] I think a lot of people wouldn't believe it.
[00:48:46.200 --> 00:48:52.840] Ultimately, I don't think it would have much of an impact because there'd always be some doubt that it was natural.
[00:48:53.240 --> 00:49:02.440] And even if we eventually, like the scientific consensus landed on, okay, this is definitely a techno signature, that would take a lot of time.
[00:49:02.440 --> 00:49:05.800] This wouldn't be like an instant affirmative conclusion.
[00:49:05.800 --> 00:49:07.720] I'd be like, there's a potential one.
[00:49:07.720 --> 00:49:11.400] And we would say, yeah, like the last thousand potential ones, they all cracked out.
[00:49:11.400 --> 00:49:14.520] And say, wait a minute, this one, you know, it would survive more and more tests.
[00:49:14.520 --> 00:49:18.320] It would probably be a process that plays itself out over years.
[00:49:19.200 --> 00:49:23.840] Yeah, we would creep up on the creep up on the consensus.
[00:49:23.840 --> 00:49:25.920] It would be, it wouldn't be a sudden shock.
[00:49:25.920 --> 00:49:26.720] You know what I mean?
[00:49:26.720 --> 00:49:28.560] But I don't think it would happen.
[00:49:28.880 --> 00:49:35.520] Like, you're right scientifically, but I think that socially, like, religions would be formed around this.
[00:49:35.520 --> 00:49:37.440] Like, weird cults would start.
[00:49:37.440 --> 00:49:38.400] Eventually, maybe.
[00:49:38.400 --> 00:49:41.680] Yeah, there'd be all sorts of like weird pseudoscience on that.
[00:49:41.920 --> 00:49:42.800] Which would surprise me.
[00:49:42.800 --> 00:50:04.080] But I think once as we become increasingly convinced that it was that it was a real techno signature, I could imagine a lot of money spent on projects to really observe it with a tool that is going to hone in on that thing and try to confirm it.
[00:50:04.080 --> 00:50:06.080] Yeah, that would happen.
[00:50:06.400 --> 00:50:08.640] But I think it would take a while to get there.
[00:50:08.640 --> 00:50:11.040] So when I ask, what is Odoo?
[00:50:11.040 --> 00:50:12.560] What comes to mind?
[00:50:12.560 --> 00:50:15.200] Well, Odoo is a bit of everything.
[00:50:15.200 --> 00:50:22.320] Odoo is a suite of business management software that some people say is like fertilizer because of the way it promotes growth.
[00:50:22.320 --> 00:50:30.800] But you know, some people also say Odoo is like a magic bean stock because it grows with your company and is also magically affordable.
[00:50:30.800 --> 00:50:37.520] But then again, you could look at Odoo in terms of how its individual software programs are a lot like building blocks.
[00:50:37.520 --> 00:50:40.160] I mean, whatever your business needs.
[00:50:40.160 --> 00:50:43.040] Manufacturing, accounting, HR programs.
[00:50:43.040 --> 00:50:47.360] You can build a custom software suite that's perfect for your company.
[00:50:47.360 --> 00:50:49.200] So, what is Odoo?
[00:50:49.200 --> 00:50:52.240] Well, I guess Odoo is a bit of everything.
[00:50:52.240 --> 00:50:57.760] Odoo is a fertilizer, magic beanstock building blocks for business.
[00:50:57.760 --> 00:50:59.360] Yeah, that's it.
[00:50:59.360 --> 00:51:03.160] Which means that Odoo is exactly what every business needs.
[00:50:59.840 --> 00:51:05.960] Learn more and sign up now at Odo.com.
[00:51:06.120 --> 00:51:08.920] That's ODOO.com.
[00:51:09.560 --> 00:51:13.080] Celebrating is about embracing the moments that matter.
[00:51:13.080 --> 00:51:15.320] It's blowing out candles with family.
[00:51:15.320 --> 00:51:19.480] It's sharing laughter at a barbecue or saying I do.
[00:51:19.480 --> 00:51:24.920] And before the drinks begin, there's one step worth taking: Z-Biotics Pre-Alcohol.
[00:51:24.920 --> 00:51:28.920] Drink it before you celebrate, so you can wake up ready for what's next.
[00:51:28.920 --> 00:51:32.600] Turn every great night into the start of an extraordinary tomorrow.
[00:51:32.600 --> 00:51:36.360] Z-Biotics Pre-Alcohol, the science behind your next great morning.
[00:51:36.360 --> 00:51:38.920] Learn more at zbiotics.com.
[00:51:39.240 --> 00:51:44.760] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Rocket Money.
[00:51:44.760 --> 00:51:50.360] Rocket Money is a personal finance app that finds and cancels your unwanted subscriptions.
[00:51:50.360 --> 00:51:54.920] It monitors your spending and helps lower your bills so that you can grow your savings.
[00:51:54.920 --> 00:51:58.920] So, guys, how much do you think you're paying in subscriptions every month?
[00:51:58.920 --> 00:52:00.520] That's a serious question.
[00:52:00.520 --> 00:52:01.480] You just don't know.
[00:52:01.480 --> 00:52:05.720] So, over 74% of people have subscriptions they've forgotten about.
[00:52:05.720 --> 00:52:06.840] This includes me.
[00:52:06.840 --> 00:52:08.920] I've had this happen to me many times.
[00:52:08.920 --> 00:52:17.560] And most Americans think they spend about $62 per month on subscriptions, but get this: the real number is closer to 300, which is scary.
[00:52:17.560 --> 00:52:22.360] This is literally thousands of dollars a year, half of which you've forgotten about, like me.
[00:52:22.360 --> 00:52:34.520] Yeah, and not only will Rocket Money help you with that by finding these things, canceling them with just a few taps, they'll also even try to negotiate lower bills for you by up to 20%.
[00:52:34.520 --> 00:52:36.840] All you have to do is submit a picture of your bill.
[00:52:36.840 --> 00:52:38.680] Rocket Money takes care of the rest.
[00:52:38.680 --> 00:52:49.600] And Rocket Money has over 5 million users and has saved a total of $500 million in canceled subscriptions, saving members up to $740 a year when using all of the app's features.
[00:52:49.600 --> 00:52:51.760] Stop wasting money on things you don't use.
[00:52:51.760 --> 00:52:58.160] Cancel your unwanted subscriptions by going to rocketmoney.com/slash SGU.
[00:52:58.160 --> 00:53:01.600] That's rocketmoney.com/slash SGU.
[00:53:01.600 --> 00:53:04.960] RocketMoney.com/slash SGU.
[00:53:05.280 --> 00:53:07.600] All right, guys, let's get back to the show.
[00:53:07.600 --> 00:53:12.480] All right, Jay, tell us about these astronauts that are stranded on the International Space Station.
[00:53:12.480 --> 00:53:18.240] Yeah, we have two NASA astronauts, Sunita Williams and Barry Butch Wilmore.
[00:53:18.240 --> 00:53:20.720] How come they didn't give her a space name?
[00:53:20.720 --> 00:53:21.840] You know, I don't know.
[00:53:21.840 --> 00:53:22.960] Like a nickname?
[00:53:22.960 --> 00:53:23.360] Yeah.
[00:53:23.760 --> 00:53:27.600] You know, they all have to have a cool moniker.
[00:53:27.600 --> 00:53:28.800] What was her name again?
[00:53:29.360 --> 00:53:30.560] Sunita Williams.
[00:53:30.560 --> 00:53:31.840] Sunita Williams.
[00:53:31.840 --> 00:53:33.120] Super Sunita Williams.
[00:53:33.120 --> 00:53:33.760] I like it.
[00:53:33.920 --> 00:53:42.160] So they're currently stuck on the International Space Station, and this is due to ongoing technical issues with the Boeing Starliner spacecraft.
[00:53:42.160 --> 00:53:44.560] Not a great position to be in.
[00:53:44.560 --> 00:53:52.480] So Williams and Wilmore, they were delivered to the space station on June 2024 aboard Boeing Starliner.
[00:53:52.480 --> 00:53:59.120] Now, this was a spacecraft that was developed as part of NASA's commercial crew program, which we've talked about.
[00:53:59.120 --> 00:54:06.720] The mission was supposed to be pretty straightforward with the astronauts staying on the ISS scheduled for about a week.
[00:54:06.720 --> 00:54:17.680] And even before the launch, the Starliner encountered some technical difficulties, and the new issues with the thrusters emerged after docking at the ISS.
[00:54:17.680 --> 00:54:21.280] What happened was they just discovered that there were problems with the thrusters.
[00:54:21.280 --> 00:54:28.000] They docked, and then NASA is currently testing the Starliner spacecraft while it's stocked at the ISS.
[00:54:28.000 --> 00:54:29.720] I didn't know that this was possible.
[00:54:29.280 --> 00:54:33.320] The testing is focused on the spacecraft's thruster system.
[00:54:33.480 --> 00:54:37.880] Like I said, NASA engineers, you know, they're carefully running these tests.
[00:54:38.200 --> 00:54:49.960] They're trying to evaluate the performance and the reliability of the thrusters before they make any decisions regarding how the astronauts are going to return to Earth, which is a vital piece of information that they're trying to get to.
[00:54:49.960 --> 00:54:59.720] Now, these tests include monitoring the spacecraft systems and simulating various scenarios that would ensure that the Starliner can safely undock and re-enter Earth's atmosphere.
[00:54:59.720 --> 00:55:07.880] So the results of the test will determine whether the Starliner will be used or if they have to result, you know, turn to an alternate spacecraft.
[00:55:07.880 --> 00:55:12.760] And some of the options could be a SpaceX Crew Dragon or a Russian Soyuz.
[00:55:12.760 --> 00:55:19.800] So right now, they don't know exactly what the problem is still, as far as what I could find out as of today.
[00:55:19.800 --> 00:55:23.320] They're feeling like they need to continue testing.
[00:55:23.320 --> 00:55:31.000] And I was personally very surprised that they could test this as thoroughly as they are testing it while it's docked at the space station.
[00:55:31.000 --> 00:55:32.200] That's awesome.
[00:55:32.200 --> 00:55:39.240] And, you know, the genius of the engineers and the foresight of the engineers is what is hopefully going to pay off here.
[00:55:39.240 --> 00:55:42.360] So the astronauts themselves, they're both in good spirits.
[00:55:42.360 --> 00:55:47.240] They have a lot of confidence that the testing and the safety of the Starliner is going to come out on top.
[00:55:47.240 --> 00:55:56.920] Now, in case of an emergency, NASA and Boeing publicly stated that the spacecraft is equipped to undock from the ISS and return the crew to Earth safely.
[00:55:56.920 --> 00:55:58.520] That's what they're saying today.
[00:55:58.840 --> 00:56:05.480] But under normal circumstances, the spacecraft will not be cleared for return until the thruster issue is fully resolved.
[00:56:05.480 --> 00:56:09.320] So, in the meantime, the other spacecraft docked at the ISS.
[00:56:09.320 --> 00:56:16.400] I think they're now this article I read said something that a second article didn't, and I don't know exactly what's going on.
[00:56:16.720 --> 00:56:22.800] Now, is there currently a SpaceX, Crew Dragon, and Russian Soyuz at the space station right now?
[00:56:22.800 --> 00:56:25.200] There were inconsistencies with these articles.
[00:56:25.200 --> 00:56:29.040] So, anyway, they're trying to assess all these things at the same time.
[00:56:29.040 --> 00:56:32.240] The engineers are taking a very careful approach.
[00:56:32.240 --> 00:56:38.880] They want to have a possible return window of mid-August, which is, you know, now coming up very soon.
[00:56:38.880 --> 00:56:43.760] But this could extend and they could stay there longer depending on how the tests go.
[00:56:43.760 --> 00:56:50.960] Now, if the technical problems are not resolved in time and there's no alternative spacecraft available, this is the worst case scenario.
[00:56:50.960 --> 00:56:55.280] There is a possibility that the astronauts could remain on the ISS longer as planned.
[00:56:55.280 --> 00:56:59.600] And some reports have mentioned that the stay could extend into 2025.
[00:56:59.600 --> 00:57:01.040] It's a pretty extreme position.
[00:57:01.840 --> 00:57:04.160] But it is a speculative outcome.
[00:57:04.160 --> 00:57:07.120] They're talking about it, of course, just in case it happens.
[00:57:07.120 --> 00:57:09.680] Bottom line is we have to wait a little bit longer.
[00:57:09.680 --> 00:57:14.160] Now, the Starliner is approved to remain docked at the ISS for up to 45 days.
[00:57:14.160 --> 00:57:17.760] This is largely because of the batteries aboard.
[00:57:17.760 --> 00:57:20.880] They could extend that to 90 days using backup systems.
[00:57:20.880 --> 00:57:24.880] You know, the health of those lithium-ion batteries is always a question here.
[00:57:24.880 --> 00:57:31.040] These batteries have previously caused problems and concerns, and that adds another layer of complexity here.
[00:57:31.040 --> 00:57:34.240] So, an interesting thing I found too, this was pretty cool.
[00:57:34.240 --> 00:57:36.880] So, a Russian satellite broke apart.
[00:57:36.880 --> 00:57:50.560] It created debris that briefly threatened the space station, and during that incident, the astronauts, both of them, had to go into their spacecraft for protection, which I found to be very curious and interesting.
[00:57:50.560 --> 00:57:55.840] I guess there's only a certain number of places that they consider to be safe, and the spacecraft are very safe.
[00:57:55.840 --> 00:57:57.840] So, we will see what happens here.
[00:57:57.840 --> 00:57:59.200] I mean, I'm not worried at all.
[00:57:59.200 --> 00:58:05.480] I feel like NASA's got this, and the future here is going to probably reveal itself within the next couple of weeks.
[00:58:05.480 --> 00:58:26.280] Jay, what I'm finding is that the ISS currently has six spaceships docked to it: the Boeing Starliner spacecraft, SpaceX Dragon, Endeavour spacecraft, Northrop Grumman resupply ship, Soyuz MS-25 cruise ship, and Progress 87 and 88 resupply ships.
[00:58:26.280 --> 00:58:26.680] Wow.
[00:58:26.680 --> 00:58:28.600] All docked to the ISS right now.
[00:58:28.600 --> 00:58:29.080] Wow.
[00:58:29.480 --> 00:58:31.080] What's in docking Bay 94?
[00:58:31.800 --> 00:58:33.560] So they have options.
[00:58:34.040 --> 00:58:37.320] They have a Dragon cruise capsule and a Soyuz.
[00:58:37.640 --> 00:58:41.640] And then the rest are resupply ships, I guess are not crew approved.
[00:58:41.640 --> 00:58:45.800] So, guys, what do you think of when I tell you the word framing?
[00:58:46.120 --> 00:58:46.600] Framing?
[00:58:46.600 --> 00:58:48.200] In the context of science communication.
[00:58:48.680 --> 00:58:50.680] What do you think of the term framing?
[00:58:50.680 --> 00:58:51.160] Oh, yeah.
[00:58:51.160 --> 00:59:03.400] Like, it's a strategy for ensuring that the listener is going to approach it in a way that, you know, that you want them to.
[00:59:03.640 --> 00:59:06.840] It's a persuasion strategy, I guess you could say.
[00:59:06.840 --> 00:59:08.200] So, do you guys remember?
[00:59:08.200 --> 00:59:25.640] This might have been a little bit before you were actively paying attention to the skeptical movement, Kara, but maybe 15, 20 years ago, there was a very brief debate within skeptical circles about whether or not framing was a scientific or more of a deceptive practice.
[00:59:25.640 --> 00:59:35.640] Like, there was a couple of people talking about we have to frame our messages appropriately, and other people, other skeptics, like, that's deception, that's politics, that's propaganda.
[00:59:35.960 --> 00:59:36.280] What?
[00:59:36.280 --> 00:59:38.040] All science communication is persuasion.
[00:59:38.040 --> 00:59:41.000] Well, I mean, it's all the other thing is, yes, I agree with you.
[00:59:41.000 --> 00:59:44.440] But also, you are always framing your message.
[00:59:44.680 --> 00:59:45.040] True.
[00:59:45.040 --> 00:59:47.680] Yeah, whether you think you are or not.
[00:59:44.760 --> 00:59:51.600] So it's like saying, oh, we're doing science, not philosophy.
[00:59:51.760 --> 00:59:53.760] It's like, no, you're doing philosophy.
[00:59:53.760 --> 00:59:55.360] You just don't know it.
[00:59:55.360 --> 00:59:55.600] Right.
[00:59:55.600 --> 00:59:59.040] When somebody's like, no, but this is pure information.
[00:59:59.040 --> 00:59:59.440] Yeah.
[00:59:59.920 --> 01:00:00.640] Or whatever.
[01:00:00.960 --> 01:00:09.760] You're just not aware of the, or thoughtful, therefore, about the philosophy that you're using, or in this case, the framing that you're using.
[01:00:10.160 --> 01:00:12.000] The bias you're bringing to the table.
[01:00:12.000 --> 01:00:12.720] Exactly.
[01:00:12.720 --> 01:00:17.680] So you're using an unconscious or assumed framing without even knowing you're doing it.
[01:00:17.680 --> 01:00:21.440] That doesn't make it in any way better or purer or whatever.
[01:00:22.080 --> 01:00:26.000] So it was both naive and I think misplaced, but it was very short-lived.
[01:00:26.000 --> 01:00:33.360] I think it was like very quickly the internal conversation was like, no, this is like a standard part of science communication.
[01:00:34.400 --> 01:00:44.080] So there's a recent study, what's prompting me talking about this is a recent study looking at the framing of global warming, talking to the public about global warming.
[01:00:44.080 --> 01:01:03.840] And they are focusing on, they wanted to see what people's reactions were to just the term climate change versus global warming versus climate crisis versus climate emergency versus climate justice.
[01:01:04.160 --> 01:01:04.720] Interesting.
[01:01:05.600 --> 01:01:09.920] And they essentially asked two questions or two types of questions.
[01:01:09.920 --> 01:01:13.120] One was just, are you familiar with this term?
[01:01:13.120 --> 01:01:16.640] Like have you ever heard this term before, do you know what it means?
[01:01:16.640 --> 01:01:19.760] And the results are pretty predictable.
[01:01:19.760 --> 01:01:27.120] So climate change and global warming were both very high, about 88%, roughly the same.
[01:01:27.120 --> 01:01:30.520] Climate crisis, climate emergency, were less so.
[01:01:29.840 --> 01:01:34.200] Crisis was 76%, climate emergency, 58%.
[01:01:34.760 --> 01:01:38.120] And climate justice was the least at 33%.
[01:01:38.600 --> 01:01:44.520] So people had not heard of that term and weren't really sure what it meant, but very familiar with climate change, global warming.
[01:01:44.520 --> 01:01:52.120] Then they asked people to say whether or not this is something that you are concerned about.
[01:01:52.120 --> 01:01:54.040] Like, are you concerned about climate change?
[01:01:54.040 --> 01:01:56.040] Are you concerned about global warming?
[01:01:56.040 --> 01:01:58.280] And the numbers were very similar.
[01:01:58.840 --> 01:01:59.640] Interesting.
[01:01:59.640 --> 01:02:00.280] Yeah.
[01:02:00.840 --> 01:02:03.240] To the familiarity one.
[01:02:03.240 --> 01:02:26.120] So they concluded that there is a big effect to, you know, when you're talking about framing the discussion, like do you frame a discussion of this issue around the concept of global warming versus climate change, whatever, people's familiarity with the terms that you're using have a dramatic effect on their receptiveness and their attitudes towards that topic.
[01:02:26.120 --> 01:02:35.080] They also, of course, broke it down by political persuasion with the absolutely predictable results, right?
[01:02:35.640 --> 01:02:42.440] Of Democrats being more concerned, Independents less so, Republicans less so.
[01:02:42.440 --> 01:02:45.080] But did they say who responds better to what framing?
[01:02:45.560 --> 01:02:51.480] It was just by, I think familiarity was the determinative factor.
[01:02:51.480 --> 01:03:01.080] And otherwise, it was like the average numbers were similar to the familiarity numbers, but then they would break out based upon political ideology.
[01:03:01.400 --> 01:03:11.720] That's interesting because I guess my understanding was that there was a big cultural shift from global warming to climate change to make it more palatable.
[01:03:11.960 --> 01:03:14.680] They're both there, but they're both the same in this study.
[01:03:15.200 --> 01:03:15.840] That's interesting.
[01:03:15.920 --> 01:03:25.440] So that almost shows that over either it shows that that actually that strategy didn't work or that it did work at the time, but it's become such a commonplace.
[01:03:25.840 --> 01:03:26.560] Yeah, it's been normal.
[01:03:27.280 --> 01:03:34.560] And honestly, that to me makes sense because I find that climate change and global warming are used pretty interchangeably.
[01:03:34.560 --> 01:03:35.440] Yeah, they are.
[01:03:35.440 --> 01:03:37.600] But at the time, it was really intentional.
[01:03:37.600 --> 01:03:41.120] Like if we say climate change, people will not be reactive.
[01:03:41.120 --> 01:03:50.960] Well, I think the idea also is that climate change is more accurate because global warming makes it sound like it's warming everywhere always as opposed to the climate's changing.
[01:03:50.960 --> 01:03:58.080] It's warming on average, but some places are getting warmer, other places not so much, and it's inconsistent year to year, whatever.
[01:03:58.080 --> 01:03:59.840] So climate change is just more accurate.
[01:03:59.840 --> 01:04:07.360] So it was just to avoid the straw man of, well, it's not getting warmer here now, so how can you call it global warming?
[01:04:07.360 --> 01:04:09.120] Like so much for global warming.
[01:04:09.120 --> 01:04:15.840] I don't think that that actually worked because the people who are making that argument are being disingenuous anyway.
[01:04:16.400 --> 01:04:27.680] Yeah, that's what's interesting about it is that, you know, my take would be that calling it climate change, while it might be more, quote, accurate, it's actually more innocuous as well.
[01:04:28.000 --> 01:04:31.360] And so it's much easier to be like, well, who cares if the climate changes?
[01:04:31.360 --> 01:04:31.760] Right.
[01:04:31.760 --> 01:04:33.040] Is it changing for the worst?
[01:04:33.040 --> 01:04:34.880] Is it dangerous that it's changing?
[01:04:34.880 --> 01:04:41.040] Whereas things like climate emergency, they communicate a value judgment.
[01:04:41.040 --> 01:04:41.520] Crisis.
[01:04:41.520 --> 01:04:43.200] I like climate crisis.
[01:04:44.240 --> 01:04:53.680] I think you can make a reasonable argument for not using like climate emergency or climate crisis because of you know all the reasons that the authors here state.
[01:04:53.680 --> 01:05:03.080] But I, in addition, because it has that value judgment, like you say climate crisis, it's much easier to straw man that as a climate alarmist position.
[01:05:03.400 --> 01:05:05.160] Like, is it really a crisis?
[01:05:05.160 --> 01:05:22.280] You know, and so you're asking people to accept not only that there's global warming, but that it's a crisis at the same time rather than easing them in, maybe saying there is you know climate change, it involves on average warming, and it could be a problem, you know, going forward.
[01:05:22.760 --> 01:05:36.600] I'm actually asking, I'm texting a friend right now, she works for a firm, and she's the comms person at the firm, and their job is to partner with agencies, often government agencies.
[01:05:36.600 --> 01:05:48.200] Her specialty is ocean science, but they actually draft policy and they do work to try to make sure that like budgets are protected and that initiatives are going forward.
[01:05:48.200 --> 01:05:49.800] And so they have to be bipartisan.
[01:05:50.120 --> 01:05:53.400] And so I'm actually asking her, do you avoid that kind of language?
[01:05:53.400 --> 01:05:54.680] Like what is your strategy?
[01:05:54.680 --> 01:05:58.440] Because I'm curious how it actually works in Washington.
[01:05:59.400 --> 01:06:08.120] Now, here's another thought that I had is when you use a term like climate crisis or climate emergency, those are emotional words, right?
[01:06:08.360 --> 01:06:12.040] Those are more of value-laden words.
[01:06:12.280 --> 01:06:20.680] So people might think that those terms are biased, and therefore they would assume that the messenger is biased, right?
[01:06:20.680 --> 01:06:29.880] The other thing is that when people hear emotional language, I think there is a defensive sort of knee-jerk reaction because you feel like I'm being manipulated.
[01:06:29.880 --> 01:06:34.120] And that's actually a perfectly reasonable assumption most of the time.
[01:06:34.440 --> 01:06:35.080] Right?
[01:06:35.080 --> 01:06:42.280] So this is, and we get, we have discussed this sort of in general, you know, previously in terms of our messaging.
[01:06:42.280 --> 01:06:48.800] It's like we want to accurately reflect how serious the topics are that we're talking about.
[01:06:49.120 --> 01:06:57.440] Like if we're talking about vaccine denial or anti-vaccine rhetoric, right, Kara, like you were saying, we want to say this is...
[01:06:57.760 --> 01:06:58.800] very serious.
[01:06:58.800 --> 01:07:02.480] This could result in kids dying of otherwise preventable diseases.
[01:07:02.480 --> 01:07:07.680] But if we push that framing too far, you get a backlash.
[01:07:07.920 --> 01:07:12.560] Well, now you're just emotionally trying to manipulate me rather than giving me objective science.
[01:07:12.560 --> 01:07:21.920] So how do you keep the aura of objective science, but still convey the seriousness of the topic that you're talking about, right?
[01:07:21.920 --> 01:07:23.760] I mean, it's a huge problem during COVID.
[01:07:23.760 --> 01:07:24.320] A huge problem.
[01:07:24.560 --> 01:07:25.200] Absolutely.
[01:07:25.200 --> 01:07:28.880] And we also run into the same problem when we're talking about con artists.
[01:07:28.880 --> 01:07:35.680] And pseudoscience, we say like homeopathy, which we're going to talk about in a moment, is rank pseudoscience.
[01:07:35.680 --> 01:07:38.240] There's really no other way to accurately portray it.
[01:07:38.320 --> 01:07:40.560] It is complete and utter nonsense.
[01:07:40.560 --> 01:07:47.040] But when you talk that way, even though it's true, you don't sound like a scientist.
[01:07:47.040 --> 01:07:56.000] But if you talk like people think a scientist is supposed to sound like, you lend false credibility by damning with faint criticism in a way, right?
[01:07:56.000 --> 01:08:00.720] You're not criticizing homeopathy enough with your language.
[01:08:00.720 --> 01:08:05.200] And so people think that it's like something that reasonable scientists could disagree about.
[01:08:05.200 --> 01:08:07.760] It's like, well, the evidence doesn't support homeopathy.
[01:08:07.760 --> 01:08:11.680] But rather than saying this is really complete nonsense.
[01:08:11.680 --> 01:08:12.480] You know what I mean?
[01:08:13.440 --> 01:08:23.520] And there's the layer too, Steve, that to like, because we're talking about science communication and we're often talking about scientists communicating their research or people whose sole job it is to communicate science.
[01:08:23.520 --> 01:08:25.120] But there's a whole other wrinkle in this.
[01:08:25.120 --> 01:08:33.400] And I remember having debates about this for years, which is that science journalism is very different from science communication.
[01:08:29.840 --> 01:08:33.560] Yes.
[01:08:33.880 --> 01:08:38.200] Science communication, often by definition, is agenda-driven.
[01:08:38.440 --> 01:08:38.760] Persuasive.
[01:08:39.320 --> 01:08:40.440] Yes, it's persuasive.
[01:08:40.440 --> 01:08:45.640] Whereas science journalism, I mean, talk about exactly what you opened this conversation with.
[01:08:46.040 --> 01:08:54.600] Their attempt, knowing that there's always going to be some amount of bias, is to be aware of that and to sniff it out and to reduce it wherever possible.
[01:08:55.720 --> 01:08:57.720] That is a journalistic ethic.
[01:08:57.720 --> 01:09:05.800] It's to report what is without saying how it should be, without saying how I want it to be, without saying how I like it to be.
[01:09:06.120 --> 01:09:07.080] This just is.
[01:09:07.080 --> 01:09:10.280] Yeah, well, we do would go in the opinion section of newspapers.
[01:09:10.440 --> 01:09:10.680] Exactly.
[01:09:12.920 --> 01:09:14.120] And a lot of SciCom is options.
[01:09:14.840 --> 01:09:16.280] It's analysis, right?
[01:09:16.280 --> 01:09:18.360] We're giving an analysis, an opinion.
[01:09:18.360 --> 01:09:19.240] And that's fine.
[01:09:19.240 --> 01:09:19.960] You know what I mean?
[01:09:19.960 --> 01:09:24.760] Because we're trying to help people make sense of the science, putting in a data.
[01:09:24.840 --> 01:09:27.320] And the thing is, we're not dishonest about it.
[01:09:27.320 --> 01:09:31.800] No, that does not necessarily imply dishonesty or deception or whatever.
[01:09:31.800 --> 01:09:32.680] It is framing.
[01:09:33.480 --> 01:09:36.040] It is saying, what's the context of the communication?
[01:09:36.280 --> 01:09:37.240] Who am I talking to?
[01:09:37.240 --> 01:09:38.840] What am I trying to convey?
[01:09:38.840 --> 01:09:44.120] And then accurately portraying the science while putting it into some kind of meaningful perspective.
[01:09:44.120 --> 01:09:48.360] And that meaningful perspective is the framing, right?
[01:09:48.920 --> 01:09:51.720] Yeah, and we say we are pro-scientific skepticism.
[01:09:51.720 --> 01:09:52.840] That is the goal here.
[01:09:53.080 --> 01:09:59.480] We are trying to do pros, cons, and pseudoscience, not just talk about it neutrally like an observer from another planet, you know what I mean?
[01:09:59.480 --> 01:09:59.880] Yeah.
[01:09:59.880 --> 01:10:01.080] No, absolutely.
[01:10:01.080 --> 01:10:06.360] And so I've written about framing over the years on many topics, like with GMOs, right?
[01:10:06.360 --> 01:10:13.320] So there's like this pure scientific question about what is the technology, what is genetic engineering, what does the evidence say, say.
[01:10:13.320 --> 01:10:20.400] But we could also frame a discussion around GMOs from a perspective of how should the government regulate it?
[01:10:20.400 --> 01:10:25.920] Or what is the behavior of corporations who use GMOs?
[01:10:25.920 --> 01:10:32.160] Or what is the pseudoscientific claim surrounding GMOs, the conspiracy theories, et cetera?
[01:10:32.160 --> 01:10:33.040] You know what I mean?
[01:10:33.040 --> 01:10:39.280] So those framings are all part of the overall GMO phenomenon.
[01:10:39.280 --> 01:10:52.480] And you can't just say, here's the dry science in complete absence of any context about the conspiracy theories about it or the actually anti-GMO agenda-driven communication that's out there about it.
[01:10:52.800 --> 01:10:53.920] This is so funny, Steve.
[01:10:53.920 --> 01:10:55.440] I just got a text back from my friend.
[01:10:55.840 --> 01:10:59.840] I said, Do you guys avoid words like climate crisis when you write policy?
[01:10:59.840 --> 01:11:04.000] Like, do you tend to avoid emotional language because it's more biased, or do you use it because it's more motivating?
[01:11:04.000 --> 01:11:05.680] We're talking about this on SGU right now.
[01:11:06.080 --> 01:11:12.080] And she responded directly to, Do you guys avoid words like climate crisis when you write policy?
[01:11:12.240 --> 01:11:17.440] And she goes, Oh my god, I'm laughing because I literally just told one of our clients that they need to do this.
[01:11:18.240 --> 01:11:19.280] They need to avoid it.
[01:11:19.280 --> 01:11:19.760] Yeah.
[01:11:19.760 --> 01:11:23.520] They need to avoid using words like climate crisis when they're drafting policy.
[01:11:23.680 --> 01:11:24.160] Yeah, exactly.
[01:11:24.800 --> 01:11:26.640] But also, but that makes sense.
[01:11:26.960 --> 01:11:29.680] There is a darker side to framing, you know, too.
[01:11:29.680 --> 01:11:31.680] Well, first of all, there's a technical side to it.
[01:11:31.680 --> 01:11:36.240] So, for example, when we talk about framing, you could be talking about a little detail.
[01:11:36.240 --> 01:11:40.240] Like, for example, this is sort of the classic example from literature.
[01:11:40.240 --> 01:11:58.080] If a surgeon tells a patient that the surgical procedure they're recommending has a 98% survival rate, the person is much more likely to agree to the procedure than if you say it has a 2% fatality rate, even though that's exactly the same information.
[01:11:58.080 --> 01:11:58.560] Yeah.
[01:11:58.560 --> 01:11:58.800] Right?
[01:11:58.800 --> 01:12:02.600] So, that's sort of that sort of technical framing about how you convey information.
[01:12:02.760 --> 01:12:06.600] And do you say it from a positive perspective or a negative perspective?
[01:12:06.600 --> 01:12:15.000] But then there's also political framing, which I think is what people reacted to, because that is inherently like spin and deception.
[01:12:15.000 --> 01:12:24.040] Like when you refer to the inheritance tax as a death tax, that's a political framing.
[01:12:24.280 --> 01:12:26.520] That is mentioned in all political.
[01:12:28.040 --> 01:12:37.320] If you talk about a hospital ethics committee, if you label it a death panel, that's a framing that's inherently, I think, deceptive.
[01:12:37.560 --> 01:12:41.800] So you can use frame, like a lot of things, you can use it for good or you can use it for evil.
[01:12:41.800 --> 01:12:45.080] But otherwise, it's just a tool of communication.
[01:12:45.080 --> 01:12:55.400] So, for example, you can use framing to rig any discussion or debate by baking in assumptions that favor your conclusion, right?
[01:12:55.400 --> 01:13:02.360] You could essentially frame a topic in such a way that your conclusion is baked into the discussion.
[01:13:02.360 --> 01:13:05.480] And that, of course, is an inherently deceptive use of framing.
[01:13:05.480 --> 01:13:08.360] You have to sort of challenge the framing.
[01:13:08.360 --> 01:13:11.480] You can't just have a conversation within that framing.
[01:13:11.480 --> 01:13:11.960] You know what I mean?
[01:13:12.040 --> 01:13:14.520] Yeah, I mean, it's also like a logical fallacy.
[01:13:14.680 --> 01:13:17.080] So it's like you're using framing to the extent.
[01:13:17.080 --> 01:13:17.320] Yeah.
[01:13:17.560 --> 01:13:19.320] To make a circular argument almost.
[01:13:19.320 --> 01:13:19.800] Yeah.
[01:13:19.800 --> 01:13:20.120] Yeah.
[01:13:20.360 --> 01:13:21.000] Sounds innocent.
[01:13:21.080 --> 01:13:22.280] Sounds inescapable.
[01:13:22.280 --> 01:13:24.360] Well, yeah, we always frame everything.
[01:13:24.360 --> 01:13:24.840] That's the thing.
[01:13:24.840 --> 01:13:28.040] It's like you're always doing philosophy whether you know it or not.
[01:13:28.040 --> 01:13:28.520] You know what I mean?
[01:13:28.760 --> 01:13:35.240] There are some inherent assumptions about knowledge and facts and whatever and the universe in any discussion that you're having.
[01:13:35.640 --> 01:13:39.640] But it's better to be consciously aware of that and the implications of it.
[01:13:40.120 --> 01:13:47.760] Same thing, you're constantly framing your discussion in some context, whether you explicitly know it and express it or not.
[01:13:48.320 --> 01:14:00.080] But when you consciously frame a subject, you should be doing it to optimize understanding and communication while being fair and accurate and scientific.
[01:14:00.080 --> 01:14:07.760] But of course, people will use framing to be deceptive and persuasive, to sell you stuff and to get your vote and whatever, to make you afraid, you know.
[01:14:07.760 --> 01:14:08.400] Absolutely.
[01:14:09.280 --> 01:14:22.000] And the truth is, it's deeply disingenuous if you actively work as a science communicator or identify as a science communicator, if your job is to do journalism, if your job is to do communications.
[01:14:22.000 --> 01:14:25.600] It's deeply disingenuous to not be aware of this.
[01:14:26.320 --> 01:14:28.480] Like, you have to be explicit about it.
[01:14:28.480 --> 01:14:36.400] I have to say, real quick, before you wrap it up, she added a little bit to the text thread, which I think is just so fascinating.
[01:14:36.400 --> 01:14:51.200] She said, we don't avoid the word climate, but we lean into language like strengthen coastal resilience, mitigate climate impacts, et cetera, that talk about specific issues that climate has for communities, ecosystems, and industries.
[01:14:51.200 --> 01:14:55.440] It's boring, but it's just less alienating and gets less of a reaction.
[01:14:56.160 --> 01:15:03.520] Truth is, most people are on board at this point that climate change is real, but they just get their heckles up when we talk about it as a crisis.
[01:15:03.520 --> 01:15:03.840] Right.
[01:15:04.080 --> 01:15:04.800] Fascinating.
[01:15:04.800 --> 01:15:06.720] And climate justice, forget about it.
[01:15:06.720 --> 01:15:12.720] Because so I think you would think justice, like, isn't justice for all, like, isn't this like an American concept?
[01:15:12.720 --> 01:15:14.640] But the word has so much baggage on it.
[01:15:14.640 --> 01:15:23.600] And I think what the specific baggage is, is that people have been convinced that justice is a zero-sum game.
[01:15:23.600 --> 01:15:29.760] And that's one person's justice is at the expense of injustice to somebody else.
[01:15:30.280 --> 01:15:32.280] And again, that's a framing, right?
[01:15:32.280 --> 01:15:40.840] So the whole concept of justice has been framed in this zero-sum way, which makes it have this psychological baggage.
[01:15:40.840 --> 01:15:44.840] Whereas that wasn't true historically, justice was like Superman, right?
[01:15:44.840 --> 01:15:47.480] You know, it was truth justice in the American way.
[01:15:47.880 --> 01:15:49.240] Yeah, something everyone could agree on.
[01:15:49.480 --> 01:15:50.440] Most people could agree on.
[01:15:50.520 --> 01:15:51.080] Right, right.
[01:15:51.480 --> 01:15:57.320] But now it's been made divisive because it's been framed as a zero-sum game.
[01:15:57.320 --> 01:16:01.400] Well, we saw this with the word skeptic as well over our 20 years of doing this, plus.
[01:16:01.560 --> 01:16:01.880] Right.
[01:16:01.880 --> 01:16:08.200] Well, and to me, climate justice, and maybe this is my own bias or my own framing, is not the same thing as climate science.
[01:16:08.200 --> 01:16:08.680] No.
[01:16:08.680 --> 01:16:13.400] It's also not the same thing as a climate crisis or climate change.
[01:16:13.400 --> 01:16:16.120] Climate justice is decision-making.
[01:16:16.200 --> 01:16:17.320] It's inherently political.
[01:16:17.320 --> 01:16:18.760] It's inherently political.
[01:16:18.760 --> 01:16:24.680] It's saying climate change affects different people disproportionately, and we want to mitigate those effects.
[01:16:24.920 --> 01:16:27.080] That's wildly different than just saying climate change exists.
[01:16:27.240 --> 01:16:28.360] It's happening, right?
[01:16:28.360 --> 01:16:29.240] Yeah, yeah.
[01:16:29.560 --> 01:16:30.280] All right.
[01:16:30.280 --> 01:16:34.840] Evan, you're going to give us that promised discussion of homeopathy.
[01:16:34.840 --> 01:16:36.280] Oh, yes, yes.
[01:16:36.280 --> 01:16:43.640] And I'm going to give it to you from, well, where I found it, because it came creeping across my feed recently.
[01:16:43.640 --> 01:16:45.960] Anytime homeopathy in the news does.
[01:16:45.960 --> 01:16:47.960] A magazine called Health.
[01:16:47.960 --> 01:16:53.240] I don't know if you guys remember Health magazine, seeing it on the rack in the supermarket or wherever, right?
[01:16:53.240 --> 01:16:55.000] Founded in 1981.
[01:16:55.000 --> 01:16:56.920] Back then it was called In Health.
[01:16:56.920 --> 01:17:01.240] And this magazine focuses primarily on women's health.
[01:17:01.560 --> 01:17:18.080] Common topics include improper diet, dealing with life issues such as weak relationships and growing stress, not to mention latest fashion and exclusive beauty tips, various healthy food recipes, and other related articles that encourage people to be happy and healthy.
[01:17:14.920 --> 01:17:18.400] Okay.
[01:17:19.280 --> 01:17:27.040] Yeah, magazine was around for a long time, still is, only it's all digital now, and they made that switch over in February of 2022.
[01:17:27.040 --> 01:17:32.240] They ended the print publication and it is digital, ones and zeros all the way.
[01:17:32.240 --> 01:17:36.000] So now we get to enjoy the website called health.com.
[01:17:36.000 --> 01:17:40.560] That's a pretty nice domain name to have if you can secure it, and they did.
[01:17:40.880 --> 01:17:48.000] But as you know, we can't judge a book by its cover, and you can't judge a magazine by its title or the pretty faces on it.
[01:17:48.000 --> 01:17:59.520] And even less so when your magazine is now 100% online, where a person might think a website called health.com might be a reliable source of good health information.
[01:17:59.520 --> 01:18:02.880] In fact, there's a news rating site called NewsGuard.
[01:18:02.880 --> 01:18:11.760] They rated the publication as reliable, noting that it generally avoids deceptive headlines and does not repeatedly publish false content.
[01:18:11.760 --> 01:18:13.120] Okay, all right.
[01:18:13.120 --> 01:18:15.920] Now, this sets up the background for this news item.
[01:18:16.160 --> 01:18:21.360] Headline reads: Everything you need to know about homeopathy and its safety.
[01:18:21.360 --> 01:18:23.280] Courtesyofhealth.com.
[01:18:23.280 --> 01:18:23.840] Okay.
[01:18:24.160 --> 01:18:33.040] I am sure, as we delve into this article, I mean, this article, homeopathy and its safety, it is going to warn the reader about homeopathy, right?
[01:18:33.040 --> 01:18:37.920] It's going to explain about the dubious nature of the claim, like cures like.
[01:18:37.920 --> 01:18:45.840] You know, where a molecule of something that makes you sick can be diluted out of existence, and yet magically its memory will remain to cure what's bad for you.
[01:18:45.840 --> 01:18:53.440] And the article, I'm sure, is going to speak to the implausibility of the suggested mechanisms that define all of homeopathy.
[01:18:53.440 --> 01:18:59.440] And I'm sure they're going to warn their readers about homeopathy can only delay legitimate treatments that actually work helping sick people.
[01:18:59.440 --> 01:19:00.600] Okay, here we go.
[01:19:01.000 --> 01:19:02.520] This was written by Dr.
[01:19:02.840 --> 01:19:08.520] I'm sorry, registered nurse, R-N, Sarah Jividin, J-I-V-I-D-E-N.
[01:19:08.520 --> 01:19:10.440] Sorry if I mispronounced that.
[01:19:10.440 --> 01:19:16.920] And it was reviewed, medically reviewed by Arno Kroener with the letters D-A-O-M.
[01:19:17.240 --> 01:19:18.680] Steve, you're MD, right?
[01:19:18.680 --> 01:19:21.480] That's your, and Kara, your Ph.D.?
[01:19:22.200 --> 01:19:22.680] Okay.
[01:19:22.680 --> 01:19:26.360] Do you ever heard of D-A-O-M for letters?
[01:19:26.360 --> 01:19:28.200] I know what D-O is.
[01:19:29.320 --> 01:19:34.520] Well, a D-A-O-M is a doctorate of acupuncture and oriental medicine.
[01:19:34.520 --> 01:19:35.080] Oh, no.
[01:19:35.160 --> 01:19:35.880] Oh, boy.
[01:19:36.040 --> 01:19:40.920] Described as a degree program that combines traditional Chinese medicine and biomedical concepts.
[01:19:40.920 --> 01:19:42.760] Okay, so there's your medical review.
[01:19:42.760 --> 01:19:43.960] Here we go.
[01:19:43.960 --> 01:19:53.720] Homeopathy, a holistic medical practice that involves treating people, I'm reading the article now, with highly diluted substances to trigger the body's natural healing responses.
[01:19:53.720 --> 01:19:58.280] It was developed in the late 18th century by German physician Samuel Hahnemann.
[01:19:58.280 --> 01:19:59.320] We know all that.
[01:19:59.320 --> 01:20:09.480] Homeopathy is based on the principle of like cures like, meaning a substance causing symptoms in a healthy person can, in minute amounts, treat similar symptoms in a sick person.
[01:20:09.480 --> 01:20:19.960] People might consider homeopathy as an alternative or complementary therapy due to its individualized approach and the belief that it stimulates the body's own healing processes.
[01:20:19.960 --> 01:20:25.000] Its effectiveness is a subject of ongoing debate and research.
[01:20:25.320 --> 01:20:25.800] Right?
[01:20:26.200 --> 01:20:27.000] That's the frame.
[01:20:28.840 --> 01:20:29.800] Right, here we go.
[01:20:29.800 --> 01:20:33.560] We're about to talk about some things that we just talked about again.
[01:20:34.040 --> 01:20:35.160] Homeopathy.
[01:20:35.160 --> 01:20:38.280] Its approach is based on two unconventional theories.
[01:20:38.280 --> 01:20:40.360] At least the unconventional got in there, fine.
[01:20:40.520 --> 01:20:42.440] But unconventional is not even accurate.
[01:20:42.440 --> 01:20:44.720] Again, that's horrible framing.
[01:20:44.720 --> 01:20:47.120] They're pseudoscientific principles.
[01:20:47.120 --> 01:20:50.560] They are magical principles, not unconventional.
[01:20:44.280 --> 01:20:51.680] Right.
[01:20:51.840 --> 01:20:58.720] So, Steve, you got to take your red pen to this and cross out the words and write in the correct ones here.
[01:20:58.720 --> 01:21:02.240] The law of similars and the law of infinitesimals, right?
[01:21:02.720 --> 01:21:03.840] Those nice little theories.
[01:21:04.400 --> 01:21:06.000] Infinite, say that again.
[01:21:06.000 --> 01:21:06.800] Infinitesimals.
[01:21:06.880 --> 01:21:08.080] Infinitesimals.
[01:21:08.080 --> 01:21:10.160] Infinitesimals.
[01:21:10.480 --> 01:21:10.960] Yeah.
[01:21:10.960 --> 01:21:13.440] The law of infinitesimals.
[01:21:13.440 --> 01:21:13.920] Yes.
[01:21:13.920 --> 01:21:15.920] So, first, the law of similars, right?
[01:21:16.400 --> 01:21:18.160] Well, like cures like.
[01:21:19.040 --> 01:21:24.400] Substance that causes symptoms in a healthy person can be used to treat similar symptoms in a sick person.
[01:21:24.400 --> 01:21:25.200] Yep.
[01:21:25.200 --> 01:21:30.080] And then the second law: extreme dilutions of substances.
[01:21:30.080 --> 01:21:34.480] The belief that the more a substance is diluted, the more potent it becomes.
[01:21:34.480 --> 01:21:34.880] How?
[01:21:34.880 --> 01:21:35.440] I mean, why?
[01:21:35.440 --> 01:21:35.840] I don't know.
[01:21:36.160 --> 01:21:36.560] Yeah.
[01:21:36.560 --> 01:21:40.960] These dilutions often reach a point where no molecules of the original substance remain.
[01:21:40.960 --> 01:21:45.600] It is believed that the solution retains a quote memory of the substance.
[01:21:45.600 --> 01:21:48.000] Yes, that magical memory again.
[01:21:48.320 --> 01:21:50.960] And homeopathic treatments are highly personalized.
[01:21:50.960 --> 01:21:51.680] Did you know that?
[01:21:51.680 --> 01:21:56.800] Practitioners consider a person's physical, emotional, and psychological state to choose the most appropriate remedy.
[01:21:56.800 --> 01:21:58.080] Doctors never do that.
[01:21:58.080 --> 01:22:00.880] Some people with the same condition receive different treatments.
[01:22:00.880 --> 01:22:08.560] Homeopathy abuse symptoms as expressions of the body's attempt to heal itself and treat the individual as a whole rather than focusing solely on the disease.
[01:22:08.560 --> 01:22:11.920] Steve, you only focus on the disease, okay?
[01:22:11.920 --> 01:22:14.400] You do not treat the individual as a whole, right?
[01:22:14.720 --> 01:22:15.200] How dare you?
[01:22:15.680 --> 01:22:17.040] How dare you do this?
[01:22:17.040 --> 01:22:24.800] And then they talk about the various things that it can be, you know, made from, in a sense, or extracted from, you know, but we won't go over that.
[01:22:24.800 --> 01:22:26.640] They kind of just give you a list of the things.
[01:22:26.640 --> 01:22:28.960] Now, here's what they do start talking about.
[01:22:29.440 --> 01:22:30.520] What is it good for?
[01:22:31.080 --> 01:22:38.600] Some studies and clinical trials suggest that homeopathy may be helpful or provide symptom relief for certain conditions, such as childhood diarrhea.
[01:22:38.600 --> 01:22:39.960] Adults, you're out of luck.
[01:22:40.200 --> 01:22:59.320] Ear infections, asthma, menopausal symptoms, sore muscles, colds and flu, allergies, but it might also help reduce symptoms of mental health conditions such as major depressive disorder, MDD, and premenstrual dysphoric disorder, PDD.
[01:22:59.640 --> 01:23:00.680] Hmm.
[01:23:01.560 --> 01:23:02.040] Okay.
[01:23:02.600 --> 01:23:03.880] I'm surprised.
[01:23:03.880 --> 01:23:09.080] It's probably an even longer list than all these things in reality, is what they claim.
[01:23:09.320 --> 01:23:10.360] Okay, research.
[01:23:10.360 --> 01:23:12.680] And so we're, you know, most of the way through the article.
[01:23:12.680 --> 01:23:14.920] And we're, you know, now we get to research and evidence.
[01:23:14.920 --> 01:23:19.640] Critics argue the benefits of homeopathy are primarily due to the placebo effect.
[01:23:19.640 --> 01:23:21.240] I don't even know if that's right.
[01:23:21.240 --> 01:23:22.040] Don't the major.
[01:23:22.520 --> 01:23:23.560] Yeah, like there are no benefits.
[01:23:23.800 --> 01:23:29.160] Yeah, by saying the benefits, guys, because then the next line is any therapeutic effects.
[01:23:29.160 --> 01:23:31.000] There are no therapeutic effects.
[01:23:31.000 --> 01:23:31.640] Right.
[01:23:31.640 --> 01:23:34.280] Yeah, if there are any, quote, therapeutic effect.
[01:23:34.520 --> 01:23:35.080] Yeah, you're right.
[01:23:35.080 --> 01:23:36.040] There are no therapeutic effects.
[01:23:36.040 --> 01:23:36.680] It's all non-beneficial.
[01:23:37.160 --> 01:23:40.040] It's all symptomatic, and it's all placebo.
[01:23:40.440 --> 01:23:51.720] Well, and beyond that, the same things that come anytime there is any clinical trial, for example, seeing a person who seems to care about you, who says, I am here to treat you.
[01:23:52.040 --> 01:23:54.440] Those things can have positive psychological benefits.
[01:23:55.880 --> 01:23:56.920] Placebo effects, by definition.
[01:23:57.080 --> 01:23:58.200] Yeah, they're non-specific effects.
[01:23:58.440 --> 01:23:59.800] Right, it doesn't have to be homeopathy.
[01:23:59.800 --> 01:24:01.000] It could be B-venom therapy.
[01:24:01.000 --> 01:24:02.200] It could be something else.
[01:24:02.200 --> 01:24:02.920] Exactly.
[01:24:03.320 --> 01:24:09.480] And then they start, okay, then they start to get into the organizations that are basically saying this is not recommended.
[01:24:09.480 --> 01:24:16.320] 2017, National Health Service in England recommended that healthcare providers do not prescribe homeopathy.
[01:24:15.000 --> 01:24:21.600] The NICE, National Institute of Healthcare, Excellence, does not recommend it either.
[01:24:22.080 --> 01:24:27.760] In fact, they report that homeopathic remedies can be unsafe and some may contain substances that interfere with medications.
[01:24:27.760 --> 01:24:29.360] We've talked about that before.
[01:24:29.360 --> 01:24:34.880] They further advise not using homeopathy for life-threatening illnesses or emergencies in place of vaccines.
[01:24:34.880 --> 01:24:36.400] Absolutely correct.
[01:24:36.720 --> 01:24:38.240] Safety and regulation.
[01:24:38.240 --> 01:24:42.960] According to the National Institutes of Health, there may be safety considerations related to homeopathy.
[01:24:42.960 --> 01:24:43.760] Absolutely.
[01:24:43.760 --> 01:24:54.000] Some homeopathic products labeled as highly diluted might contain significant amounts of active ingredients, and that could lead to potential side effects or drug interactions.
[01:24:54.400 --> 01:24:55.280] And they continue.
[01:24:55.280 --> 01:25:06.080] What would have been better in this article, I think, is if you took these, because they finally do address some of the concerns here, that should have been at the top of the article, right?
[01:25:06.080 --> 01:25:12.400] And basically say, look, here's what this is, and then we can go on and then describe it in that context, right?
[01:25:12.400 --> 01:25:13.440] Framing.
[01:25:13.440 --> 01:25:19.680] So that would have made this a more, somewhat more palatable article to me.
[01:25:19.680 --> 01:25:23.200] But instead, you know, they get right in there because I don't know.
[01:25:23.200 --> 01:25:28.160] I imagine there is a first paragraph bias among people who read articles and things.
[01:25:28.160 --> 01:25:29.280] They're going to, right?
[01:25:29.600 --> 01:25:30.960] And a headline bias, certainly.
[01:25:30.960 --> 01:25:31.840] They'll read a headline.
[01:25:31.840 --> 01:25:35.200] They might read the first couple of paragraphs to really kind of absorb what's going on.
[01:25:35.200 --> 01:25:35.680] They're a good deal.
[01:25:35.920 --> 01:25:39.600] And then by the time you get to paragraph 28, you don't remember that.
[01:25:39.600 --> 01:25:43.280] You kind of remember what was at the top of the article.
[01:25:43.280 --> 01:25:50.400] So I would have much rather seen those lead this article instead of kind of being, well, buried, frankly, towards the bottom.
[01:25:50.400 --> 01:26:00.440] That doesn't make up for the horrible reporting error for the bottom line conclusion was: homeopathy may be beneficial for minor health concerns or in conjunction with more conventional treatment.
[01:26:00.440 --> 01:26:02.520] That's their bottom-line recommendation.
[01:26:02.520 --> 01:26:03.000] Right.
[01:26:03.000 --> 01:26:05.160] Wait, when in conjunction with more?
[01:25:59.840 --> 01:26:06.680] That's like this is a balanced.
[01:26:07.080 --> 01:26:07.720] Yeah, that's part of the thing.
[01:26:07.800 --> 01:26:10.040] When eating with nutrition.
[01:26:10.040 --> 01:26:11.960] When eating with a new protricious breakfast.
[01:26:13.720 --> 01:26:14.840] Straight out of The Simpsons.
[01:26:14.840 --> 01:26:17.640] Yeah, but again, it's a perfect example of what I was talking about about framing.
[01:26:17.640 --> 01:26:20.600] They're framing it as it's a natural treatment.
[01:26:20.600 --> 01:26:24.200] I actually chose FDU because of the small classroom sizes.
[01:26:24.200 --> 01:26:26.040] That's something that was extremely important to me.
[01:26:26.040 --> 01:26:28.680] I wanted to have a connection and a relationship with my professors.
[01:26:28.680 --> 01:26:30.840] I didn't want to just be a number in a classroom.
[01:26:30.840 --> 01:26:34.680] I seized the moment at FDU and found my purpose.
[01:26:36.280 --> 01:26:45.880] Teaching is rewarding and exhausting, but with trustworthy, AI-powered tools from Cengage, you can manage the chaos and make a bigger impact on your students.
[01:26:45.880 --> 01:26:55.240] Our tools give you real-time insights into student performance, help you pinpoint challenges, and provide personalized, scalable support without adding to your workload.
[01:26:55.240 --> 01:26:58.680] It's innovation that makes a real difference for you and your students.
[01:26:58.680 --> 01:27:04.280] Visit Cengage.com to learn how to teach with less stress and more impact.
[01:27:05.560 --> 01:27:07.720] That helps the body heal itself.
[01:27:07.720 --> 01:27:09.960] It's used for all these things.
[01:27:09.960 --> 01:27:12.440] You know, people say it works, blah, blah, blah, blah.
[01:27:12.440 --> 01:27:19.480] And then deep buried, the critics and skeptics say that, you know, whatever, blah, blah, blah, evidence, logic, you know.
[01:27:19.480 --> 01:27:21.720] And then, but we recommend it at the end, right?
[01:27:21.720 --> 01:27:35.960] Then they come back to a recommendation for it rather than a skeptical framing, which I think is way more accurate and more meaningful, which is this is dangerous, pseudoscience from beginning to end.
[01:27:36.280 --> 01:27:38.200] There is no redeeming qualities.
[01:27:38.200 --> 01:27:39.720] It is not a matter of debate.
[01:27:39.720 --> 01:27:41.800] It does not require any further research.
[01:27:41.800 --> 01:27:43.560] You should not use it at all.
[01:27:43.560 --> 01:27:47.120] It is an expensive and wasteful distraction.
[01:27:44.680 --> 01:27:50.080] That's, I think, an accurate and fair framing.
[01:27:50.080 --> 01:27:56.000] And you could say that in a way that is not off-putting and doesn't sound like you're putting anybody down or anything.
[01:27:56.000 --> 01:27:57.200] It's just like, these are the facts.
[01:27:57.200 --> 01:27:59.360] These are the objective scientific facts.
[01:27:59.360 --> 01:28:01.440] But yet, this article is horrible.
[01:28:02.800 --> 01:28:03.920] It could be worse.
[01:28:05.680 --> 01:28:08.960] They did include the skepticism buried deep in the article.
[01:28:08.960 --> 01:28:12.000] So they get half a point for that.
[01:28:12.000 --> 01:28:14.480] But as you say, Evan, yeah, it's buried.
[01:28:14.480 --> 01:28:18.400] It's contradicted by other parts of the article.
[01:28:20.240 --> 01:28:22.400] Bottom line is it's terrible.
[01:28:22.400 --> 01:28:24.000] Health.com, everyone.
[01:28:24.000 --> 01:28:25.920] Beware the title.
[01:28:25.920 --> 01:28:27.440] Yeah, they get health.com.
[01:28:27.440 --> 01:28:28.240] That is terrible.
[01:28:28.240 --> 01:28:28.800] I know.
[01:28:28.800 --> 01:28:33.120] That's a certain injustice if I use the word for that, too.
[01:28:33.440 --> 01:28:33.840] All right.
[01:28:33.840 --> 01:28:34.720] Thanks, Evan.
[01:28:34.720 --> 01:28:37.200] Jay, it's who's that noisy time.
[01:28:37.200 --> 01:28:39.680] All right, guys, last week I played This Noisy.
[01:28:55.680 --> 01:28:57.600] That pause is there for a reason.
[01:28:57.600 --> 01:28:58.240] Is it?
[01:28:58.240 --> 01:28:58.800] Yes.
[01:28:58.800 --> 01:29:03.440] So I, every week, I try to figure out what will people guess, you know?
[01:29:03.440 --> 01:29:07.040] And I thought of a couple of things I thought people would guess, and people did guess them.
[01:29:07.040 --> 01:29:08.080] And I'll tell you what they are.
[01:29:08.080 --> 01:29:12.160] So a listener named Brian Sinkowski.
[01:29:12.480 --> 01:29:13.520] Sinkowski's.
[01:29:13.520 --> 01:29:13.920] There it is.
[01:29:13.920 --> 01:29:14.880] He gave me the phonetics.
[01:29:14.880 --> 01:29:15.920] Thank you, Brian.
[01:29:15.920 --> 01:29:16.960] He said, Hi, Jay.
[01:29:16.960 --> 01:29:20.080] I'm sure you've heard this guess already, but this sounds like a rainstick.
[01:29:20.080 --> 01:29:21.840] He said his college roommate had one.
[01:29:21.840 --> 01:29:22.800] Of course, he did.
[01:29:22.800 --> 01:29:29.120] Everybody was getting high, and the rainstick gets pulled out, along with that head scratcher thing made out of metal.
[01:29:29.200 --> 01:29:31.400] Looks like a hanger, a bunch of hangers.
[01:29:31.640 --> 01:29:39.160] Anyway, I knew people were going to guess rainstick, and this absolutely does sound exactly like a rainstick, but it's not a rainstick.
[01:29:39.160 --> 01:29:43.960] The next listener who guessed, this is Tor Tor Strasburg.
[01:29:43.960 --> 01:29:48.360] He said, Hi, this week's Noisy Sounds Like the Quarter Dispenser at the Laundromat.
[01:29:48.360 --> 01:29:50.040] Put in a bill and get quarters.
[01:29:50.040 --> 01:29:51.960] This could be for 20 bucks.
[01:29:51.960 --> 01:29:52.920] Kind of.
[01:29:52.920 --> 01:29:58.760] I mean, you know, it's usually louder and a little bit more high-pitched, but not a horrible guess.
[01:29:58.760 --> 01:30:01.480] I have another guesser here named Edward Myers.
[01:30:01.480 --> 01:30:02.440] Edward says, Hi, Jay.
[01:30:02.920 --> 01:30:05.960] This week, the noisy sounds like dominoes toppling.
[01:30:05.960 --> 01:30:07.480] I'm not sure what color they are.
[01:30:07.480 --> 01:30:09.800] Well, if you could tell that by the sound.
[01:30:09.800 --> 01:30:13.560] Sure, it has a domino kind of effect, but that's not the answer.
[01:30:13.560 --> 01:30:17.240] Emily Markwell said, a sperm whale using echolocation.
[01:30:17.240 --> 01:30:18.120] Whoa.
[01:30:18.120 --> 01:30:19.560] That's a cool guess.
[01:30:19.560 --> 01:30:23.640] I mean, I love echolocation and I love whales, but that is not correct.
[01:30:23.640 --> 01:30:25.960] I'm going to give you one more guess here.
[01:30:25.960 --> 01:30:29.480] Another listener named Micah Swindles said, Hi, Jay.
[01:30:29.720 --> 01:30:33.400] Micah, here from, dude, you know I can't pronounce that.
[01:30:33.400 --> 01:30:36.520] Wayne Wainuomate.
[01:30:36.520 --> 01:30:39.000] Waynui Omata in New Zealand.
[01:30:39.000 --> 01:30:40.920] Wai Nui Omata.
[01:30:40.920 --> 01:30:43.480] Okay, he gave me the phonetic Wai Nui omata.
[01:30:43.480 --> 01:30:45.240] Okay, if you say so.
[01:30:45.480 --> 01:30:49.960] I've seen lots of different videos online recently of big hailstones falling.
[01:30:49.960 --> 01:30:52.920] Multiple people guessed this one, by the way.
[01:30:52.920 --> 01:30:56.360] So he says he thinks it's hailstones, but he doesn't know what they're falling on.
[01:30:56.360 --> 01:30:57.560] That's not a horrible guess either.
[01:30:57.560 --> 01:30:58.200] I like that guess.
[01:30:58.200 --> 01:31:00.120] I think hailstones are cool, but that is not it.
[01:31:00.120 --> 01:31:01.160] Okay, I have a winner.
[01:31:01.160 --> 01:31:09.080] The winner is Jake Bellevoyne, and he says, Hey, all this week's noisy, I believe, is bobbin lace making.
[01:31:09.080 --> 01:31:11.880] The noise is cool, but the video of this art is awesome.
[01:31:11.880 --> 01:31:13.240] Thanks for everything you do.
[01:31:13.240 --> 01:31:15.600] Okay, so how do I describe this?
[01:31:15.760 --> 01:31:37.120] Imagine a bunch of metal, I'm sorry, a bunch of wood dowels that are, let's say they're about five to six inches long, and they have like a bobbin, kind of like a bulbous end on them, kind of weigh them down, and they all are connected to the textile that lace is made out of, right?
[01:31:37.760 --> 01:31:49.200] Each one of them has a line on it, and you use those dowels to move the different threads to make the weave, and there's a lot of them.
[01:31:49.200 --> 01:31:57.680] And the people who do this are so good at it that they are ripping through these dowels with the threads attached to them super fast.
[01:31:57.680 --> 01:32:00.640] Toss them two to the left, two to the right, one under, one over, blah, blah, blah, blah.
[01:32:00.720 --> 01:32:01.200] You know what I mean?
[01:32:01.200 --> 01:32:02.080] It goes super quick.
[01:32:02.080 --> 01:32:04.240] You got to look it up and watch the video.
[01:32:04.240 --> 01:32:10.800] It looks like the person's having like, you know, a neurological incident going on, but they're not because it's all real.
[01:32:10.800 --> 01:32:13.280] They're really doing it and their hands are moving that fast.
[01:32:13.280 --> 01:32:17.920] So let's listen to that again as you can hopefully visualize my explanation.
[01:32:25.280 --> 01:32:35.600] So the noise you're hearing is them whipping those pieces of wood around that have the line attached to them, moving them back and forth and to the right and over and under and all this stuff.
[01:32:35.600 --> 01:32:36.960] It's very complicated.
[01:32:36.960 --> 01:32:45.600] And when you look up and you see where all the threads are going and you see the lace that's partially created, you're like, wow, like really complicated.
[01:32:45.600 --> 01:32:51.040] You have to have a lot of dexterity, probably an enormous amount of patience to just learn how to do that.
[01:32:51.040 --> 01:32:53.600] Anyway, very cool thing that people do.
[01:32:53.600 --> 01:32:57.920] I'm sure machines do it now, but it's still wonderful that people do it with their hands.
[01:32:57.920 --> 01:32:59.280] Anyway, good job, everybody.
[01:32:59.280 --> 01:33:01.480] Lots of fun guesses in there this week.
[01:33:01.720 --> 01:33:03.720] I have a new noisy for you guys this week.
[01:33:03.720 --> 01:33:09.560] This noisy was sent in by a listener named Daniel Berliner, Berliner, and here it is.
[01:33:24.600 --> 01:33:35.640] All right, guys, if you think you know this week's noisy or if you heard something cool, there's only one thing you got to do: email me at wtn at the skepticsguide.org.
[01:33:35.800 --> 01:33:39.320] Guys, as you know, we are recording our 1,000th show this weekend.
[01:33:39.320 --> 01:33:41.880] Tickets are sold, everything is done.
[01:33:42.360 --> 01:33:44.520] The show is just about to happen.
[01:33:44.520 --> 01:33:51.400] But if you would like to show your support for the SGU and the work that we've done, and you couldn't make it to the 1,000th show, become a patron.
[01:33:51.400 --> 01:33:51.960] Please do.
[01:33:51.960 --> 01:33:53.880] Please consider becoming a patron.
[01:33:53.880 --> 01:34:00.440] Not only do we appreciate your support, but all the contributions that we get definitely help us expand on what we do.
[01:34:00.440 --> 01:34:08.200] And on top of that, if you are a patron at the $5 level or higher, you will have access to this five-hour show.
[01:34:08.200 --> 01:34:09.400] You will have access to it.
[01:34:09.400 --> 01:34:12.120] You'll have access to the video, the audio, everything.
[01:34:12.120 --> 01:34:13.480] Both live and recorded.
[01:34:13.720 --> 01:34:15.560] All be there for you, right, Steve?
[01:34:15.560 --> 01:34:15.880] Yep.
[01:34:16.200 --> 01:34:18.040] Anyway, guys, we're looking forward to this weekend.
[01:34:18.040 --> 01:34:19.720] Congratulations to everybody, guys.
[01:34:19.720 --> 01:34:20.280] We did it.
[01:34:20.280 --> 01:34:20.760] We did it.
[01:34:20.760 --> 01:34:21.560] Yay.
[01:34:21.880 --> 01:34:22.840] Great job, Jay.
[01:34:22.840 --> 01:34:23.400] Made it to it.
[01:34:23.640 --> 01:34:23.960] Yep.
[01:34:23.960 --> 01:34:25.160] Made it to a thousand.
[01:34:25.160 --> 01:34:30.040] And I reserve the right to amend that compliment after the show.
[01:34:31.320 --> 01:34:34.120] But I won't, because I'm sure it's going to be kick-ass.
[01:34:34.120 --> 01:34:37.320] All right, guys, let's go on with science or fiction.
[01:34:39.560 --> 01:34:44.120] It's time for science or fiction.
[01:34:48.800 --> 01:34:56.400] Each week, I come up with three science news items or facts: two genuine and one fictitious, and I challenge my panel of skeptics to tell me which one is the fake.
[01:34:56.400 --> 01:34:58.240] We got three regular news items.
[01:34:58.240 --> 01:34:59.120] You guys ready?
[01:34:59.120 --> 01:34:59.520] Yep.
[01:34:59.920 --> 01:35:11.840] Item number one: a new analysis concludes that large language model AIs do not display any truly emergent abilities and therefore do not pose a novel risk.
[01:35:11.840 --> 01:35:21.440] I number two, researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:35:21.760 --> 01:35:31.760] And item number three, scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity.
[01:35:31.760 --> 01:35:33.040] Kara, go first.
[01:35:33.040 --> 01:35:40.000] A new analysis concludes that large language model AI do not display any truly emergent abilities.
[01:35:40.000 --> 01:35:41.520] Okay, can you define that?
[01:35:41.520 --> 01:35:46.080] So an emergent ability is something that was not specifically trained or programmed or whatever.
[01:35:46.080 --> 01:35:47.760] They just sort of develop this.
[01:35:47.760 --> 01:35:54.960] They extend the boundaries of what they're capable of doing, sort of as an emergent complexity rises.
[01:35:55.120 --> 01:35:56.480] You couldn't have predicted it.
[01:35:56.800 --> 01:36:00.000] Oh, I thought that was kind of the point, but maybe not.
[01:36:00.000 --> 01:36:02.320] So therefore, do not pose a novel risk.
[01:36:02.320 --> 01:36:11.920] Like they could run away potentially with the things that they've been fed and new combinations and I guess permutations of the inputs.
[01:36:12.160 --> 01:36:18.720] They can, you know, solve problems better, but they're not going to, like, I guess, achieve this novel echelon.
[01:36:18.720 --> 01:36:21.120] They're not going to run away and start murdering people.
[01:36:21.120 --> 01:36:21.520] Okay.
[01:36:21.520 --> 01:36:27.360] Researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:36:28.160 --> 01:36:28.560] Okay.
[01:36:28.880 --> 01:36:40.600] Scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity.
[01:36:40.600 --> 01:36:46.840] Okay, so the question with this one is these routine biological markers, you may or may not be able to answer.
[01:36:47.000 --> 01:36:49.320] Blood tests, you know, or x-rays or whatever.
[01:36:49.640 --> 01:36:54.840] So you're talking about people after they are not talking about things that make them more.
[01:36:55.000 --> 01:37:04.120] No, no, this is just diagnosing that somebody has it based upon kind of laboratory workup that we would normally do in medicine.
[01:37:05.080 --> 01:37:12.840] So I thought that we were sort of not sure why people get long COVID.
[01:37:13.160 --> 01:37:16.360] 90% sensitivity and specificity.
[01:37:16.600 --> 01:37:18.760] I mean, that would be a game changer.
[01:37:18.760 --> 01:37:25.640] Like if we could just diagnose it, know immediately this is long COVID, develop treatments for it, have this class of people.
[01:37:25.640 --> 01:37:29.480] But I thought this was more of one of those things where it's like, is it long COVID?
[01:37:29.480 --> 01:37:29.960] Is it not?
[01:37:29.960 --> 01:37:31.400] How do we define it?
[01:37:31.400 --> 01:37:35.400] Is it actually, you know, this, this, is it POTS?
[01:37:35.400 --> 01:37:36.040] Is it this?
[01:37:36.040 --> 01:37:38.120] Is it a constellation of factors?
[01:37:38.120 --> 01:37:42.520] So yeah, that one's bugging me as the, I think that one might be the fiction.
[01:37:42.520 --> 01:37:50.520] I don't think that long COVID is a quote, I mean, it is a disease, but I think, you know, how there's, we often make this distinction, right?
[01:37:50.520 --> 01:38:01.400] Maybe you can help me with the terminology between like a constellation of symptoms, like a disorder, and a specific disease, like, you know, you catch an infection or this, this genetic thing changed.
[01:38:01.560 --> 01:38:04.600] So, yeah, I think that one might be the fiction because of that.
[01:38:05.240 --> 01:38:06.600] Okay, Jay.
[01:38:06.600 --> 01:38:15.040] Okay, the first one about the large language models and that they don't have emergent behavior.
[01:38:13.400 --> 01:38:16.880] Truly emergent behavior.
[01:38:14.600 --> 01:38:17.680] I mean, whatever.
[01:38:19.040 --> 01:38:23.680] I'm not sure what Steve is using that word for, but they do have emergent behavior.
[01:38:24.000 --> 01:38:29.760] You know, as far as I understand, so I just can't see how that could be science right there.
[01:38:29.760 --> 01:38:31.280] Bob says truly, though.
[01:38:31.280 --> 01:38:32.400] Yeah, that's tough.
[01:38:32.800 --> 01:38:34.400] Let me go on to the next one.
[01:38:34.400 --> 01:38:39.200] The Insight Mars lander concluded that the mid-crust of Mars is saturated with liquid water.
[01:38:39.200 --> 01:38:40.960] Yeah, I think this one is science.
[01:38:40.960 --> 01:38:47.040] You know, we've been searching for water for so long, and I know that we've been drilling and all sorts of things.
[01:38:47.040 --> 01:38:50.480] And I wouldn't be surprised if I found out that this is the truth there.
[01:38:50.480 --> 01:38:57.360] And then the last one, scientists have demonstrated that a set of routine biological markers can diagnose long COVID.
[01:38:57.360 --> 01:38:59.840] A routine set of biological markers.
[01:38:59.840 --> 01:39:01.440] I wonder what the hell that is.
[01:39:01.440 --> 01:39:03.600] It's 90% specific.
[01:39:04.400 --> 01:39:06.400] All right, that's interesting.
[01:39:06.400 --> 01:39:09.360] Yeah, I'm going to say that the first one is fiction.
[01:39:09.840 --> 01:39:15.120] I'm fairly certain that large AI models have emergent capabilities.
[01:39:15.120 --> 01:39:16.080] Okay, Bob.
[01:39:16.480 --> 01:39:17.440] This is tough.
[01:39:18.240 --> 01:39:24.880] The Mars one, I buy the Mars one, but the routine biological markers for long COVID, man.
[01:39:25.200 --> 01:39:26.320] That one's tough.
[01:39:26.320 --> 01:39:30.320] I would think it would be something more extensive.
[01:39:30.320 --> 01:39:31.360] So there's no level.
[01:39:31.440 --> 01:39:37.040] So the implication is that there's no real genetic assessment going on.
[01:39:37.040 --> 01:39:42.560] It's just, right, are they actually looking at the level of the genome or is it just...
[01:39:42.720 --> 01:39:44.080] You said blood tests.
[01:39:44.080 --> 01:39:44.720] Yeah, root.
[01:39:44.800 --> 01:39:49.200] Yeah, it's not going to be like fancy genet egg testing because that wouldn't be routine.
[01:39:49.200 --> 01:39:49.680] Right.
[01:39:49.680 --> 01:39:51.360] I just wanted to make that clear.
[01:39:51.360 --> 01:40:01.080] But the thing with the with number one here, the large language models, and the truly emergent behavior, that the truly is, to me, potentially a key word.
[01:40:01.720 --> 01:40:03.960] So then how do you define emergent?
[01:40:03.960 --> 01:40:14.360] You know, and if you if you define truly emergent as something really, you know, really dramatic and emergent, that's truly.
[01:40:14.680 --> 01:40:22.440] And so the other lowly, merely minimally emergent behaviors don't really count because they're not truly emergent.
[01:40:22.760 --> 01:40:27.240] You know, it's just like, I don't know, I'm just like picking away at this needlessly.
[01:40:28.120 --> 01:40:37.240] But I remember GPT-40 when that came out, I mean, I was reading some scientists' assessments, and they're like, yeah, this was not anticipated.
[01:40:37.240 --> 01:40:40.040] Some of these behaviors were not anticipated.
[01:40:41.160 --> 01:40:45.880] And back then, you know, they were saying that, yeah, this definitely is emergent.
[01:40:45.880 --> 01:40:55.960] But now that it's kind of taken time, time has passed, they look deeper, maybe it's really, they can explain it, and it's not really emergent.
[01:40:55.960 --> 01:41:01.320] So that wouldn't surprise me because I haven't taken a deep dive in that in quite a while, many, many months.
[01:41:01.320 --> 01:41:03.640] So who knows what they could have discovered?
[01:41:03.640 --> 01:41:09.240] Ah, I think I'll go with, so I can kind of explain the emergent abilities.
[01:41:09.240 --> 01:41:17.000] So then I'll have to default then to the routine biological markers detecting long COVID with CARA.
[01:41:17.000 --> 01:41:17.160] Okay.
[01:41:17.800 --> 01:41:18.760] And Evan.
[01:41:20.200 --> 01:41:21.800] Yeah, all the same reasons.
[01:41:21.800 --> 01:41:31.960] Everyone else said I don't think I have anything really much to add here, except that the mid-crust of Mars may not be saturated with liquid water.
[01:41:31.960 --> 01:41:35.480] It could be a marshmallow or cream filling instead.
[01:41:35.640 --> 01:41:37.320] That could be the fiction for that one.
[01:41:37.320 --> 01:41:39.160] You didn't think about that.
[01:41:39.160 --> 01:41:41.400] Otherwise, everything else was expressed.
[01:41:41.400 --> 01:41:42.360] So, okay, right.
[01:41:42.360 --> 01:41:44.680] Is it going to be the AI or is it going to be the COVID one?
[01:41:44.960 --> 01:41:45.680] I don't know.
[01:41:47.040 --> 01:41:50.080] Again, emergent, truly emergent, right?
[01:41:50.080 --> 01:41:50.720] So, hmm.
[01:41:51.120 --> 01:41:54.640] And therefore, do not pose a novel risk.
[01:41:54.640 --> 01:42:06.160] So, you kind of got two pieces in this one about the AI that could either throw it off, or if one's dependent on the other, bring it all in one direction, if you know what I mean.
[01:42:06.160 --> 01:42:10.800] But I will choose Karab, I'll go with you guys.
[01:42:10.800 --> 01:42:11.840] I'll join the COVID.
[01:42:11.840 --> 01:42:14.240] Jay, I almost almost went AI.
[01:42:14.240 --> 01:42:16.640] It was really practically a coin flip.
[01:42:16.640 --> 01:42:18.240] So, I'm sorry to leave you there.
[01:42:18.240 --> 01:42:19.840] All right, so we'll start with number two.
[01:42:19.840 --> 01:42:27.440] Researchers modeling data from the InSight Mars lander conclude that the mid-crust of Mars is saturated with liquid water.
[01:42:27.440 --> 01:42:33.360] You all think this one is science, and this one is science.
[01:42:33.360 --> 01:42:34.720] Is this science?
[01:42:34.720 --> 01:42:37.920] So, yes, saturated, saturated with water.
[01:42:37.920 --> 01:42:39.040] Now, why do they think this?
[01:42:39.680 --> 01:42:41.680] They were sort of examining data.
[01:42:41.680 --> 01:42:50.560] Basically, this is like seismic data, and they're looking at how the seismic waves move through the crust.
[01:42:50.560 --> 01:42:51.680] And when they interpret.
[01:42:52.000 --> 01:42:54.560] What causes the seismic waves in this case?
[01:42:54.880 --> 01:42:58.320] I guess there are Marsquakes, and they measure them.
[01:42:59.200 --> 01:43:10.800] So, essentially, when they interpret the data, they say the best explanation for the data that we have is that the mid-crust is saturated with liquid water.
[01:43:10.800 --> 01:43:18.480] So, it means the upper crust has dried out, but there's still a substantial amount of water in the mid-crust.
[01:43:18.480 --> 01:43:22.560] The mid-crust is 11.5 to 20 kilometers down.
[01:43:22.880 --> 01:43:33.800] So, that's pretty deep, but it means if we could drill 12 kilometers down from the surface on Mars, we might have access to liquid water, right?
[01:43:33.800 --> 01:43:34.760] Basically, a well.
[01:43:29.680 --> 01:43:36.520] That's still pretty deep, though.
[01:43:36.840 --> 01:43:38.680] Yeah, quite deep.
[01:43:38.680 --> 01:43:39.240] All right.
[01:43:39.240 --> 01:43:41.080] Which one should we go to next?
[01:43:41.080 --> 01:43:42.760] Let's go back to number one.
[01:43:42.760 --> 01:43:50.920] A new analysis concludes that large language model AIs do not display any truly emergent abilities and therefore do not pose a novel risk.
[01:43:50.920 --> 01:43:52.760] Jay, you think this one is the fiction.
[01:43:52.760 --> 01:43:54.920] Everyone else thinks this one is science.
[01:43:54.920 --> 01:44:04.840] What if I told you that truly emergent is as opposed to apparently emergent, right?
[01:44:04.840 --> 01:44:12.040] So not minimally emergent, but just that it appeared to be emergent, but it really isn't.
[01:44:12.040 --> 01:44:14.120] Yeah, like they figured, yeah, they figured it out.
[01:44:14.360 --> 01:44:21.240] Because if you can figure out how it arose, really, in a sense, then it's a current definition.
[01:44:21.320 --> 01:44:23.000] Does that change your choice at all?
[01:44:23.000 --> 01:44:23.720] If I said that?
[01:44:23.720 --> 01:44:24.760] But yeah, potentially.
[01:44:25.400 --> 01:44:26.040] Yeah.
[01:44:26.360 --> 01:44:31.480] Well, I mean, I wouldn't necessarily switch, but that makes sense.
[01:44:31.480 --> 01:44:35.560] Well, this one is science.
[01:44:35.560 --> 01:44:36.600] Sorry, Jay.
[01:44:37.480 --> 01:44:45.640] Yeah, so what they found was that the behavior that was thought to be emergent actually isn't.
[01:44:45.640 --> 01:44:58.360] That it says our findings suggest that purported emergent abilities are not truly emergent, but result from a combination of in-context learning, model memory, and linguistic knowledge.
[01:44:58.360 --> 01:45:05.640] So the in-context learning is basically, training on a specific question, right, or a specific body of knowledge.
[01:45:05.640 --> 01:45:12.840] So, the point of this was to understand how these models work and how they perform, right?
[01:45:12.840 --> 01:45:19.040] So, understanding how it gets to the behaviors that it displays was sort of the point.
[01:45:14.200 --> 01:45:25.440] And they're saying, Yeah, this is, when you look at it, it's not going outside the boundaries of what is already there.
[01:45:25.440 --> 01:45:28.000] It is not a truly new emergent behavior.
[01:45:28.000 --> 01:45:36.880] It is just doing what it does in the context of this, you know, again, this context learning model, memory, et cetera, explains everything.
[01:45:37.440 --> 01:45:50.240] And then the bit about therefore doesn't pose a novel risk, that was more spin, but that just basically means it doesn't pose the risk that you would think truly emergent behavior does because that's completely unpredictable, right?
[01:45:50.240 --> 01:46:00.880] It's meaning because it's something truly emergent by definition you can't predict because it's not extrapolated from what it already could do, right?
[01:46:01.200 --> 01:46:06.560] But of course, that's the part that the news releases went with.
[01:46:06.560 --> 01:46:10.640] And then, when I went to the study, it wasn't about that at all, you know?
[01:46:10.960 --> 01:46:18.800] So, like, the news release says AI poses no existential threat to humanity, new study finds.
[01:46:19.120 --> 01:46:24.720] The study says, are emergent abilities and large language models just in context learning?
[01:46:24.720 --> 01:46:26.160] That was their headline.
[01:46:26.160 --> 01:46:27.680] Completely different, you know?
[01:46:28.160 --> 01:46:28.800] What?
[01:46:28.960 --> 01:46:30.000] So always the case.
[01:46:30.560 --> 01:46:31.360] Yeah, amazing.
[01:46:31.680 --> 01:46:36.640] I'm really hating those titles even more than I used to.
[01:46:36.640 --> 01:46:55.600] All right, this means that scientists have demonstrated that a set of routine biological markers can diagnose long COVID with over 90% sensitivity and specificity is the fiction because the truth is, what the study shows is that there are no biological markers that could tell you if you have long COVID or not.
[01:46:55.600 --> 01:46:57.760] We have no markers for long COVID.
[01:46:58.080 --> 01:46:58.720] So, how do we?
[01:46:59.200 --> 01:47:01.720] It's a clinical diagnosis, as you were saying, right?
[01:47:01.720 --> 01:47:02.840] It's a syndrome or whatever.
[01:47:02.920 --> 01:47:03.640] It's just a bunch of symptoms.
[01:46:59.920 --> 01:47:04.840] Syndrome, that's what it is.
[01:47:04.920 --> 01:47:13.080] It's a clinical diagnosis, and it's not, we don't understand the pathophysiology, and therefore there are no pathophysiological markers, right?
[01:47:13.080 --> 01:47:13.640] Like the functionality.
[01:47:13.800 --> 01:47:15.800] But that doesn't mean that we won't in the future.
[01:47:15.800 --> 01:47:21.480] It doesn't, but there are certainly entities that have remained in that category for decades.
[01:47:21.480 --> 01:47:22.680] So, you know what I mean?
[01:47:22.920 --> 01:47:26.360] It also means that then the boundaries are always going to be blurry, right?
[01:47:27.080 --> 01:47:28.840] Like, this person might, this something.
[01:47:29.080 --> 01:47:34.360] It's like if we rule out everything else, this is maybe what's going on, but we don't really know.
[01:47:34.360 --> 01:47:37.720] I mean, it may just be that there's nothing pathological to find.
[01:47:37.720 --> 01:47:45.880] You know, sometimes there are post-infectious syndromes where people have symptoms, and it's probably based on the fact that their nerves are functioning differently.
[01:47:45.880 --> 01:47:48.760] It's a functional, you know, problem at that point.
[01:47:48.760 --> 01:47:56.200] It's like, yeah, at a cellular level, at a neurological level, a metabolic level, things are a bit off, but there's nothing measurable, right?
[01:47:56.200 --> 01:47:59.320] There's no pathophysiological smoking gun.
[01:47:59.320 --> 01:48:02.120] And therefore, it will always be in that category.
[01:48:02.120 --> 01:48:06.680] There's never going to be a blood test for like a migraine, for example.
[01:48:07.080 --> 01:48:07.480] Yeah.
[01:48:07.480 --> 01:48:07.960] But yes.
[01:48:08.120 --> 01:48:16.440] I think it's sad, though, because there are certain things that, like, if you say, if you describe your migraine and it's like follows the textbook, it's like, yeah, you have a migraine.
[01:48:16.520 --> 01:48:17.000] It could be very similar to the same thing.
[01:48:17.160 --> 01:48:28.920] With long COVID, very similar to like fibromyalgia, very similar to, like, a lot of people just aren't believed, they aren't really listened to, they're sort of minimized, but they don't feel well.
[01:48:28.920 --> 01:48:31.720] And they know that they don't feel well, and they're trying to get answers.
[01:48:31.880 --> 01:48:45.840] But I will say, I do think, even over the course of my career, over the last 20 years, the sort of acceptance of these diagnoses, diagnoses, and the skepticism towards patients who have them, I think, has improved.
[01:48:46.160 --> 01:48:53.600] Meaning, it's basically accepted that, yeah, people have these syndromes, even though we can't find any pathological smoking gun.
[01:48:53.600 --> 01:48:56.480] And they're just as real, you know, and whatever.
[01:48:56.480 --> 01:48:57.680] We need to take them seriously.
[01:48:57.680 --> 01:49:03.920] We need to, you know, basically being thinking that it's all in the patient's head or that they're faking doesn't get you anywhere.
[01:49:03.920 --> 01:49:08.480] It's not really accurate, and it's just an unwarranted negative judgment.
[01:49:08.480 --> 01:49:13.520] Yeah, and that may be the trend when you look at these things systemically.
[01:49:13.520 --> 01:49:21.040] But I can tell you, being a psychologist in a healthcare setting, every, not every, but many women that I talked to had experiences.
[01:49:21.280 --> 01:49:21.520] Totally.
[01:49:21.520 --> 01:49:23.120] It is more targeted towards women.
[01:49:23.120 --> 01:49:23.760] Absolutely.
[01:49:23.760 --> 01:49:27.440] More targeted towards women, more targeted towards people of color, where they were not believed.
[01:49:27.440 --> 01:49:33.440] Like very often, you know, patients with cancer diagnoses, it's caught later in women, it's caught later in people of color.
[01:49:33.440 --> 01:49:38.400] Heart attacks are caught later in women because they present atypically and they're not believed as much.
[01:49:38.400 --> 01:49:38.720] Absolutely.
[01:49:39.200 --> 01:49:41.840] And these are things that we do have measurements for.
[01:49:41.840 --> 01:49:43.520] So now you talk about something where there are no measurements.
[01:49:43.680 --> 01:49:46.080] They're just not doing the test when they should.
[01:49:46.080 --> 01:49:47.520] But in this case, there isn't a test.
[01:49:47.520 --> 01:49:48.880] It's all clinical syndrome.
[01:49:49.120 --> 01:49:53.200] And so you can imagine how much harder it is to be heard and believed when that's the case.
[01:49:53.200 --> 01:49:53.600] Yeah.
[01:49:53.600 --> 01:49:54.480] It's a bummer.
[01:49:54.480 --> 01:49:56.640] Yeah, we have to constantly remind ourselves of this fact.
[01:49:56.640 --> 01:50:01.440] This is like, you know, part of being a good clinician is sort of beating all of your biases out of yourself.
[01:50:01.440 --> 01:50:02.000] You know what I mean?
[01:50:02.000 --> 01:50:05.120] You really, those cultural biases are not good.
[01:50:05.840 --> 01:50:10.960] They do interfere with your ability to objectively and accurately assess your patient.
[01:50:12.480 --> 01:50:12.960] All right.
[01:50:12.960 --> 01:50:14.160] Evan, give us a quote.
[01:50:14.480 --> 01:50:17.760] It is important to promote a scientific attitude.
[01:50:17.760 --> 01:50:22.640] This is much more important than really understanding all concepts about science.
[01:50:22.640 --> 01:50:30.280] If you understand how science works and you develop a scientific attitude towards the world, you are going to be a skeptic.
[01:50:30.600 --> 01:50:32.520] Natalia Pasternak.
[01:50:29.520 --> 01:50:35.720] She's a microbiologist, an author, a science communicator.
[01:50:36.040 --> 01:50:40.680] She's from Brazil, but she'll be with us in Las Vegas in October at Sycon.
[01:50:41.080 --> 01:50:41.960] Yeah, she's wonderful.
[01:50:41.960 --> 01:50:43.400] I had her on my show a while back.
[01:50:43.400 --> 01:50:48.920] She was kind of responsible for like a lot of pushback against Bolsonaro's COVID policy.
[01:50:48.920 --> 01:50:49.240] Wow.
[01:50:49.240 --> 01:50:51.160] I think she made huge waves in Brazil.
[01:50:51.160 --> 01:50:51.960] That's awesome.
[01:50:51.960 --> 01:50:52.520] Yeah.
[01:50:52.520 --> 01:50:58.200] Yeah, we're looking forward to meeting all these people in Vegas and seeing all their talks.
[01:50:58.200 --> 01:50:58.760] Yeah.
[01:50:58.760 --> 01:51:01.720] I know we're always busy, but I do try to get to some of the good ones.
[01:51:01.720 --> 01:51:02.120] Definitely.
[01:51:02.120 --> 01:51:04.040] There'll be a lot of good ones this year.
[01:51:04.040 --> 01:51:04.680] All right, guys.
[01:51:04.680 --> 01:51:06.920] Well, thank you all for joining me this week.
[01:51:06.920 --> 01:51:07.800] You're welcome, brother.
[01:51:07.960 --> 01:51:08.200] Thanks, guys.
[01:51:08.280 --> 01:51:09.000] Thanks, Doctor.
[01:51:09.000 --> 01:51:13.480] And until next week, this is your Skeptic's Guide to the Universe.
[01:51:15.720 --> 01:51:22.440] Skeptic's Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:51:22.440 --> 01:51:27.080] For more information, visit us at theskepticsguide.org.
[01:51:27.080 --> 01:51:31.000] Send your questions to info at the skepticsguide.org.
[01:51:31.000 --> 01:51:41.720] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:51:41.720 --> 01:51:45.080] Our listeners and supporters are what make SGU possible.
[01:51:52.760 --> 01:51:55.800] You don't have to wait for a milestone or holiday.
[01:51:55.800 --> 01:51:56.920] Celebrate now.
[01:51:56.920 --> 01:52:04.200] Take a pause from the everyday and allow yourself the gift of wellness and rejuvenation in a truly awe-inspiring setting.
[01:52:04.200 --> 01:52:12.200] Guided nature hikes, wellness classes, evening entertainment, and locally sourced cuisine, all included in your stay package.
[01:52:12.200 --> 01:52:18.400] Feel worlds away and experience for yourself why Mohawk Mountain House was voted number one resort.
[01:52:18.400 --> 01:52:23.040] Now is the perfect time to plan your getaway at mohonk.com.