Debug Information
Processing Details
- VTT File: skepticast2024-12-07.vtt
- Processing Time: September 09, 2025 at 08:40 AM
- Total Chunks: 2
- Transcript Length: 135,039 characters
- Caption Count: 1,417 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.320 --> 00:00:04.640] The Skeptic's Guide to the Universe is brought to you by Progressive Insurance.
[00:00:04.640 --> 00:00:08.960] Do you ever think about switching insurance companies to see if you could save some cash?
[00:00:08.960 --> 00:00:14.480] Progressive makes it easy to see if you could save when you bundle your home and auto policies.
[00:00:14.480 --> 00:00:17.120] Try it at progressive.com.
[00:00:17.120 --> 00:00:19.920] Progressive casualty insurance company and affiliates.
[00:00:19.920 --> 00:00:21.440] Potential savings will vary.
[00:00:21.440 --> 00:00:23.600] Not available in all states.
[00:00:27.120 --> 00:00:30.320] You're listening to the skeptic's guide to the universe.
[00:00:30.320 --> 00:00:33.200] Your escape to reality.
[00:00:33.840 --> 00:00:36.800] Hello and welcome to the Skeptics Guide to the Universe.
[00:00:36.800 --> 00:00:41.520] Today is Tuesday, December 3rd, 2024, and this is your host, Stephen Novella.
[00:00:41.520 --> 00:00:43.280] Joining me this week are Bob Novella.
[00:00:43.280 --> 00:00:43.840] Hey, everybody.
[00:00:43.840 --> 00:00:45.200] Kara Santa Maria.
[00:00:45.200 --> 00:00:45.600] Howdy.
[00:00:45.680 --> 00:00:46.720] Jay Novella.
[00:00:46.720 --> 00:00:47.360] Hey, guys.
[00:00:47.360 --> 00:00:49.120] And Evan Bernstein.
[00:00:49.120 --> 00:00:50.080] Good evening, everyone.
[00:00:50.080 --> 00:00:52.080] So, guys, we're going to DC this weekend.
[00:00:52.080 --> 00:00:59.120] When the show comes out, we will be basically in the middle of our private show in D.C., and then we have the extravaganza.
[00:00:59.120 --> 00:01:06.240] And it's going to be a little tricky because, as some of you already know, I fell and cracked a rib last week.
[00:01:06.240 --> 00:01:08.240] Oh, Steve.
[00:01:08.720 --> 00:01:09.920] Don't do that, by the way.
[00:01:09.920 --> 00:01:13.360] It's like I recommend you do not fracture a rib.
[00:01:13.360 --> 00:01:14.800] It's not what happened.
[00:01:14.800 --> 00:01:15.360] I hear that hurt.
[00:01:16.320 --> 00:01:17.120] It was nothing.
[00:01:17.360 --> 00:01:18.480] It was completely stupid.
[00:01:18.480 --> 00:01:29.040] I was, you know, outside at night at my mother-in-law's place, and I tripped on a stone patio thing, fell on my right side, and that's it.
[00:01:29.520 --> 00:01:30.240] You were drunk.
[00:01:30.240 --> 00:01:30.880] Come on.
[00:01:33.680 --> 00:01:37.520] So, yeah, unfortunately, one of my ribs took the majority of the impact.
[00:01:37.520 --> 00:01:38.080] Oof.
[00:01:38.400 --> 00:01:41.520] So, how bad does it hurt for those of us who have not broken a rib?
[00:01:41.520 --> 00:01:42.960] So, it's not bad.
[00:01:42.960 --> 00:01:46.560] It's a haven't been x-rayed or anything.
[00:01:46.560 --> 00:01:51.680] I'm just diagnosing myself based upon symptoms, but it's probably a non-displaced fracture.
[00:01:51.680 --> 00:01:52.720] I could feel the rib.
[00:01:52.720 --> 00:01:54.000] Everything feels good.
[00:01:54.000 --> 00:01:58.360] Yes, I think I could take a deep breath without any problems.
[00:01:58.840 --> 00:01:59.880] That's the big damage anyway.
[00:02:00.040 --> 00:02:00.680] Yeah, the lungs.
[00:02:01.160 --> 00:02:01.640] That's huge.
[00:01:59.200 --> 00:02:03.240] Yeah, well, it's been a week now.
[00:02:05.480 --> 00:02:09.240] But it's just I am very restricted in my movements.
[00:02:09.240 --> 00:02:09.880] You know what I mean?
[00:02:09.880 --> 00:02:13.080] I can't bear any weight with my right arm, which is very limiting.
[00:02:13.080 --> 00:02:15.320] So you guys are going to have to do all my heavy lifting this weekend.
[00:02:15.800 --> 00:02:17.160] It's basically what I'm telling you.
[00:02:17.160 --> 00:02:18.920] No interpretive dance at all.
[00:02:18.920 --> 00:02:23.400] Yeah, like, so remember the last extravagance where I did that Russian goose-stepping dance, whatever?
[00:02:23.720 --> 00:02:24.200] That ain't happening.
[00:02:24.280 --> 00:02:26.280] That's not happening this time.
[00:02:26.840 --> 00:02:28.680] No physicality for me.
[00:02:28.680 --> 00:02:29.080] Oh, no.
[00:02:29.320 --> 00:02:31.880] Yeah, you just gotta be.
[00:02:32.200 --> 00:02:33.880] Well, you'll be guessing the freeze frames.
[00:02:34.360 --> 00:02:37.560] Or, I mean, I could, as long as I'm vertical, I'm fine.
[00:02:37.560 --> 00:02:38.760] Like, I'm not going on the floor.
[00:02:38.760 --> 00:02:41.480] I'm not doing anything ambitious.
[00:02:41.640 --> 00:02:43.880] It really, just, it's very, it's just miserable.
[00:02:43.880 --> 00:02:44.600] It's, you know what I mean?
[00:02:44.600 --> 00:02:47.240] Because I have to get into very specific positions.
[00:02:47.240 --> 00:02:50.600] Like, there can't be any tension on my ribs, you know?
[00:02:50.840 --> 00:02:56.920] It's one of those core muscles, you know, one of those positional things where it's like try to put your back completely at rest.
[00:02:56.920 --> 00:02:59.000] Like, it's very, very difficult to do it.
[00:02:59.240 --> 00:03:00.440] Sleeping is terrible.
[00:03:01.560 --> 00:03:05.640] I guess you're going to try really hard not to get into a fist fight with someone at the show, right?
[00:03:05.720 --> 00:03:05.800] Yeah.
[00:03:05.880 --> 00:03:06.360] Some audience members.
[00:03:06.600 --> 00:03:07.400] Very, very hard.
[00:03:07.400 --> 00:03:10.920] I know usually I get into two or three fist fights with audience members.
[00:03:11.560 --> 00:03:12.520] I know, yeah.
[00:03:12.520 --> 00:03:14.680] Well, what the hell go for that right side?
[00:03:14.920 --> 00:03:17.640] And I can't really give like a full cough.
[00:03:17.640 --> 00:03:19.720] So I'm always doing these half coughs.
[00:03:20.200 --> 00:03:21.400] Oh, I hate those.
[00:03:23.960 --> 00:03:24.280] I know.
[00:03:24.280 --> 00:03:29.240] I got to, it's like just enough to because you're constantly clearing gunk out of your lungs.
[00:03:29.640 --> 00:03:30.040] I know.
[00:03:30.040 --> 00:03:31.400] Humans are disgusting.
[00:03:31.400 --> 00:03:38.600] That's part of the problem with anything that impairs your ability to cough is that you run the risk of the gunk getting backed up.
[00:03:38.840 --> 00:03:43.000] Well, Steve, you know, the trick that I was taught before.
[00:03:43.480 --> 00:03:45.520] Yeah, is to push into a pillow.
[00:03:45.520 --> 00:03:47.120] So after my hysterectomy, I had to.
[00:03:47.120 --> 00:03:48.640] I mean, you still have to cough sometimes.
[00:03:48.640 --> 00:03:49.280] It's brutal.
[00:03:45.000 --> 00:03:51.600] Yeah, but this isn't an abdominal problem.
[00:03:51.840 --> 00:03:53.360] So that helps with the abdominal.
[00:03:53.360 --> 00:03:54.160] It's higher.
[00:03:54.480 --> 00:03:57.040] It's and on the side.
[00:03:57.040 --> 00:04:01.920] And it's like try breathing or coughing without moving your ribs, you know.
[00:04:01.920 --> 00:04:04.720] Yeah, but I mean, you can't breathe or cough without moving your abs either.
[00:04:04.720 --> 00:04:05.120] I know.
[00:04:05.120 --> 00:04:05.520] I'm actually trying to get a lot of things.
[00:04:05.680 --> 00:04:06.480] You know, they're all applied.
[00:04:06.800 --> 00:04:11.040] I'm trying to do abdominal breathing rather than rib breathing.
[00:04:11.120 --> 00:04:11.680] That's good for you.
[00:04:12.000 --> 00:04:14.960] I teach my patients how to do that, like diaphragmatic breathing all the time.
[00:04:14.960 --> 00:04:15.520] Right, exactly.
[00:04:15.520 --> 00:04:16.640] It's good for your blood pressure.
[00:04:16.640 --> 00:04:17.600] Oh, is that right?
[00:04:20.080 --> 00:04:22.000] Well, I got some interesting news, guys.
[00:04:22.000 --> 00:04:22.480] Oh, yeah?
[00:04:22.800 --> 00:04:24.160] What'd you break, Jay?
[00:04:24.160 --> 00:04:26.400] We now have SGU dice.
[00:04:26.400 --> 00:04:26.960] Ooh.
[00:04:27.600 --> 00:04:27.840] Yeah.
[00:04:28.000 --> 00:04:28.560] Can't wait to see.
[00:04:28.720 --> 00:04:30.080] Not what I was expecting you to say.
[00:04:30.080 --> 00:04:40.640] Yeah, so what happened was Brian Wecht, you know, he and I talk all the time, and he's constantly, like, he's such a good friend that he is constantly throwing me ideas, whatever.
[00:04:40.640 --> 00:04:43.040] It could be anything from show ideas to whatever.
[00:04:43.040 --> 00:04:45.840] One day he said, hey, you guys ever sell dice?
[00:04:45.840 --> 00:04:46.560] I'm like, no.
[00:04:46.560 --> 00:04:47.440] And I'm like, why?
[00:04:47.440 --> 00:04:51.520] I'm like, you think people that listen to a skeptical podcast would want to buy dice?
[00:04:51.520 --> 00:04:56.720] And he goes, he goes, look, I'm in an 80s rock band, and we sold a ton of dice.
[00:04:56.720 --> 00:04:57.440] You should try to do it.
[00:04:57.600 --> 00:05:00.000] I think he said they were his best-selling swag item.
[00:05:00.000 --> 00:05:00.400] Yeah.
[00:05:00.400 --> 00:05:00.720] He did.
[00:05:01.120 --> 00:05:02.480] Yeah, so then I'm like, all right.
[00:05:02.480 --> 00:05:06.880] I mean, look, because I always like it when I like the swag that we buy.
[00:05:07.280 --> 00:05:14.320] Typically, I like everything because I only buy things that are high quality and like something that I would want or buy because that's a good marker for me.
[00:05:14.320 --> 00:05:15.120] A J test.
[00:05:15.120 --> 00:05:19.760] So, you know, I scrubbed the internet and I found a manufacturer that I liked.
[00:05:19.760 --> 00:05:22.640] I mean, I really dig this company that we're buying dice from.
[00:05:22.640 --> 00:05:23.120] And that's it.
[00:05:23.120 --> 00:05:23.400] Steve.
[00:05:23.400 --> 00:05:25.840] And I picked out a couple of styles and everything.
[00:05:25.840 --> 00:05:30.440] And so they're going to be at the DC show with us, which is, you know, people can buy those there if they want them.
[00:05:30.440 --> 00:05:32.040] They'll have our logo on them, obviously.
[00:05:29.680 --> 00:05:35.800] Yeah, they have the 20, the D20 on the 20 has an SGU logo.
[00:05:35.960 --> 00:05:36.920] Woohoo!
[00:05:36.920 --> 00:05:38.280] Yeah, we got a bunch of new swag for this.
[00:05:38.520 --> 00:05:40.680] Yeah, yeah, we just reloaded our swag.
[00:05:40.680 --> 00:05:44.120] It was a good time to buy it because lots of things were on sale because of the holiday.
[00:05:44.280 --> 00:05:46.200] That was basically the impetus right there.
[00:05:46.200 --> 00:05:49.480] Kara, have you ever rolled a 20-sided die before?
[00:05:50.040 --> 00:05:51.240] Like Casahedron?
[00:05:51.560 --> 00:05:54.840] How many sides does the die from Scattergories have?
[00:05:54.840 --> 00:05:56.440] It doesn't have 20 letters on it.
[00:05:56.440 --> 00:05:57.880] I'm about to find out.
[00:05:58.120 --> 00:06:00.040] Scattergories die is a 20-sided die.
[00:06:00.040 --> 00:06:00.520] Then, yes, I have to do it.
[00:06:00.680 --> 00:06:02.520] So you have rolled shot.
[00:06:03.000 --> 00:06:04.920] Well, anybody who's played Scattergories has.
[00:06:04.920 --> 00:06:05.480] They're fun.
[00:06:05.480 --> 00:06:06.280] Yeah, I think I have.
[00:06:06.280 --> 00:06:06.760] I've played it.
[00:06:06.760 --> 00:06:08.040] It's been a while, I guess.
[00:06:08.040 --> 00:06:11.080] You'll never forget your first D20, I promise you.
[00:06:11.400 --> 00:06:12.120] Totally forgot.
[00:06:12.120 --> 00:06:12.520] That's why I have a lot of fun.
[00:06:12.760 --> 00:06:13.800] I still have mine.
[00:06:13.800 --> 00:06:14.840] I was 10 years old.
[00:06:14.840 --> 00:06:15.320] I still have mine.
[00:06:15.640 --> 00:06:18.280] Kara, now you can have an SGU D20.
[00:06:18.280 --> 00:06:18.680] Okay.
[00:06:19.400 --> 00:06:20.520] That's a keepsake.
[00:06:20.520 --> 00:06:21.000] Yeah.
[00:06:21.000 --> 00:06:24.280] All right, let's get on with some skeptical content here, Bob.
[00:06:24.440 --> 00:06:26.440] You're going to start us off with a quickie.
[00:06:26.440 --> 00:06:27.320] Thank you, Steve.
[00:06:27.320 --> 00:06:28.760] This is your quickie with Bob.
[00:06:28.760 --> 00:06:30.040] All right, how cool is this, guys?
[00:06:30.040 --> 00:06:37.160] Imagine putting something under a microscope and being able to easily manipulate its surface with atomic precision.
[00:06:37.160 --> 00:06:41.880] This is exactly what researchers claim in a study published in Applied Surface Science.
[00:06:41.880 --> 00:06:43.240] The lead researcher, Dr.
[00:06:43.560 --> 00:06:51.880] Motaba Moshkani, said, Our laser method provides atomic level control over diamond surfaces in a standard air environment.
[00:06:51.880 --> 00:06:56.280] This level of precision is typically only possible with large, complex vacuum equipment.
[00:06:56.280 --> 00:07:00.200] The ability to achieve it with a simple laser process is truly remarkable.
[00:07:00.200 --> 00:07:01.240] Remarkable?
[00:07:01.240 --> 00:07:03.320] So, yeah, this is really slick.
[00:07:03.320 --> 00:07:09.640] To do this, they use a deep UV light laser to precisely deliver pulses of light onto the diamond surface.
[00:07:09.640 --> 00:07:16.240] And these pulses trigger chemical reactions that can precisely remove carbon atoms from this top atomic layer.
[00:07:16.560 --> 00:07:27.680] Now, this could have tremendous impact on electronics, quantum computers, advanced manufacturing, specifically in materials where minor changes to the surface can have dramatic performance improvements.
[00:07:27.680 --> 00:07:36.640] For example, they showed, and MIT Lincoln Laboratory confirmed that this technique was able to increase diamond surface conductivity by a factor of seven.
[00:07:36.640 --> 00:07:40.560] They say, Professor Richard Mildren says, he's the team lead.
[00:07:40.560 --> 00:07:46.480] We were amazed that such a minor adjustment to the surface could yield such a substantial boost in conductivity.
[00:07:46.640 --> 00:07:50.080] Now, you may be thinking, yeah, but doing this atom by atom, right?
[00:07:50.080 --> 00:07:51.200] Isn't that how you would do it?
[00:07:51.200 --> 00:07:52.800] It would just take forever.
[00:07:52.800 --> 00:07:56.720] But no, this technique is precise and fast at the same time.
[00:07:57.040 --> 00:08:04.160] They said that they were able to remove 1% of a surface monolayer in 0.2 milliseconds.
[00:08:04.160 --> 00:08:10.160] Now, I don't know how big this surface monolayer was, but speed is not an issue with this technique, apparently.
[00:08:10.400 --> 00:08:10.640] Dr.
[00:08:10.640 --> 00:08:14.640] Mushkani said: We've shown that the process is both rapid and scalable.
[00:08:14.640 --> 00:08:19.920] It's a compelling option for industries requiring advanced material processing.
[00:08:19.920 --> 00:08:22.160] Professor Mildred said, This is just the beginning.
[00:08:22.160 --> 00:08:31.120] We're excited to explore how this technique can be optimized further to unlock the full potential of diamonds in electronics, quantum technologies, and beyond.
[00:08:31.120 --> 00:08:34.720] So, this is yet another technology that I will be following closely.
[00:08:34.720 --> 00:08:38.000] So, this has been your atomically precise quick eat with Bob.
[00:08:38.000 --> 00:08:39.120] Back to you, Steve.
[00:08:39.120 --> 00:08:40.480] Thank you, Bob.
[00:08:40.480 --> 00:08:46.640] All right, Kara, this one's intriguing, and this is different than this cuts against what I would have thought.
[00:08:46.640 --> 00:08:51.120] So, tell me about whether or not humans have innate morality.
[00:08:51.120 --> 00:08:58.480] Well, this cuts against what the researchers thought as well, which is why this is kind of a really interesting story.
[00:08:58.480 --> 00:09:04.920] And in a way, it relates back to: I've got to ask, do you guys remember when we were at SaiCon?
[00:09:04.920 --> 00:09:10.600] I know it feels like a long time ago to us now, but to our listeners, I think it doesn't because didn't that episode just air last year?
[00:09:11.080 --> 00:09:11.720] That's perfect.
[00:09:13.000 --> 00:09:13.800] Time travel.
[00:09:13.800 --> 00:09:14.680] Yeah, time travel.
[00:09:14.680 --> 00:09:32.200] But the story that I covered was about how to sort of keep researchers honest and how to ensure that what we are doing in the publication world is appropriately vetted.
[00:09:32.200 --> 00:09:40.680] And we talked about pre-registering studies and we talked about open access and we talked about having raw data available for other researchers to look at.
[00:09:40.680 --> 00:09:45.720] And I think in some ways, this study is a bit of a case study in being able to do that.
[00:09:45.720 --> 00:10:01.880] So, published in Developmental Science on November 26th, 2024 is a paper called Infant Social Evaluation of Helpers and Hinderers, a large-scale multi-lab coordinated replication study.
[00:10:01.880 --> 00:10:11.800] Researchers in this study looked at a seminal study that was published by Hamlin and colleagues in 2007.
[00:10:11.800 --> 00:10:40.040] And in that study, that a lot of people have cited, researchers found that forming social evaluations, as they put it, based on third-party interactions, that's like a very wonky way to say, observing a social engagement with another either human or toy or character, that within the first year of life, infants would choose a helper over a hinderer.
[00:10:40.040 --> 00:10:52.400] And they used a paradigm in which an object with eyes was trying to climb a hill, and another object with eyes either gave it a boost or prevented it from going up.
[00:10:52.400 --> 00:11:10.000] And using that paradigm, in this study from 2007, researchers showed, hey, look, even within the first year of life, based on their eye gazes, these infants are spending more time looking at the helpers, less time looking at the hinderers, therefore they're preferring the helpers over the hinderers.
[00:11:10.000 --> 00:11:15.280] Therefore, there must be some sort of innate decision-making around helping.
[00:11:15.280 --> 00:11:17.200] If we want to call that morality, we can.
[00:11:17.200 --> 00:11:18.480] The researchers don't.
[00:11:18.480 --> 00:11:20.400] Just all the write-arounds do.
[00:11:21.040 --> 00:11:23.120] So, which is not uncommon, right?
[00:11:23.120 --> 00:11:29.680] So, this study said, okay, this is interesting, because since that 2007 study, a bunch of studies said, hey, look, we found the same thing.
[00:11:29.680 --> 00:11:31.600] But a bunch of other studies were like, not so fast.
[00:11:31.600 --> 00:11:34.000] I don't think we found the same thing, and I'm confused.
[00:11:34.000 --> 00:11:35.760] So they did something interesting.
[00:11:35.760 --> 00:11:41.360] They said, okay, we want to do a different kind of experiment altogether.
[00:11:41.360 --> 00:11:49.520] And in this study, they looked at a project across multiple labs in a really large scale.
[00:11:49.520 --> 00:11:59.840] Ultimately, they ended up seeing over a thousand infants across 40 different developmental psychology teams.
[00:11:59.840 --> 00:12:03.360] They ended up using about half of that data for good reason.
[00:12:03.360 --> 00:12:05.120] We can talk about that in a bit.
[00:12:05.120 --> 00:12:18.960] But they looked at over a thousand children between the ages of five and a half and ten and a half months across multiple labs, and they developed a paradigm that was based on that Hill paradigm that I described before.
[00:12:18.960 --> 00:12:23.360] So there are four different conditions in the helping and hindering paradigm.
[00:12:23.360 --> 00:12:26.240] There are social conditions and non-social conditions.
[00:12:26.240 --> 00:12:28.080] Two of them are social, two are non-social.
[00:12:28.080 --> 00:12:32.840] And they're helping or up conditions and hindering or down conditions.
[00:12:32.840 --> 00:12:35.160] Two of them are helping, two of them are hindering.
[00:12:29.920 --> 00:12:37.160] So we'll start with the helping up condition.
[00:12:37.480 --> 00:12:45.160] Imagine a little yellow triangle with eyes and a little red circle about the same size with eyes.
[00:12:45.160 --> 00:12:49.640] The helping condition, the yellow triangle, helps the red circle get up the hill.
[00:12:49.640 --> 00:12:51.160] The red circle can't go on its own.
[00:12:51.160 --> 00:12:55.720] The yellow triangle comes up and gives it a boost from underneath and then it makes it up the hill.
[00:12:55.720 --> 00:13:03.320] That same paradigm is shown in a non-social condition where the little red character doesn't have eyes.
[00:13:03.320 --> 00:13:05.800] So now it's just a ball rolling up a hill.
[00:13:05.800 --> 00:13:09.960] And the helper, the yellow triangle with eyes, moves that red ball up the hill.
[00:13:09.960 --> 00:13:11.320] So it's non-social, right?
[00:13:11.320 --> 00:13:13.960] It's moving an object, not another character.
[00:13:13.960 --> 00:13:24.520] In the other condition, the hindering condition, the red ball is trying to go up the hill, but a blue square comes from above it and pushes it down, preventing it from going up the hill.
[00:13:24.520 --> 00:13:32.600] So the helper helps it up from behind, the yellow triangle, the hinderer, the blue square, pushes it down from above.
[00:13:32.600 --> 00:13:37.960] And there's also the non-social version of that, where the red character doesn't have eyes anymore, and now it's just a red ball.
[00:13:37.960 --> 00:13:41.560] So the googly eyes make it social, if that makes sense.
[00:13:41.560 --> 00:13:43.160] Now these are infants, remember.
[00:13:43.160 --> 00:13:47.080] So they're bright-colored, very simple displays.
[00:13:47.080 --> 00:13:50.360] And they showed these different conditions to the infants.
[00:13:50.360 --> 00:13:56.360] Now, in the original study, they just looked at eye gazes, but in this replication study, they did something even more interesting.
[00:13:56.360 --> 00:14:09.640] They added a behavioral component where after the infants looked at these different conditions in randomized order, they then were shown a board with cutouts of the characters.
[00:14:09.640 --> 00:14:11.880] And they were asked to pick which character they liked.
[00:14:11.880 --> 00:14:17.280] They were looking for, researchers were, and this is based on a lot of research about working with infants.
[00:14:17.600 --> 00:14:24.480] They were looking to see which of the characters the infants grabbed for and also looked at at the same time.
[00:14:24.480 --> 00:14:25.360] Does that make sense?
[00:14:25.600 --> 00:14:25.920] Yeah.
[00:14:25.920 --> 00:14:27.040] So they needed to look at it.
[00:14:27.040 --> 00:14:28.640] They also needed to grab for it.
[00:14:28.640 --> 00:14:33.040] Now, a lot of them were thrown out because some infants just didn't pick one.
[00:14:33.040 --> 00:14:36.560] They either went back and forth between the two or they looked at one and grabbed the other.
[00:14:36.560 --> 00:14:40.400] And it was, you know, inconsistent which they were showing a preference for.
[00:14:40.400 --> 00:14:41.680] So they couldn't use that data.
[00:14:41.680 --> 00:14:50.240] In the situations where it was obvious which character the infants went for, what do you think they found?
[00:14:50.960 --> 00:14:51.680] The same.
[00:14:52.320 --> 00:14:52.800] It was the same.
[00:14:52.880 --> 00:14:53.600] Yeah, same.
[00:14:53.600 --> 00:14:54.480] Fine.
[00:14:54.480 --> 00:14:55.040] It was the same.
[00:14:55.840 --> 00:15:00.000] So overall, 49.34% of the infants chose the helpers.
[00:15:00.000 --> 00:15:00.320] Yeah.
[00:15:00.320 --> 00:15:02.240] Oh, and that was in the social condition.
[00:15:02.240 --> 00:15:09.440] 55.85% chose the helpers in the non-social condition.
[00:15:09.440 --> 00:15:12.560] And when they compared the two, there was no significant difference.
[00:15:12.560 --> 00:15:14.000] Helpers or hinderers.
[00:15:14.000 --> 00:15:16.960] It was no different from each other, no different from chance alone.
[00:15:16.960 --> 00:15:17.680] Coin flip.
[00:15:17.680 --> 00:15:21.440] And so the researchers were like, whoa, because that was not their hypothesis.
[00:15:22.080 --> 00:15:23.840] They were like, wait, we did this before.
[00:15:23.840 --> 00:15:27.520] We showed that there was this pro-social thing that appears to be innate.
[00:15:27.520 --> 00:15:34.000] Not only do infants go for the helpers, but we're also hoping that they go for the social group a little bit more too.
[00:15:34.000 --> 00:15:43.280] They did find, I think, a slight preference for the social group, but they didn't find any significant difference between the helpers and the hinderers.
[00:15:43.280 --> 00:15:46.160] So they then talk about why this might be the case.
[00:15:46.160 --> 00:15:51.360] They have some different ideas, but of course, this is in the discussion section of a study.
[00:15:51.360 --> 00:15:52.720] We just don't know.
[00:15:52.720 --> 00:15:54.960] We are speculating.
[00:15:54.960 --> 00:16:02.440] They are wondering if maybe this skill or this preference is present, just not this early.
[00:15:59.840 --> 00:16:04.920] Maybe it's going to develop after 10 and a half months.
[00:16:05.240 --> 00:16:10.040] Or maybe there is no innate preference at all.
[00:16:10.040 --> 00:16:17.160] Or maybe this study isn't tapping in to the best way to look at an innate preference.
[00:16:17.160 --> 00:16:30.200] Maybe they don't see a significant difference between the groups using the Hill paradigm, but maybe there are other sort of more naturalistic studies where they would see that pro-social or that quote moral behavior.
[00:16:30.200 --> 00:16:36.760] But what they like is that this was proof of concept for using these active measures so that the infants had a choice.
[00:16:36.760 --> 00:16:48.520] They got to choose the character at the end instead of the classic way that researchers study where they just kind of count how long an infant gazes at an object and then they infer that that means they're more interested in that object.
[00:16:48.520 --> 00:16:59.880] They're calling this a first of its kind study that shows proof of concept for using an active behavioral measure that is a manual choice in a large-scale multi-lab project studying infants.
[00:16:59.880 --> 00:17:08.440] Oh, and one thing I didn't mention that I should have is that every aspect of this study was pre-registered and all of the data is available open access.
[00:17:08.440 --> 00:17:09.000] That's good.
[00:17:09.800 --> 00:17:15.320] So they really did check a lot of those boxes because they were worried about the replication crisis.
[00:17:15.320 --> 00:17:30.760] They were worried that this original study, which has been cited so many times that so many people were taking as truth, was kind of showing, I guess we could say, ambiguous or maybe ambivalent, you know, like, sometimes it's supported, sometimes it's not.
[00:17:30.760 --> 00:17:31.800] It's kind of wiggly.
[00:17:31.800 --> 00:17:36.760] Okay, well, let's just look at it once and for all across multiple labs, across multiple countries.
[00:17:36.760 --> 00:17:38.840] We'll all do the exact same thing.
[00:17:38.840 --> 00:17:42.440] We'll all collaborate on this study, and we'll see if we find the same thing.
[00:17:42.440 --> 00:17:53.840] So, you know, they talk quite a bit in the paper about the difference between doing a meta-analysis and doing a large-scale multi-lab project and how there are different ways to approach it statistically.
[00:17:53.840 --> 00:17:59.440] And, you know, there are strengths and weaknesses of both, but it's a pretty robust paradigm.
[00:17:59.440 --> 00:18:07.840] And the researchers were surprised that they failed to replicate that study.
[00:18:07.840 --> 00:18:17.440] And I think it's also a great example of publishing negative results because just because the results are quote negative doesn't mean there's not a lot of really interesting stuff here.
[00:18:17.440 --> 00:18:19.120] Yeah, negative data is data.
[00:18:19.120 --> 00:18:19.440] Yeah.
[00:18:20.080 --> 00:18:20.880] It tells us a lot.
[00:18:20.880 --> 00:18:21.760] Yeah, this is interesting.
[00:18:21.760 --> 00:18:27.760] I mean, it's a one interpretation of this is that, well, maybe the effect went away because it was all p-hacking to begin with.
[00:18:27.760 --> 00:18:31.440] And then when they control for p-hacking, it goes away.
[00:18:31.440 --> 00:18:33.120] That's one interpretation.
[00:18:33.280 --> 00:18:36.960] The other one is that it's just not a good paradigm for what they're looking for.
[00:18:37.360 --> 00:18:39.280] As we say, everything's a construct, right?
[00:18:39.280 --> 00:18:42.960] You can't, we don't know that this is a marker of morality in kids.
[00:18:42.960 --> 00:18:46.320] It's just one way that they're choosing to look at it.
[00:18:46.320 --> 00:18:48.320] And it could indicate that.
[00:18:48.320 --> 00:18:52.400] Because you could, the thing is, it's like one of those things where you could rationalize either way.
[00:18:52.400 --> 00:19:01.920] Are kids more interested in the pro-social character because they're drawn to it, or are they more interested in the anti-social character because that's more fascinating?
[00:19:01.920 --> 00:19:03.760] And it's almost like a morbid curiosity.
[00:19:03.760 --> 00:19:06.080] It's like, there's something wrong with this character.
[00:19:06.080 --> 00:19:12.640] What's going on with this thing that he's being you know, so we can't tell what's going on in the minds of these infants.
[00:19:12.640 --> 00:19:14.640] Obviously, they can't talk.
[00:19:14.960 --> 00:19:21.840] And then the other thing I was thinking when I was reading this was like, well, you can't make statements like there is no evidence for innate morality.
[00:19:21.840 --> 00:19:24.240] I mean, that I think is massively overcalling this.
[00:19:24.280 --> 00:19:29.760] But as you say, you know, it could just be, well, maybe it doesn't really manifest until two years old, you know?
[00:19:29.880 --> 00:19:31.240] Yeah, and they talk about that too.
[00:19:29.760 --> 00:19:31.720] Yeah.
[00:19:29.840 --> 00:19:33.960] The brain is still maturing at this point.
[00:19:34.840 --> 00:19:39.720] That could be one of those sophisticated frontal lobe things that doesn't kick in for a while, you know?
[00:19:40.040 --> 00:19:40.680] Absolutely.
[00:19:40.680 --> 00:19:49.160] They say, you know, that their first interpretation is that infants between five and ten months of age don't prefer pro-social characters over antisocial characters after all.
[00:19:49.160 --> 00:19:54.760] And then they talk about why those results could have been different, as you mentioned, p-hacking being one of them.
[00:19:54.760 --> 00:20:02.360] A second interpretation they say is that infants don't prefer helpers in the Hill paradigm, but maybe would prefer them in some other context.
[00:20:02.680 --> 00:20:14.280] And they talk about a lot of the studies that have been published, including a meta-analysis, that did find a preference for agents performing different kinds of pro-social actions.
[00:20:14.280 --> 00:20:35.960] And so they say for those reasons, it seems plausible that even if infants don't prefer helpers over hinders in the Hill paradigm by 10 and a half months, they might nonetheless prefer, I'm quoting them directly, helpers or pro-social agents more generally in other scenarios, perhaps when the intentions are overt, or perhaps it's going to happen, you know, later in development.
[00:20:35.960 --> 00:20:49.160] But to sit on the assumption that infants between the ages of five and a half and ten and a half months are intrinsically moral versus the assumption that they are intrinsically not moral.
[00:20:49.160 --> 00:20:52.680] Neither of those things is really borne by the literature.
[00:20:52.680 --> 00:21:00.120] And so we have to be really careful when we interpret these studies to interpret them based on what they actually tell us.
[00:21:00.120 --> 00:21:05.400] And I think that's a big difficulty in psychology because we're always working with constructs.
[00:21:05.400 --> 00:21:08.280] There's also different types of morality.
[00:21:08.280 --> 00:21:11.320] I mean, there's justice and fairness, right?
[00:21:11.320 --> 00:21:14.520] This is more pro-social versus antisocial behavior.
[00:21:14.520 --> 00:21:16.960] So, again, you can't make sweeping statements about.
[00:21:17.040 --> 00:21:18.560] Yeah, and they, you know, they're very careful.
[00:21:18.560 --> 00:21:24.000] They do talk about morality in the discussion, but they don't talk about it very often when they're talking through the study.
[00:21:24.000 --> 00:21:29.760] They're saying helpers, hinderers, helpers, social, pro-social, non-social.
[00:21:29.760 --> 00:21:30.720] Internet.
[00:21:30.720 --> 00:21:30.960] All right.
[00:21:30.960 --> 00:21:32.080] Thanks, Kara.
[00:21:32.080 --> 00:21:34.320] Jay, how's our space station doing?
[00:21:34.320 --> 00:21:38.160] Well, on a scale of one to ten, ten is brand new.
[00:21:38.160 --> 00:21:39.520] It's probably at a two.
[00:21:39.760 --> 00:21:43.920] Yeah, there's some significant issues with it.
[00:21:43.920 --> 00:21:47.440] I mean, let's face it, it's lasted longer than they expected it to.
[00:21:47.440 --> 00:21:48.720] But let me give you the update.
[00:21:48.720 --> 00:21:57.840] So, first of all, you know, the International Space Station, it's been around since 1998, which means that they started building the components for it, you know, many, many years before that.
[00:21:57.840 --> 00:22:04.560] So it's been in low Earth orbit for 26 years, and, you know, unfortunately, it's really showing its age now.
[00:22:04.880 --> 00:22:08.480] And when they initially launched it, it started with just one module.
[00:22:08.480 --> 00:22:09.920] Do you guys know what the name of that was?
[00:22:09.920 --> 00:22:10.720] Freedom.
[00:22:10.720 --> 00:22:12.240] It's a Russian module.
[00:22:12.240 --> 00:22:12.800] Oh, wow.
[00:22:12.800 --> 00:22:13.280] Oh.
[00:22:14.080 --> 00:22:15.360] It was a Zarya.
[00:22:15.360 --> 00:22:16.880] Z-A-R-Y-A.
[00:22:16.880 --> 00:22:17.920] Zarya.
[00:22:17.920 --> 00:22:21.600] Yeah, the ISS today has 43 module facilities.
[00:22:21.600 --> 00:22:22.560] And there's a lot of different.
[00:22:22.880 --> 00:22:23.760] I mean, it's really cool.
[00:22:23.760 --> 00:22:26.640] They have so many different sections that do different things.
[00:22:26.640 --> 00:22:34.560] They have airlocks, robotic arms, power, life support, communication, habitation modules, research facilities.
[00:22:34.880 --> 00:22:37.440] There's a lot of different modules up there.
[00:22:37.440 --> 00:22:40.080] Very complicated and unbelievably expensive.
[00:22:40.080 --> 00:22:45.360] There's been an unprecedented and I think very noteworthy collaboration between nations.
[00:22:45.360 --> 00:22:55.360] We have the United States, Russia, European nations, Japan, Canada, Italy, Brazil, all being major contributors to the space station.
[00:22:55.360 --> 00:23:06.600] But, you know, like I said, it's unlimited time today, and most of the partner countries I listed, they plan to retire the station on or around 2030, which is coming up pretty quick.
[00:23:06.920 --> 00:23:13.640] And Russia is also now saying that it might even go earlier for them and they might want to withdraw by 2028.
[00:23:13.640 --> 00:23:16.120] Yeah, they've been talking about that for many years now.
[00:23:16.120 --> 00:23:19.800] Yeah, it's interesting to try to work out the details of what that actually means.
[00:23:19.800 --> 00:23:21.880] If they pull out, like, what does that actually mean?
[00:23:21.880 --> 00:23:25.000] You know, is it like they're not going to give any financial support or what?
[00:23:25.000 --> 00:23:32.520] I think they'll no longer be responsible for the upgrades or the maintenance of their portion of the stage.
[00:23:32.520 --> 00:23:33.880] Yeah, okay, that makes sense.
[00:23:33.880 --> 00:23:43.960] So NASA has already instructed SpaceX to design a deorbiting spacecraft that will guide the ISS safely into Earth's atmosphere.
[00:23:43.960 --> 00:23:45.880] You know, like it's a big deal.
[00:23:45.880 --> 00:23:50.280] This is a really big deal to deorbit that thing and to have it land in the ocean.
[00:23:50.280 --> 00:23:52.600] You know, they want a controlled descent.
[00:23:52.600 --> 00:23:54.760] And of course, this thing can't land in a city.
[00:23:54.760 --> 00:23:57.000] It would do an amazing amount of damage.
[00:23:57.000 --> 00:24:00.040] You know, it could be very dangerous.
[00:24:00.520 --> 00:24:03.400] I think that they are totally up to the task.
[00:24:03.400 --> 00:24:08.040] And I think, you know, NASA and SpaceX know exactly what to do.
[00:24:08.040 --> 00:24:09.560] But it's going to be expensive.
[00:24:09.560 --> 00:24:11.160] But what kind of a show that will be, guys?
[00:24:11.480 --> 00:24:15.080] I mean, if it happens at night or during the day, you're going to see it.
[00:24:17.160 --> 00:24:22.280] If you're in line of sight, you're going to see a huge, huge show.
[00:24:22.920 --> 00:24:25.640] That's going to burn up and really be visible.
[00:24:25.640 --> 00:24:33.080] And even worse, guys, there's issues that might accelerate this 2028 or 2030 timeline.
[00:24:33.080 --> 00:24:38.440] There's been these persistent air leaks in the Russian Zvezda module.
[00:24:38.680 --> 00:24:43.720] It's really got some serious air leaks, and they were first discovered, you know, back in 2019.
[00:24:43.720 --> 00:24:45.360] That seems like a long time ago.
[00:24:44.680 --> 00:24:49.840] So they've escalated this situation to a very critical level.
[00:24:50.160 --> 00:24:59.120] The leaks are now rated five out of five on NASA's danger scale, and it is the number one top safety concern that's going on, and it is significant.
[00:24:59.120 --> 00:25:13.280] The ISS is losing about 3.7 pounds of air per day, and that costs NASA about $4,000 a day, about $1.6 million a year to actually replace the atmosphere because it has to be brought up there.
[00:25:13.280 --> 00:25:13.600] Really?
[00:25:14.240 --> 00:25:23.120] And these figures, they're estimates, but they show clearly that there's a growing operational burden here of maintaining the station.
[00:25:23.120 --> 00:25:25.200] It's not the only problems that they have.
[00:25:25.200 --> 00:25:31.600] It is serious enough where the astronauts were told by NASA to spend more time in the American section of the station.
[00:25:31.600 --> 00:25:33.280] Probably having fewer problems, yeah.
[00:25:33.280 --> 00:25:33.840] Yeah, right.
[00:25:33.840 --> 00:25:40.400] So to prepare for a worst-case scenario, SpaceX is developing an emergency evacuation plan for the crew.
[00:25:40.400 --> 00:25:47.520] I would imagine that that is some type of craft that has, you know, that'll be attached and will be ready to go at a moment's notice.
[00:25:47.520 --> 00:25:52.560] The structural issues on the station have made a top priority to replace the ISS.
[00:25:52.560 --> 00:25:58.560] NASA already turned to the private sector, which is a really good idea because there's a ton of money in the private sector.
[00:25:58.960 --> 00:26:04.000] They've funded multiple initiatives to help keep a human presence in low Earth orbit.
[00:26:04.000 --> 00:26:05.120] I'm totally for this.
[00:26:05.120 --> 00:26:09.200] I'm sure most of us science enthusiasts, we want people up there.
[00:26:09.200 --> 00:26:12.320] There's a lot of significant science that's being done there.
[00:26:12.320 --> 00:26:14.320] It also is just human nature.
[00:26:14.320 --> 00:26:20.720] I think we have to respect the fact that we're explorers, and you know, it is the final frontier.
[00:26:20.720 --> 00:26:25.440] There are four major projects, and I think you'll find these interesting if you haven't heard of them.
[00:26:25.440 --> 00:26:35.000] One of them is Vast Space, and they want to build their own independent station to go in low-earth orbit, and it's going to focus on modular and scalable designs.
[00:26:35.320 --> 00:26:37.880] I'm sure most of you have heard of Axiom Space.
[00:26:37.880 --> 00:26:45.720] They plan to attach their modular station to the ISS initially, and then over time, it'll detach and function as a standalone station.
[00:26:45.720 --> 00:26:52.760] So, it might pull some of the modules with it, and then they'll slowly get rid of them as they bring up their own new modules.
[00:26:52.760 --> 00:27:00.920] There's a coalition led by Blue Origin, Sierra Space, Boeing, and Redwire, and they're developing something called an orbital reef.
[00:27:00.920 --> 00:27:05.720] This is a commercial station marketed as a mixed-use business park in space.
[00:27:05.720 --> 00:27:07.640] I think that one is really interesting.
[00:27:07.640 --> 00:27:09.640] And then the last one is Voyager Space.
[00:27:09.640 --> 00:27:16.760] They're partnering with Airbus, Northrop Grumman, and Hilton Hotels, which is it might sound odd, but yeah, they're in the game.
[00:27:16.760 --> 00:27:22.680] And they're working on Starlab, which is a futuristic station with scientific and commercial applications, right?
[00:27:23.000 --> 00:27:25.960] I think it would be wonderful if more than one of these made it.
[00:27:25.960 --> 00:27:30.120] As much as this is a Herculean effort, it's costing an incredible amount of money.
[00:27:30.120 --> 00:27:33.080] You know, NASA is trying to help these four projects.
[00:27:33.080 --> 00:27:37.400] I think it's possible that we could have more than one, you know, within the next 10 years.
[00:27:37.400 --> 00:27:44.760] The experts have outlined that they have significant doubts about whether any of the replacements will be operational by the 2030 deadline.
[00:27:44.760 --> 00:27:51.160] That might be the case, but that doesn't mean that they're not going to end up getting up there eventually.
[00:27:51.160 --> 00:27:55.240] Axiom Space, they faced financial issues.
[00:27:55.240 --> 00:28:00.040] They had to delay payments to suppliers, and they were struggling to pay their employees.
[00:28:00.200 --> 00:28:05.720] That's a you know like a situation that you know that could kill a project very easily.
[00:28:05.720 --> 00:28:09.720] And NASA you know they like I said, NASA is supporting them as best they can.
[00:28:09.720 --> 00:29:46.120] They want them to keep going, of course, so NASA's going to give them as much money as they can to keep them going NASA plans to also award additional development contracts in 2026 and they've already invested over 500 million to help companies refine their designs you know build their prototypes and just keep those projects breathing air so in the short term NASA is probably going to need to implement some temporary fixes you know I think that's pretty obvious they could involve you know sealing off the uh the the Zezda transfer tunnel like just closing it down and making it so you know no one can go into that module anymore that is where the leaks are the most severe so I think you know abandoning that would would you know it would be a difficult thing it's a it's a needed module but you know what else can they do these aren't ideal situations they don't have ideal solutions but they could extend the station's lifespan long enough to bridge the gap so we don't you know we don't have a you know we we want there to be a continuous space station for a lot of reasons like you know it would be great if they could move a module over with one of the new you know space stations and then move all the gear and all the expensive stuff and you know that that's a good way to transport stuff from one station to the other we don't want to have to pull that the entire space station down and everything in it you know that would be really an unfortunate reality if that happens i think just from a science perspective i think it's worth it because we can definitely run tests and do things in outer space you know nasa does pass on a lot of its technology to the private sector for free in the united states.
[00:29:46.120 --> 00:29:48.360] It'll just hand companies, you know, here you go.
[00:29:48.360 --> 00:29:55.240] This is, you know, this a product you know a product line that you could you could sell, you know, things like velcro, right?
[00:29:55.240 --> 00:30:01.240] Like, you know, they were created, you know, duct tape, like things you wouldn't imagine, but very useful stuff.
[00:30:01.240 --> 00:30:03.000] And, you know, it does boost the U.S.
[00:30:03.000 --> 00:30:03.480] economy.
[00:30:03.480 --> 00:30:04.600] It's not insignificant.
[00:30:04.600 --> 00:30:08.520] So I do think it's definitely worth every penny we spend on it.
[00:30:08.520 --> 00:30:12.280] So, but the future is, guys, it's all private investors.
[00:30:12.280 --> 00:30:21.160] And, you know, there has to be a commercial component now to any of these endeavors going on in the future because, you know, single governments can't afford to do this.
[00:30:21.160 --> 00:30:24.440] So there has to be some type of private industry involved.
[00:30:24.440 --> 00:30:27.240] And I think that's great, you know, because there's a ton of money out there.
[00:30:27.240 --> 00:30:29.560] There's a lot of companies, you know, look at like Hilton Hotels.
[00:30:29.560 --> 00:30:31.720] They have a ton of money and they want to get in it.
[00:30:31.720 --> 00:30:32.120] Great.
[00:30:32.440 --> 00:30:33.720] Why wouldn't we encourage that?
[00:30:33.720 --> 00:30:39.960] I think it's an awesome collaboration between private companies and NASA to do this.
[00:30:39.960 --> 00:30:43.880] Yeah, I'm interested in the Voyager space station or Voyager station.
[00:30:43.880 --> 00:30:47.400] This is the first one that is in the works.
[00:30:47.400 --> 00:30:52.840] It's where there's planned that would have a rotating wheel design that would have artificial gravity.
[00:30:52.840 --> 00:30:53.800] Ooh, hello.
[00:30:53.800 --> 00:30:58.040] But they said, hey, we're going to start construction in 2026, but they don't have funding yet.
[00:30:59.080 --> 00:31:02.840] Until they have funding, or at least they haven't announced any funding.
[00:31:02.840 --> 00:31:05.800] So until that happens, naming a date is worthless.
[00:31:05.800 --> 00:31:06.520] You need a kickstart.
[00:31:06.600 --> 00:31:10.680] I'm skeptical that they're going to try to incorporate artificial gravity in that way.
[00:31:11.800 --> 00:31:14.280] That's the design for the Voyager station.
[00:31:14.280 --> 00:31:15.400] That's the whole point.
[00:31:15.400 --> 00:31:18.840] They want it to be a space hotel, basically.
[00:31:18.840 --> 00:31:19.880] Yeah, a little 2000.
[00:31:20.200 --> 00:31:24.680] And Ev, if you look at the pictures, man, it does kind of look like that.
[00:31:25.480 --> 00:31:29.640] These are renderings, and I'm sure that there are real plans out there.
[00:31:29.640 --> 00:31:34.760] I mean, it's not like they're just sitting around a room and they're like looking at science fiction drawings.
[00:31:35.160 --> 00:31:36.840] They're probably thinking, oh, that looks cool.
[00:31:36.840 --> 00:31:37.880] They know what they want.
[00:31:37.880 --> 00:31:39.000] They know what it's going to be.
[00:31:39.000 --> 00:31:42.600] But all of the artwork that I've seen, it does have that vibe.
[00:31:42.600 --> 00:31:47.520] And man, could you imagine if they're shuttling people up to a space hotel and that becomes commonplace?
[00:31:47.520 --> 00:31:48.640] That's amazing.
[00:31:48.640 --> 00:31:50.240] I still wouldn't go.
[00:31:44.760 --> 00:31:52.000] Not in a million years.
[00:31:52.320 --> 00:31:56.240] No, it'll happen, but it'll be long dead.
[00:31:56.240 --> 00:31:56.960] Yep.
[00:31:56.960 --> 00:31:57.520] Oh.
[00:31:57.840 --> 00:31:58.080] All right.
[00:31:58.080 --> 00:31:59.600] Well, Bob's a freshman.
[00:31:59.760 --> 00:32:01.440] That's pretty messy.
[00:32:02.400 --> 00:32:03.120] Bob.
[00:32:03.440 --> 00:32:05.680] This shit never pans out the way we want it to pan out.
[00:32:05.760 --> 00:32:06.160] It always doesn't.
[00:32:07.440 --> 00:32:08.560] Not in your life.
[00:32:08.720 --> 00:32:09.040] Forever.
[00:32:10.800 --> 00:32:11.200] All right.
[00:32:11.200 --> 00:32:11.760] Thanks, Jay.
[00:32:12.240 --> 00:32:12.800] Yep.
[00:32:13.040 --> 00:32:15.840] But it switched topics before Bob gets too mad.
[00:32:15.840 --> 00:32:16.320] Yeah.
[00:32:16.880 --> 00:32:17.520] All right, Bob.
[00:32:17.520 --> 00:32:18.880] I'm going to make you feel better.
[00:32:18.880 --> 00:32:21.040] Global warming is even worse than we thought it was going to be.
[00:32:21.280 --> 00:32:21.680] Great.
[00:32:21.680 --> 00:32:22.080] There you go.
[00:32:22.240 --> 00:32:23.520] No surprise there.
[00:32:24.480 --> 00:32:28.480] There's a recent study looking at climate hotspots.
[00:32:29.120 --> 00:32:45.840] And so the idea here is they wanted to, they looked at a lot of data, obviously climate data, but regional temperature data to try to identify regions where there are more likely to be heat waves.
[00:32:45.840 --> 00:32:55.600] And they identified, they basically made a map of the world, identifying those regions, Northern Africa, Europe, Northwestern U.S.
[00:32:55.680 --> 00:33:04.080] and Canada, where there have been statistically outlier heat waves in the last 20 years.
[00:33:04.320 --> 00:33:07.040] Their conclusion is that a couple of things.
[00:33:07.040 --> 00:33:17.200] Well, first of all, statistically speaking, these are multiple standard deviations away from average temperatures, like what the models would predict.
[00:33:17.440 --> 00:33:30.760] You may remember, like, you know, in northwestern Canada, they had temperatures that were literally 50 degrees above the average temperature for that location in that time of year.
[00:33:29.760 --> 00:33:36.360] Just massively outside of any statistical distribution.
[00:33:36.680 --> 00:33:42.680] So that's one observation that they made, that there's these hot spots where we're seeing extreme heat waves.
[00:33:42.680 --> 00:33:46.920] And the second thing is that the models don't really predict this.
[00:33:46.920 --> 00:33:50.440] But it's not that the models, they don't contradict the models.
[00:33:50.440 --> 00:33:53.400] The models don't drill down to this level of detail.
[00:33:53.400 --> 00:33:59.640] The models are unable to tell us like what this regional, you know, these regional variations will be.
[00:33:59.640 --> 00:34:06.120] And so in other words, we can't predict these extreme weather events, and yet they're happening.
[00:34:06.120 --> 00:34:14.040] They're saying that basically what's happening is that as the temperature increases, the variability is also increasing.
[00:34:14.200 --> 00:34:16.920] So you're getting more extreme events.
[00:34:16.920 --> 00:34:20.440] That bell curve, if you will, is spreading out.
[00:34:20.440 --> 00:34:24.520] And this includes both hot and cold temperatures.
[00:34:24.520 --> 00:34:32.360] But of course, as the average temperature increases, it's worse at the hot end of that spectrum.
[00:34:32.360 --> 00:35:03.560] And what this could mean is that earlier than the models would have predicted, we could be seeing these hotspots around the world with increasing frequency of increasing heat waves, like heat waves that are significantly above the average temperatures and that are obviously spiking deaths due to heat waves and are making these regions, at least for the duration of the heat wave, they could be, like, if you don't have air conditioning, they're basically unlivable.
[00:35:03.560 --> 00:35:06.680] So, these are parts of the world where people don't traditionally have air conditioning, right?
[00:35:06.680 --> 00:35:08.920] Because it doesn't get hot enough to require air conditioning.
[00:35:08.920 --> 00:35:17.040] And now they're having heat waves, like 110, 120 degree temperatures, and they don't have air conditioning, you know, because they've never seen temperature.
[00:35:14.920 --> 00:35:21.200] These are tens of degrees higher than anything they've ever seen before.
[00:35:21.520 --> 00:35:26.160] Obviously, the solution to this is to try to mitigate climate change as much as possible.
[00:35:26.160 --> 00:35:27.120] Oh, yeah, that's a good idea.
[00:35:27.120 --> 00:35:27.840] Yeah, let's try that.
[00:35:27.840 --> 00:35:31.520] But, you know, what it means, so the authors are recommending two things.
[00:35:31.760 --> 00:35:39.760] First, we need to figure out how to predict these regional changes, even though the models are not designed to do that.
[00:35:39.760 --> 00:35:40.800] We need new models, right?
[00:35:40.800 --> 00:35:43.200] We need new ways of trying to do this.
[00:35:43.200 --> 00:35:50.640] Now, here we're just looking back at what's happened, which is not necessarily the same thing as predicting what's going to happen in the future.
[00:35:50.800 --> 00:35:54.080] And that's because this is the difference between climate and weather, right?
[00:35:54.080 --> 00:35:57.120] We're starting to blur the lines here between climate and weather.
[00:35:57.120 --> 00:36:01.280] You know, the weather is much more difficult to predict.
[00:36:01.280 --> 00:36:19.120] We're not just looking at average global temperatures, which the models have been very good at predicting, but there are lots of complicated changes to the way weather is behaving on the world with increasing temperature, like increasing energy in the system.
[00:36:19.120 --> 00:36:29.360] I know we talked on the show that we aired last week about the great clot the cold blob and shutting down the conveyor belt, the ocean currents.
[00:36:29.840 --> 00:36:35.360] These things just become harder and harder to predict because we're dealing with a very complicated system.
[00:36:35.360 --> 00:36:50.240] So this could mean that we will be seeing extreme heat wave events earlier than the models predicted because regional variability is increasing.
[00:36:50.240 --> 00:36:57.040] It's like this is a separate phenomenon they're observing, this increasing, the tails are spreading out basically.
[00:36:57.040 --> 00:36:59.800] And that wasn't something that the models predicted.
[00:36:59.800 --> 00:37:00.600] But we're seeing it.
[00:37:00.600 --> 00:37:01.240] It's happening.
[00:36:59.360 --> 00:37:02.440] So that's not a good thing.
[00:37:02.760 --> 00:37:08.840] Well, imagine seeing such variation in a place that already occasionally sees like 120.
[00:37:08.840 --> 00:37:09.240] Yeah.
[00:37:09.240 --> 00:37:13.240] You know, imagine like, oh, oh, yeah, tomorrow it's going to be 140.
[00:37:13.240 --> 00:37:15.320] I wonder if it could get quite that extreme.
[00:37:15.560 --> 00:37:16.280] Yeah, yeah.
[00:37:16.280 --> 00:37:16.920] Oh, my God.
[00:37:16.920 --> 00:37:17.320] Wow.
[00:37:18.200 --> 00:37:28.360] We talked about this with Michael Mann, you know, on Saikon in the show they aired a few weeks ago: that the effects of climate change are not evenly distributed, right?
[00:37:28.360 --> 00:37:34.040] And the effects of our attempts at mitigating it won't necessarily be evenly distributed either.
[00:37:35.000 --> 00:37:40.280] There's going to be, you know, random winners and losers, you know, in this kind of scenario.
[00:37:40.280 --> 00:37:44.280] I think we're all losers, but more losers and less losers, I guess.
[00:37:44.280 --> 00:37:47.640] But no one is safe, I guess, is the other idea here.
[00:37:47.640 --> 00:37:51.640] Like, Canada, you know, had these ridiculous heat waves.
[00:37:51.640 --> 00:37:59.240] North Carolina, deep in the mountains, had flooding, like a place that would, nobody, there's no flooding infrastructure here.
[00:37:59.240 --> 00:38:03.320] They don't have floods in this region of the country.
[00:38:03.320 --> 00:38:13.640] But because of, you know, the increased moisture in the air, increased the rainfall due to Hurricane Helene, they had a massive flood that they weren't prepared for.
[00:38:13.640 --> 00:38:20.440] So it's not just that, oh, yeah, the hottest places on the earth are going to get even hotter and they're going to be feeling it first.
[00:38:20.440 --> 00:38:23.880] Like, yeah, that is true, but it's not the only thing that's true.
[00:38:23.880 --> 00:38:29.800] It's also going to be bringing extreme weather events to pretty much any part of the world.
[00:38:29.800 --> 00:38:32.680] And it's kind of unpredictable how that will happen.
[00:38:32.840 --> 00:38:36.040] It's not a simple extrapolation from what you're currently seeing.
[00:38:36.040 --> 00:38:37.720] Right, exactly.
[00:38:38.040 --> 00:38:43.000] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Aura Frames.
[00:38:43.000 --> 00:38:48.640] This year, guys, one of the best gifts that you can give someone is the Aura Digital Picture Frame.
[00:38:48.640 --> 00:38:52.400] Aura Frames are the number one digital photo frame by Wirecutter.
[00:38:52.400 --> 00:38:53.920] This is a really awesome platform.
[00:38:54.160 --> 00:38:55.360] It's incredibly smart.
[00:38:55.360 --> 00:38:56.560] It's easy to use.
[00:38:56.560 --> 00:39:00.960] You can upload an unlimited number of photos and videos from your phone to the frame.
[00:39:00.960 --> 00:39:09.920] Plus, you can order the frame online and you can preload it with the photos and videos that you want to share using the Aura app so it's ready to go right out of the box.
[00:39:09.920 --> 00:39:16.400] Yeah, I helped set this up my personal Aura frame and my mom's, and it was just so ridiculously easy.
[00:39:16.400 --> 00:39:21.040] And it's really one of the best presents that I've ever given or received.
[00:39:21.040 --> 00:39:21.760] It truly is.
[00:39:21.760 --> 00:39:25.440] With the Aura Frame, you can upload your favorite photos and they are there.
[00:39:25.440 --> 00:39:28.480] It's really convenient, fantastic, and a lot of fun.
[00:39:28.720 --> 00:39:31.840] You'll be looking at it many times during the day, guaranteed.
[00:39:31.840 --> 00:39:43.440] So save on the perfect gift by visiting auraframes.com to get $35 off Aura's best-selling Carver Mat Frames by using promo code skeptics at checkout.
[00:39:43.440 --> 00:39:48.560] That's A-U-R-AFrames.com, promo code skeptics.
[00:39:48.560 --> 00:39:53.040] This deal is exclusive to listeners, so get yours now in time for the holidays.
[00:39:53.040 --> 00:39:54.800] Terms and conditions apply.
[00:39:54.800 --> 00:39:56.960] All right, guys, let's get back to the show.
[00:39:57.280 --> 00:40:02.240] Okay, Evan, tell us about Orcas wearing hats.
[00:40:02.480 --> 00:40:03.600] Yes.
[00:40:05.760 --> 00:40:08.000] You don't run across that headline every day.
[00:40:08.480 --> 00:40:11.760] Have you guys ever seen that nature documentary film?
[00:40:11.760 --> 00:40:14.160] It's called Star Trek 4: The Voyage Home?
[00:40:15.120 --> 00:40:22.240] Where Captain of Engineering Montgomery Scott famously exclaimed, Admiral, there be whales here.
[00:40:22.880 --> 00:40:24.160] Can you imagine if you said that?
[00:40:24.480 --> 00:40:27.360] And they're wearing salmons as hats.
[00:40:28.480 --> 00:40:31.480] Scotty, you've been hitting the brandy again, I think.
[00:40:31.480 --> 00:40:32.120] Sorry, I'm not sure.
[00:40:33.800 --> 00:40:36.120] That movie came out in 1986.
[00:40:36.440 --> 00:40:37.000] All right.
[00:40:37.000 --> 00:40:48.040] But in 1987, researchers, for the first time, had noticed that a population of orcas had begun swimming around with dead fish on their heads.
[00:40:48.040 --> 00:40:49.320] Salmons on their heads.
[00:40:49.320 --> 00:40:52.520] This was the first recorded observation of this behavior.
[00:40:52.920 --> 00:40:57.960] I have to set it up like this because I need to talk for just a moment about groups of whales, family groups.
[00:40:57.960 --> 00:40:59.960] They're called pods.
[00:41:00.280 --> 00:41:04.760] So a pod of whales is led by the oldest female, the matriarch.
[00:41:04.760 --> 00:41:11.080] And the matriarch passes down knowledge about hunting, migration routes, social interactions.
[00:41:11.080 --> 00:41:13.960] You know, they run the family, essentially.
[00:41:13.960 --> 00:41:20.760] And the offspring, both the male and female offspring, remain with their mothers for life, pretty much.
[00:41:20.760 --> 00:41:23.960] And that's a tight-knit family, really, really tight bond.
[00:41:23.960 --> 00:41:32.200] But pods can range from a small group of a few individuals to a large group of dozens or even in some cases, hundreds of whales.
[00:41:32.200 --> 00:41:38.600] And scientists will categorize these pods by letter, you know, A pod, B pod, and so forth.
[00:41:38.600 --> 00:41:40.040] And this is in 1987.
[00:41:40.040 --> 00:41:47.880] It was observed that there was a female from K pod who started wearing a dead salmon on her head.
[00:41:47.880 --> 00:41:48.440] Wow.
[00:41:48.440 --> 00:42:01.960] And then they noticed within a few weeks other pods, other individuals in other pods, pods J and pods L, they also started to wear fish hats as well, fish on their head.
[00:42:01.960 --> 00:42:13.480] And it was apparently something of a fad because that behavior soon afterwards, I guess, had stopped or had not been seen again until last month, November 2024.
[00:42:13.480 --> 00:42:14.040] Yep.
[00:42:14.040 --> 00:42:25.680] When one of the whales from J-Pod, the same pod, the same family, now this was a 32-year-old male, so that particular whale was not yet born in 1987, but regardless, it's from the same family.
[00:42:25.680 --> 00:42:33.920] So J27 Blackberry, 32-year-old male, was photographed exhibiting the same behavior of wearing the dead salmon on its head.
[00:42:33.920 --> 00:42:39.360] And this was at Point No Point, Washington, off Whidbey Island in Puget Sound.
[00:42:39.360 --> 00:42:44.400] And this took the observers once again, scientists, by surprise.
[00:42:44.400 --> 00:42:44.960] Yeah.
[00:42:45.600 --> 00:43:01.040] Andrew Foote, who is an evolutionary ecologist at the University of Oslo in Norway, said it does seem possible that some individuals that experienced the behavior for the first time or their family members may have started it again.
[00:43:01.360 --> 00:43:08.640] And then there's Deborah Giles, who's a science researcher and a director at the nonprofit Wild Orca.
[00:43:08.640 --> 00:43:19.200] She says, we've seen mammal-eating killer whales carry large chunks of food before under their pectoral fin, kind of tucked into their bodies.
[00:43:19.520 --> 00:43:26.480] But the phenomenon by which they will also wear it on their head could just be another form of this.
[00:43:26.480 --> 00:43:29.280] Like there's like storing food.
[00:43:29.280 --> 00:43:37.360] They have extra food, so they're, you know, just hanging on to it in a way until they're ready to eat it at some point later.
[00:43:37.680 --> 00:43:48.800] And this can happen because what there will be times during the course of the year in which there will be an abundance of food for some of these pods, more than they can eat on a regular basis.
[00:43:48.800 --> 00:43:53.200] So they'll, you know, start storing it somewhere in and amongst their body.
[00:43:53.200 --> 00:44:02.680] So this could be, although they're not 100% sure, this could be that behavior, just another form of them kind of storing this food for a little while.
[00:44:03.080 --> 00:44:10.760] But it's fascinating, though, that it's kind of this cultural almost trend in a way that they've described it as.
[00:44:11.480 --> 00:44:25.880] And again, one pod of whales will exhibit a certain behavior, and then other families will also learn one way or another from that pod, and they will also start exhibiting this behavior.
[00:44:25.880 --> 00:44:31.960] And they think that this happens when orcas will go after boats, right?
[00:44:31.960 --> 00:44:35.320] And attack or throw their bodies onto boats.
[00:44:35.320 --> 00:44:43.800] That one group will do it, another pod will see that, and they'll imitate or they'll go ahead and exhibit that behavior as well.
[00:44:44.920 --> 00:44:46.040] It's very fascinating.
[00:44:46.040 --> 00:44:48.840] It's very nuanced, and it's quite complex.
[00:44:48.840 --> 00:44:59.960] I mean, these are mammals that are just, you know, have been studied so long, and they have very complex social networks and collaborative behaviors, emotional and social bonds.
[00:44:59.960 --> 00:45:04.280] I mean, you know, not to be underestimated.
[00:45:04.280 --> 00:45:09.400] But, I mean, anytime you see a headline where whales wearing fish on their heads, I mean, forget it.
[00:45:09.400 --> 00:45:10.760] And the internet goes crazy.
[00:45:10.760 --> 00:45:16.280] And there's memes and there's songs and there's videos and a million people are talking about it all over the place.
[00:45:16.280 --> 00:45:24.200] You know, like, I know that this must be the most common question, but how the hell are they swimming and they're keeping a dead fish on their forehead?
[00:45:24.200 --> 00:45:36.120] Well, I guess if all you, if that's the medium you live in, things like that become easier than perhaps it would seem to people who don't live their existence in the water like that.
[00:45:36.280 --> 00:45:39.800] Just from the physics of it, though, I mean, you know, their skin is smooth.
[00:45:39.800 --> 00:45:42.360] You'd think that it would just slip right off their head, you know?
[00:45:42.600 --> 00:45:42.960] Yeah, you see.
[00:45:43.200 --> 00:45:45.440] If you swim against it, then you swim against it.
[00:45:44.520 --> 00:45:46.800] They'd be able to battle.
[00:45:47.120 --> 00:45:53.120] They're able to have an equilibrium, a balance of some sort in which they're able to.
[00:45:53.280 --> 00:45:55.440] Yeah, they don't have pockets, so they got to wear it.
[00:45:55.520 --> 00:45:56.800] They've got to put it somewhere.
[00:45:56.800 --> 00:46:00.240] Could this be this orcas playing with their food again?
[00:46:00.720 --> 00:46:10.800] You've seen the videos of orcas tossing seals, like you know, ragdolls up 30, 40 feet in the air and down.
[00:46:10.800 --> 00:46:13.040] Could this just be them like, ah, look what I killed.
[00:46:13.040 --> 00:46:14.720] I'm going to keep it here for a while.
[00:46:14.720 --> 00:46:16.560] Could that be just food play?
[00:46:16.800 --> 00:46:19.360] I mean, I suppose it could be.
[00:46:20.000 --> 00:46:21.600] They're looking into it more, Bob.
[00:46:22.400 --> 00:46:28.080] They don't seem to have, you know, there's not a lot of data on this particular behavior.
[00:46:28.080 --> 00:46:36.640] Obviously, you know, they saw it happen in 1987, and now it's 2024, and they're only seeing it again now, you know, observing it.
[00:46:36.640 --> 00:46:40.480] So they are still really trying to figure it out.
[00:46:40.480 --> 00:46:44.160] But yeah, it could be playfulness, I suppose.
[00:46:44.160 --> 00:46:59.200] But again, you know, I think perhaps Deborah Giles might be onto something when she observes that the whales will tuck dead fish into other parts of their bodies as well, and this could just be one more thing.
[00:46:59.280 --> 00:47:00.560] Well, they eventually eat it.
[00:47:00.880 --> 00:47:02.160] Yeah, eventually they will eat it.
[00:47:02.320 --> 00:47:02.480] Yeah.
[00:47:02.800 --> 00:47:03.120] Whatever.
[00:47:04.480 --> 00:47:06.080] Bob, you know what it reminds me of?
[00:47:06.640 --> 00:47:11.680] Look how I take this dead rat and turn it into a delightful hat.
[00:47:12.320 --> 00:47:13.360] Oh my God, is that Dr.
[00:47:13.440 --> 00:47:13.840] See?
[00:47:14.320 --> 00:47:16.000] That's the nightmare before Christmas.
[00:47:16.480 --> 00:47:17.680] Oh, gosh.
[00:47:17.680 --> 00:47:27.360] Do you guys remember one of the first, I think in the first year of our podcast, we talked about a story where dolphins were observed wearing sponges on their nose.
[00:47:27.360 --> 00:47:38.040] And they do that so that when they forage for fish who are on the ground, they protect their nose from coral and rocks or whatever on the sea floor.
[00:47:38.440 --> 00:47:40.120] Yeah, as they're foraging for fish.
[00:47:40.600 --> 00:47:42.760] Again, cetacean wearing other.
[00:47:43.080 --> 00:47:46.920] This is using it more as a tool, not just carrying it with them to eat later.
[00:47:46.920 --> 00:47:51.240] This is actually even more sophisticated than what the orcas are doing, arguably.
[00:47:51.880 --> 00:47:56.280] But yeah, not unusual behavior for these types of creatures.
[00:47:56.280 --> 00:47:58.600] They're very intelligent, obviously.
[00:47:58.600 --> 00:47:59.640] All right, Bob.
[00:47:59.640 --> 00:48:03.160] Bob, has there ever been water on the surface of Venus?
[00:48:03.160 --> 00:48:05.240] Ah, listen and find out.
[00:48:05.240 --> 00:48:10.680] Earth's twin planet Venus is a hellish place now, but has it always been inhospitable?
[00:48:10.680 --> 00:48:19.960] Some models say yes, some say no, but a new theory using clues from Venus's atmosphere points to a world that has always been evil and inhospitable.
[00:48:19.960 --> 00:48:22.440] This is from researchers at the University of Cambridge.
[00:48:22.440 --> 00:48:28.680] They published in the journal Nature Astronomy a paper called A Dry Venusian Interior Constrained by Atmospheric Chemistry.
[00:48:28.680 --> 00:48:29.880] Okay, what does that mean?
[00:48:29.880 --> 00:48:39.960] All right, so we know Venus is a nasty place, but it's so similar to Earth that it's often referred to as our twin or sister planet.
[00:48:39.960 --> 00:48:41.400] And there's lots of good reasons for that.
[00:48:41.880 --> 00:48:42.440] Same size.
[00:48:42.520 --> 00:48:47.640] Same size, like Venus is 95% of Earth's diameter, similar mass, similar density.
[00:48:47.640 --> 00:48:49.960] The internal structures are similar as well.
[00:48:49.960 --> 00:48:53.480] They both have a rocky mantle surrounding an iron core.
[00:48:53.480 --> 00:49:00.440] They formed at roughly the same time in the history of the solar system from similar materials in the inner solar system.
[00:49:00.760 --> 00:49:03.880] They both have volcanic activity now and in the past.
[00:49:03.880 --> 00:49:06.200] But Venus is many other things.
[00:49:06.200 --> 00:49:07.960] It's ridiculously hot.
[00:49:07.960 --> 00:49:16.800] Venus is the hottest planet in the solar system until, of course, you know, at some point Earth will beat those records the way we're going.
[00:49:17.600 --> 00:49:23.360] But it's hotter than Mercury at 477 C or 860 Fahrenheit.
[00:49:23.360 --> 00:49:25.680] It would melt lead on the surface of Venus.
[00:49:26.240 --> 00:49:28.640] And the Soviets launched a Venera.
[00:49:30.160 --> 00:49:31.040] Is that the one?
[00:49:31.040 --> 00:49:33.600] It actually landed on the surface.
[00:49:33.600 --> 00:49:35.680] But what, like moments later?
[00:49:35.840 --> 00:49:36.960] It survived for minutes, I think.
[00:49:37.280 --> 00:49:37.680] Yeah, it was.
[00:49:37.840 --> 00:49:44.800] Yeah, you can get longer than minutes, but yeah, nothing, yeah, anything we put land on there will not be lasting very long.
[00:49:45.040 --> 00:49:47.680] I heard, I saw one figure of maybe hours.
[00:49:47.680 --> 00:49:51.920] I'm not sure about that one, but yeah, you're not going to have anything that's going to last there.
[00:49:51.920 --> 00:49:55.840] You said Venera 13 lasted for 127 minutes.
[00:49:55.840 --> 00:49:56.720] Yeah, okay.
[00:49:56.720 --> 00:49:57.280] Wow.
[00:49:57.280 --> 00:49:58.400] So yeah, it's two hours.
[00:49:59.360 --> 00:50:02.560] But then it's not just the heat, it's the atmospheric pressure.
[00:50:02.560 --> 00:50:05.520] The atmosphere is notoriously thick on Venus.
[00:50:05.840 --> 00:50:10.320] So if Earth's atmospheric pressure at sea level is what, what's that called?
[00:50:10.320 --> 00:50:11.360] One bar, right?
[00:50:11.360 --> 00:50:15.360] That translates to 14.7 pounds per square inch.
[00:50:15.360 --> 00:50:18.080] Venus is not one bar, but 92 bar.
[00:50:18.320 --> 00:50:24.080] So at its surface, instead of 14.7 pounds per square inch, it's 1,350 pounds per square inch.
[00:50:25.360 --> 00:50:28.000] It's like a small car on every square inch of your body.
[00:50:28.640 --> 00:50:32.000] It's like going down a kilometer below the surface of the ocean.
[00:50:32.000 --> 00:50:35.280] That's the kind of pressure we're talking about on the surface of Venus.
[00:50:35.280 --> 00:50:36.480] And it's also toxic.
[00:50:36.480 --> 00:50:37.840] Let's just throw that in there.
[00:50:37.840 --> 00:50:44.240] The atmosphere is mostly carbon dioxide, and there's thick clouds of sulfuric acid that shroud the entire planet.
[00:50:44.720 --> 00:50:48.400] The only saving grace is that the acid rain never even reaches the surface.
[00:50:48.400 --> 00:50:52.720] But then again, it doesn't reach the surface because it evaporates from all the intense heat.
[00:50:52.720 --> 00:50:55.200] It's so hot that it can't even get down to the surface.
[00:50:55.200 --> 00:50:56.720] It just evaporates away.
[00:50:56.720 --> 00:50:58.480] And even the winds are ridiculous.
[00:50:58.480 --> 00:51:04.040] 100 meters per second at some altitudes, and that's 60 times the planet's speed of rotation.
[00:50:59.680 --> 00:51:05.880] So, yeah, there's no Starbucks.
[00:51:06.440 --> 00:51:09.560] Yeah, hellish is like the perfect word for Venus.
[00:51:09.560 --> 00:51:10.840] It's nasty.
[00:51:10.840 --> 00:51:15.880] So, how could such initially similar twins diverge so dramatically?
[00:51:15.880 --> 00:51:22.200] To answer this, scientists have used climate modeling to tell us what Venus was like when it was younger.
[00:51:22.200 --> 00:51:24.920] So, two models have been duking it out for a long time.
[00:51:24.920 --> 00:51:36.920] One model contends that Venus was more temperate and Earth-like in the past, enough to even have liquid water on its surface and probably even shallow seas, they these models, this specific model contends.
[00:51:36.920 --> 00:51:51.480] But planet-wide volcanic activity eventually, after maybe even a couple of billion years, eventually spewed enough carbon dioxide to cause the famous runaway greenhouse effect that made the planet hotter and hotter, eventually becoming what we see today.
[00:51:51.480 --> 00:51:52.680] So, that's one model.
[00:51:52.680 --> 00:51:56.440] The other model describes a dry Venus scenario.
[00:51:56.440 --> 00:52:11.800] In this one, Venus's distance is close enough to the sun that its initial magma ocean, you know, when a planet just formed and it's basically magma everywhere, it takes a long time to cool, far longer than the Earth, and that desiccates the planet.
[00:52:11.800 --> 00:52:17.560] It removes most of the water, which was then never able to collect as liquid water on its surface.
[00:52:17.560 --> 00:52:19.320] So, those are the two models.
[00:52:19.320 --> 00:52:32.680] First author, Teresa Constanta, now a PhD student at Cambridge Institute of Astronomy, said both of those theories are based on climate models, but we want to take a different approach based on observations of Venus's current atmospheric chemistry.
[00:52:32.680 --> 00:52:35.000] So, that's what their paper is about.
[00:52:35.000 --> 00:52:43.160] And the goal was to study Venus's atmosphere to see how it might interact with the planet's interior, what that loop is like.
[00:52:43.160 --> 00:52:47.360] So, to do that, they looked at the chemicals in the atmosphere that were being destroyed.
[00:52:44.840 --> 00:52:51.920] And a couple of those were carbon dioxide and water were two right off the bat.
[00:52:52.240 --> 00:53:01.120] So for Venus to have a stable atmosphere, it's going to have to replace the water and carbon that were getting destroyed by various atmospheric processes.
[00:53:01.120 --> 00:53:02.480] So those chemicals need to be replaced.
[00:53:02.480 --> 00:53:09.840] So they looked at volcanic outgassing on Venus, because that's something that could replace some of these lost chemicals in the atmosphere.
[00:53:09.840 --> 00:53:16.320] And they found the critical finding here was that the volcanic acid on Venus were only 6% water.
[00:53:16.640 --> 00:53:21.120] And that was enough to replace what was being lost in the atmosphere.
[00:53:21.120 --> 00:53:22.800] But only 6% water.
[00:53:23.120 --> 00:53:29.360] Now remember, when magma rises from the deep within a planet, it brings the chemicals and gases that are down there to the surface.
[00:53:29.920 --> 00:53:33.280] So whatever is outgassed is very indicative of what's down there.
[00:53:33.280 --> 00:53:42.080] On Earth, volcanic gases are mostly steam, which is one way to show that, or to prove that the interior of the Earth is water-rich.
[00:53:42.080 --> 00:53:49.600] If Venus is outgassing only 6% of the water, its interior is likely to be just as dry as the surface.
[00:53:49.680 --> 00:53:57.280] And they say in the paper, the volcanic resupply to Venus's atmosphere, therefore, indicates that the planet has never been liquid water habitable.
[00:53:58.080 --> 00:54:24.080] So if you were dozing there for the past few minutes, the TLDR or perhaps TLDL, which too long didn't listen, this study shows that Venus was likely close enough to the Sun that in its early epoch that had magma oceans, took a long time to end, giving it a lot of time to lose its water, meaning that it never had a temperate Earth-like history and it probably never had water on its surface.
[00:54:24.080 --> 00:54:34.840] Now, luckily, the Earth was farther from the Sun, causing our magma ocean to solidify on the surface earlier than Venus did, and that cap held in all the water that we had at that time.
[00:54:34.840 --> 00:54:37.960] And that might be the key difference between Earth and Venus right there.
[00:54:37.960 --> 00:54:39.080] And it was the distance.
[00:54:39.080 --> 00:54:42.360] We knew that Venus was on the edge of habitability, right?
[00:54:42.360 --> 00:54:46.280] We were not sure if it was over that line or not.
[00:54:46.280 --> 00:54:58.520] And this is one bit of evidence showing that, yeah, Venus is a little bit out of the habitable zone because, and primarily because the magma ocean is going to just last too long and all the water is going to go away, basically.
[00:54:58.520 --> 00:55:04.680] Now, confirming this, though, to a higher degree of certainty, it's going to require waiting for future orbiters and even landers.
[00:55:04.680 --> 00:55:07.400] And there's one actually planned for the end of this decade.
[00:55:07.400 --> 00:55:09.480] Da Vinci hadn't heard about that.
[00:55:09.960 --> 00:55:14.920] So, the Da Vinci mission will have orbiters and a lander.
[00:55:14.920 --> 00:55:21.080] I'll see how long that one lasts, and we'll maybe even be able to confirm some of these theories that these people are saying.
[00:55:21.080 --> 00:55:30.680] Then, so a reasonable takeaway now from this is that when we look for exoplanets that may have had or have life, we should not look at Venus-like planets.
[00:55:30.680 --> 00:55:32.040] We should narrow our focus.
[00:55:32.040 --> 00:55:43.080] These right, these scientists contend, we should narrow our focus to exoplanets that are more similar to Earth, if that's what your interest is looking for worlds, exoplanets that have life.
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
ing it a lot of time to lose its water, meaning that it never had a temperate Earth-like history and it probably never had water on its surface.
[00:54:24.080 --> 00:54:34.840] Now, luckily, the Earth was farther from the Sun, causing our magma ocean to solidify on the surface earlier than Venus did, and that cap held in all the water that we had at that time.
[00:54:34.840 --> 00:54:37.960] And that might be the key difference between Earth and Venus right there.
[00:54:37.960 --> 00:54:39.080] And it was the distance.
[00:54:39.080 --> 00:54:42.360] We knew that Venus was on the edge of habitability, right?
[00:54:42.360 --> 00:54:46.280] We were not sure if it was over that line or not.
[00:54:46.280 --> 00:54:58.520] And this is one bit of evidence showing that, yeah, Venus is a little bit out of the habitable zone because, and primarily because the magma ocean is going to just last too long and all the water is going to go away, basically.
[00:54:58.520 --> 00:55:04.680] Now, confirming this, though, to a higher degree of certainty, it's going to require waiting for future orbiters and even landers.
[00:55:04.680 --> 00:55:07.400] And there's one actually planned for the end of this decade.
[00:55:07.400 --> 00:55:09.480] Da Vinci hadn't heard about that.
[00:55:09.960 --> 00:55:14.920] So, the Da Vinci mission will have orbiters and a lander.
[00:55:14.920 --> 00:55:21.080] I'll see how long that one lasts, and we'll maybe even be able to confirm some of these theories that these people are saying.
[00:55:21.080 --> 00:55:30.680] Then, so a reasonable takeaway now from this is that when we look for exoplanets that may have had or have life, we should not look at Venus-like planets.
[00:55:30.680 --> 00:55:32.040] We should narrow our focus.
[00:55:32.040 --> 00:55:43.080] These right, these scientists contend, we should narrow our focus to exoplanets that are more similar to Earth, if that's what your interest is looking for worlds, exoplanets that have life.
[00:55:43.080 --> 00:55:44.760] I'll finish with a quote here from the paper.
[00:55:45.000 --> 00:56:00.520] We would have loved to have found that Venus was once a planet much closer to our own, so it's kind of sad in a way to find out that it wasn't, but ultimately it's more useful to focus the search on planets that are most likely to be able to support life, at least life as we know it, which I love when scientists throw that there at the end.
[00:56:00.520 --> 00:56:05.320] Life as we know it, yes, we've got one data point, so that's really a good way to say that.
[00:56:05.320 --> 00:56:06.760] So, yeah, interesting stuff.
[00:56:06.760 --> 00:56:08.760] Yeah, Venus is very interesting.
[00:56:08.760 --> 00:56:14.880] And the idea that there may be life in the upper atmosphere isn't faring so well.
[00:56:14.600 --> 00:56:17.280] You haven't heard too much good stuff about that either.
[00:56:18.480 --> 00:56:22.560] But if it is up there, it's not relying on water.
[00:56:22.880 --> 00:56:23.360] Yeah, probably.
[00:56:24.400 --> 00:56:24.640] All right.
[00:56:24.640 --> 00:56:25.440] Thanks, Bob.
[00:56:25.440 --> 00:56:26.160] All right, Jay.
[00:56:26.160 --> 00:56:27.760] It's who's that noisy time?
[00:56:27.760 --> 00:56:30.080] Okay, guys, last week I played This Noisy.
[00:56:38.160 --> 00:56:39.600] You guys have any guesses?
[00:56:39.600 --> 00:56:40.720] Cetacean.
[00:56:41.040 --> 00:56:42.560] Okay, well, we got.
[00:56:43.520 --> 00:56:47.280] I'll say right out of the gate, a ton of listeners knew exactly what this was.
[00:56:47.280 --> 00:56:51.520] So a listener named Joe Lanardria.
[00:56:53.360 --> 00:56:57.760] It's as if this guy's last name was created just so I couldn't pronounce it.
[00:56:58.400 --> 00:57:05.840] He said, This week's Noisy immediately brought to mind images of crested dinosaurs wading around in some prehistoric swamp.
[00:57:05.840 --> 00:57:12.000] So I'm going to guess this sound was from a computer's reproduction of what a parasolophilic.
[00:57:12.080 --> 00:57:13.200] Paraserolephus.
[00:57:13.280 --> 00:57:14.400] Parasiurolephus.
[00:57:14.400 --> 00:57:15.600] Paraserolephus.
[00:57:15.920 --> 00:57:24.640] It's like, you know, you get to like the second, as soon as I hit the second syllable, it just, like, my brain just goes, ha ha, you know.
[00:57:24.960 --> 00:57:33.600] All right, so the dinosaur or related species would have sounded like based on a 3D model of fossilized skull, Paris Cierolephus.
[00:57:33.600 --> 00:57:34.080] I got it.
[00:57:34.080 --> 00:57:34.320] Okay.
[00:57:34.320 --> 00:57:35.280] Thank you, Kara.
[00:57:35.280 --> 00:57:35.760] Yep.
[00:57:35.760 --> 00:57:39.440] I wish I just had you in my head where I could just hear you say it and then I could say it.
[00:57:39.440 --> 00:57:41.920] Listener named Amanda Lee wrote in and said, Hi, Jay.
[00:57:41.920 --> 00:57:46.720] I am absolutely convinced that this week's noisy is a cetacean, but which one?
[00:57:46.880 --> 00:57:49.920] My guess is a toothed whale, maybe an orca.
[00:57:49.920 --> 00:57:55.440] Okay, so that is not correct, but we did mention orcas in this show, so I give you two points.
[00:57:55.760 --> 00:57:59.120] Leah Zich said, Hey, all, new listener here.
[00:57:59.120 --> 00:58:04.680] I found you a few weeks ago, but I've gone back into the archives and have listened to hours and hours of the show.
[00:57:59.920 --> 00:58:06.280] I truly enjoy every episode.
[00:58:06.520 --> 00:58:10.280] I'm excited that I think I finally know what this noise is.
[00:58:10.280 --> 00:58:12.760] I'm quite confident that it's a zebra.
[00:58:12.760 --> 00:58:17.160] She goes on, If you ever take Naticon internationally, I'll be the first in line for tickets.
[00:58:17.160 --> 00:58:18.200] I'm in Italy.
[00:58:18.200 --> 00:58:20.760] That's awesome, and I would love to have it in Italy.
[00:58:20.760 --> 00:58:29.720] And you should contact me to let me know if you think that there are enough English-speaking skeptics out there that actually would come to the conference.
[00:58:29.720 --> 00:58:33.160] You can email us at info at the skepticsguy.org.
[00:58:33.160 --> 00:58:34.040] I appreciate it.
[00:58:34.040 --> 00:58:35.560] And send me some good meatballs.
[00:58:35.560 --> 00:58:38.200] Okay, moving on to a listener named Aiden.
[00:58:38.200 --> 00:58:42.200] He goes, Jay, yes, he wrote my name with a crazy number of A's in it.
[00:58:42.200 --> 00:58:46.040] That sounded like a deer, probably a reindeer, considering the holiday season.
[00:58:46.040 --> 00:58:48.680] Not a bad guess at all, but not correct.
[00:58:48.680 --> 00:58:50.520] Let's get right to the answer here.
[00:58:50.520 --> 00:58:54.440] Crystal Haka was the first one to answer correctly.
[00:58:54.440 --> 00:58:57.720] And we also had another correct guess by Shane Hillier.
[00:58:57.720 --> 00:59:01.800] These two people basically guessed within seconds of each other, which I think is always funny.
[00:59:01.800 --> 00:59:09.480] But Crystal writes, Hey, this week's noisy definitely sounds like a bull elk in rut, maybe two about to fight, right?
[00:59:09.480 --> 00:59:14.520] So, you know, this is an elk that it's called bleeding, right?
[00:59:14.520 --> 00:59:18.360] And when they're in rut, it's like mating season, basically.
[00:59:18.360 --> 00:59:19.880] And they're trying to find each other.
[00:59:19.880 --> 00:59:21.080] They're trying to fight each other.
[00:59:21.080 --> 00:59:22.200] They're looking for alcohol.
[00:59:22.200 --> 00:59:23.800] It gets really messy out there.
[00:59:23.800 --> 00:59:26.600] I kept thinking you were going to say something different.
[00:59:26.600 --> 00:59:27.080] Yeah.
[00:59:27.400 --> 00:59:29.640] But the bottom line is, that is an elk.
[00:59:29.640 --> 00:59:40.280] Lots of listeners apparently lived near Elk or knew someone that had elk or had some type of interaction with an elk and their vehicle.
[00:59:40.280 --> 00:59:41.800] That's all I'm going to say.
[00:59:42.120 --> 00:59:43.720] So, thank you all for writing in.
[00:59:43.720 --> 00:59:48.720] I really enjoyed your guesses this week, and I'm going to move right in to next week's Noisy.
[00:59:49.040 --> 00:59:53.280] Guys, this noisy was sent in by a listener named Bobby Duke.
[01:00:00.560 --> 01:00:10.320] Now, lots of people might have some initial guesses, but a really good guess would be that that's the noise Bob makes when he's putting away his Halloween decorations, right, Bob?
[01:00:10.640 --> 01:00:15.120] Yeah, I thought he was sobbing in between those cries, and yes.
[01:00:15.440 --> 01:00:23.440] All right, so guys, if you think you know what this week's noisy is or you heard something cool, you can email us at WTN at the skepticsguide.org.
[01:00:23.440 --> 01:00:25.120] Steve, hold on a second.
[01:00:25.120 --> 01:00:25.600] Yes.
[01:00:25.600 --> 01:00:28.800] All right, so first of all, it's the end of the year, right?
[01:00:29.440 --> 01:00:30.480] It's Christmas time.
[01:00:30.480 --> 01:00:31.520] Well, we're Christmas coming in.
[01:00:32.000 --> 01:00:45.840] If you guys know someone that listens to the show, you know, a sibling, a parent, a friend, a neighbor, a neighbor, maybe, you could gift them an SGU patron membership.
[01:00:45.840 --> 01:00:47.280] It's very easy to do.
[01:00:47.280 --> 01:00:56.080] So just go to patreon.com forward slash skepticsguide, and you will see that there is an option to gift someone a membership to the SGU.
[01:00:56.080 --> 01:01:00.400] That would be a great way to give someone a nice gift and also support your favorite podcast.
[01:01:00.400 --> 01:01:01.920] We would really appreciate it.
[01:01:01.920 --> 01:01:04.080] Or you could become a patron on your own.
[01:01:04.080 --> 01:01:09.360] You could also buy either the Skeptics Guide to the Universe or the Skeptics Guide to the Future books.
[01:01:09.360 --> 01:01:10.720] We have two books out there.
[01:01:10.720 --> 01:01:11.840] They make great gifts.
[01:01:11.840 --> 01:01:21.680] If you're giving somebody a gift membership in a Patreon membership, you could represent that with a physical item in one of the one or both of those books.
[01:01:21.680 --> 01:01:24.720] And if you bring them to a live event, we will sign them for you.
[01:01:24.720 --> 01:01:25.280] Yes.
[01:01:25.280 --> 01:01:26.640] And I'll sign your forehead.
[01:01:26.640 --> 01:01:27.680] Whatever you want, we'll do it.
[01:01:27.840 --> 01:01:30.280] So, anyway, you could join our mailing list.
[01:01:29.760 --> 01:01:40.360] You can go to the skepticsguide.org and become a mailing list member, which basically means that every week we will send you an email that has all the details about everything that we've done the previous week.
[01:01:40.360 --> 01:01:47.960] Also, if you find a way to give our show a rating, we encourage you to do it because that helps new people find our podcast.
[01:01:47.960 --> 01:01:51.800] As you hear this show, it'll be the seventh on Saturday.
[01:01:51.800 --> 01:01:55.560] We will be doing two live shows, so it's probably too late to buy tickets.
[01:01:55.560 --> 01:02:03.160] But if there's some reason where it isn't, you could buy tickets to our private show and the extravaganza, which is at night.
[01:02:03.160 --> 01:02:06.840] You can go to theskepticsguide.org and buy tickets there.
[01:02:06.840 --> 01:02:12.120] And the big one, which I'm putting a lot of time into, is Natacon 2025.
[01:02:12.920 --> 01:02:16.440] This is going to be the weekend of May 15th.
[01:02:16.440 --> 01:02:18.920] And, you know, you've heard me talk about this.
[01:02:18.920 --> 01:02:22.760] It's a wonderful conference, tons of socializing.
[01:02:22.760 --> 01:02:26.360] If you come there and you want to make friends, you will make friends.
[01:02:26.360 --> 01:02:34.120] There's an extraordinary group of people that are, you know, the core patrons of the SGU that are going to attend and they're awesome.
[01:02:34.120 --> 01:02:35.720] And there's a lot of fun to be had.
[01:02:35.720 --> 01:02:45.800] Brian Wecht, Andrea Jones Roy, George Hobb, and the entire SGU will be there providing entertainment for the two days, 2.2 days, like I said, Kara.
[01:02:45.800 --> 01:02:48.280] And that's a good way to support the SGU.
[01:02:48.280 --> 01:02:56.520] So please go to nauticoncon.com or you can go to the skepticsguide.org and you can find all the information that you need.
[01:02:56.520 --> 01:02:57.640] Thank you, Jay.
[01:02:57.640 --> 01:03:02.680] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Mint Mobile.
[01:03:02.680 --> 01:03:06.200] With Mint Mobile, there's no hoops and there's no BS.
[01:03:06.200 --> 01:03:09.640] It's $15 a month with the purchase of a three-month plan.
[01:03:09.640 --> 01:03:13.080] It really is that easy to get wireless for $15 a month.
[01:03:13.080 --> 01:03:17.840] The longest part of this process is the time that you'll spend breaking up with your old provider.
[01:03:17.840 --> 01:03:23.840] All plans come with high-speed data and unlimited talk and text delivered on the nation's largest 5G network.
[01:03:23.840 --> 01:03:29.920] You can use your own phone with any Mint Mobile plan and bring your own phone number along with all your existing contacts.
[01:03:29.920 --> 01:03:37.040] Find out how easy it is to switch to Mint Mobile and get three months of premium wireless service for $15 a month.
[01:03:37.040 --> 01:03:44.240] To get this new customer offer and your new three-month premium wireless plan for just $15 a month, go to mintmobile.com/slash SGU.
[01:03:44.240 --> 01:03:46.800] That's mintmobile.com/slash SGU.
[01:03:46.800 --> 01:03:51.040] Cut your wireless bill to $15 a month at mintmobile.com/slash SGU.
[01:03:51.040 --> 01:03:54.480] $45 upfront payment required equivalent to $15 a month.
[01:03:54.480 --> 01:03:56.640] New customers on first three-month plan only.
[01:03:56.640 --> 01:03:59.120] Speeds slower than $40 GB on unlimited plan.
[01:03:59.120 --> 01:04:00.800] Additional taxes, fees, and restrictions apply.
[01:04:00.800 --> 01:04:02.320] See Mint Mobile for details.
[01:04:02.320 --> 01:04:04.640] All right, guys, let's get back to the show.
[01:04:04.640 --> 01:04:08.160] Got a couple of emails, a couple of interesting corrections for this week.
[01:04:08.160 --> 01:04:16.880] The first one comes from Doug Herr, who writes, Has Steve realized that the Fisher, not Fisher Cat, no such thing, does not scream?
[01:04:17.120 --> 01:04:22.880] I hope to find that Steve has realized that all of the Fisher screaming recordings are actually recordings of foxes.
[01:04:22.880 --> 01:04:27.120] As evidence, see the link in the Wikipedia reference, and it gives a link to that.
[01:04:27.120 --> 01:04:33.200] It shows how the person running that site realized that all the content on that site was people hearing and not seeing.
[01:04:33.200 --> 01:04:37.920] It also shows that an expert with the animal has never heard much more than a hiss.
[01:04:37.920 --> 01:04:40.320] It is time to be skeptical.
[01:04:40.960 --> 01:04:46.080] I appreciate the correction, Doug, but you don't have to be so hurtful about it.
[01:04:46.080 --> 01:04:46.640] Yeah.
[01:04:47.280 --> 01:04:51.200] So, yes, I did as much as with the time that I had.
[01:04:51.200 --> 01:04:58.080] I tried to find a video of a Fisher or Fisher cat, which is the common name.
[01:04:58.080 --> 01:05:03.560] And apparently, Fisher Cat is a particularly regional name of New England.
[01:05:03.880 --> 01:05:08.520] So, by disrespecting that, you are disrespecting my culture and my family.
[01:05:09.800 --> 01:05:10.840] We will accept apologies.
[01:05:11.000 --> 01:05:13.160] We call them fisher cats in this family.
[01:05:13.480 --> 01:05:15.960] So, fisher is the technical name.
[01:05:15.960 --> 01:05:19.960] It is, however, they do have many common names, one of which is fisher cat.
[01:05:19.960 --> 01:05:23.160] If enough people use it, it becomes at least a common name.
[01:05:23.160 --> 01:05:25.160] The name Fisher, by the way, said, Well, it's not a cat.
[01:05:25.160 --> 01:05:26.040] It's also not a fisher.
[01:05:26.040 --> 01:05:27.880] It doesn't eat fish, doesn't fish.
[01:05:28.200 --> 01:05:30.120] That's a misnomer, too.
[01:05:30.120 --> 01:05:39.240] It probably derives from the fact that American settlers thought that it looked like a European pole cat.
[01:05:39.240 --> 01:05:42.840] And in French, that's like fichette or whatever.
[01:05:43.080 --> 01:05:47.000] So, like, it got bastardized to fisher.
[01:05:47.000 --> 01:05:48.200] Oh, we're good at doing that.
[01:05:48.440 --> 01:05:55.480] Not sure why it became fisher cat, but it does kind of look like a cat, so it's easy to say, well, some people thought it looked like a cat, so they call it a fisher cat, whatever.
[01:05:55.480 --> 01:05:58.040] Um, it is a member of the weasel family.
[01:05:58.040 --> 01:05:58.840] Pop.
[01:05:58.840 --> 01:05:59.320] Yep.
[01:05:59.320 --> 01:06:00.440] They're actually kind of cute.
[01:06:00.760 --> 01:06:02.920] They're pretty good predators.
[01:06:02.920 --> 01:06:11.480] And when we talked about the scream on the show, I mentioned the fact that somebody from Europe said, no, that's a fox.
[01:06:11.560 --> 01:06:19.960] Like, well, yeah, foxes have that same scream, and there's actually lots of websites that have how to tell the difference between a fisher scream and a red fox scream.
[01:06:19.960 --> 01:06:25.320] But now he's saying that it's all foxes, like there's no fisher scream.
[01:06:25.320 --> 01:06:27.320] And he, I think that may be correct.
[01:06:27.320 --> 01:06:31.560] I have not been able to find a video of a fisher screaming.
[01:06:32.040 --> 01:06:35.400] There's plenty of videos of fissures, but they're not screaming.
[01:06:35.400 --> 01:06:43.720] And there's video of like screaming coming from the woods, but you're not seeing the animal that's making that noise.
[01:06:43.720 --> 01:06:48.800] I have not been able to find any actual video of a fisher screaming.
[01:06:44.760 --> 01:06:51.920] So it's possible that it doesn't, in fact, happen.
[01:06:52.240 --> 01:06:56.960] And just from my reading, what experts say is they probably don't.
[01:06:57.280 --> 01:06:59.360] They mainly grunt and hiss.
[01:06:59.360 --> 01:07:02.560] They're predators and they're pretty quiet creatures.
[01:07:02.560 --> 01:07:08.800] But under stress, under distress, under rare circumstances, they may scream.
[01:07:08.800 --> 01:07:10.880] They can't rule that out.
[01:07:10.880 --> 01:07:14.160] But we don't really have, it turns out it's not well documented.
[01:07:14.160 --> 01:07:19.840] A lot of people assume that they do, but there's certainly a widespread belief that they do.
[01:07:19.840 --> 01:07:23.200] But there's no, it turns out there's no hard evidence.
[01:07:23.200 --> 01:07:24.160] I'll keep looking.
[01:07:24.160 --> 01:07:26.320] We could crowd towards this as well.
[01:07:26.320 --> 01:07:31.680] I don't know if there's any actual documented cases of a fisher screaming.
[01:07:31.680 --> 01:07:36.000] I could not find any, and it seems like the consensus is they probably don't, which is interesting.
[01:07:36.000 --> 01:07:40.960] All right, next email comes from Ron, who writes, Hello, fellow skeptics and fellow Connecticut residents.
[01:07:40.960 --> 01:07:46.000] There has been some buzz lately over the sighting of a UFO by a police officer in Connecticut.
[01:07:46.000 --> 01:07:48.880] He even filmed some of it on his cell phone.
[01:07:48.880 --> 01:07:51.280] You can Google it or go to the link in the news station.
[01:07:51.280 --> 01:07:53.440] I would love to see Steve comment on the story.
[01:07:53.440 --> 01:07:55.680] I think it's worth examining some of the logic in the story.
[01:07:55.680 --> 01:07:57.840] At any rate, I'm a huge fan of the SGU.
[01:07:57.840 --> 01:07:58.960] Keep up the good work.
[01:07:58.960 --> 01:08:00.160] And then he provides a link.
[01:08:00.160 --> 01:08:01.360] So, yes.
[01:08:01.840 --> 01:08:07.520] As Bob says, so this was a, yeah, this is in Fairfield County, like where we live.
[01:08:07.520 --> 01:08:13.600] And it was a police officer who, Robert Klein, who had, the story goes, right?
[01:08:13.600 --> 01:08:17.440] The story that he tells is it was like two in the morning, three in the morning.
[01:08:17.440 --> 01:08:17.840] Oh, is it?
[01:08:18.000 --> 01:08:19.360] He's pulling an overnight shift.
[01:08:19.360 --> 01:08:23.120] He's riding on a lonely country road with nobody around.
[01:08:23.120 --> 01:08:32.840] And then suddenly he's blinded by this bright light shining into his cabin of his police car and to the point where he couldn't really see.
[01:08:33.160 --> 01:08:39.560] And then he describes this glowing orange-red orb that was beaming the light.
[01:08:39.560 --> 01:08:43.560] And then it flew across the lake, and then you could see it sort of moving in the distance.
[01:08:43.560 --> 01:08:57.000] And then he sort of overcame the shock of the whole situation, took out his phone, and then taped, you know, filmed on his phone the UFO in more now more in the distance.
[01:08:57.000 --> 01:08:58.440] That's the story, right?
[01:08:59.880 --> 01:09:03.720] I know you guys have all had an opportunity to see the news reports and the video.
[01:09:03.720 --> 01:09:12.280] So, of course, the local news does a total unskeptical hatchet job of reporting this because they're interested only in the sensation.
[01:09:12.280 --> 01:09:21.800] And they pull for an expert, this guy, Ross Coulthard, who is an investigative journalist, but he's a UFO nut, right?
[01:09:21.800 --> 01:09:24.440] The bottom line is he's not really an expert in anything.
[01:09:24.440 --> 01:09:26.120] He's just a true believer.
[01:09:26.120 --> 01:09:39.480] And this guy goes, he proceeds to make every unskeptical, uncritical trope, you know, pro-UFO trope in this, in his interview on the topic.
[01:09:39.480 --> 01:09:40.920] It's just embarrassing.
[01:09:41.160 --> 01:09:43.320] His attitude towards skeptics was galling.
[01:09:43.320 --> 01:09:47.400] I mean, I just wanted to eviscerate all the straw men that he was throwing up.
[01:09:47.400 --> 01:09:55.960] He actually said, describing skeptics, he says that describing us as experiencing denialism in the face of overwhelming evidence.
[01:09:55.960 --> 01:09:58.680] Sure, that's what skeptics are all about, right?
[01:09:59.000 --> 01:10:03.800] But then later, like a minute later or two, he says, it's important to be skeptical.
[01:10:03.800 --> 01:10:05.320] Like, dude, make up your mind, right?
[01:10:05.320 --> 01:10:06.360] What are you talking about?
[01:10:06.360 --> 01:10:06.840] It's folks.
[01:10:07.080 --> 01:10:09.160] I just got so mad, so mad at him.
[01:10:09.560 --> 01:10:12.360] Yeah, he's emphasizing this guy's a 25-year veteran.
[01:10:12.360 --> 01:10:14.600] He's like, you know, so what?
[01:10:14.600 --> 01:10:15.760] That is irrelevant.
[01:10:14.840 --> 01:10:19.200] That's the credible witness kind of fallacy.
[01:10:20.080 --> 01:10:23.920] It doesn't change the fact that it was late, early in the morning.
[01:10:23.920 --> 01:10:24.800] He's by himself.
[01:10:24.800 --> 01:10:26.800] The guy could have been half asleep as far as we know.
[01:10:26.960 --> 01:10:28.080] The witching hour, 3 a.m.
[01:10:28.240 --> 01:10:28.880] That's when most of the time.
[01:10:29.040 --> 01:10:30.240] No, he made a point.
[01:10:30.240 --> 01:10:30.960] They made a point.
[01:10:30.960 --> 01:10:34.400] He made a point of saying it was early in his overnight shift.
[01:10:34.400 --> 01:10:34.640] So?
[01:10:35.120 --> 01:10:36.800] So he wasn't up, meaning he wasn't up.
[01:10:37.520 --> 01:10:39.120] But we don't know what his state of mind was.
[01:10:39.120 --> 01:10:43.520] Even shift workers are sleep deprived, period.
[01:10:43.840 --> 01:10:45.120] And so whatever.
[01:10:45.120 --> 01:10:46.320] But I don't know.
[01:10:46.320 --> 01:10:48.560] I don't have any first-hand knowledge of what his actual state was.
[01:10:48.560 --> 01:10:50.240] But it certainly is plausible.
[01:10:50.240 --> 01:11:03.600] But even without that, even if he was wide awake, it doesn't matter because he's assuming that he's not subject to optical perceptual illusions, right?
[01:11:04.000 --> 01:11:09.600] It doesn't matter how long you've been a police officer, you have the same susceptibility to visual illusions as everybody else.
[01:11:09.920 --> 01:11:11.520] If you're human, you're susceptible.
[01:11:11.520 --> 01:11:12.400] Absolutely.
[01:11:12.400 --> 01:11:15.360] So what's the simplest explanation for the evidence we have?
[01:11:15.360 --> 01:11:16.320] We basically have two things.
[01:11:16.320 --> 01:11:20.960] We have his story and we have the video, right?
[01:11:21.360 --> 01:11:22.720] I didn't see his video, though.
[01:11:22.720 --> 01:11:23.360] I just saw the research.
[01:11:24.160 --> 01:11:28.000] Well, yeah, the news did a recreation, which is utterly worthless.
[01:11:28.000 --> 01:11:28.640] It's worthless.
[01:11:29.360 --> 01:11:30.720] Oh, gosh, it was right out of it.
[01:11:30.800 --> 01:11:31.360] It was misleading.
[01:11:31.520 --> 01:11:31.760] I know.
[01:11:31.760 --> 01:11:32.480] It's like, what is it?
[01:11:32.480 --> 01:11:33.120] It's worthless.
[01:11:33.120 --> 01:11:33.600] What are you making?
[01:11:33.760 --> 01:11:34.880] Yeah, you're making a digital.
[01:11:35.120 --> 01:11:36.960] Is this fire in the sky, the movie?
[01:11:37.360 --> 01:11:38.240] It's terribly.
[01:11:38.560 --> 01:11:43.440] No, if you keep watching, Bob, you keep watching the news report, they then start to show his actual video.
[01:11:43.440 --> 01:11:43.920] Right.
[01:11:43.920 --> 01:11:45.360] And you know what it shows?
[01:11:45.360 --> 01:11:46.160] An airplane.
[01:11:46.160 --> 01:11:47.360] A light in the distance.
[01:11:47.360 --> 01:11:49.200] I guarantee you it's a drone.
[01:11:49.760 --> 01:11:50.080] Or something.
[01:11:50.320 --> 01:11:54.640] It's just a normal light not moving in any way that's unusual.
[01:11:54.640 --> 01:12:00.520] It's just, it's, I mean, completely and easily explained by.
[01:12:01.960 --> 01:12:02.840] That is so.
[01:12:02.840 --> 01:12:06.680] I mean, think about how he exaggerated that story then.
[01:12:06.680 --> 01:12:09.560] And he said, Jay, I can guarantee you it wasn't a drone.
[01:12:09.560 --> 01:12:10.440] Can you really?
[01:12:10.440 --> 01:12:11.400] Can you really guarantee?
[01:12:11.800 --> 01:12:16.280] The only hard evidence we have is 100% consistent with a drone.
[01:12:16.280 --> 01:12:20.920] And then he's saying, first of all, he could have underestimated how close it was, right?
[01:12:20.920 --> 01:12:24.680] He thought it was moving very fast when actually it was just super close.
[01:12:24.680 --> 01:12:37.720] You know, a drone with a light, if somebody was buzzing him with the drone because he was on the side of the road or whatever, and then he's interpreting the halo of light around, you know what I mean?
[01:12:37.720 --> 01:12:55.480] Like you see, if you watch that night and your eyes are dark adapted and then there's a bright light, and you could see it on the video, this sort of glow around the light source, then he's interpreting that as the thing itself, not just a visual, you know, aura around a bright light source.
[01:12:55.480 --> 01:12:57.240] So yeah, he just got surprised by that.
[01:12:57.240 --> 01:12:58.600] He misinterpreted what he saw.
[01:12:58.600 --> 01:13:03.080] He thought something was moving fast because it was probably closer than he thought, and then it flew off into the distance.
[01:13:03.080 --> 01:13:07.560] He gets out his phone and he videotapes a freaking drone.
[01:13:07.560 --> 01:13:09.720] So this is, again, it's a nothing burger.
[01:13:09.720 --> 01:13:11.000] It's much ado about nothing.
[01:13:11.000 --> 01:13:13.400] Again, the hard evidence is nothing.
[01:13:13.400 --> 01:13:17.160] It's just nothing out of the ordinary at all.
[01:13:17.560 --> 01:13:23.240] And even if, I mean, my first thought, I'm looking at the video now, like, yeah, that clearly could be a drone for sure.
[01:13:23.320 --> 01:13:24.920] So it's much, I'll say this.
[01:13:24.920 --> 01:13:29.000] It's much more likely than an alien, an extraterrestrial craft.
[01:13:29.000 --> 01:13:29.320] Right.
[01:13:29.560 --> 01:13:31.800] But you know what else would be more plausible than that?
[01:13:31.800 --> 01:13:32.920] How about ball lightning?
[01:13:33.240 --> 01:13:35.560] That's basically a thing at this point.
[01:13:35.800 --> 01:13:36.120] Right?
[01:13:36.120 --> 01:13:38.360] I mean, that could have potentially have been ball lightning.
[01:13:38.520 --> 01:13:39.640] In this video, it's a drone.
[01:13:39.640 --> 01:13:40.280] I mean, come on.
[01:13:40.520 --> 01:13:40.920] Yeah.
[01:13:40.920 --> 01:13:43.400] Looking at the video, I would definitely say it's a drone.
[01:13:43.400 --> 01:13:46.880] But, Bob, you're denying this overwhelming evidence, yeah, Bob.
[01:13:46.880 --> 01:13:48.400] Yeah, that's what I am.
[01:13:48.560 --> 01:13:49.360] Yeah, that's what I am.
[01:13:49.360 --> 01:13:50.000] I'm a skeptical.
[01:13:50.320 --> 01:13:50.960] I'm a denier.
[01:13:44.840 --> 01:13:51.760] You're a denier.
[01:13:52.640 --> 01:13:54.400] Yeah, the evidence is overwhelming.
[01:13:54.400 --> 01:13:54.960] Overwhelming.
[01:13:55.280 --> 01:13:56.080] Gotta be a UFO.
[01:13:56.160 --> 01:13:57.440] Gotta be an alien craft.
[01:13:57.440 --> 01:13:59.600] May I add another layer to this, please?
[01:13:59.920 --> 01:14:01.120] The sighting took place.
[01:14:01.200 --> 01:14:02.000] So, two things.
[01:14:02.000 --> 01:14:03.920] The sighting took place in 2022.
[01:14:03.920 --> 01:14:04.400] Yes.
[01:14:04.400 --> 01:14:09.120] And the police officer only came out recently and talked about it for the first time.
[01:14:09.120 --> 01:14:11.040] So there's a three-year gap here, folks.
[01:14:11.040 --> 01:14:11.600] Two years.
[01:14:11.600 --> 01:14:11.840] Right.
[01:14:11.840 --> 01:14:12.240] Of time.
[01:14:12.240 --> 01:14:13.520] A three-and-a-half year.
[01:14:13.520 --> 01:14:15.760] I'm sorry, two and a half-year gap of time.
[01:14:15.760 --> 01:14:18.080] So you have to take that into consideration as well.
[01:14:18.080 --> 01:14:26.560] I went back and I looked up some news items back in April of 2022 when this supposed incident took place in Connecticut.
[01:14:26.560 --> 01:14:27.920] Let's see, what do we have here?
[01:14:27.920 --> 01:14:29.200] Oh, Patch, Connecticut.
[01:14:29.600 --> 01:14:34.480] The Lyrid meteor shower of 2022 in April.
[01:14:34.480 --> 01:14:44.880] The first of the spring meteor showers producing about 15 to 20 shooting stars and hours, appearing as fireballs in the night sky.
[01:14:44.880 --> 01:14:46.000] Yeah, but this is not that.
[01:14:47.600 --> 01:14:51.760] I will propose that there are perhaps two separate incidents here.
[01:14:52.640 --> 01:14:55.520] Because he describes it as changing color.
[01:14:55.840 --> 01:15:00.800] And if you've seen these things fall from the sky, and I've seen them before, they do change color.
[01:15:01.120 --> 01:15:01.360] Right?
[01:15:01.360 --> 01:15:03.120] They'll start, you know, as hot white.
[01:15:03.120 --> 01:15:06.880] It'll go to green and then, you know, some shade of red or something.
[01:15:06.880 --> 01:15:08.320] So they will tend to change colour.
[01:15:08.320 --> 01:15:14.000] And they do light up like a flare in the sky, which is kind of what he was describing.
[01:15:14.000 --> 01:15:24.880] I would say by the time he finally oriented himself, got out of his car, he focused on something else entirely-the drone or whatever it was over across the distance of the lake.
[01:15:24.880 --> 01:15:26.960] I think he might be conflating two separate incidents.
[01:15:27.040 --> 01:15:27.840] That's possible.
[01:15:27.840 --> 01:15:28.480] Possibly.
[01:15:28.480 --> 01:15:29.040] That's possible.
[01:15:30.040 --> 01:15:43.960] Just saying, you know, a little research online, you know, if you look up what was happening locally at the time, this is right here, and you know, these meteor showers can produce those effects.
[01:15:43.960 --> 01:15:45.800] I don't care what the guy's testimony is.
[01:15:45.800 --> 01:15:46.840] It's irrelevant.
[01:15:46.840 --> 01:15:48.440] It doesn't matter.
[01:15:48.760 --> 01:15:49.960] We have hard evidence.
[01:15:49.960 --> 01:15:52.440] The hard evidence is completely unimpressive.
[01:15:52.440 --> 01:15:59.160] It's a light in the distance consistent with probably, again, this day and age, probably a drone.
[01:15:59.160 --> 01:16:06.280] The other thing that Coulthart goes on and on about is how much these sightings are increasing recently.
[01:16:06.280 --> 01:16:08.120] Gee, I wonder why that is.
[01:16:08.520 --> 01:16:13.880] Why would there be an increase in sightings of drone-like objects in the last few years?
[01:16:13.880 --> 01:16:16.840] What could possibly be the explanation for that?
[01:16:16.840 --> 01:16:17.640] I can't imagine.
[01:16:18.360 --> 01:16:19.560] Drone-like you say?
[01:16:19.560 --> 01:16:20.760] I don't know.
[01:16:21.080 --> 01:16:22.120] Unbelievable.
[01:16:22.120 --> 01:16:22.760] All right.
[01:16:22.760 --> 01:16:23.880] Terrible reporting.
[01:16:23.880 --> 01:16:25.640] Lack of skepticism.
[01:16:25.960 --> 01:16:29.720] Let's go on with science or fiction.
[01:16:31.960 --> 01:16:37.080] It's time for science or fiction.
[01:16:41.480 --> 01:16:50.600] Each week, I come up with three science news items or facts: two real and one fake, and then I challenge my panel of skeptics to tell me which one is the fake.
[01:16:50.920 --> 01:16:57.000] We have a theme this week, so these are just three facts that have to do with surprising statistics.
[01:16:57.000 --> 01:16:57.560] Surprise.
[01:16:57.720 --> 01:16:58.440] Here we go.
[01:16:58.440 --> 01:17:06.280] Item number one: koalas are the sleepiest animals, sleeping for 20 to 22 hours per day.
[01:17:06.280 --> 01:17:14.520] Item number two: about 10% of people have an atypical number of presacral spinal vertebra.
[01:17:14.520 --> 01:17:21.920] And item number three, of the human remains found at Machu Picchu, 80% are female.
[01:17:21.920 --> 01:17:22.880] Jay, go first.
[01:17:22.880 --> 01:17:26.640] All right, the first one here, koalas are the sleepiest animals.
[01:17:26.640 --> 01:17:29.440] I mean, I agree with that.
[01:17:29.440 --> 01:17:31.760] They have a hell of a life, those animals.
[01:17:32.080 --> 01:17:34.080] Yeah, I think that one is science.
[01:17:34.080 --> 01:17:39.600] The second one here, it says about 10% of people have an atypical number of pre-sacral.
[01:17:39.680 --> 01:17:40.400] Pre-sacral.
[01:17:40.400 --> 01:17:40.960] Sacral.
[01:17:40.960 --> 01:17:41.440] Sacral.
[01:17:41.760 --> 01:17:47.360] So what the spine does tell you, the spine number, there's the cervical, thoracic, lumbar, and sacral spine.
[01:17:47.360 --> 01:17:49.440] So not including the sacrum.
[01:17:49.440 --> 01:17:51.440] That's basically your tailbone, right?
[01:17:51.440 --> 01:17:54.800] For the cervical, thoracic, and lumbar part of the spine.
[01:17:54.800 --> 01:17:59.040] 10% of people have an atypical number of vertebra.
[01:17:59.040 --> 01:17:59.680] Oh, wow.
[01:18:00.000 --> 01:18:06.160] And then the last one here of the human remains found at Machu Picchu, 80% are female.
[01:18:06.160 --> 01:18:08.240] Okay, so this is what I think about that.
[01:18:08.240 --> 01:18:15.920] They did an awful lot of sacrificing, and I could come up with a reason why they mostly sacrificed women.
[01:18:15.920 --> 01:18:20.000] I just don't know enough about their culture to know if that was what they did.
[01:18:20.000 --> 01:18:25.280] You know, and then the second one, you said, you know, it's 10% of people have this atypical number.
[01:18:25.600 --> 01:18:28.240] That is a low, relatively low percentage number.
[01:18:28.240 --> 01:18:31.200] You know, it's like, okay, so, you know, 10% of the people have like...
[01:18:31.200 --> 01:18:34.080] They either have extra vertebra or they have too few vertebra.
[01:18:34.080 --> 01:18:34.720] Okay.
[01:18:34.720 --> 01:18:39.040] You know, I think I'm going to go with that one is the fake because I don't know.
[01:18:39.040 --> 01:18:43.760] Something is rubbing me the wrong way about people's spines having not the same number.
[01:18:44.560 --> 01:18:46.720] You don't have the same number of bones as everybody else.
[01:18:46.720 --> 01:18:47.920] I don't think so.
[01:18:47.920 --> 01:18:48.800] Okay, Evan.
[01:18:49.280 --> 01:18:54.320] Koala is the sleepiest animal, sleeping for 20 to 22 hours per day.
[01:18:54.320 --> 01:18:57.840] Oh, those eucalyptus leaves will do that to you, won't they?
[01:18:57.840 --> 01:18:58.400] Yes.
[01:18:58.880 --> 01:19:13.320] This could perhaps have been something we learned about on our first trip to Australia, and I seem to recall this being in sync with that excursion we did to the, what was it, the Sydney Zoo, I believe.
[01:19:14.440 --> 01:19:15.960] So I think that one's right.
[01:19:15.960 --> 01:19:23.080] And then the second one about 10% of people having an atypical number of presacral spinal vertebra.
[01:19:23.400 --> 01:19:26.600] Well, yeah, I have no idea.
[01:19:26.840 --> 01:19:32.680] Never, you know, was invited to anywhere to talk about this by any experts, so I don't know.
[01:19:33.080 --> 01:19:39.000] 10% of people, I guess that could be variations in.
[01:19:40.040 --> 01:19:47.560] I wonder if there are other parts of the anatomy that also have these types of atypical numbers, right?
[01:19:48.200 --> 01:19:50.840] Or, you know, other measurements that also equal that.
[01:19:51.000 --> 01:19:55.960] The last one about Machu Picchu, 80% are female.
[01:19:55.960 --> 01:20:02.120] So this one, I think, of the three is the one that I think I'm going to say is the fiction.
[01:20:02.440 --> 01:20:10.360] They would, gosh, unfortunately, I can only go based on kind of what I've seen on television shows and movies and things.
[01:20:10.360 --> 01:20:13.320] And who knows what that's all about.
[01:20:13.320 --> 01:20:15.800] And it's, I have no record.
[01:20:15.960 --> 01:20:21.880] I've seen those scenes in which they did sacrifice, have done sacrifices of people.
[01:20:21.880 --> 01:20:24.440] They all seem to be male characters.
[01:20:24.440 --> 01:20:31.520] That doesn't necessarily mean anything in the real world, but they were way skewed towards.
[01:20:32.040 --> 01:20:35.480] I don't think I ever saw a woman be sacrificed in any of those movies.
[01:20:36.040 --> 01:20:42.840] And 80% allows for a lot more range, I think, as far as all these options go.
[01:20:42.840 --> 01:20:46.720] So I'll put my nickel down there and say the machu Picchu one is the fiction.
[01:20:46.720 --> 01:20:47.920] Okay, Kara.
[01:20:47.920 --> 01:20:51.200] Okay, so koalas are sleepy.
[01:20:44.920 --> 01:20:51.680] This I know.
[01:20:51.840 --> 01:20:53.360] I don't know if they're the sleepiest.
[01:20:53.360 --> 01:20:54.960] I know that we have a koala.
[01:20:54.960 --> 01:21:02.800] You guys remember the story at the LA Zoo who didn't like to go up in the trees at night, and one time a keeper let it stay out and it slept on the ground.
[01:21:02.800 --> 01:21:08.720] And our resident Puma, who has since died, P22, snuck into the zoo and ate it.
[01:21:12.240 --> 01:21:13.840] That's what they get for sleeping so much.
[01:21:13.840 --> 01:21:17.840] I have no idea if they're the sleepiest, but yes, I think they are well known to be quite sleepy.
[01:21:17.840 --> 01:21:20.240] I don't think the eucalyptus is like a drug.
[01:21:20.240 --> 01:21:22.320] I think it just doesn't provide a whole lot of energy.
[01:21:22.320 --> 01:21:26.800] 10% of people have an atypical number of presacral spinal vertebra.
[01:21:26.800 --> 01:21:36.080] I mean, I feel like if you took 10%, if you took a population of people, 10% of them would have something anomalous everywhere.
[01:21:36.240 --> 01:21:36.960] You know what I mean?
[01:21:36.960 --> 01:21:39.360] Like maybe not something big like polydactyly.
[01:21:39.360 --> 01:21:40.480] That might not be 10%.
[01:21:40.480 --> 01:21:50.560] But I bet you if you looked at like a hundred hearts, 10% of them would have like a weird little, like it would be bending in a different direction or like there'd be a weird little extra something here.
[01:21:50.560 --> 01:21:51.360] Like who knows?
[01:21:51.360 --> 01:21:54.160] But that doesn't seem unreasonable, 10%.
[01:21:54.160 --> 01:22:03.040] I don't know why both of you think that this human remains found in Machu Picchu are like sacrificial remains.
[01:22:03.040 --> 01:22:04.560] That feels like a leap to me.
[01:22:04.560 --> 01:22:07.440] Wasn't Machu Picchu just in cities?
[01:22:07.440 --> 01:22:09.520] Yeah, I feel like, wasn't this just a city?
[01:22:09.520 --> 01:22:11.280] And people died in cities.
[01:22:11.280 --> 01:22:12.160] And then they have buried.
[01:22:12.160 --> 01:22:14.240] They're getting beheaded by the priests.
[01:22:14.880 --> 01:22:15.680] I don't know.
[01:22:15.680 --> 01:22:20.880] I think there were just dead people at this ancient Inca city.
[01:22:21.360 --> 01:22:24.560] So I don't know why that would be 80% female.
[01:22:24.720 --> 01:22:25.480] Right, why would that be?
[01:22:25.560 --> 01:22:26.960] That city was 80% female?
[01:22:27.120 --> 01:22:29.440] It should be more like maybe 55% or something.
[01:22:29.560 --> 01:22:30.360] That's what I'm thinking.
[01:22:29.760 --> 01:22:35.160] Yeah, I'm thinking maybe that one's not science just because it should be half.
[01:22:35.480 --> 01:22:39.240] So I'm going to put my I'm going to lightly put my nickel on that.
[01:22:39.240 --> 01:22:40.360] Woo-hoo, two nickels.
[01:22:40.360 --> 01:22:41.160] Okay, and Bob.
[01:22:41.640 --> 01:22:43.720] All right, so koalas, yeah, that makes sense.
[01:22:43.880 --> 01:22:53.240] I've heard as well that they're very sleepy, and part of me is thinking Steve knows he remembers that day when we all learned that, and now they found out that they're not.
[01:22:53.960 --> 01:22:56.840] Oh, he's the case.
[01:22:58.440 --> 01:22:59.560] I will not be a happy person.
[01:22:59.960 --> 01:23:01.720] It's a distinct possibility.
[01:23:02.520 --> 01:23:02.920] Don't even.
[01:23:03.160 --> 01:23:03.640] Let's see.
[01:23:03.800 --> 01:23:08.520] The variation in the vertebra, it seems a little high, but I can see that.
[01:23:08.520 --> 01:23:11.240] I mean, we got some, there's some crazy variations out there.
[01:23:11.720 --> 01:23:18.280] Even with types of muscles, some people have an extra muscle in their forearm that most people don't have.
[01:23:18.280 --> 01:23:19.560] It's crazy stuff.
[01:23:19.560 --> 01:23:22.200] So variation like that, I can kind of see.
[01:23:22.200 --> 01:23:27.320] But the remains found in Machu Picchu, I don't know anything about Machu Picchu at all, hardly.
[01:23:27.320 --> 01:23:28.920] And just mainly based on that.
[01:23:28.920 --> 01:23:32.920] The other ones I'm more familiar with, I'll just say that one is fiction.
[01:23:32.920 --> 01:23:33.640] All right.
[01:23:33.640 --> 01:23:35.880] So you guys all agree on the first one?
[01:23:35.880 --> 01:23:36.840] So we'll start there.
[01:23:36.840 --> 01:23:42.120] Koalas are the sleepiest animals sleeping for 20 to 22 hours per day.
[01:23:42.120 --> 01:23:44.440] You guys all think this one is science.
[01:23:44.440 --> 01:23:45.080] Better be science.
[01:23:45.480 --> 01:23:48.200] This one is science.
[01:23:48.200 --> 01:23:49.080] Is it science?
[01:23:49.960 --> 01:23:51.320] Yeah, we would have killed you.
[01:23:52.840 --> 01:23:59.800] There are other, you know, almost as sleepy animals, but yeah, the koala is still considered to be the most sleepiest of the animals.
[01:24:00.200 --> 01:24:01.080] Most sleepy.
[01:24:01.080 --> 01:24:01.720] And yeah, 20.
[01:24:01.880 --> 01:24:03.720] You're just sleeping 22 hours a day.
[01:24:03.880 --> 01:24:04.840] Yeah, Kara, I think you're right.
[01:24:04.840 --> 01:24:07.880] I think it's just because they don't get a lot of energy out of those eucalyptus leaves.
[01:24:08.440 --> 01:24:10.440] Oh, yeah, they're bereft of nutrition.
[01:24:10.440 --> 01:24:13.640] They're just like hypoglycemic all the time.
[01:24:14.040 --> 01:24:18.240] Well, sleeping is an adaptation to energy efficiency, right?
[01:24:18.240 --> 01:24:21.280] You don't want to expend any more energy than you absolutely have to.
[01:24:21.280 --> 01:24:22.720] All right, I guess we'll take these in order.
[01:24:22.720 --> 01:24:27.120] About 10% of people have an atypical number of pre-sacral spinal vertebra.
[01:24:27.120 --> 01:24:28.960] Jay, you think this one is the fiction.
[01:24:28.960 --> 01:24:31.280] Everyone else thinks this one is science.
[01:24:31.280 --> 01:24:36.080] And this one is the question is: is it 1%?
[01:24:36.720 --> 01:24:38.480] Is it 20%?
[01:24:38.480 --> 01:24:39.760] Is it 5%?
[01:24:39.760 --> 01:24:41.360] Is it 0.1%?
[01:24:42.160 --> 01:24:43.360] 90%.
[01:24:43.360 --> 01:24:46.080] This one is science.
[01:24:46.080 --> 01:24:47.360] This is science.
[01:24:47.360 --> 01:24:53.840] Now, that number is, there are different estimates, right, depending on the method that is used.
[01:24:53.840 --> 01:25:03.280] But the best studies, ones where they do like total spine MRI scans, so you could count every single vertebra, come in at around 10%.
[01:25:03.280 --> 01:25:04.720] Which is actually kind of high.
[01:25:04.720 --> 01:25:06.080] But you're, I mean, Kara, you're right.
[01:25:06.080 --> 01:25:12.240] There's the what we learn of as normal anatomy is like 80-90%.
[01:25:12.480 --> 01:25:12.960] Guideline.
[01:25:13.120 --> 01:25:15.120] Yeah, that's sure.
[01:25:15.120 --> 01:25:18.640] Like maybe sometimes it's 60%, sometimes it's 90%.
[01:25:18.640 --> 01:25:19.440] Sometimes it's higher.
[01:25:19.440 --> 01:25:22.400] Sometimes, depending on how absolutely critical it is.
[01:25:22.400 --> 01:25:29.120] The cervical vertebra are pretty much very rare to have an abnormal number of neck vertebra.
[01:25:29.120 --> 01:25:34.800] But thoracic and lumbar are not are pretty common, you know, adding up to about 10%.
[01:25:34.800 --> 01:25:41.920] People have like six lumbar vertebra or five lumbar vertebra or they have 11 or 13 thoracic vertebra.
[01:25:41.920 --> 01:25:49.360] Interestingly, this can result in surgeons operating on the wrong vertebra.
[01:25:49.600 --> 01:25:50.240] Oh, crap.
[01:25:50.240 --> 01:25:51.520] You better count them all.
[01:25:51.520 --> 01:25:51.880] Yeah, exactly.
[01:25:52.040 --> 01:25:59.960] And in fact, they think that abnormal vertebral numbers are responsible for 40% of the times when that happens.
[01:25:59.600 --> 01:26:01.640] Holy almost all of them.
[01:26:02.760 --> 01:26:04.440] Don't they do a full x-ray before?
[01:26:04.920 --> 01:26:10.360] Well, they just x-ray, but they might do a thoracic MRI scan and say, oh, yeah, that's the vertebra that we need to go for.
[01:26:10.840 --> 01:26:12.040] They're going to do the whole back.
[01:26:12.360 --> 01:26:20.840] Well, that's the paper that I was reading is basically saying, yes, we should do that because people were underestimating how frequent this is.
[01:26:21.320 --> 01:26:27.000] And it actually can affect trying to figure out which level is causing the pinched nerve.
[01:26:27.000 --> 01:26:37.880] Like, you have an L2 pinched nerve, and you count out the L2 vertebra, but it's like, no, but it's actually the L1 because there's an extra vertebra in there or whatever.
[01:26:38.760 --> 01:26:40.760] It can cause the surgeons to make a mistake.
[01:26:40.760 --> 01:26:45.480] So we have to consider variability in anatomy.
[01:26:45.480 --> 01:26:47.480] It's more common than we think.
[01:26:47.720 --> 01:26:51.800] Doing nerve conduction studies for the last 20 years, this comes up a lot there too.
[01:26:52.760 --> 01:26:58.280] You see the diagram of the nerve innervation, both of the skin and of the muscles and everything.
[01:26:58.280 --> 01:27:02.520] It's like, yeah, that's like, yeah, it's 60%, 70%, 80%, 90% of the time.
[01:27:02.520 --> 01:27:08.200] But depending on what nerve you're looking at, especially the farther you get downstream, you know what I mean?
[01:27:08.200 --> 01:27:13.480] Like, yeah, everyone has an aorta, but the smaller you get, the more variability there is.
[01:27:13.480 --> 01:27:21.080] And people have a lot of variability in their, like the nerve endings, even when you get to second, third degree, you know, branchings.
[01:27:21.400 --> 01:27:22.760] So yeah, a lot of variability.
[01:27:22.760 --> 01:27:28.200] Okay, that means that of the human remains found in Machu Picchu, 80% are female is the fiction.
[01:27:28.200 --> 01:27:30.520] However, I didn't make this up.
[01:27:31.000 --> 01:27:34.440] There was a study which showed that this was the case.
[01:27:34.440 --> 01:27:41.320] An archaeologist surveying the bones found that was their estimate.
[01:27:41.560 --> 01:27:44.480] 80% of the remains that they found in Machu Picchu were female.
[01:27:44.120 --> 01:27:49.040] But a later re-examination found out, no, they're 50-50.
[01:27:49.840 --> 01:27:51.520] Why did they think that so many were female?
[01:27:51.680 --> 01:27:56.400] I think they were using pelvic examination and then they later did DNA analysis.
[01:27:56.960 --> 01:27:57.840] Yeah, but I wonder why.
[01:27:57.840 --> 01:28:00.640] Oh, well, weren't they very, very small people?
[01:28:01.360 --> 01:28:03.920] Well, they're smaller than modern humans.
[01:28:04.560 --> 01:28:05.440] But even modernization.
[01:28:05.520 --> 01:28:14.160] But you can either sampling error or they just didn't, you know, if you're just going by how the pelvis looks, they maybe didn't do a good job.
[01:28:14.160 --> 01:28:19.520] Or there was, you know, again, some kind of sampling issue in terms of what they were looking for.
[01:28:19.840 --> 01:28:23.360] But a more thorough later analysis found that it was basically 50-50.
[01:28:23.360 --> 01:28:25.360] Now, what was Machu Picchu used for?
[01:28:25.360 --> 01:28:25.600] Right?
[01:28:25.600 --> 01:28:26.000] This has come up.
[01:28:26.000 --> 01:28:33.040] So Machu Picchu is like the crown jewel of the Incan Empire in terms of our lost city of the Incas.
[01:28:33.040 --> 01:28:38.560] It is considered to be, you know, it's the most famous remains of an Incan city.
[01:28:38.560 --> 01:28:39.280] It's beautiful.
[01:28:39.280 --> 01:28:41.520] I don't know if you guys, I've never been there, but I've seen pictures of it.
[01:28:41.520 --> 01:28:42.160] Oh, yeah, the pictures.
[01:28:42.240 --> 01:28:43.360] It looks amazing.
[01:28:44.240 --> 01:28:44.800] And we're not really sure.
[01:28:44.880 --> 01:28:45.760] It looks like a face.
[01:28:45.760 --> 01:28:47.680] Like the mountain looks like a face on its side.
[01:28:47.680 --> 01:28:48.000] Yeah, yeah.
[01:28:49.440 --> 01:28:53.280] But you know, like modern Peruvians are the shortest people in the world.
[01:28:53.280 --> 01:28:53.920] Yeah.
[01:28:53.920 --> 01:28:54.880] So that's interesting.
[01:28:54.880 --> 01:28:56.720] Like, I was just looking it up.
[01:28:56.720 --> 01:29:02.960] An average Peruvian woman is five feet, and an average Peruvian man is five feet four.
[01:29:02.960 --> 01:29:04.160] Wow, that is short.
[01:29:04.160 --> 01:29:06.560] So it might be hard just based on.
[01:29:06.720 --> 01:29:08.640] It might be hard if they're more petite, you know.
[01:29:08.640 --> 01:29:14.800] Yeah, to tell the difference of ancient remains if you're only looking at the way they look.
[01:29:14.800 --> 01:29:17.680] So there were human sacrifices in the area.
[01:29:19.040 --> 01:29:25.360] There are temples at Machu Picchu, so it does have some religious significance, but they're not exactly sure.
[01:29:25.360 --> 01:29:27.680] It probably wasn't just a regular city.
[01:29:27.680 --> 01:29:36.520] They think it might have been like the royal city or just like where the you know, the rich people lived, or it could have been mostly of religious significance.
[01:29:36.840 --> 01:29:39.080] It wasn't just a place to sacrifice people.
[01:29:39.080 --> 01:29:42.200] You know, again, that wasn't the main thing that was happening.
[01:29:42.280 --> 01:29:43.080] It was like a whole city.
[01:29:43.080 --> 01:29:44.040] It was a city, yeah.
[01:29:44.040 --> 01:29:51.480] But there were, they, there were, they said nearby, like there was, you know, places where they did human sacrifices.
[01:29:51.720 --> 01:29:53.400] But that wasn't, yeah, but Cara is right.
[01:29:53.400 --> 01:29:58.760] That's not the bodies they were, and they were not just looking at the bodies of sacrificed individuals, that's the people in the city.
[01:29:58.760 --> 01:30:02.440] Yeah, they didn't just throw all the people into it like somewhere else.
[01:30:03.240 --> 01:30:05.880] Well, there are sites, though, CARA.
[01:30:05.880 --> 01:30:08.920] There are sites where there are just pits of sacrificed people.
[01:30:09.800 --> 01:30:10.760] Yeah, I'm not surprised.
[01:30:10.760 --> 01:30:11.880] This is not one of them, though.
[01:30:14.120 --> 01:30:20.200] And then they're all like have caved in skulls and stuff, like signs of extreme trauma.
[01:30:20.200 --> 01:30:20.920] Oh, gosh.
[01:30:21.400 --> 01:30:22.440] Nasty, what happened?
[01:30:23.160 --> 01:30:23.560] Yikes.
[01:30:23.720 --> 01:30:24.600] People are gross.
[01:30:24.600 --> 01:30:25.400] Brutal.
[01:30:25.400 --> 01:30:25.960] All right.
[01:30:25.960 --> 01:30:27.320] Well, good job, guys.
[01:30:27.320 --> 01:30:28.200] You got it.
[01:30:28.200 --> 01:30:30.440] Evan, give us a quote.
[01:30:30.760 --> 01:30:34.600] This week's quote was suggested by listener Pat from Michigan.
[01:30:34.600 --> 01:30:35.560] Thank you, Pat.
[01:30:35.560 --> 01:30:47.800] If an outsider perceives something wrong with a core scientific model, the humble and justified response of that curious outsider should be to ask, what mistake am I making?
[01:30:47.800 --> 01:30:51.800] Before assuming 100% of the experts are wrong.
[01:30:51.800 --> 01:30:52.360] Yeah.
[01:30:52.680 --> 01:30:54.120] David Brin.
[01:30:54.440 --> 01:30:55.000] Great honor.
[01:30:55.080 --> 01:30:55.400] Brin.
[01:30:55.560 --> 01:30:56.040] David Brin.
[01:30:56.280 --> 01:30:56.840] David Brin.
[01:30:56.840 --> 01:30:57.160] Oh, my God.
[01:30:57.320 --> 01:30:58.280] The Uplift Wars.
[01:30:58.280 --> 01:30:58.760] Yeah, very good.
[01:30:59.480 --> 01:31:00.040] I still remember.
[01:31:00.120 --> 01:31:01.240] It's been like, what, 30 years?
[01:31:01.240 --> 01:31:02.400] Dev, I still remember those stories.
[01:31:02.760 --> 01:31:03.400] Great series.
[01:31:03.400 --> 01:31:04.680] Great science fiction series.
[01:31:04.680 --> 01:31:05.160] Yeah.
[01:31:05.480 --> 01:31:10.600] Yeah, really, the first science fiction series I read with actually alien aliens.
[01:31:10.600 --> 01:31:11.240] You know what I mean?
[01:31:11.240 --> 01:31:13.760] Not just humanoid aliens, but right, not Star Trek.
[01:31:13.920 --> 01:31:15.120] Completely alien aliens.
[01:31:16.640 --> 01:31:17.760] Very good.
[01:31:17.760 --> 01:31:18.320] All right.
[01:31:18.640 --> 01:31:19.680] Thanks, Evan.
[01:31:14.920 --> 01:31:20.000] Thanks.
[01:31:20.320 --> 01:31:22.880] Well, thank you all for joining me this week.
[01:31:22.880 --> 01:31:23.520] Sure, man.
[01:31:23.520 --> 01:31:24.480] You got it, brother.
[01:31:24.480 --> 01:31:29.200] And until next week, this is your Skeptics Guide to the Universe.
[01:31:31.760 --> 01:31:38.480] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:31:38.480 --> 01:31:43.120] For more information, visit us at theskepticsguide.org.
[01:31:43.120 --> 01:31:47.040] Send your questions to info at the skepticsguide.org.
[01:31:47.040 --> 01:31:57.680] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:31:57.680 --> 01:32:01.360] Our listeners and supporters are what make SGU possible.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.320 --> 00:00:04.640] The Skeptic's Guide to the Universe is brought to you by Progressive Insurance.
[00:00:04.640 --> 00:00:08.960] Do you ever think about switching insurance companies to see if you could save some cash?
[00:00:08.960 --> 00:00:14.480] Progressive makes it easy to see if you could save when you bundle your home and auto policies.
[00:00:14.480 --> 00:00:17.120] Try it at progressive.com.
[00:00:17.120 --> 00:00:19.920] Progressive casualty insurance company and affiliates.
[00:00:19.920 --> 00:00:21.440] Potential savings will vary.
[00:00:21.440 --> 00:00:23.600] Not available in all states.
[00:00:27.120 --> 00:00:30.320] You're listening to the skeptic's guide to the universe.
[00:00:30.320 --> 00:00:33.200] Your escape to reality.
[00:00:33.840 --> 00:00:36.800] Hello and welcome to the Skeptics Guide to the Universe.
[00:00:36.800 --> 00:00:41.520] Today is Tuesday, December 3rd, 2024, and this is your host, Stephen Novella.
[00:00:41.520 --> 00:00:43.280] Joining me this week are Bob Novella.
[00:00:43.280 --> 00:00:43.840] Hey, everybody.
[00:00:43.840 --> 00:00:45.200] Kara Santa Maria.
[00:00:45.200 --> 00:00:45.600] Howdy.
[00:00:45.680 --> 00:00:46.720] Jay Novella.
[00:00:46.720 --> 00:00:47.360] Hey, guys.
[00:00:47.360 --> 00:00:49.120] And Evan Bernstein.
[00:00:49.120 --> 00:00:50.080] Good evening, everyone.
[00:00:50.080 --> 00:00:52.080] So, guys, we're going to DC this weekend.
[00:00:52.080 --> 00:00:59.120] When the show comes out, we will be basically in the middle of our private show in D.C., and then we have the extravaganza.
[00:00:59.120 --> 00:01:06.240] And it's going to be a little tricky because, as some of you already know, I fell and cracked a rib last week.
[00:01:06.240 --> 00:01:08.240] Oh, Steve.
[00:01:08.720 --> 00:01:09.920] Don't do that, by the way.
[00:01:09.920 --> 00:01:13.360] It's like I recommend you do not fracture a rib.
[00:01:13.360 --> 00:01:14.800] It's not what happened.
[00:01:14.800 --> 00:01:15.360] I hear that hurt.
[00:01:16.320 --> 00:01:17.120] It was nothing.
[00:01:17.360 --> 00:01:18.480] It was completely stupid.
[00:01:18.480 --> 00:01:29.040] I was, you know, outside at night at my mother-in-law's place, and I tripped on a stone patio thing, fell on my right side, and that's it.
[00:01:29.520 --> 00:01:30.240] You were drunk.
[00:01:30.240 --> 00:01:30.880] Come on.
[00:01:33.680 --> 00:01:37.520] So, yeah, unfortunately, one of my ribs took the majority of the impact.
[00:01:37.520 --> 00:01:38.080] Oof.
[00:01:38.400 --> 00:01:41.520] So, how bad does it hurt for those of us who have not broken a rib?
[00:01:41.520 --> 00:01:42.960] So, it's not bad.
[00:01:42.960 --> 00:01:46.560] It's a haven't been x-rayed or anything.
[00:01:46.560 --> 00:01:51.680] I'm just diagnosing myself based upon symptoms, but it's probably a non-displaced fracture.
[00:01:51.680 --> 00:01:52.720] I could feel the rib.
[00:01:52.720 --> 00:01:54.000] Everything feels good.
[00:01:54.000 --> 00:01:58.360] Yes, I think I could take a deep breath without any problems.
[00:01:58.840 --> 00:01:59.880] That's the big damage anyway.
[00:02:00.040 --> 00:02:00.680] Yeah, the lungs.
[00:02:01.160 --> 00:02:01.640] That's huge.
[00:01:59.200 --> 00:02:03.240] Yeah, well, it's been a week now.
[00:02:05.480 --> 00:02:09.240] But it's just I am very restricted in my movements.
[00:02:09.240 --> 00:02:09.880] You know what I mean?
[00:02:09.880 --> 00:02:13.080] I can't bear any weight with my right arm, which is very limiting.
[00:02:13.080 --> 00:02:15.320] So you guys are going to have to do all my heavy lifting this weekend.
[00:02:15.800 --> 00:02:17.160] It's basically what I'm telling you.
[00:02:17.160 --> 00:02:18.920] No interpretive dance at all.
[00:02:18.920 --> 00:02:23.400] Yeah, like, so remember the last extravagance where I did that Russian goose-stepping dance, whatever?
[00:02:23.720 --> 00:02:24.200] That ain't happening.
[00:02:24.280 --> 00:02:26.280] That's not happening this time.
[00:02:26.840 --> 00:02:28.680] No physicality for me.
[00:02:28.680 --> 00:02:29.080] Oh, no.
[00:02:29.320 --> 00:02:31.880] Yeah, you just gotta be.
[00:02:32.200 --> 00:02:33.880] Well, you'll be guessing the freeze frames.
[00:02:34.360 --> 00:02:37.560] Or, I mean, I could, as long as I'm vertical, I'm fine.
[00:02:37.560 --> 00:02:38.760] Like, I'm not going on the floor.
[00:02:38.760 --> 00:02:41.480] I'm not doing anything ambitious.
[00:02:41.640 --> 00:02:43.880] It really, just, it's very, it's just miserable.
[00:02:43.880 --> 00:02:44.600] It's, you know what I mean?
[00:02:44.600 --> 00:02:47.240] Because I have to get into very specific positions.
[00:02:47.240 --> 00:02:50.600] Like, there can't be any tension on my ribs, you know?
[00:02:50.840 --> 00:02:56.920] It's one of those core muscles, you know, one of those positional things where it's like try to put your back completely at rest.
[00:02:56.920 --> 00:02:59.000] Like, it's very, very difficult to do it.
[00:02:59.240 --> 00:03:00.440] Sleeping is terrible.
[00:03:01.560 --> 00:03:05.640] I guess you're going to try really hard not to get into a fist fight with someone at the show, right?
[00:03:05.720 --> 00:03:05.800] Yeah.
[00:03:05.880 --> 00:03:06.360] Some audience members.
[00:03:06.600 --> 00:03:07.400] Very, very hard.
[00:03:07.400 --> 00:03:10.920] I know usually I get into two or three fist fights with audience members.
[00:03:11.560 --> 00:03:12.520] I know, yeah.
[00:03:12.520 --> 00:03:14.680] Well, what the hell go for that right side?
[00:03:14.920 --> 00:03:17.640] And I can't really give like a full cough.
[00:03:17.640 --> 00:03:19.720] So I'm always doing these half coughs.
[00:03:20.200 --> 00:03:21.400] Oh, I hate those.
[00:03:23.960 --> 00:03:24.280] I know.
[00:03:24.280 --> 00:03:29.240] I got to, it's like just enough to because you're constantly clearing gunk out of your lungs.
[00:03:29.640 --> 00:03:30.040] I know.
[00:03:30.040 --> 00:03:31.400] Humans are disgusting.
[00:03:31.400 --> 00:03:38.600] That's part of the problem with anything that impairs your ability to cough is that you run the risk of the gunk getting backed up.
[00:03:38.840 --> 00:03:43.000] Well, Steve, you know, the trick that I was taught before.
[00:03:43.480 --> 00:03:45.520] Yeah, is to push into a pillow.
[00:03:45.520 --> 00:03:47.120] So after my hysterectomy, I had to.
[00:03:47.120 --> 00:03:48.640] I mean, you still have to cough sometimes.
[00:03:48.640 --> 00:03:49.280] It's brutal.
[00:03:45.000 --> 00:03:51.600] Yeah, but this isn't an abdominal problem.
[00:03:51.840 --> 00:03:53.360] So that helps with the abdominal.
[00:03:53.360 --> 00:03:54.160] It's higher.
[00:03:54.480 --> 00:03:57.040] It's and on the side.
[00:03:57.040 --> 00:04:01.920] And it's like try breathing or coughing without moving your ribs, you know.
[00:04:01.920 --> 00:04:04.720] Yeah, but I mean, you can't breathe or cough without moving your abs either.
[00:04:04.720 --> 00:04:05.120] I know.
[00:04:05.120 --> 00:04:05.520] I'm actually trying to get a lot of things.
[00:04:05.680 --> 00:04:06.480] You know, they're all applied.
[00:04:06.800 --> 00:04:11.040] I'm trying to do abdominal breathing rather than rib breathing.
[00:04:11.120 --> 00:04:11.680] That's good for you.
[00:04:12.000 --> 00:04:14.960] I teach my patients how to do that, like diaphragmatic breathing all the time.
[00:04:14.960 --> 00:04:15.520] Right, exactly.
[00:04:15.520 --> 00:04:16.640] It's good for your blood pressure.
[00:04:16.640 --> 00:04:17.600] Oh, is that right?
[00:04:20.080 --> 00:04:22.000] Well, I got some interesting news, guys.
[00:04:22.000 --> 00:04:22.480] Oh, yeah?
[00:04:22.800 --> 00:04:24.160] What'd you break, Jay?
[00:04:24.160 --> 00:04:26.400] We now have SGU dice.
[00:04:26.400 --> 00:04:26.960] Ooh.
[00:04:27.600 --> 00:04:27.840] Yeah.
[00:04:28.000 --> 00:04:28.560] Can't wait to see.
[00:04:28.720 --> 00:04:30.080] Not what I was expecting you to say.
[00:04:30.080 --> 00:04:40.640] Yeah, so what happened was Brian Wecht, you know, he and I talk all the time, and he's constantly, like, he's such a good friend that he is constantly throwing me ideas, whatever.
[00:04:40.640 --> 00:04:43.040] It could be anything from show ideas to whatever.
[00:04:43.040 --> 00:04:45.840] One day he said, hey, you guys ever sell dice?
[00:04:45.840 --> 00:04:46.560] I'm like, no.
[00:04:46.560 --> 00:04:47.440] And I'm like, why?
[00:04:47.440 --> 00:04:51.520] I'm like, you think people that listen to a skeptical podcast would want to buy dice?
[00:04:51.520 --> 00:04:56.720] And he goes, he goes, look, I'm in an 80s rock band, and we sold a ton of dice.
[00:04:56.720 --> 00:04:57.440] You should try to do it.
[00:04:57.600 --> 00:05:00.000] I think he said they were his best-selling swag item.
[00:05:00.000 --> 00:05:00.400] Yeah.
[00:05:00.400 --> 00:05:00.720] He did.
[00:05:01.120 --> 00:05:02.480] Yeah, so then I'm like, all right.
[00:05:02.480 --> 00:05:06.880] I mean, look, because I always like it when I like the swag that we buy.
[00:05:07.280 --> 00:05:14.320] Typically, I like everything because I only buy things that are high quality and like something that I would want or buy because that's a good marker for me.
[00:05:14.320 --> 00:05:15.120] A J test.
[00:05:15.120 --> 00:05:19.760] So, you know, I scrubbed the internet and I found a manufacturer that I liked.
[00:05:19.760 --> 00:05:22.640] I mean, I really dig this company that we're buying dice from.
[00:05:22.640 --> 00:05:23.120] And that's it.
[00:05:23.120 --> 00:05:23.400] Steve.
[00:05:23.400 --> 00:05:25.840] And I picked out a couple of styles and everything.
[00:05:25.840 --> 00:05:30.440] And so they're going to be at the DC show with us, which is, you know, people can buy those there if they want them.
[00:05:30.440 --> 00:05:32.040] They'll have our logo on them, obviously.
[00:05:29.680 --> 00:05:35.800] Yeah, they have the 20, the D20 on the 20 has an SGU logo.
[00:05:35.960 --> 00:05:36.920] Woohoo!
[00:05:36.920 --> 00:05:38.280] Yeah, we got a bunch of new swag for this.
[00:05:38.520 --> 00:05:40.680] Yeah, yeah, we just reloaded our swag.
[00:05:40.680 --> 00:05:44.120] It was a good time to buy it because lots of things were on sale because of the holiday.
[00:05:44.280 --> 00:05:46.200] That was basically the impetus right there.
[00:05:46.200 --> 00:05:49.480] Kara, have you ever rolled a 20-sided die before?
[00:05:50.040 --> 00:05:51.240] Like Casahedron?
[00:05:51.560 --> 00:05:54.840] How many sides does the die from Scattergories have?
[00:05:54.840 --> 00:05:56.440] It doesn't have 20 letters on it.
[00:05:56.440 --> 00:05:57.880] I'm about to find out.
[00:05:58.120 --> 00:06:00.040] Scattergories die is a 20-sided die.
[00:06:00.040 --> 00:06:00.520] Then, yes, I have to do it.
[00:06:00.680 --> 00:06:02.520] So you have rolled shot.
[00:06:03.000 --> 00:06:04.920] Well, anybody who's played Scattergories has.
[00:06:04.920 --> 00:06:05.480] They're fun.
[00:06:05.480 --> 00:06:06.280] Yeah, I think I have.
[00:06:06.280 --> 00:06:06.760] I've played it.
[00:06:06.760 --> 00:06:08.040] It's been a while, I guess.
[00:06:08.040 --> 00:06:11.080] You'll never forget your first D20, I promise you.
[00:06:11.400 --> 00:06:12.120] Totally forgot.
[00:06:12.120 --> 00:06:12.520] That's why I have a lot of fun.
[00:06:12.760 --> 00:06:13.800] I still have mine.
[00:06:13.800 --> 00:06:14.840] I was 10 years old.
[00:06:14.840 --> 00:06:15.320] I still have mine.
[00:06:15.640 --> 00:06:18.280] Kara, now you can have an SGU D20.
[00:06:18.280 --> 00:06:18.680] Okay.
[00:06:19.400 --> 00:06:20.520] That's a keepsake.
[00:06:20.520 --> 00:06:21.000] Yeah.
[00:06:21.000 --> 00:06:24.280] All right, let's get on with some skeptical content here, Bob.
[00:06:24.440 --> 00:06:26.440] You're going to start us off with a quickie.
[00:06:26.440 --> 00:06:27.320] Thank you, Steve.
[00:06:27.320 --> 00:06:28.760] This is your quickie with Bob.
[00:06:28.760 --> 00:06:30.040] All right, how cool is this, guys?
[00:06:30.040 --> 00:06:37.160] Imagine putting something under a microscope and being able to easily manipulate its surface with atomic precision.
[00:06:37.160 --> 00:06:41.880] This is exactly what researchers claim in a study published in Applied Surface Science.
[00:06:41.880 --> 00:06:43.240] The lead researcher, Dr.
[00:06:43.560 --> 00:06:51.880] Motaba Moshkani, said, Our laser method provides atomic level control over diamond surfaces in a standard air environment.
[00:06:51.880 --> 00:06:56.280] This level of precision is typically only possible with large, complex vacuum equipment.
[00:06:56.280 --> 00:07:00.200] The ability to achieve it with a simple laser process is truly remarkable.
[00:07:00.200 --> 00:07:01.240] Remarkable?
[00:07:01.240 --> 00:07:03.320] So, yeah, this is really slick.
[00:07:03.320 --> 00:07:09.640] To do this, they use a deep UV light laser to precisely deliver pulses of light onto the diamond surface.
[00:07:09.640 --> 00:07:16.240] And these pulses trigger chemical reactions that can precisely remove carbon atoms from this top atomic layer.
[00:07:16.560 --> 00:07:27.680] Now, this could have tremendous impact on electronics, quantum computers, advanced manufacturing, specifically in materials where minor changes to the surface can have dramatic performance improvements.
[00:07:27.680 --> 00:07:36.640] For example, they showed, and MIT Lincoln Laboratory confirmed that this technique was able to increase diamond surface conductivity by a factor of seven.
[00:07:36.640 --> 00:07:40.560] They say, Professor Richard Mildren says, he's the team lead.
[00:07:40.560 --> 00:07:46.480] We were amazed that such a minor adjustment to the surface could yield such a substantial boost in conductivity.
[00:07:46.640 --> 00:07:50.080] Now, you may be thinking, yeah, but doing this atom by atom, right?
[00:07:50.080 --> 00:07:51.200] Isn't that how you would do it?
[00:07:51.200 --> 00:07:52.800] It would just take forever.
[00:07:52.800 --> 00:07:56.720] But no, this technique is precise and fast at the same time.
[00:07:57.040 --> 00:08:04.160] They said that they were able to remove 1% of a surface monolayer in 0.2 milliseconds.
[00:08:04.160 --> 00:08:10.160] Now, I don't know how big this surface monolayer was, but speed is not an issue with this technique, apparently.
[00:08:10.400 --> 00:08:10.640] Dr.
[00:08:10.640 --> 00:08:14.640] Mushkani said: We've shown that the process is both rapid and scalable.
[00:08:14.640 --> 00:08:19.920] It's a compelling option for industries requiring advanced material processing.
[00:08:19.920 --> 00:08:22.160] Professor Mildred said, This is just the beginning.
[00:08:22.160 --> 00:08:31.120] We're excited to explore how this technique can be optimized further to unlock the full potential of diamonds in electronics, quantum technologies, and beyond.
[00:08:31.120 --> 00:08:34.720] So, this is yet another technology that I will be following closely.
[00:08:34.720 --> 00:08:38.000] So, this has been your atomically precise quick eat with Bob.
[00:08:38.000 --> 00:08:39.120] Back to you, Steve.
[00:08:39.120 --> 00:08:40.480] Thank you, Bob.
[00:08:40.480 --> 00:08:46.640] All right, Kara, this one's intriguing, and this is different than this cuts against what I would have thought.
[00:08:46.640 --> 00:08:51.120] So, tell me about whether or not humans have innate morality.
[00:08:51.120 --> 00:08:58.480] Well, this cuts against what the researchers thought as well, which is why this is kind of a really interesting story.
[00:08:58.480 --> 00:09:04.920] And in a way, it relates back to: I've got to ask, do you guys remember when we were at SaiCon?
[00:09:04.920 --> 00:09:10.600] I know it feels like a long time ago to us now, but to our listeners, I think it doesn't because didn't that episode just air last year?
[00:09:11.080 --> 00:09:11.720] That's perfect.
[00:09:13.000 --> 00:09:13.800] Time travel.
[00:09:13.800 --> 00:09:14.680] Yeah, time travel.
[00:09:14.680 --> 00:09:32.200] But the story that I covered was about how to sort of keep researchers honest and how to ensure that what we are doing in the publication world is appropriately vetted.
[00:09:32.200 --> 00:09:40.680] And we talked about pre-registering studies and we talked about open access and we talked about having raw data available for other researchers to look at.
[00:09:40.680 --> 00:09:45.720] And I think in some ways, this study is a bit of a case study in being able to do that.
[00:09:45.720 --> 00:10:01.880] So, published in Developmental Science on November 26th, 2024 is a paper called Infant Social Evaluation of Helpers and Hinderers, a large-scale multi-lab coordinated replication study.
[00:10:01.880 --> 00:10:11.800] Researchers in this study looked at a seminal study that was published by Hamlin and colleagues in 2007.
[00:10:11.800 --> 00:10:40.040] And in that study, that a lot of people have cited, researchers found that forming social evaluations, as they put it, based on third-party interactions, that's like a very wonky way to say, observing a social engagement with another either human or toy or character, that within the first year of life, infants would choose a helper over a hinderer.
[00:10:40.040 --> 00:10:52.400] And they used a paradigm in which an object with eyes was trying to climb a hill, and another object with eyes either gave it a boost or prevented it from going up.
[00:10:52.400 --> 00:11:10.000] And using that paradigm, in this study from 2007, researchers showed, hey, look, even within the first year of life, based on their eye gazes, these infants are spending more time looking at the helpers, less time looking at the hinderers, therefore they're preferring the helpers over the hinderers.
[00:11:10.000 --> 00:11:15.280] Therefore, there must be some sort of innate decision-making around helping.
[00:11:15.280 --> 00:11:17.200] If we want to call that morality, we can.
[00:11:17.200 --> 00:11:18.480] The researchers don't.
[00:11:18.480 --> 00:11:20.400] Just all the write-arounds do.
[00:11:21.040 --> 00:11:23.120] So, which is not uncommon, right?
[00:11:23.120 --> 00:11:29.680] So, this study said, okay, this is interesting, because since that 2007 study, a bunch of studies said, hey, look, we found the same thing.
[00:11:29.680 --> 00:11:31.600] But a bunch of other studies were like, not so fast.
[00:11:31.600 --> 00:11:34.000] I don't think we found the same thing, and I'm confused.
[00:11:34.000 --> 00:11:35.760] So they did something interesting.
[00:11:35.760 --> 00:11:41.360] They said, okay, we want to do a different kind of experiment altogether.
[00:11:41.360 --> 00:11:49.520] And in this study, they looked at a project across multiple labs in a really large scale.
[00:11:49.520 --> 00:11:59.840] Ultimately, they ended up seeing over a thousand infants across 40 different developmental psychology teams.
[00:11:59.840 --> 00:12:03.360] They ended up using about half of that data for good reason.
[00:12:03.360 --> 00:12:05.120] We can talk about that in a bit.
[00:12:05.120 --> 00:12:18.960] But they looked at over a thousand children between the ages of five and a half and ten and a half months across multiple labs, and they developed a paradigm that was based on that Hill paradigm that I described before.
[00:12:18.960 --> 00:12:23.360] So there are four different conditions in the helping and hindering paradigm.
[00:12:23.360 --> 00:12:26.240] There are social conditions and non-social conditions.
[00:12:26.240 --> 00:12:28.080] Two of them are social, two are non-social.
[00:12:28.080 --> 00:12:32.840] And they're helping or up conditions and hindering or down conditions.
[00:12:32.840 --> 00:12:35.160] Two of them are helping, two of them are hindering.
[00:12:29.920 --> 00:12:37.160] So we'll start with the helping up condition.
[00:12:37.480 --> 00:12:45.160] Imagine a little yellow triangle with eyes and a little red circle about the same size with eyes.
[00:12:45.160 --> 00:12:49.640] The helping condition, the yellow triangle, helps the red circle get up the hill.
[00:12:49.640 --> 00:12:51.160] The red circle can't go on its own.
[00:12:51.160 --> 00:12:55.720] The yellow triangle comes up and gives it a boost from underneath and then it makes it up the hill.
[00:12:55.720 --> 00:13:03.320] That same paradigm is shown in a non-social condition where the little red character doesn't have eyes.
[00:13:03.320 --> 00:13:05.800] So now it's just a ball rolling up a hill.
[00:13:05.800 --> 00:13:09.960] And the helper, the yellow triangle with eyes, moves that red ball up the hill.
[00:13:09.960 --> 00:13:11.320] So it's non-social, right?
[00:13:11.320 --> 00:13:13.960] It's moving an object, not another character.
[00:13:13.960 --> 00:13:24.520] In the other condition, the hindering condition, the red ball is trying to go up the hill, but a blue square comes from above it and pushes it down, preventing it from going up the hill.
[00:13:24.520 --> 00:13:32.600] So the helper helps it up from behind, the yellow triangle, the hinderer, the blue square, pushes it down from above.
[00:13:32.600 --> 00:13:37.960] And there's also the non-social version of that, where the red character doesn't have eyes anymore, and now it's just a red ball.
[00:13:37.960 --> 00:13:41.560] So the googly eyes make it social, if that makes sense.
[00:13:41.560 --> 00:13:43.160] Now these are infants, remember.
[00:13:43.160 --> 00:13:47.080] So they're bright-colored, very simple displays.
[00:13:47.080 --> 00:13:50.360] And they showed these different conditions to the infants.
[00:13:50.360 --> 00:13:56.360] Now, in the original study, they just looked at eye gazes, but in this replication study, they did something even more interesting.
[00:13:56.360 --> 00:14:09.640] They added a behavioral component where after the infants looked at these different conditions in randomized order, they then were shown a board with cutouts of the characters.
[00:14:09.640 --> 00:14:11.880] And they were asked to pick which character they liked.
[00:14:11.880 --> 00:14:17.280] They were looking for, researchers were, and this is based on a lot of research about working with infants.
[00:14:17.600 --> 00:14:24.480] They were looking to see which of the characters the infants grabbed for and also looked at at the same time.
[00:14:24.480 --> 00:14:25.360] Does that make sense?
[00:14:25.600 --> 00:14:25.920] Yeah.
[00:14:25.920 --> 00:14:27.040] So they needed to look at it.
[00:14:27.040 --> 00:14:28.640] They also needed to grab for it.
[00:14:28.640 --> 00:14:33.040] Now, a lot of them were thrown out because some infants just didn't pick one.
[00:14:33.040 --> 00:14:36.560] They either went back and forth between the two or they looked at one and grabbed the other.
[00:14:36.560 --> 00:14:40.400] And it was, you know, inconsistent which they were showing a preference for.
[00:14:40.400 --> 00:14:41.680] So they couldn't use that data.
[00:14:41.680 --> 00:14:50.240] In the situations where it was obvious which character the infants went for, what do you think they found?
[00:14:50.960 --> 00:14:51.680] The same.
[00:14:52.320 --> 00:14:52.800] It was the same.
[00:14:52.880 --> 00:14:53.600] Yeah, same.
[00:14:53.600 --> 00:14:54.480] Fine.
[00:14:54.480 --> 00:14:55.040] It was the same.
[00:14:55.840 --> 00:15:00.000] So overall, 49.34% of the infants chose the helpers.
[00:15:00.000 --> 00:15:00.320] Yeah.
[00:15:00.320 --> 00:15:02.240] Oh, and that was in the social condition.
[00:15:02.240 --> 00:15:09.440] 55.85% chose the helpers in the non-social condition.
[00:15:09.440 --> 00:15:12.560] And when they compared the two, there was no significant difference.
[00:15:12.560 --> 00:15:14.000] Helpers or hinderers.
[00:15:14.000 --> 00:15:16.960] It was no different from each other, no different from chance alone.
[00:15:16.960 --> 00:15:17.680] Coin flip.
[00:15:17.680 --> 00:15:21.440] And so the researchers were like, whoa, because that was not their hypothesis.
[00:15:22.080 --> 00:15:23.840] They were like, wait, we did this before.
[00:15:23.840 --> 00:15:27.520] We showed that there was this pro-social thing that appears to be innate.
[00:15:27.520 --> 00:15:34.000] Not only do infants go for the helpers, but we're also hoping that they go for the social group a little bit more too.
[00:15:34.000 --> 00:15:43.280] They did find, I think, a slight preference for the social group, but they didn't find any significant difference between the helpers and the hinderers.
[00:15:43.280 --> 00:15:46.160] So they then talk about why this might be the case.
[00:15:46.160 --> 00:15:51.360] They have some different ideas, but of course, this is in the discussion section of a study.
[00:15:51.360 --> 00:15:52.720] We just don't know.
[00:15:52.720 --> 00:15:54.960] We are speculating.
[00:15:54.960 --> 00:16:02.440] They are wondering if maybe this skill or this preference is present, just not this early.
[00:15:59.840 --> 00:16:04.920] Maybe it's going to develop after 10 and a half months.
[00:16:05.240 --> 00:16:10.040] Or maybe there is no innate preference at all.
[00:16:10.040 --> 00:16:17.160] Or maybe this study isn't tapping in to the best way to look at an innate preference.
[00:16:17.160 --> 00:16:30.200] Maybe they don't see a significant difference between the groups using the Hill paradigm, but maybe there are other sort of more naturalistic studies where they would see that pro-social or that quote moral behavior.
[00:16:30.200 --> 00:16:36.760] But what they like is that this was proof of concept for using these active measures so that the infants had a choice.
[00:16:36.760 --> 00:16:48.520] They got to choose the character at the end instead of the classic way that researchers study where they just kind of count how long an infant gazes at an object and then they infer that that means they're more interested in that object.
[00:16:48.520 --> 00:16:59.880] They're calling this a first of its kind study that shows proof of concept for using an active behavioral measure that is a manual choice in a large-scale multi-lab project studying infants.
[00:16:59.880 --> 00:17:08.440] Oh, and one thing I didn't mention that I should have is that every aspect of this study was pre-registered and all of the data is available open access.
[00:17:08.440 --> 00:17:09.000] That's good.
[00:17:09.800 --> 00:17:15.320] So they really did check a lot of those boxes because they were worried about the replication crisis.
[00:17:15.320 --> 00:17:30.760] They were worried that this original study, which has been cited so many times that so many people were taking as truth, was kind of showing, I guess we could say, ambiguous or maybe ambivalent, you know, like, sometimes it's supported, sometimes it's not.
[00:17:30.760 --> 00:17:31.800] It's kind of wiggly.
[00:17:31.800 --> 00:17:36.760] Okay, well, let's just look at it once and for all across multiple labs, across multiple countries.
[00:17:36.760 --> 00:17:38.840] We'll all do the exact same thing.
[00:17:38.840 --> 00:17:42.440] We'll all collaborate on this study, and we'll see if we find the same thing.
[00:17:42.440 --> 00:17:53.840] So, you know, they talk quite a bit in the paper about the difference between doing a meta-analysis and doing a large-scale multi-lab project and how there are different ways to approach it statistically.
[00:17:53.840 --> 00:17:59.440] And, you know, there are strengths and weaknesses of both, but it's a pretty robust paradigm.
[00:17:59.440 --> 00:18:07.840] And the researchers were surprised that they failed to replicate that study.
[00:18:07.840 --> 00:18:17.440] And I think it's also a great example of publishing negative results because just because the results are quote negative doesn't mean there's not a lot of really interesting stuff here.
[00:18:17.440 --> 00:18:19.120] Yeah, negative data is data.
[00:18:19.120 --> 00:18:19.440] Yeah.
[00:18:20.080 --> 00:18:20.880] It tells us a lot.
[00:18:20.880 --> 00:18:21.760] Yeah, this is interesting.
[00:18:21.760 --> 00:18:27.760] I mean, it's a one interpretation of this is that, well, maybe the effect went away because it was all p-hacking to begin with.
[00:18:27.760 --> 00:18:31.440] And then when they control for p-hacking, it goes away.
[00:18:31.440 --> 00:18:33.120] That's one interpretation.
[00:18:33.280 --> 00:18:36.960] The other one is that it's just not a good paradigm for what they're looking for.
[00:18:37.360 --> 00:18:39.280] As we say, everything's a construct, right?
[00:18:39.280 --> 00:18:42.960] You can't, we don't know that this is a marker of morality in kids.
[00:18:42.960 --> 00:18:46.320] It's just one way that they're choosing to look at it.
[00:18:46.320 --> 00:18:48.320] And it could indicate that.
[00:18:48.320 --> 00:18:52.400] Because you could, the thing is, it's like one of those things where you could rationalize either way.
[00:18:52.400 --> 00:19:01.920] Are kids more interested in the pro-social character because they're drawn to it, or are they more interested in the anti-social character because that's more fascinating?
[00:19:01.920 --> 00:19:03.760] And it's almost like a morbid curiosity.
[00:19:03.760 --> 00:19:06.080] It's like, there's something wrong with this character.
[00:19:06.080 --> 00:19:12.640] What's going on with this thing that he's being you know, so we can't tell what's going on in the minds of these infants.
[00:19:12.640 --> 00:19:14.640] Obviously, they can't talk.
[00:19:14.960 --> 00:19:21.840] And then the other thing I was thinking when I was reading this was like, well, you can't make statements like there is no evidence for innate morality.
[00:19:21.840 --> 00:19:24.240] I mean, that I think is massively overcalling this.
[00:19:24.280 --> 00:19:29.760] But as you say, you know, it could just be, well, maybe it doesn't really manifest until two years old, you know?
[00:19:29.880 --> 00:19:31.240] Yeah, and they talk about that too.
[00:19:29.760 --> 00:19:31.720] Yeah.
[00:19:29.840 --> 00:19:33.960] The brain is still maturing at this point.
[00:19:34.840 --> 00:19:39.720] That could be one of those sophisticated frontal lobe things that doesn't kick in for a while, you know?
[00:19:40.040 --> 00:19:40.680] Absolutely.
[00:19:40.680 --> 00:19:49.160] They say, you know, that their first interpretation is that infants between five and ten months of age don't prefer pro-social characters over antisocial characters after all.
[00:19:49.160 --> 00:19:54.760] And then they talk about why those results could have been different, as you mentioned, p-hacking being one of them.
[00:19:54.760 --> 00:20:02.360] A second interpretation they say is that infants don't prefer helpers in the Hill paradigm, but maybe would prefer them in some other context.
[00:20:02.680 --> 00:20:14.280] And they talk about a lot of the studies that have been published, including a meta-analysis, that did find a preference for agents performing different kinds of pro-social actions.
[00:20:14.280 --> 00:20:35.960] And so they say for those reasons, it seems plausible that even if infants don't prefer helpers over hinders in the Hill paradigm by 10 and a half months, they might nonetheless prefer, I'm quoting them directly, helpers or pro-social agents more generally in other scenarios, perhaps when the intentions are overt, or perhaps it's going to happen, you know, later in development.
[00:20:35.960 --> 00:20:49.160] But to sit on the assumption that infants between the ages of five and a half and ten and a half months are intrinsically moral versus the assumption that they are intrinsically not moral.
[00:20:49.160 --> 00:20:52.680] Neither of those things is really borne by the literature.
[00:20:52.680 --> 00:21:00.120] And so we have to be really careful when we interpret these studies to interpret them based on what they actually tell us.
[00:21:00.120 --> 00:21:05.400] And I think that's a big difficulty in psychology because we're always working with constructs.
[00:21:05.400 --> 00:21:08.280] There's also different types of morality.
[00:21:08.280 --> 00:21:11.320] I mean, there's justice and fairness, right?
[00:21:11.320 --> 00:21:14.520] This is more pro-social versus antisocial behavior.
[00:21:14.520 --> 00:21:16.960] So, again, you can't make sweeping statements about.
[00:21:17.040 --> 00:21:18.560] Yeah, and they, you know, they're very careful.
[00:21:18.560 --> 00:21:24.000] They do talk about morality in the discussion, but they don't talk about it very often when they're talking through the study.
[00:21:24.000 --> 00:21:29.760] They're saying helpers, hinderers, helpers, social, pro-social, non-social.
[00:21:29.760 --> 00:21:30.720] Internet.
[00:21:30.720 --> 00:21:30.960] All right.
[00:21:30.960 --> 00:21:32.080] Thanks, Kara.
[00:21:32.080 --> 00:21:34.320] Jay, how's our space station doing?
[00:21:34.320 --> 00:21:38.160] Well, on a scale of one to ten, ten is brand new.
[00:21:38.160 --> 00:21:39.520] It's probably at a two.
[00:21:39.760 --> 00:21:43.920] Yeah, there's some significant issues with it.
[00:21:43.920 --> 00:21:47.440] I mean, let's face it, it's lasted longer than they expected it to.
[00:21:47.440 --> 00:21:48.720] But let me give you the update.
[00:21:48.720 --> 00:21:57.840] So, first of all, you know, the International Space Station, it's been around since 1998, which means that they started building the components for it, you know, many, many years before that.
[00:21:57.840 --> 00:22:04.560] So it's been in low Earth orbit for 26 years, and, you know, unfortunately, it's really showing its age now.
[00:22:04.880 --> 00:22:08.480] And when they initially launched it, it started with just one module.
[00:22:08.480 --> 00:22:09.920] Do you guys know what the name of that was?
[00:22:09.920 --> 00:22:10.720] Freedom.
[00:22:10.720 --> 00:22:12.240] It's a Russian module.
[00:22:12.240 --> 00:22:12.800] Oh, wow.
[00:22:12.800 --> 00:22:13.280] Oh.
[00:22:14.080 --> 00:22:15.360] It was a Zarya.
[00:22:15.360 --> 00:22:16.880] Z-A-R-Y-A.
[00:22:16.880 --> 00:22:17.920] Zarya.
[00:22:17.920 --> 00:22:21.600] Yeah, the ISS today has 43 module facilities.
[00:22:21.600 --> 00:22:22.560] And there's a lot of different.
[00:22:22.880 --> 00:22:23.760] I mean, it's really cool.
[00:22:23.760 --> 00:22:26.640] They have so many different sections that do different things.
[00:22:26.640 --> 00:22:34.560] They have airlocks, robotic arms, power, life support, communication, habitation modules, research facilities.
[00:22:34.880 --> 00:22:37.440] There's a lot of different modules up there.
[00:22:37.440 --> 00:22:40.080] Very complicated and unbelievably expensive.
[00:22:40.080 --> 00:22:45.360] There's been an unprecedented and I think very noteworthy collaboration between nations.
[00:22:45.360 --> 00:22:55.360] We have the United States, Russia, European nations, Japan, Canada, Italy, Brazil, all being major contributors to the space station.
[00:22:55.360 --> 00:23:06.600] But, you know, like I said, it's unlimited time today, and most of the partner countries I listed, they plan to retire the station on or around 2030, which is coming up pretty quick.
[00:23:06.920 --> 00:23:13.640] And Russia is also now saying that it might even go earlier for them and they might want to withdraw by 2028.
[00:23:13.640 --> 00:23:16.120] Yeah, they've been talking about that for many years now.
[00:23:16.120 --> 00:23:19.800] Yeah, it's interesting to try to work out the details of what that actually means.
[00:23:19.800 --> 00:23:21.880] If they pull out, like, what does that actually mean?
[00:23:21.880 --> 00:23:25.000] You know, is it like they're not going to give any financial support or what?
[00:23:25.000 --> 00:23:32.520] I think they'll no longer be responsible for the upgrades or the maintenance of their portion of the stage.
[00:23:32.520 --> 00:23:33.880] Yeah, okay, that makes sense.
[00:23:33.880 --> 00:23:43.960] So NASA has already instructed SpaceX to design a deorbiting spacecraft that will guide the ISS safely into Earth's atmosphere.
[00:23:43.960 --> 00:23:45.880] You know, like it's a big deal.
[00:23:45.880 --> 00:23:50.280] This is a really big deal to deorbit that thing and to have it land in the ocean.
[00:23:50.280 --> 00:23:52.600] You know, they want a controlled descent.
[00:23:52.600 --> 00:23:54.760] And of course, this thing can't land in a city.
[00:23:54.760 --> 00:23:57.000] It would do an amazing amount of damage.
[00:23:57.000 --> 00:24:00.040] You know, it could be very dangerous.
[00:24:00.520 --> 00:24:03.400] I think that they are totally up to the task.
[00:24:03.400 --> 00:24:08.040] And I think, you know, NASA and SpaceX know exactly what to do.
[00:24:08.040 --> 00:24:09.560] But it's going to be expensive.
[00:24:09.560 --> 00:24:11.160] But what kind of a show that will be, guys?
[00:24:11.480 --> 00:24:15.080] I mean, if it happens at night or during the day, you're going to see it.
[00:24:17.160 --> 00:24:22.280] If you're in line of sight, you're going to see a huge, huge show.
[00:24:22.920 --> 00:24:25.640] That's going to burn up and really be visible.
[00:24:25.640 --> 00:24:33.080] And even worse, guys, there's issues that might accelerate this 2028 or 2030 timeline.
[00:24:33.080 --> 00:24:38.440] There's been these persistent air leaks in the Russian Zvezda module.
[00:24:38.680 --> 00:24:43.720] It's really got some serious air leaks, and they were first discovered, you know, back in 2019.
[00:24:43.720 --> 00:24:45.360] That seems like a long time ago.
[00:24:44.680 --> 00:24:49.840] So they've escalated this situation to a very critical level.
[00:24:50.160 --> 00:24:59.120] The leaks are now rated five out of five on NASA's danger scale, and it is the number one top safety concern that's going on, and it is significant.
[00:24:59.120 --> 00:25:13.280] The ISS is losing about 3.7 pounds of air per day, and that costs NASA about $4,000 a day, about $1.6 million a year to actually replace the atmosphere because it has to be brought up there.
[00:25:13.280 --> 00:25:13.600] Really?
[00:25:14.240 --> 00:25:23.120] And these figures, they're estimates, but they show clearly that there's a growing operational burden here of maintaining the station.
[00:25:23.120 --> 00:25:25.200] It's not the only problems that they have.
[00:25:25.200 --> 00:25:31.600] It is serious enough where the astronauts were told by NASA to spend more time in the American section of the station.
[00:25:31.600 --> 00:25:33.280] Probably having fewer problems, yeah.
[00:25:33.280 --> 00:25:33.840] Yeah, right.
[00:25:33.840 --> 00:25:40.400] So to prepare for a worst-case scenario, SpaceX is developing an emergency evacuation plan for the crew.
[00:25:40.400 --> 00:25:47.520] I would imagine that that is some type of craft that has, you know, that'll be attached and will be ready to go at a moment's notice.
[00:25:47.520 --> 00:25:52.560] The structural issues on the station have made a top priority to replace the ISS.
[00:25:52.560 --> 00:25:58.560] NASA already turned to the private sector, which is a really good idea because there's a ton of money in the private sector.
[00:25:58.960 --> 00:26:04.000] They've funded multiple initiatives to help keep a human presence in low Earth orbit.
[00:26:04.000 --> 00:26:05.120] I'm totally for this.
[00:26:05.120 --> 00:26:09.200] I'm sure most of us science enthusiasts, we want people up there.
[00:26:09.200 --> 00:26:12.320] There's a lot of significant science that's being done there.
[00:26:12.320 --> 00:26:14.320] It also is just human nature.
[00:26:14.320 --> 00:26:20.720] I think we have to respect the fact that we're explorers, and you know, it is the final frontier.
[00:26:20.720 --> 00:26:25.440] There are four major projects, and I think you'll find these interesting if you haven't heard of them.
[00:26:25.440 --> 00:26:35.000] One of them is Vast Space, and they want to build their own independent station to go in low-earth orbit, and it's going to focus on modular and scalable designs.
[00:26:35.320 --> 00:26:37.880] I'm sure most of you have heard of Axiom Space.
[00:26:37.880 --> 00:26:45.720] They plan to attach their modular station to the ISS initially, and then over time, it'll detach and function as a standalone station.
[00:26:45.720 --> 00:26:52.760] So, it might pull some of the modules with it, and then they'll slowly get rid of them as they bring up their own new modules.
[00:26:52.760 --> 00:27:00.920] There's a coalition led by Blue Origin, Sierra Space, Boeing, and Redwire, and they're developing something called an orbital reef.
[00:27:00.920 --> 00:27:05.720] This is a commercial station marketed as a mixed-use business park in space.
[00:27:05.720 --> 00:27:07.640] I think that one is really interesting.
[00:27:07.640 --> 00:27:09.640] And then the last one is Voyager Space.
[00:27:09.640 --> 00:27:16.760] They're partnering with Airbus, Northrop Grumman, and Hilton Hotels, which is it might sound odd, but yeah, they're in the game.
[00:27:16.760 --> 00:27:22.680] And they're working on Starlab, which is a futuristic station with scientific and commercial applications, right?
[00:27:23.000 --> 00:27:25.960] I think it would be wonderful if more than one of these made it.
[00:27:25.960 --> 00:27:30.120] As much as this is a Herculean effort, it's costing an incredible amount of money.
[00:27:30.120 --> 00:27:33.080] You know, NASA is trying to help these four projects.
[00:27:33.080 --> 00:27:37.400] I think it's possible that we could have more than one, you know, within the next 10 years.
[00:27:37.400 --> 00:27:44.760] The experts have outlined that they have significant doubts about whether any of the replacements will be operational by the 2030 deadline.
[00:27:44.760 --> 00:27:51.160] That might be the case, but that doesn't mean that they're not going to end up getting up there eventually.
[00:27:51.160 --> 00:27:55.240] Axiom Space, they faced financial issues.
[00:27:55.240 --> 00:28:00.040] They had to delay payments to suppliers, and they were struggling to pay their employees.
[00:28:00.200 --> 00:28:05.720] That's a you know like a situation that you know that could kill a project very easily.
[00:28:05.720 --> 00:28:09.720] And NASA you know they like I said, NASA is supporting them as best they can.
[00:28:09.720 --> 00:29:46.120] They want them to keep going, of course, so NASA's going to give them as much money as they can to keep them going NASA plans to also award additional development contracts in 2026 and they've already invested over 500 million to help companies refine their designs you know build their prototypes and just keep those projects breathing air so in the short term NASA is probably going to need to implement some temporary fixes you know I think that's pretty obvious they could involve you know sealing off the uh the the Zezda transfer tunnel like just closing it down and making it so you know no one can go into that module anymore that is where the leaks are the most severe so I think you know abandoning that would would you know it would be a difficult thing it's a it's a needed module but you know what else can they do these aren't ideal situations they don't have ideal solutions but they could extend the station's lifespan long enough to bridge the gap so we don't you know we don't have a you know we we want there to be a continuous space station for a lot of reasons like you know it would be great if they could move a module over with one of the new you know space stations and then move all the gear and all the expensive stuff and you know that that's a good way to transport stuff from one station to the other we don't want to have to pull that the entire space station down and everything in it you know that would be really an unfortunate reality if that happens i think just from a science perspective i think it's worth it because we can definitely run tests and do things in outer space you know nasa does pass on a lot of its technology to the private sector for free in the united states.
[00:29:46.120 --> 00:29:48.360] It'll just hand companies, you know, here you go.
[00:29:48.360 --> 00:29:55.240] This is, you know, this a product you know a product line that you could you could sell, you know, things like velcro, right?
[00:29:55.240 --> 00:30:01.240] Like, you know, they were created, you know, duct tape, like things you wouldn't imagine, but very useful stuff.
[00:30:01.240 --> 00:30:03.000] And, you know, it does boost the U.S.
[00:30:03.000 --> 00:30:03.480] economy.
[00:30:03.480 --> 00:30:04.600] It's not insignificant.
[00:30:04.600 --> 00:30:08.520] So I do think it's definitely worth every penny we spend on it.
[00:30:08.520 --> 00:30:12.280] So, but the future is, guys, it's all private investors.
[00:30:12.280 --> 00:30:21.160] And, you know, there has to be a commercial component now to any of these endeavors going on in the future because, you know, single governments can't afford to do this.
[00:30:21.160 --> 00:30:24.440] So there has to be some type of private industry involved.
[00:30:24.440 --> 00:30:27.240] And I think that's great, you know, because there's a ton of money out there.
[00:30:27.240 --> 00:30:29.560] There's a lot of companies, you know, look at like Hilton Hotels.
[00:30:29.560 --> 00:30:31.720] They have a ton of money and they want to get in it.
[00:30:31.720 --> 00:30:32.120] Great.
[00:30:32.440 --> 00:30:33.720] Why wouldn't we encourage that?
[00:30:33.720 --> 00:30:39.960] I think it's an awesome collaboration between private companies and NASA to do this.
[00:30:39.960 --> 00:30:43.880] Yeah, I'm interested in the Voyager space station or Voyager station.
[00:30:43.880 --> 00:30:47.400] This is the first one that is in the works.
[00:30:47.400 --> 00:30:52.840] It's where there's planned that would have a rotating wheel design that would have artificial gravity.
[00:30:52.840 --> 00:30:53.800] Ooh, hello.
[00:30:53.800 --> 00:30:58.040] But they said, hey, we're going to start construction in 2026, but they don't have funding yet.
[00:30:59.080 --> 00:31:02.840] Until they have funding, or at least they haven't announced any funding.
[00:31:02.840 --> 00:31:05.800] So until that happens, naming a date is worthless.
[00:31:05.800 --> 00:31:06.520] You need a kickstart.
[00:31:06.600 --> 00:31:10.680] I'm skeptical that they're going to try to incorporate artificial gravity in that way.
[00:31:11.800 --> 00:31:14.280] That's the design for the Voyager station.
[00:31:14.280 --> 00:31:15.400] That's the whole point.
[00:31:15.400 --> 00:31:18.840] They want it to be a space hotel, basically.
[00:31:18.840 --> 00:31:19.880] Yeah, a little 2000.
[00:31:20.200 --> 00:31:24.680] And Ev, if you look at the pictures, man, it does kind of look like that.
[00:31:25.480 --> 00:31:29.640] These are renderings, and I'm sure that there are real plans out there.
[00:31:29.640 --> 00:31:34.760] I mean, it's not like they're just sitting around a room and they're like looking at science fiction drawings.
[00:31:35.160 --> 00:31:36.840] They're probably thinking, oh, that looks cool.
[00:31:36.840 --> 00:31:37.880] They know what they want.
[00:31:37.880 --> 00:31:39.000] They know what it's going to be.
[00:31:39.000 --> 00:31:42.600] But all of the artwork that I've seen, it does have that vibe.
[00:31:42.600 --> 00:31:47.520] And man, could you imagine if they're shuttling people up to a space hotel and that becomes commonplace?
[00:31:47.520 --> 00:31:48.640] That's amazing.
[00:31:48.640 --> 00:31:50.240] I still wouldn't go.
[00:31:44.760 --> 00:31:52.000] Not in a million years.
[00:31:52.320 --> 00:31:56.240] No, it'll happen, but it'll be long dead.
[00:31:56.240 --> 00:31:56.960] Yep.
[00:31:56.960 --> 00:31:57.520] Oh.
[00:31:57.840 --> 00:31:58.080] All right.
[00:31:58.080 --> 00:31:59.600] Well, Bob's a freshman.
[00:31:59.760 --> 00:32:01.440] That's pretty messy.
[00:32:02.400 --> 00:32:03.120] Bob.
[00:32:03.440 --> 00:32:05.680] This shit never pans out the way we want it to pan out.
[00:32:05.760 --> 00:32:06.160] It always doesn't.
[00:32:07.440 --> 00:32:08.560] Not in your life.
[00:32:08.720 --> 00:32:09.040] Forever.
[00:32:10.800 --> 00:32:11.200] All right.
[00:32:11.200 --> 00:32:11.760] Thanks, Jay.
[00:32:12.240 --> 00:32:12.800] Yep.
[00:32:13.040 --> 00:32:15.840] But it switched topics before Bob gets too mad.
[00:32:15.840 --> 00:32:16.320] Yeah.
[00:32:16.880 --> 00:32:17.520] All right, Bob.
[00:32:17.520 --> 00:32:18.880] I'm going to make you feel better.
[00:32:18.880 --> 00:32:21.040] Global warming is even worse than we thought it was going to be.
[00:32:21.280 --> 00:32:21.680] Great.
[00:32:21.680 --> 00:32:22.080] There you go.
[00:32:22.240 --> 00:32:23.520] No surprise there.
[00:32:24.480 --> 00:32:28.480] There's a recent study looking at climate hotspots.
[00:32:29.120 --> 00:32:45.840] And so the idea here is they wanted to, they looked at a lot of data, obviously climate data, but regional temperature data to try to identify regions where there are more likely to be heat waves.
[00:32:45.840 --> 00:32:55.600] And they identified, they basically made a map of the world, identifying those regions, Northern Africa, Europe, Northwestern U.S.
[00:32:55.680 --> 00:33:04.080] and Canada, where there have been statistically outlier heat waves in the last 20 years.
[00:33:04.320 --> 00:33:07.040] Their conclusion is that a couple of things.
[00:33:07.040 --> 00:33:17.200] Well, first of all, statistically speaking, these are multiple standard deviations away from average temperatures, like what the models would predict.
[00:33:17.440 --> 00:33:30.760] You may remember, like, you know, in northwestern Canada, they had temperatures that were literally 50 degrees above the average temperature for that location in that time of year.
[00:33:29.760 --> 00:33:36.360] Just massively outside of any statistical distribution.
[00:33:36.680 --> 00:33:42.680] So that's one observation that they made, that there's these hot spots where we're seeing extreme heat waves.
[00:33:42.680 --> 00:33:46.920] And the second thing is that the models don't really predict this.
[00:33:46.920 --> 00:33:50.440] But it's not that the models, they don't contradict the models.
[00:33:50.440 --> 00:33:53.400] The models don't drill down to this level of detail.
[00:33:53.400 --> 00:33:59.640] The models are unable to tell us like what this regional, you know, these regional variations will be.
[00:33:59.640 --> 00:34:06.120] And so in other words, we can't predict these extreme weather events, and yet they're happening.
[00:34:06.120 --> 00:34:14.040] They're saying that basically what's happening is that as the temperature increases, the variability is also increasing.
[00:34:14.200 --> 00:34:16.920] So you're getting more extreme events.
[00:34:16.920 --> 00:34:20.440] That bell curve, if you will, is spreading out.
[00:34:20.440 --> 00:34:24.520] And this includes both hot and cold temperatures.
[00:34:24.520 --> 00:34:32.360] But of course, as the average temperature increases, it's worse at the hot end of that spectrum.
[00:34:32.360 --> 00:35:03.560] And what this could mean is that earlier than the models would have predicted, we could be seeing these hotspots around the world with increasing frequency of increasing heat waves, like heat waves that are significantly above the average temperatures and that are obviously spiking deaths due to heat waves and are making these regions, at least for the duration of the heat wave, they could be, like, if you don't have air conditioning, they're basically unlivable.
[00:35:03.560 --> 00:35:06.680] So, these are parts of the world where people don't traditionally have air conditioning, right?
[00:35:06.680 --> 00:35:08.920] Because it doesn't get hot enough to require air conditioning.
[00:35:08.920 --> 00:35:17.040] And now they're having heat waves, like 110, 120 degree temperatures, and they don't have air conditioning, you know, because they've never seen temperature.
[00:35:14.920 --> 00:35:21.200] These are tens of degrees higher than anything they've ever seen before.
[00:35:21.520 --> 00:35:26.160] Obviously, the solution to this is to try to mitigate climate change as much as possible.
[00:35:26.160 --> 00:35:27.120] Oh, yeah, that's a good idea.
[00:35:27.120 --> 00:35:27.840] Yeah, let's try that.
[00:35:27.840 --> 00:35:31.520] But, you know, what it means, so the authors are recommending two things.
[00:35:31.760 --> 00:35:39.760] First, we need to figure out how to predict these regional changes, even though the models are not designed to do that.
[00:35:39.760 --> 00:35:40.800] We need new models, right?
[00:35:40.800 --> 00:35:43.200] We need new ways of trying to do this.
[00:35:43.200 --> 00:35:50.640] Now, here we're just looking back at what's happened, which is not necessarily the same thing as predicting what's going to happen in the future.
[00:35:50.800 --> 00:35:54.080] And that's because this is the difference between climate and weather, right?
[00:35:54.080 --> 00:35:57.120] We're starting to blur the lines here between climate and weather.
[00:35:57.120 --> 00:36:01.280] You know, the weather is much more difficult to predict.
[00:36:01.280 --> 00:36:19.120] We're not just looking at average global temperatures, which the models have been very good at predicting, but there are lots of complicated changes to the way weather is behaving on the world with increasing temperature, like increasing energy in the system.
[00:36:19.120 --> 00:36:29.360] I know we talked on the show that we aired last week about the great clot the cold blob and shutting down the conveyor belt, the ocean currents.
[00:36:29.840 --> 00:36:35.360] These things just become harder and harder to predict because we're dealing with a very complicated system.
[00:36:35.360 --> 00:36:50.240] So this could mean that we will be seeing extreme heat wave events earlier than the models predicted because regional variability is increasing.
[00:36:50.240 --> 00:36:57.040] It's like this is a separate phenomenon they're observing, this increasing, the tails are spreading out basically.
[00:36:57.040 --> 00:36:59.800] And that wasn't something that the models predicted.
[00:36:59.800 --> 00:37:00.600] But we're seeing it.
[00:37:00.600 --> 00:37:01.240] It's happening.
[00:36:59.360 --> 00:37:02.440] So that's not a good thing.
[00:37:02.760 --> 00:37:08.840] Well, imagine seeing such variation in a place that already occasionally sees like 120.
[00:37:08.840 --> 00:37:09.240] Yeah.
[00:37:09.240 --> 00:37:13.240] You know, imagine like, oh, oh, yeah, tomorrow it's going to be 140.
[00:37:13.240 --> 00:37:15.320] I wonder if it could get quite that extreme.
[00:37:15.560 --> 00:37:16.280] Yeah, yeah.
[00:37:16.280 --> 00:37:16.920] Oh, my God.
[00:37:16.920 --> 00:37:17.320] Wow.
[00:37:18.200 --> 00:37:28.360] We talked about this with Michael Mann, you know, on Saikon in the show they aired a few weeks ago: that the effects of climate change are not evenly distributed, right?
[00:37:28.360 --> 00:37:34.040] And the effects of our attempts at mitigating it won't necessarily be evenly distributed either.
[00:37:35.000 --> 00:37:40.280] There's going to be, you know, random winners and losers, you know, in this kind of scenario.
[00:37:40.280 --> 00:37:44.280] I think we're all losers, but more losers and less losers, I guess.
[00:37:44.280 --> 00:37:47.640] But no one is safe, I guess, is the other idea here.
[00:37:47.640 --> 00:37:51.640] Like, Canada, you know, had these ridiculous heat waves.
[00:37:51.640 --> 00:37:59.240] North Carolina, deep in the mountains, had flooding, like a place that would, nobody, there's no flooding infrastructure here.
[00:37:59.240 --> 00:38:03.320] They don't have floods in this region of the country.
[00:38:03.320 --> 00:38:13.640] But because of, you know, the increased moisture in the air, increased the rainfall due to Hurricane Helene, they had a massive flood that they weren't prepared for.
[00:38:13.640 --> 00:38:20.440] So it's not just that, oh, yeah, the hottest places on the earth are going to get even hotter and they're going to be feeling it first.
[00:38:20.440 --> 00:38:23.880] Like, yeah, that is true, but it's not the only thing that's true.
[00:38:23.880 --> 00:38:29.800] It's also going to be bringing extreme weather events to pretty much any part of the world.
[00:38:29.800 --> 00:38:32.680] And it's kind of unpredictable how that will happen.
[00:38:32.840 --> 00:38:36.040] It's not a simple extrapolation from what you're currently seeing.
[00:38:36.040 --> 00:38:37.720] Right, exactly.
[00:38:38.040 --> 00:38:43.000] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Aura Frames.
[00:38:43.000 --> 00:38:48.640] This year, guys, one of the best gifts that you can give someone is the Aura Digital Picture Frame.
[00:38:48.640 --> 00:38:52.400] Aura Frames are the number one digital photo frame by Wirecutter.
[00:38:52.400 --> 00:38:53.920] This is a really awesome platform.
[00:38:54.160 --> 00:38:55.360] It's incredibly smart.
[00:38:55.360 --> 00:38:56.560] It's easy to use.
[00:38:56.560 --> 00:39:00.960] You can upload an unlimited number of photos and videos from your phone to the frame.
[00:39:00.960 --> 00:39:09.920] Plus, you can order the frame online and you can preload it with the photos and videos that you want to share using the Aura app so it's ready to go right out of the box.
[00:39:09.920 --> 00:39:16.400] Yeah, I helped set this up my personal Aura frame and my mom's, and it was just so ridiculously easy.
[00:39:16.400 --> 00:39:21.040] And it's really one of the best presents that I've ever given or received.
[00:39:21.040 --> 00:39:21.760] It truly is.
[00:39:21.760 --> 00:39:25.440] With the Aura Frame, you can upload your favorite photos and they are there.
[00:39:25.440 --> 00:39:28.480] It's really convenient, fantastic, and a lot of fun.
[00:39:28.720 --> 00:39:31.840] You'll be looking at it many times during the day, guaranteed.
[00:39:31.840 --> 00:39:43.440] So save on the perfect gift by visiting auraframes.com to get $35 off Aura's best-selling Carver Mat Frames by using promo code skeptics at checkout.
[00:39:43.440 --> 00:39:48.560] That's A-U-R-AFrames.com, promo code skeptics.
[00:39:48.560 --> 00:39:53.040] This deal is exclusive to listeners, so get yours now in time for the holidays.
[00:39:53.040 --> 00:39:54.800] Terms and conditions apply.
[00:39:54.800 --> 00:39:56.960] All right, guys, let's get back to the show.
[00:39:57.280 --> 00:40:02.240] Okay, Evan, tell us about Orcas wearing hats.
[00:40:02.480 --> 00:40:03.600] Yes.
[00:40:05.760 --> 00:40:08.000] You don't run across that headline every day.
[00:40:08.480 --> 00:40:11.760] Have you guys ever seen that nature documentary film?
[00:40:11.760 --> 00:40:14.160] It's called Star Trek 4: The Voyage Home?
[00:40:15.120 --> 00:40:22.240] Where Captain of Engineering Montgomery Scott famously exclaimed, Admiral, there be whales here.
[00:40:22.880 --> 00:40:24.160] Can you imagine if you said that?
[00:40:24.480 --> 00:40:27.360] And they're wearing salmons as hats.
[00:40:28.480 --> 00:40:31.480] Scotty, you've been hitting the brandy again, I think.
[00:40:31.480 --> 00:40:32.120] Sorry, I'm not sure.
[00:40:33.800 --> 00:40:36.120] That movie came out in 1986.
[00:40:36.440 --> 00:40:37.000] All right.
[00:40:37.000 --> 00:40:48.040] But in 1987, researchers, for the first time, had noticed that a population of orcas had begun swimming around with dead fish on their heads.
[00:40:48.040 --> 00:40:49.320] Salmons on their heads.
[00:40:49.320 --> 00:40:52.520] This was the first recorded observation of this behavior.
[00:40:52.920 --> 00:40:57.960] I have to set it up like this because I need to talk for just a moment about groups of whales, family groups.
[00:40:57.960 --> 00:40:59.960] They're called pods.
[00:41:00.280 --> 00:41:04.760] So a pod of whales is led by the oldest female, the matriarch.
[00:41:04.760 --> 00:41:11.080] And the matriarch passes down knowledge about hunting, migration routes, social interactions.
[00:41:11.080 --> 00:41:13.960] You know, they run the family, essentially.
[00:41:13.960 --> 00:41:20.760] And the offspring, both the male and female offspring, remain with their mothers for life, pretty much.
[00:41:20.760 --> 00:41:23.960] And that's a tight-knit family, really, really tight bond.
[00:41:23.960 --> 00:41:32.200] But pods can range from a small group of a few individuals to a large group of dozens or even in some cases, hundreds of whales.
[00:41:32.200 --> 00:41:38.600] And scientists will categorize these pods by letter, you know, A pod, B pod, and so forth.
[00:41:38.600 --> 00:41:40.040] And this is in 1987.
[00:41:40.040 --> 00:41:47.880] It was observed that there was a female from K pod who started wearing a dead salmon on her head.
[00:41:47.880 --> 00:41:48.440] Wow.
[00:41:48.440 --> 00:42:01.960] And then they noticed within a few weeks other pods, other individuals in other pods, pods J and pods L, they also started to wear fish hats as well, fish on their head.
[00:42:01.960 --> 00:42:13.480] And it was apparently something of a fad because that behavior soon afterwards, I guess, had stopped or had not been seen again until last month, November 2024.
[00:42:13.480 --> 00:42:14.040] Yep.
[00:42:14.040 --> 00:42:25.680] When one of the whales from J-Pod, the same pod, the same family, now this was a 32-year-old male, so that particular whale was not yet born in 1987, but regardless, it's from the same family.
[00:42:25.680 --> 00:42:33.920] So J27 Blackberry, 32-year-old male, was photographed exhibiting the same behavior of wearing the dead salmon on its head.
[00:42:33.920 --> 00:42:39.360] And this was at Point No Point, Washington, off Whidbey Island in Puget Sound.
[00:42:39.360 --> 00:42:44.400] And this took the observers once again, scientists, by surprise.
[00:42:44.400 --> 00:42:44.960] Yeah.
[00:42:45.600 --> 00:43:01.040] Andrew Foote, who is an evolutionary ecologist at the University of Oslo in Norway, said it does seem possible that some individuals that experienced the behavior for the first time or their family members may have started it again.
[00:43:01.360 --> 00:43:08.640] And then there's Deborah Giles, who's a science researcher and a director at the nonprofit Wild Orca.
[00:43:08.640 --> 00:43:19.200] She says, we've seen mammal-eating killer whales carry large chunks of food before under their pectoral fin, kind of tucked into their bodies.
[00:43:19.520 --> 00:43:26.480] But the phenomenon by which they will also wear it on their head could just be another form of this.
[00:43:26.480 --> 00:43:29.280] Like there's like storing food.
[00:43:29.280 --> 00:43:37.360] They have extra food, so they're, you know, just hanging on to it in a way until they're ready to eat it at some point later.
[00:43:37.680 --> 00:43:48.800] And this can happen because what there will be times during the course of the year in which there will be an abundance of food for some of these pods, more than they can eat on a regular basis.
[00:43:48.800 --> 00:43:53.200] So they'll, you know, start storing it somewhere in and amongst their body.
[00:43:53.200 --> 00:44:02.680] So this could be, although they're not 100% sure, this could be that behavior, just another form of them kind of storing this food for a little while.
[00:44:03.080 --> 00:44:10.760] But it's fascinating, though, that it's kind of this cultural almost trend in a way that they've described it as.
[00:44:11.480 --> 00:44:25.880] And again, one pod of whales will exhibit a certain behavior, and then other families will also learn one way or another from that pod, and they will also start exhibiting this behavior.
[00:44:25.880 --> 00:44:31.960] And they think that this happens when orcas will go after boats, right?
[00:44:31.960 --> 00:44:35.320] And attack or throw their bodies onto boats.
[00:44:35.320 --> 00:44:43.800] That one group will do it, another pod will see that, and they'll imitate or they'll go ahead and exhibit that behavior as well.
[00:44:44.920 --> 00:44:46.040] It's very fascinating.
[00:44:46.040 --> 00:44:48.840] It's very nuanced, and it's quite complex.
[00:44:48.840 --> 00:44:59.960] I mean, these are mammals that are just, you know, have been studied so long, and they have very complex social networks and collaborative behaviors, emotional and social bonds.
[00:44:59.960 --> 00:45:04.280] I mean, you know, not to be underestimated.
[00:45:04.280 --> 00:45:09.400] But, I mean, anytime you see a headline where whales wearing fish on their heads, I mean, forget it.
[00:45:09.400 --> 00:45:10.760] And the internet goes crazy.
[00:45:10.760 --> 00:45:16.280] And there's memes and there's songs and there's videos and a million people are talking about it all over the place.
[00:45:16.280 --> 00:45:24.200] You know, like, I know that this must be the most common question, but how the hell are they swimming and they're keeping a dead fish on their forehead?
[00:45:24.200 --> 00:45:36.120] Well, I guess if all you, if that's the medium you live in, things like that become easier than perhaps it would seem to people who don't live their existence in the water like that.
[00:45:36.280 --> 00:45:39.800] Just from the physics of it, though, I mean, you know, their skin is smooth.
[00:45:39.800 --> 00:45:42.360] You'd think that it would just slip right off their head, you know?
[00:45:42.600 --> 00:45:42.960] Yeah, you see.
[00:45:43.200 --> 00:45:45.440] If you swim against it, then you swim against it.
[00:45:44.520 --> 00:45:46.800] They'd be able to battle.
[00:45:47.120 --> 00:45:53.120] They're able to have an equilibrium, a balance of some sort in which they're able to.
[00:45:53.280 --> 00:45:55.440] Yeah, they don't have pockets, so they got to wear it.
[00:45:55.520 --> 00:45:56.800] They've got to put it somewhere.
[00:45:56.800 --> 00:46:00.240] Could this be this orcas playing with their food again?
[00:46:00.720 --> 00:46:10.800] You've seen the videos of orcas tossing seals, like you know, ragdolls up 30, 40 feet in the air and down.
[00:46:10.800 --> 00:46:13.040] Could this just be them like, ah, look what I killed.
[00:46:13.040 --> 00:46:14.720] I'm going to keep it here for a while.
[00:46:14.720 --> 00:46:16.560] Could that be just food play?
[00:46:16.800 --> 00:46:19.360] I mean, I suppose it could be.
[00:46:20.000 --> 00:46:21.600] They're looking into it more, Bob.
[00:46:22.400 --> 00:46:28.080] They don't seem to have, you know, there's not a lot of data on this particular behavior.
[00:46:28.080 --> 00:46:36.640] Obviously, you know, they saw it happen in 1987, and now it's 2024, and they're only seeing it again now, you know, observing it.
[00:46:36.640 --> 00:46:40.480] So they are still really trying to figure it out.
[00:46:40.480 --> 00:46:44.160] But yeah, it could be playfulness, I suppose.
[00:46:44.160 --> 00:46:59.200] But again, you know, I think perhaps Deborah Giles might be onto something when she observes that the whales will tuck dead fish into other parts of their bodies as well, and this could just be one more thing.
[00:46:59.280 --> 00:47:00.560] Well, they eventually eat it.
[00:47:00.880 --> 00:47:02.160] Yeah, eventually they will eat it.
[00:47:02.320 --> 00:47:02.480] Yeah.
[00:47:02.800 --> 00:47:03.120] Whatever.
[00:47:04.480 --> 00:47:06.080] Bob, you know what it reminds me of?
[00:47:06.640 --> 00:47:11.680] Look how I take this dead rat and turn it into a delightful hat.
[00:47:12.320 --> 00:47:13.360] Oh my God, is that Dr.
[00:47:13.440 --> 00:47:13.840] See?
[00:47:14.320 --> 00:47:16.000] That's the nightmare before Christmas.
[00:47:16.480 --> 00:47:17.680] Oh, gosh.
[00:47:17.680 --> 00:47:27.360] Do you guys remember one of the first, I think in the first year of our podcast, we talked about a story where dolphins were observed wearing sponges on their nose.
[00:47:27.360 --> 00:47:38.040] And they do that so that when they forage for fish who are on the ground, they protect their nose from coral and rocks or whatever on the sea floor.
[00:47:38.440 --> 00:47:40.120] Yeah, as they're foraging for fish.
[00:47:40.600 --> 00:47:42.760] Again, cetacean wearing other.
[00:47:43.080 --> 00:47:46.920] This is using it more as a tool, not just carrying it with them to eat later.
[00:47:46.920 --> 00:47:51.240] This is actually even more sophisticated than what the orcas are doing, arguably.
[00:47:51.880 --> 00:47:56.280] But yeah, not unusual behavior for these types of creatures.
[00:47:56.280 --> 00:47:58.600] They're very intelligent, obviously.
[00:47:58.600 --> 00:47:59.640] All right, Bob.
[00:47:59.640 --> 00:48:03.160] Bob, has there ever been water on the surface of Venus?
[00:48:03.160 --> 00:48:05.240] Ah, listen and find out.
[00:48:05.240 --> 00:48:10.680] Earth's twin planet Venus is a hellish place now, but has it always been inhospitable?
[00:48:10.680 --> 00:48:19.960] Some models say yes, some say no, but a new theory using clues from Venus's atmosphere points to a world that has always been evil and inhospitable.
[00:48:19.960 --> 00:48:22.440] This is from researchers at the University of Cambridge.
[00:48:22.440 --> 00:48:28.680] They published in the journal Nature Astronomy a paper called A Dry Venusian Interior Constrained by Atmospheric Chemistry.
[00:48:28.680 --> 00:48:29.880] Okay, what does that mean?
[00:48:29.880 --> 00:48:39.960] All right, so we know Venus is a nasty place, but it's so similar to Earth that it's often referred to as our twin or sister planet.
[00:48:39.960 --> 00:48:41.400] And there's lots of good reasons for that.
[00:48:41.880 --> 00:48:42.440] Same size.
[00:48:42.520 --> 00:48:47.640] Same size, like Venus is 95% of Earth's diameter, similar mass, similar density.
[00:48:47.640 --> 00:48:49.960] The internal structures are similar as well.
[00:48:49.960 --> 00:48:53.480] They both have a rocky mantle surrounding an iron core.
[00:48:53.480 --> 00:49:00.440] They formed at roughly the same time in the history of the solar system from similar materials in the inner solar system.
[00:49:00.760 --> 00:49:03.880] They both have volcanic activity now and in the past.
[00:49:03.880 --> 00:49:06.200] But Venus is many other things.
[00:49:06.200 --> 00:49:07.960] It's ridiculously hot.
[00:49:07.960 --> 00:49:16.800] Venus is the hottest planet in the solar system until, of course, you know, at some point Earth will beat those records the way we're going.
[00:49:17.600 --> 00:49:23.360] But it's hotter than Mercury at 477 C or 860 Fahrenheit.
[00:49:23.360 --> 00:49:25.680] It would melt lead on the surface of Venus.
[00:49:26.240 --> 00:49:28.640] And the Soviets launched a Venera.
[00:49:30.160 --> 00:49:31.040] Is that the one?
[00:49:31.040 --> 00:49:33.600] It actually landed on the surface.
[00:49:33.600 --> 00:49:35.680] But what, like moments later?
[00:49:35.840 --> 00:49:36.960] It survived for minutes, I think.
[00:49:37.280 --> 00:49:37.680] Yeah, it was.
[00:49:37.840 --> 00:49:44.800] Yeah, you can get longer than minutes, but yeah, nothing, yeah, anything we put land on there will not be lasting very long.
[00:49:45.040 --> 00:49:47.680] I heard, I saw one figure of maybe hours.
[00:49:47.680 --> 00:49:51.920] I'm not sure about that one, but yeah, you're not going to have anything that's going to last there.
[00:49:51.920 --> 00:49:55.840] You said Venera 13 lasted for 127 minutes.
[00:49:55.840 --> 00:49:56.720] Yeah, okay.
[00:49:56.720 --> 00:49:57.280] Wow.
[00:49:57.280 --> 00:49:58.400] So yeah, it's two hours.
[00:49:59.360 --> 00:50:02.560] But then it's not just the heat, it's the atmospheric pressure.
[00:50:02.560 --> 00:50:05.520] The atmosphere is notoriously thick on Venus.
[00:50:05.840 --> 00:50:10.320] So if Earth's atmospheric pressure at sea level is what, what's that called?
[00:50:10.320 --> 00:50:11.360] One bar, right?
[00:50:11.360 --> 00:50:15.360] That translates to 14.7 pounds per square inch.
[00:50:15.360 --> 00:50:18.080] Venus is not one bar, but 92 bar.
[00:50:18.320 --> 00:50:24.080] So at its surface, instead of 14.7 pounds per square inch, it's 1,350 pounds per square inch.
[00:50:25.360 --> 00:50:28.000] It's like a small car on every square inch of your body.
[00:50:28.640 --> 00:50:32.000] It's like going down a kilometer below the surface of the ocean.
[00:50:32.000 --> 00:50:35.280] That's the kind of pressure we're talking about on the surface of Venus.
[00:50:35.280 --> 00:50:36.480] And it's also toxic.
[00:50:36.480 --> 00:50:37.840] Let's just throw that in there.
[00:50:37.840 --> 00:50:44.240] The atmosphere is mostly carbon dioxide, and there's thick clouds of sulfuric acid that shroud the entire planet.
[00:50:44.720 --> 00:50:48.400] The only saving grace is that the acid rain never even reaches the surface.
[00:50:48.400 --> 00:50:52.720] But then again, it doesn't reach the surface because it evaporates from all the intense heat.
[00:50:52.720 --> 00:50:55.200] It's so hot that it can't even get down to the surface.
[00:50:55.200 --> 00:50:56.720] It just evaporates away.
[00:50:56.720 --> 00:50:58.480] And even the winds are ridiculous.
[00:50:58.480 --> 00:51:04.040] 100 meters per second at some altitudes, and that's 60 times the planet's speed of rotation.
[00:50:59.680 --> 00:51:05.880] So, yeah, there's no Starbucks.
[00:51:06.440 --> 00:51:09.560] Yeah, hellish is like the perfect word for Venus.
[00:51:09.560 --> 00:51:10.840] It's nasty.
[00:51:10.840 --> 00:51:15.880] So, how could such initially similar twins diverge so dramatically?
[00:51:15.880 --> 00:51:22.200] To answer this, scientists have used climate modeling to tell us what Venus was like when it was younger.
[00:51:22.200 --> 00:51:24.920] So, two models have been duking it out for a long time.
[00:51:24.920 --> 00:51:36.920] One model contends that Venus was more temperate and Earth-like in the past, enough to even have liquid water on its surface and probably even shallow seas, they these models, this specific model contends.
[00:51:36.920 --> 00:51:51.480] But planet-wide volcanic activity eventually, after maybe even a couple of billion years, eventually spewed enough carbon dioxide to cause the famous runaway greenhouse effect that made the planet hotter and hotter, eventually becoming what we see today.
[00:51:51.480 --> 00:51:52.680] So, that's one model.
[00:51:52.680 --> 00:51:56.440] The other model describes a dry Venus scenario.
[00:51:56.440 --> 00:52:11.800] In this one, Venus's distance is close enough to the sun that its initial magma ocean, you know, when a planet just formed and it's basically magma everywhere, it takes a long time to cool, far longer than the Earth, and that desiccates the planet.
[00:52:11.800 --> 00:52:17.560] It removes most of the water, which was then never able to collect as liquid water on its surface.
[00:52:17.560 --> 00:52:19.320] So, those are the two models.
[00:52:19.320 --> 00:52:32.680] First author, Teresa Constanta, now a PhD student at Cambridge Institute of Astronomy, said both of those theories are based on climate models, but we want to take a different approach based on observations of Venus's current atmospheric chemistry.
[00:52:32.680 --> 00:52:35.000] So, that's what their paper is about.
[00:52:35.000 --> 00:52:43.160] And the goal was to study Venus's atmosphere to see how it might interact with the planet's interior, what that loop is like.
[00:52:43.160 --> 00:52:47.360] So, to do that, they looked at the chemicals in the atmosphere that were being destroyed.
[00:52:44.840 --> 00:52:51.920] And a couple of those were carbon dioxide and water were two right off the bat.
[00:52:52.240 --> 00:53:01.120] So for Venus to have a stable atmosphere, it's going to have to replace the water and carbon that were getting destroyed by various atmospheric processes.
[00:53:01.120 --> 00:53:02.480] So those chemicals need to be replaced.
[00:53:02.480 --> 00:53:09.840] So they looked at volcanic outgassing on Venus, because that's something that could replace some of these lost chemicals in the atmosphere.
[00:53:09.840 --> 00:53:16.320] And they found the critical finding here was that the volcanic acid on Venus were only 6% water.
[00:53:16.640 --> 00:53:21.120] And that was enough to replace what was being lost in the atmosphere.
[00:53:21.120 --> 00:53:22.800] But only 6% water.
[00:53:23.120 --> 00:53:29.360] Now remember, when magma rises from the deep within a planet, it brings the chemicals and gases that are down there to the surface.
[00:53:29.920 --> 00:53:33.280] So whatever is outgassed is very indicative of what's down there.
[00:53:33.280 --> 00:53:42.080] On Earth, volcanic gases are mostly steam, which is one way to show that, or to prove that the interior of the Earth is water-rich.
[00:53:42.080 --> 00:53:49.600] If Venus is outgassing only 6% of the water, its interior is likely to be just as dry as the surface.
[00:53:49.680 --> 00:53:57.280] And they say in the paper, the volcanic resupply to Venus's atmosphere, therefore, indicates that the planet has never been liquid water habitable.
[00:53:58.080 --> 00:54:24.080] So if you were dozing there for the past few minutes, the TLDR or perhaps TLDL, which too long didn't listen, this study shows that Venus was likely close enough to the Sun that in its early epoch that had magma oceans, took a long time to end, giving it a lot of time to lose its water, meaning that it never had a temperate Earth-like history and it probably never had water on its surface.
[00:54:24.080 --> 00:54:34.840] Now, luckily, the Earth was farther from the Sun, causing our magma ocean to solidify on the surface earlier than Venus did, and that cap held in all the water that we had at that time.
[00:54:34.840 --> 00:54:37.960] And that might be the key difference between Earth and Venus right there.
[00:54:37.960 --> 00:54:39.080] And it was the distance.
[00:54:39.080 --> 00:54:42.360] We knew that Venus was on the edge of habitability, right?
[00:54:42.360 --> 00:54:46.280] We were not sure if it was over that line or not.
[00:54:46.280 --> 00:54:58.520] And this is one bit of evidence showing that, yeah, Venus is a little bit out of the habitable zone because, and primarily because the magma ocean is going to just last too long and all the water is going to go away, basically.
[00:54:58.520 --> 00:55:04.680] Now, confirming this, though, to a higher degree of certainty, it's going to require waiting for future orbiters and even landers.
[00:55:04.680 --> 00:55:07.400] And there's one actually planned for the end of this decade.
[00:55:07.400 --> 00:55:09.480] Da Vinci hadn't heard about that.
[00:55:09.960 --> 00:55:14.920] So, the Da Vinci mission will have orbiters and a lander.
[00:55:14.920 --> 00:55:21.080] I'll see how long that one lasts, and we'll maybe even be able to confirm some of these theories that these people are saying.
[00:55:21.080 --> 00:55:30.680] Then, so a reasonable takeaway now from this is that when we look for exoplanets that may have had or have life, we should not look at Venus-like planets.
[00:55:30.680 --> 00:55:32.040] We should narrow our focus.
[00:55:32.040 --> 00:55:43.080] These right, these scientists contend, we should narrow our focus to exoplanets that are more similar to Earth, if that's what your interest is looking for worlds, exoplanets that have life.
[00:55:43.080 --> 00:55:44.760] I'll finish with a quote here from the paper.
[00:55:45.000 --> 00:56:00.520] We would have loved to have found that Venus was once a planet much closer to our own, so it's kind of sad in a way to find out that it wasn't, but ultimately it's more useful to focus the search on planets that are most likely to be able to support life, at least life as we know it, which I love when scientists throw that there at the end.
[00:56:00.520 --> 00:56:05.320] Life as we know it, yes, we've got one data point, so that's really a good way to say that.
[00:56:05.320 --> 00:56:06.760] So, yeah, interesting stuff.
[00:56:06.760 --> 00:56:08.760] Yeah, Venus is very interesting.
[00:56:08.760 --> 00:56:14.880] And the idea that there may be life in the upper atmosphere isn't faring so well.
[00:56:14.600 --> 00:56:17.280] You haven't heard too much good stuff about that either.
[00:56:18.480 --> 00:56:22.560] But if it is up there, it's not relying on water.
[00:56:22.880 --> 00:56:23.360] Yeah, probably.
[00:56:24.400 --> 00:56:24.640] All right.
[00:56:24.640 --> 00:56:25.440] Thanks, Bob.
[00:56:25.440 --> 00:56:26.160] All right, Jay.
[00:56:26.160 --> 00:56:27.760] It's who's that noisy time?
[00:56:27.760 --> 00:56:30.080] Okay, guys, last week I played This Noisy.
[00:56:38.160 --> 00:56:39.600] You guys have any guesses?
[00:56:39.600 --> 00:56:40.720] Cetacean.
[00:56:41.040 --> 00:56:42.560] Okay, well, we got.
[00:56:43.520 --> 00:56:47.280] I'll say right out of the gate, a ton of listeners knew exactly what this was.
[00:56:47.280 --> 00:56:51.520] So a listener named Joe Lanardria.
[00:56:53.360 --> 00:56:57.760] It's as if this guy's last name was created just so I couldn't pronounce it.
[00:56:58.400 --> 00:57:05.840] He said, This week's Noisy immediately brought to mind images of crested dinosaurs wading around in some prehistoric swamp.
[00:57:05.840 --> 00:57:12.000] So I'm going to guess this sound was from a computer's reproduction of what a parasolophilic.
[00:57:12.080 --> 00:57:13.200] Paraserolephus.
[00:57:13.280 --> 00:57:14.400] Parasiurolephus.
[00:57:14.400 --> 00:57:15.600] Paraserolephus.
[00:57:15.920 --> 00:57:24.640] It's like, you know, you get to like the second, as soon as I hit the second syllable, it just, like, my brain just goes, ha ha, you know.
[00:57:24.960 --> 00:57:33.600] All right, so the dinosaur or related species would have sounded like based on a 3D model of fossilized skull, Paris Cierolephus.
[00:57:33.600 --> 00:57:34.080] I got it.
[00:57:34.080 --> 00:57:34.320] Okay.
[00:57:34.320 --> 00:57:35.280] Thank you, Kara.
[00:57:35.280 --> 00:57:35.760] Yep.
[00:57:35.760 --> 00:57:39.440] I wish I just had you in my head where I could just hear you say it and then I could say it.
[00:57:39.440 --> 00:57:41.920] Listener named Amanda Lee wrote in and said, Hi, Jay.
[00:57:41.920 --> 00:57:46.720] I am absolutely convinced that this week's noisy is a cetacean, but which one?
[00:57:46.880 --> 00:57:49.920] My guess is a toothed whale, maybe an orca.
[00:57:49.920 --> 00:57:55.440] Okay, so that is not correct, but we did mention orcas in this show, so I give you two points.
[00:57:55.760 --> 00:57:59.120] Leah Zich said, Hey, all, new listener here.
[00:57:59.120 --> 00:58:04.680] I found you a few weeks ago, but I've gone back into the archives and have listened to hours and hours of the show.
[00:57:59.920 --> 00:58:06.280] I truly enjoy every episode.
[00:58:06.520 --> 00:58:10.280] I'm excited that I think I finally know what this noise is.
[00:58:10.280 --> 00:58:12.760] I'm quite confident that it's a zebra.
[00:58:12.760 --> 00:58:17.160] She goes on, If you ever take Naticon internationally, I'll be the first in line for tickets.
[00:58:17.160 --> 00:58:18.200] I'm in Italy.
[00:58:18.200 --> 00:58:20.760] That's awesome, and I would love to have it in Italy.
[00:58:20.760 --> 00:58:29.720] And you should contact me to let me know if you think that there are enough English-speaking skeptics out there that actually would come to the conference.
[00:58:29.720 --> 00:58:33.160] You can email us at info at the skepticsguy.org.
[00:58:33.160 --> 00:58:34.040] I appreciate it.
[00:58:34.040 --> 00:58:35.560] And send me some good meatballs.
[00:58:35.560 --> 00:58:38.200] Okay, moving on to a listener named Aiden.
[00:58:38.200 --> 00:58:42.200] He goes, Jay, yes, he wrote my name with a crazy number of A's in it.
[00:58:42.200 --> 00:58:46.040] That sounded like a deer, probably a reindeer, considering the holiday season.
[00:58:46.040 --> 00:58:48.680] Not a bad guess at all, but not correct.
[00:58:48.680 --> 00:58:50.520] Let's get right to the answer here.
[00:58:50.520 --> 00:58:54.440] Crystal Haka was the first one to answer correctly.
[00:58:54.440 --> 00:58:57.720] And we also had another correct guess by Shane Hillier.
[00:58:57.720 --> 00:59:01.800] These two people basically guessed within seconds of each other, which I think is always funny.
[00:59:01.800 --> 00:59:09.480] But Crystal writes, Hey, this week's noisy definitely sounds like a bull elk in rut, maybe two about to fight, right?
[00:59:09.480 --> 00:59:14.520] So, you know, this is an elk that it's called bleeding, right?
[00:59:14.520 --> 00:59:18.360] And when they're in rut, it's like mating season, basically.
[00:59:18.360 --> 00:59:19.880] And they're trying to find each other.
[00:59:19.880 --> 00:59:21.080] They're trying to fight each other.
[00:59:21.080 --> 00:59:22.200] They're looking for alcohol.
[00:59:22.200 --> 00:59:23.800] It gets really messy out there.
[00:59:23.800 --> 00:59:26.600] I kept thinking you were going to say something different.
[00:59:26.600 --> 00:59:27.080] Yeah.
[00:59:27.400 --> 00:59:29.640] But the bottom line is, that is an elk.
[00:59:29.640 --> 00:59:40.280] Lots of listeners apparently lived near Elk or knew someone that had elk or had some type of interaction with an elk and their vehicle.
[00:59:40.280 --> 00:59:41.800] That's all I'm going to say.
[00:59:42.120 --> 00:59:43.720] So, thank you all for writing in.
[00:59:43.720 --> 00:59:48.720] I really enjoyed your guesses this week, and I'm going to move right in to next week's Noisy.
[00:59:49.040 --> 00:59:53.280] Guys, this noisy was sent in by a listener named Bobby Duke.
[01:00:00.560 --> 01:00:10.320] Now, lots of people might have some initial guesses, but a really good guess would be that that's the noise Bob makes when he's putting away his Halloween decorations, right, Bob?
[01:00:10.640 --> 01:00:15.120] Yeah, I thought he was sobbing in between those cries, and yes.
[01:00:15.440 --> 01:00:23.440] All right, so guys, if you think you know what this week's noisy is or you heard something cool, you can email us at WTN at the skepticsguide.org.
[01:00:23.440 --> 01:00:25.120] Steve, hold on a second.
[01:00:25.120 --> 01:00:25.600] Yes.
[01:00:25.600 --> 01:00:28.800] All right, so first of all, it's the end of the year, right?
[01:00:29.440 --> 01:00:30.480] It's Christmas time.
[01:00:30.480 --> 01:00:31.520] Well, we're Christmas coming in.
[01:00:32.000 --> 01:00:45.840] If you guys know someone that listens to the show, you know, a sibling, a parent, a friend, a neighbor, a neighbor, maybe, you could gift them an SGU patron membership.
[01:00:45.840 --> 01:00:47.280] It's very easy to do.
[01:00:47.280 --> 01:00:56.080] So just go to patreon.com forward slash skepticsguide, and you will see that there is an option to gift someone a membership to the SGU.
[01:00:56.080 --> 01:01:00.400] That would be a great way to give someone a nice gift and also support your favorite podcast.
[01:01:00.400 --> 01:01:01.920] We would really appreciate it.
[01:01:01.920 --> 01:01:04.080] Or you could become a patron on your own.
[01:01:04.080 --> 01:01:09.360] You could also buy either the Skeptics Guide to the Universe or the Skeptics Guide to the Future books.
[01:01:09.360 --> 01:01:10.720] We have two books out there.
[01:01:10.720 --> 01:01:11.840] They make great gifts.
[01:01:11.840 --> 01:01:21.680] If you're giving somebody a gift membership in a Patreon membership, you could represent that with a physical item in one of the one or both of those books.
[01:01:21.680 --> 01:01:24.720] And if you bring them to a live event, we will sign them for you.
[01:01:24.720 --> 01:01:25.280] Yes.
[01:01:25.280 --> 01:01:26.640] And I'll sign your forehead.
[01:01:26.640 --> 01:01:27.680] Whatever you want, we'll do it.
[01:01:27.840 --> 01:01:30.280] So, anyway, you could join our mailing list.
[01:01:29.760 --> 01:01:40.360] You can go to the skepticsguide.org and become a mailing list member, which basically means that every week we will send you an email that has all the details about everything that we've done the previous week.
[01:01:40.360 --> 01:01:47.960] Also, if you find a way to give our show a rating, we encourage you to do it because that helps new people find our podcast.
[01:01:47.960 --> 01:01:51.800] As you hear this show, it'll be the seventh on Saturday.
[01:01:51.800 --> 01:01:55.560] We will be doing two live shows, so it's probably too late to buy tickets.
[01:01:55.560 --> 01:02:03.160] But if there's some reason where it isn't, you could buy tickets to our private show and the extravaganza, which is at night.
[01:02:03.160 --> 01:02:06.840] You can go to theskepticsguide.org and buy tickets there.
[01:02:06.840 --> 01:02:12.120] And the big one, which I'm putting a lot of time into, is Natacon 2025.
[01:02:12.920 --> 01:02:16.440] This is going to be the weekend of May 15th.
[01:02:16.440 --> 01:02:18.920] And, you know, you've heard me talk about this.
[01:02:18.920 --> 01:02:22.760] It's a wonderful conference, tons of socializing.
[01:02:22.760 --> 01:02:26.360] If you come there and you want to make friends, you will make friends.
[01:02:26.360 --> 01:02:34.120] There's an extraordinary group of people that are, you know, the core patrons of the SGU that are going to attend and they're awesome.
[01:02:34.120 --> 01:02:35.720] And there's a lot of fun to be had.
[01:02:35.720 --> 01:02:45.800] Brian Wecht, Andrea Jones Roy, George Hobb, and the entire SGU will be there providing entertainment for the two days, 2.2 days, like I said, Kara.
[01:02:45.800 --> 01:02:48.280] And that's a good way to support the SGU.
[01:02:48.280 --> 01:02:56.520] So please go to nauticoncon.com or you can go to the skepticsguide.org and you can find all the information that you need.
[01:02:56.520 --> 01:02:57.640] Thank you, Jay.
[01:02:57.640 --> 01:03:02.680] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Mint Mobile.
[01:03:02.680 --> 01:03:06.200] With Mint Mobile, there's no hoops and there's no BS.
[01:03:06.200 --> 01:03:09.640] It's $15 a month with the purchase of a three-month plan.
[01:03:09.640 --> 01:03:13.080] It really is that easy to get wireless for $15 a month.
[01:03:13.080 --> 01:03:17.840] The longest part of this process is the time that you'll spend breaking up with your old provider.
[01:03:17.840 --> 01:03:23.840] All plans come with high-speed data and unlimited talk and text delivered on the nation's largest 5G network.
[01:03:23.840 --> 01:03:29.920] You can use your own phone with any Mint Mobile plan and bring your own phone number along with all your existing contacts.
[01:03:29.920 --> 01:03:37.040] Find out how easy it is to switch to Mint Mobile and get three months of premium wireless service for $15 a month.
[01:03:37.040 --> 01:03:44.240] To get this new customer offer and your new three-month premium wireless plan for just $15 a month, go to mintmobile.com/slash SGU.
[01:03:44.240 --> 01:03:46.800] That's mintmobile.com/slash SGU.
[01:03:46.800 --> 01:03:51.040] Cut your wireless bill to $15 a month at mintmobile.com/slash SGU.
[01:03:51.040 --> 01:03:54.480] $45 upfront payment required equivalent to $15 a month.
[01:03:54.480 --> 01:03:56.640] New customers on first three-month plan only.
[01:03:56.640 --> 01:03:59.120] Speeds slower than $40 GB on unlimited plan.
[01:03:59.120 --> 01:04:00.800] Additional taxes, fees, and restrictions apply.
[01:04:00.800 --> 01:04:02.320] See Mint Mobile for details.
[01:04:02.320 --> 01:04:04.640] All right, guys, let's get back to the show.
[01:04:04.640 --> 01:04:08.160] Got a couple of emails, a couple of interesting corrections for this week.
[01:04:08.160 --> 01:04:16.880] The first one comes from Doug Herr, who writes, Has Steve realized that the Fisher, not Fisher Cat, no such thing, does not scream?
[01:04:17.120 --> 01:04:22.880] I hope to find that Steve has realized that all of the Fisher screaming recordings are actually recordings of foxes.
[01:04:22.880 --> 01:04:27.120] As evidence, see the link in the Wikipedia reference, and it gives a link to that.
[01:04:27.120 --> 01:04:33.200] It shows how the person running that site realized that all the content on that site was people hearing and not seeing.
[01:04:33.200 --> 01:04:37.920] It also shows that an expert with the animal has never heard much more than a hiss.
[01:04:37.920 --> 01:04:40.320] It is time to be skeptical.
[01:04:40.960 --> 01:04:46.080] I appreciate the correction, Doug, but you don't have to be so hurtful about it.
[01:04:46.080 --> 01:04:46.640] Yeah.
[01:04:47.280 --> 01:04:51.200] So, yes, I did as much as with the time that I had.
[01:04:51.200 --> 01:04:58.080] I tried to find a video of a Fisher or Fisher cat, which is the common name.
[01:04:58.080 --> 01:05:03.560] And apparently, Fisher Cat is a particularly regional name of New England.
[01:05:03.880 --> 01:05:08.520] So, by disrespecting that, you are disrespecting my culture and my family.
[01:05:09.800 --> 01:05:10.840] We will accept apologies.
[01:05:11.000 --> 01:05:13.160] We call them fisher cats in this family.
[01:05:13.480 --> 01:05:15.960] So, fisher is the technical name.
[01:05:15.960 --> 01:05:19.960] It is, however, they do have many common names, one of which is fisher cat.
[01:05:19.960 --> 01:05:23.160] If enough people use it, it becomes at least a common name.
[01:05:23.160 --> 01:05:25.160] The name Fisher, by the way, said, Well, it's not a cat.
[01:05:25.160 --> 01:05:26.040] It's also not a fisher.
[01:05:26.040 --> 01:05:27.880] It doesn't eat fish, doesn't fish.
[01:05:28.200 --> 01:05:30.120] That's a misnomer, too.
[01:05:30.120 --> 01:05:39.240] It probably derives from the fact that American settlers thought that it looked like a European pole cat.
[01:05:39.240 --> 01:05:42.840] And in French, that's like fichette or whatever.
[01:05:43.080 --> 01:05:47.000] So, like, it got bastardized to fisher.
[01:05:47.000 --> 01:05:48.200] Oh, we're good at doing that.
[01:05:48.440 --> 01:05:55.480] Not sure why it became fisher cat, but it does kind of look like a cat, so it's easy to say, well, some people thought it looked like a cat, so they call it a fisher cat, whatever.
[01:05:55.480 --> 01:05:58.040] Um, it is a member of the weasel family.
[01:05:58.040 --> 01:05:58.840] Pop.
[01:05:58.840 --> 01:05:59.320] Yep.
[01:05:59.320 --> 01:06:00.440] They're actually kind of cute.
[01:06:00.760 --> 01:06:02.920] They're pretty good predators.
[01:06:02.920 --> 01:06:11.480] And when we talked about the scream on the show, I mentioned the fact that somebody from Europe said, no, that's a fox.
[01:06:11.560 --> 01:06:19.960] Like, well, yeah, foxes have that same scream, and there's actually lots of websites that have how to tell the difference between a fisher scream and a red fox scream.
[01:06:19.960 --> 01:06:25.320] But now he's saying that it's all foxes, like there's no fisher scream.
[01:06:25.320 --> 01:06:27.320] And he, I think that may be correct.
[01:06:27.320 --> 01:06:31.560] I have not been able to find a video of a fisher screaming.
[01:06:32.040 --> 01:06:35.400] There's plenty of videos of fissures, but they're not screaming.
[01:06:35.400 --> 01:06:43.720] And there's video of like screaming coming from the woods, but you're not seeing the animal that's making that noise.
[01:06:43.720 --> 01:06:48.800] I have not been able to find any actual video of a fisher screaming.
[01:06:44.760 --> 01:06:51.920] So it's possible that it doesn't, in fact, happen.
[01:06:52.240 --> 01:06:56.960] And just from my reading, what experts say is they probably don't.
[01:06:57.280 --> 01:06:59.360] They mainly grunt and hiss.
[01:06:59.360 --> 01:07:02.560] They're predators and they're pretty quiet creatures.
[01:07:02.560 --> 01:07:08.800] But under stress, under distress, under rare circumstances, they may scream.
[01:07:08.800 --> 01:07:10.880] They can't rule that out.
[01:07:10.880 --> 01:07:14.160] But we don't really have, it turns out it's not well documented.
[01:07:14.160 --> 01:07:19.840] A lot of people assume that they do, but there's certainly a widespread belief that they do.
[01:07:19.840 --> 01:07:23.200] But there's no, it turns out there's no hard evidence.
[01:07:23.200 --> 01:07:24.160] I'll keep looking.
[01:07:24.160 --> 01:07:26.320] We could crowd towards this as well.
[01:07:26.320 --> 01:07:31.680] I don't know if there's any actual documented cases of a fisher screaming.
[01:07:31.680 --> 01:07:36.000] I could not find any, and it seems like the consensus is they probably don't, which is interesting.
[01:07:36.000 --> 01:07:40.960] All right, next email comes from Ron, who writes, Hello, fellow skeptics and fellow Connecticut residents.
[01:07:40.960 --> 01:07:46.000] There has been some buzz lately over the sighting of a UFO by a police officer in Connecticut.
[01:07:46.000 --> 01:07:48.880] He even filmed some of it on his cell phone.
[01:07:48.880 --> 01:07:51.280] You can Google it or go to the link in the news station.
[01:07:51.280 --> 01:07:53.440] I would love to see Steve comment on the story.
[01:07:53.440 --> 01:07:55.680] I think it's worth examining some of the logic in the story.
[01:07:55.680 --> 01:07:57.840] At any rate, I'm a huge fan of the SGU.
[01:07:57.840 --> 01:07:58.960] Keep up the good work.
[01:07:58.960 --> 01:08:00.160] And then he provides a link.
[01:08:00.160 --> 01:08:01.360] So, yes.
[01:08:01.840 --> 01:08:07.520] As Bob says, so this was a, yeah, this is in Fairfield County, like where we live.
[01:08:07.520 --> 01:08:13.600] And it was a police officer who, Robert Klein, who had, the story goes, right?
[01:08:13.600 --> 01:08:17.440] The story that he tells is it was like two in the morning, three in the morning.
[01:08:17.440 --> 01:08:17.840] Oh, is it?
[01:08:18.000 --> 01:08:19.360] He's pulling an overnight shift.
[01:08:19.360 --> 01:08:23.120] He's riding on a lonely country road with nobody around.
[01:08:23.120 --> 01:08:32.840] And then suddenly he's blinded by this bright light shining into his cabin of his police car and to the point where he couldn't really see.
[01:08:33.160 --> 01:08:39.560] And then he describes this glowing orange-red orb that was beaming the light.
[01:08:39.560 --> 01:08:43.560] And then it flew across the lake, and then you could see it sort of moving in the distance.
[01:08:43.560 --> 01:08:57.000] And then he sort of overcame the shock of the whole situation, took out his phone, and then taped, you know, filmed on his phone the UFO in more now more in the distance.
[01:08:57.000 --> 01:08:58.440] That's the story, right?
[01:08:59.880 --> 01:09:03.720] I know you guys have all had an opportunity to see the news reports and the video.
[01:09:03.720 --> 01:09:12.280] So, of course, the local news does a total unskeptical hatchet job of reporting this because they're interested only in the sensation.
[01:09:12.280 --> 01:09:21.800] And they pull for an expert, this guy, Ross Coulthard, who is an investigative journalist, but he's a UFO nut, right?
[01:09:21.800 --> 01:09:24.440] The bottom line is he's not really an expert in anything.
[01:09:24.440 --> 01:09:26.120] He's just a true believer.
[01:09:26.120 --> 01:09:39.480] And this guy goes, he proceeds to make every unskeptical, uncritical trope, you know, pro-UFO trope in this, in his interview on the topic.
[01:09:39.480 --> 01:09:40.920] It's just embarrassing.
[01:09:41.160 --> 01:09:43.320] His attitude towards skeptics was galling.
[01:09:43.320 --> 01:09:47.400] I mean, I just wanted to eviscerate all the straw men that he was throwing up.
[01:09:47.400 --> 01:09:55.960] He actually said, describing skeptics, he says that describing us as experiencing denialism in the face of overwhelming evidence.
[01:09:55.960 --> 01:09:58.680] Sure, that's what skeptics are all about, right?
[01:09:59.000 --> 01:10:03.800] But then later, like a minute later or two, he says, it's important to be skeptical.
[01:10:03.800 --> 01:10:05.320] Like, dude, make up your mind, right?
[01:10:05.320 --> 01:10:06.360] What are you talking about?
[01:10:06.360 --> 01:10:06.840] It's folks.
[01:10:07.080 --> 01:10:09.160] I just got so mad, so mad at him.
[01:10:09.560 --> 01:10:12.360] Yeah, he's emphasizing this guy's a 25-year veteran.
[01:10:12.360 --> 01:10:14.600] He's like, you know, so what?
[01:10:14.600 --> 01:10:15.760] That is irrelevant.
[01:10:14.840 --> 01:10:19.200] That's the credible witness kind of fallacy.
[01:10:20.080 --> 01:10:23.920] It doesn't change the fact that it was late, early in the morning.
[01:10:23.920 --> 01:10:24.800] He's by himself.
[01:10:24.800 --> 01:10:26.800] The guy could have been half asleep as far as we know.
[01:10:26.960 --> 01:10:28.080] The witching hour, 3 a.m.
[01:10:28.240 --> 01:10:28.880] That's when most of the time.
[01:10:29.040 --> 01:10:30.240] No, he made a point.
[01:10:30.240 --> 01:10:30.960] They made a point.
[01:10:30.960 --> 01:10:34.400] He made a point of saying it was early in his overnight shift.
[01:10:34.400 --> 01:10:34.640] So?
[01:10:35.120 --> 01:10:36.800] So he wasn't up, meaning he wasn't up.
[01:10:37.520 --> 01:10:39.120] But we don't know what his state of mind was.
[01:10:39.120 --> 01:10:43.520] Even shift workers are sleep deprived, period.
[01:10:43.840 --> 01:10:45.120] And so whatever.
[01:10:45.120 --> 01:10:46.320] But I don't know.
[01:10:46.320 --> 01:10:48.560] I don't have any first-hand knowledge of what his actual state was.
[01:10:48.560 --> 01:10:50.240] But it certainly is plausible.
[01:10:50.240 --> 01:11:03.600] But even without that, even if he was wide awake, it doesn't matter because he's assuming that he's not subject to optical perceptual illusions, right?
[01:11:04.000 --> 01:11:09.600] It doesn't matter how long you've been a police officer, you have the same susceptibility to visual illusions as everybody else.
[01:11:09.920 --> 01:11:11.520] If you're human, you're susceptible.
[01:11:11.520 --> 01:11:12.400] Absolutely.
[01:11:12.400 --> 01:11:15.360] So what's the simplest explanation for the evidence we have?
[01:11:15.360 --> 01:11:16.320] We basically have two things.
[01:11:16.320 --> 01:11:20.960] We have his story and we have the video, right?
[01:11:21.360 --> 01:11:22.720] I didn't see his video, though.
[01:11:22.720 --> 01:11:23.360] I just saw the research.
[01:11:24.160 --> 01:11:28.000] Well, yeah, the news did a recreation, which is utterly worthless.
[01:11:28.000 --> 01:11:28.640] It's worthless.
[01:11:29.360 --> 01:11:30.720] Oh, gosh, it was right out of it.
[01:11:30.800 --> 01:11:31.360] It was misleading.
[01:11:31.520 --> 01:11:31.760] I know.
[01:11:31.760 --> 01:11:32.480] It's like, what is it?
[01:11:32.480 --> 01:11:33.120] It's worthless.
[01:11:33.120 --> 01:11:33.600] What are you making?
[01:11:33.760 --> 01:11:34.880] Yeah, you're making a digital.
[01:11:35.120 --> 01:11:36.960] Is this fire in the sky, the movie?
[01:11:37.360 --> 01:11:38.240] It's terribly.
[01:11:38.560 --> 01:11:43.440] No, if you keep watching, Bob, you keep watching the news report, they then start to show his actual video.
[01:11:43.440 --> 01:11:43.920] Right.
[01:11:43.920 --> 01:11:45.360] And you know what it shows?
[01:11:45.360 --> 01:11:46.160] An airplane.
[01:11:46.160 --> 01:11:47.360] A light in the distance.
[01:11:47.360 --> 01:11:49.200] I guarantee you it's a drone.
[01:11:49.760 --> 01:11:50.080] Or something.
[01:11:50.320 --> 01:11:54.640] It's just a normal light not moving in any way that's unusual.
[01:11:54.640 --> 01:12:00.520] It's just, it's, I mean, completely and easily explained by.
[01:12:01.960 --> 01:12:02.840] That is so.
[01:12:02.840 --> 01:12:06.680] I mean, think about how he exaggerated that story then.
[01:12:06.680 --> 01:12:09.560] And he said, Jay, I can guarantee you it wasn't a drone.
[01:12:09.560 --> 01:12:10.440] Can you really?
[01:12:10.440 --> 01:12:11.400] Can you really guarantee?
[01:12:11.800 --> 01:12:16.280] The only hard evidence we have is 100% consistent with a drone.
[01:12:16.280 --> 01:12:20.920] And then he's saying, first of all, he could have underestimated how close it was, right?
[01:12:20.920 --> 01:12:24.680] He thought it was moving very fast when actually it was just super close.
[01:12:24.680 --> 01:12:37.720] You know, a drone with a light, if somebody was buzzing him with the drone because he was on the side of the road or whatever, and then he's interpreting the halo of light around, you know what I mean?
[01:12:37.720 --> 01:12:55.480] Like you see, if you watch that night and your eyes are dark adapted and then there's a bright light, and you could see it on the video, this sort of glow around the light source, then he's interpreting that as the thing itself, not just a visual, you know, aura around a bright light source.
[01:12:55.480 --> 01:12:57.240] So yeah, he just got surprised by that.
[01:12:57.240 --> 01:12:58.600] He misinterpreted what he saw.
[01:12:58.600 --> 01:13:03.080] He thought something was moving fast because it was probably closer than he thought, and then it flew off into the distance.
[01:13:03.080 --> 01:13:07.560] He gets out his phone and he videotapes a freaking drone.
[01:13:07.560 --> 01:13:09.720] So this is, again, it's a nothing burger.
[01:13:09.720 --> 01:13:11.000] It's much ado about nothing.
[01:13:11.000 --> 01:13:13.400] Again, the hard evidence is nothing.
[01:13:13.400 --> 01:13:17.160] It's just nothing out of the ordinary at all.
[01:13:17.560 --> 01:13:23.240] And even if, I mean, my first thought, I'm looking at the video now, like, yeah, that clearly could be a drone for sure.
[01:13:23.320 --> 01:13:24.920] So it's much, I'll say this.
[01:13:24.920 --> 01:13:29.000] It's much more likely than an alien, an extraterrestrial craft.
[01:13:29.000 --> 01:13:29.320] Right.
[01:13:29.560 --> 01:13:31.800] But you know what else would be more plausible than that?
[01:13:31.800 --> 01:13:32.920] How about ball lightning?
[01:13:33.240 --> 01:13:35.560] That's basically a thing at this point.
[01:13:35.800 --> 01:13:36.120] Right?
[01:13:36.120 --> 01:13:38.360] I mean, that could have potentially have been ball lightning.
[01:13:38.520 --> 01:13:39.640] In this video, it's a drone.
[01:13:39.640 --> 01:13:40.280] I mean, come on.
[01:13:40.520 --> 01:13:40.920] Yeah.
[01:13:40.920 --> 01:13:43.400] Looking at the video, I would definitely say it's a drone.
[01:13:43.400 --> 01:13:46.880] But, Bob, you're denying this overwhelming evidence, yeah, Bob.
[01:13:46.880 --> 01:13:48.400] Yeah, that's what I am.
[01:13:48.560 --> 01:13:49.360] Yeah, that's what I am.
[01:13:49.360 --> 01:13:50.000] I'm a skeptical.
[01:13:50.320 --> 01:13:50.960] I'm a denier.
[01:13:44.840 --> 01:13:51.760] You're a denier.
[01:13:52.640 --> 01:13:54.400] Yeah, the evidence is overwhelming.
[01:13:54.400 --> 01:13:54.960] Overwhelming.
[01:13:55.280 --> 01:13:56.080] Gotta be a UFO.
[01:13:56.160 --> 01:13:57.440] Gotta be an alien craft.
[01:13:57.440 --> 01:13:59.600] May I add another layer to this, please?
[01:13:59.920 --> 01:14:01.120] The sighting took place.
[01:14:01.200 --> 01:14:02.000] So, two things.
[01:14:02.000 --> 01:14:03.920] The sighting took place in 2022.
[01:14:03.920 --> 01:14:04.400] Yes.
[01:14:04.400 --> 01:14:09.120] And the police officer only came out recently and talked about it for the first time.
[01:14:09.120 --> 01:14:11.040] So there's a three-year gap here, folks.
[01:14:11.040 --> 01:14:11.600] Two years.
[01:14:11.600 --> 01:14:11.840] Right.
[01:14:11.840 --> 01:14:12.240] Of time.
[01:14:12.240 --> 01:14:13.520] A three-and-a-half year.
[01:14:13.520 --> 01:14:15.760] I'm sorry, two and a half-year gap of time.
[01:14:15.760 --> 01:14:18.080] So you have to take that into consideration as well.
[01:14:18.080 --> 01:14:26.560] I went back and I looked up some news items back in April of 2022 when this supposed incident took place in Connecticut.
[01:14:26.560 --> 01:14:27.920] Let's see, what do we have here?
[01:14:27.920 --> 01:14:29.200] Oh, Patch, Connecticut.
[01:14:29.600 --> 01:14:34.480] The Lyrid meteor shower of 2022 in April.
[01:14:34.480 --> 01:14:44.880] The first of the spring meteor showers producing about 15 to 20 shooting stars and hours, appearing as fireballs in the night sky.
[01:14:44.880 --> 01:14:46.000] Yeah, but this is not that.
[01:14:47.600 --> 01:14:51.760] I will propose that there are perhaps two separate incidents here.
[01:14:52.640 --> 01:14:55.520] Because he describes it as changing color.
[01:14:55.840 --> 01:15:00.800] And if you've seen these things fall from the sky, and I've seen them before, they do change color.
[01:15:01.120 --> 01:15:01.360] Right?
[01:15:01.360 --> 01:15:03.120] They'll start, you know, as hot white.
[01:15:03.120 --> 01:15:06.880] It'll go to green and then, you know, some shade of red or something.
[01:15:06.880 --> 01:15:08.320] So they will tend to change colour.
[01:15:08.320 --> 01:15:14.000] And they do light up like a flare in the sky, which is kind of what he was describing.
[01:15:14.000 --> 01:15:24.880] I would say by the time he finally oriented himself, got out of his car, he focused on something else entirely-the drone or whatever it was over across the distance of the lake.
[01:15:24.880 --> 01:15:26.960] I think he might be conflating two separate incidents.
[01:15:27.040 --> 01:15:27.840] That's possible.
[01:15:27.840 --> 01:15:28.480] Possibly.
[01:15:28.480 --> 01:15:29.040] That's possible.
[01:15:30.040 --> 01:15:43.960] Just saying, you know, a little research online, you know, if you look up what was happening locally at the time, this is right here, and you know, these meteor showers can produce those effects.
[01:15:43.960 --> 01:15:45.800] I don't care what the guy's testimony is.
[01:15:45.800 --> 01:15:46.840] It's irrelevant.
[01:15:46.840 --> 01:15:48.440] It doesn't matter.
[01:15:48.760 --> 01:15:49.960] We have hard evidence.
[01:15:49.960 --> 01:15:52.440] The hard evidence is completely unimpressive.
[01:15:52.440 --> 01:15:59.160] It's a light in the distance consistent with probably, again, this day and age, probably a drone.
[01:15:59.160 --> 01:16:06.280] The other thing that Coulthart goes on and on about is how much these sightings are increasing recently.
[01:16:06.280 --> 01:16:08.120] Gee, I wonder why that is.
[01:16:08.520 --> 01:16:13.880] Why would there be an increase in sightings of drone-like objects in the last few years?
[01:16:13.880 --> 01:16:16.840] What could possibly be the explanation for that?
[01:16:16.840 --> 01:16:17.640] I can't imagine.
[01:16:18.360 --> 01:16:19.560] Drone-like you say?
[01:16:19.560 --> 01:16:20.760] I don't know.
[01:16:21.080 --> 01:16:22.120] Unbelievable.
[01:16:22.120 --> 01:16:22.760] All right.
[01:16:22.760 --> 01:16:23.880] Terrible reporting.
[01:16:23.880 --> 01:16:25.640] Lack of skepticism.
[01:16:25.960 --> 01:16:29.720] Let's go on with science or fiction.
[01:16:31.960 --> 01:16:37.080] It's time for science or fiction.
[01:16:41.480 --> 01:16:50.600] Each week, I come up with three science news items or facts: two real and one fake, and then I challenge my panel of skeptics to tell me which one is the fake.
[01:16:50.920 --> 01:16:57.000] We have a theme this week, so these are just three facts that have to do with surprising statistics.
[01:16:57.000 --> 01:16:57.560] Surprise.
[01:16:57.720 --> 01:16:58.440] Here we go.
[01:16:58.440 --> 01:17:06.280] Item number one: koalas are the sleepiest animals, sleeping for 20 to 22 hours per day.
[01:17:06.280 --> 01:17:14.520] Item number two: about 10% of people have an atypical number of presacral spinal vertebra.
[01:17:14.520 --> 01:17:21.920] And item number three, of the human remains found at Machu Picchu, 80% are female.
[01:17:21.920 --> 01:17:22.880] Jay, go first.
[01:17:22.880 --> 01:17:26.640] All right, the first one here, koalas are the sleepiest animals.
[01:17:26.640 --> 01:17:29.440] I mean, I agree with that.
[01:17:29.440 --> 01:17:31.760] They have a hell of a life, those animals.
[01:17:32.080 --> 01:17:34.080] Yeah, I think that one is science.
[01:17:34.080 --> 01:17:39.600] The second one here, it says about 10% of people have an atypical number of pre-sacral.
[01:17:39.680 --> 01:17:40.400] Pre-sacral.
[01:17:40.400 --> 01:17:40.960] Sacral.
[01:17:40.960 --> 01:17:41.440] Sacral.
[01:17:41.760 --> 01:17:47.360] So what the spine does tell you, the spine number, there's the cervical, thoracic, lumbar, and sacral spine.
[01:17:47.360 --> 01:17:49.440] So not including the sacrum.
[01:17:49.440 --> 01:17:51.440] That's basically your tailbone, right?
[01:17:51.440 --> 01:17:54.800] For the cervical, thoracic, and lumbar part of the spine.
[01:17:54.800 --> 01:17:59.040] 10% of people have an atypical number of vertebra.
[01:17:59.040 --> 01:17:59.680] Oh, wow.
[01:18:00.000 --> 01:18:06.160] And then the last one here of the human remains found at Machu Picchu, 80% are female.
[01:18:06.160 --> 01:18:08.240] Okay, so this is what I think about that.
[01:18:08.240 --> 01:18:15.920] They did an awful lot of sacrificing, and I could come up with a reason why they mostly sacrificed women.
[01:18:15.920 --> 01:18:20.000] I just don't know enough about their culture to know if that was what they did.
[01:18:20.000 --> 01:18:25.280] You know, and then the second one, you said, you know, it's 10% of people have this atypical number.
[01:18:25.600 --> 01:18:28.240] That is a low, relatively low percentage number.
[01:18:28.240 --> 01:18:31.200] You know, it's like, okay, so, you know, 10% of the people have like...
[01:18:31.200 --> 01:18:34.080] They either have extra vertebra or they have too few vertebra.
[01:18:34.080 --> 01:18:34.720] Okay.
[01:18:34.720 --> 01:18:39.040] You know, I think I'm going to go with that one is the fake because I don't know.
[01:18:39.040 --> 01:18:43.760] Something is rubbing me the wrong way about people's spines having not the same number.
[01:18:44.560 --> 01:18:46.720] You don't have the same number of bones as everybody else.
[01:18:46.720 --> 01:18:47.920] I don't think so.
[01:18:47.920 --> 01:18:48.800] Okay, Evan.
[01:18:49.280 --> 01:18:54.320] Koala is the sleepiest animal, sleeping for 20 to 22 hours per day.
[01:18:54.320 --> 01:18:57.840] Oh, those eucalyptus leaves will do that to you, won't they?
[01:18:57.840 --> 01:18:58.400] Yes.
[01:18:58.880 --> 01:19:13.320] This could perhaps have been something we learned about on our first trip to Australia, and I seem to recall this being in sync with that excursion we did to the, what was it, the Sydney Zoo, I believe.
[01:19:14.440 --> 01:19:15.960] So I think that one's right.
[01:19:15.960 --> 01:19:23.080] And then the second one about 10% of people having an atypical number of presacral spinal vertebra.
[01:19:23.400 --> 01:19:26.600] Well, yeah, I have no idea.
[01:19:26.840 --> 01:19:32.680] Never, you know, was invited to anywhere to talk about this by any experts, so I don't know.
[01:19:33.080 --> 01:19:39.000] 10% of people, I guess that could be variations in.
[01:19:40.040 --> 01:19:47.560] I wonder if there are other parts of the anatomy that also have these types of atypical numbers, right?
[01:19:48.200 --> 01:19:50.840] Or, you know, other measurements that also equal that.
[01:19:51.000 --> 01:19:55.960] The last one about Machu Picchu, 80% are female.
[01:19:55.960 --> 01:20:02.120] So this one, I think, of the three is the one that I think I'm going to say is the fiction.
[01:20:02.440 --> 01:20:10.360] They would, gosh, unfortunately, I can only go based on kind of what I've seen on television shows and movies and things.
[01:20:10.360 --> 01:20:13.320] And who knows what that's all about.
[01:20:13.320 --> 01:20:15.800] And it's, I have no record.
[01:20:15.960 --> 01:20:21.880] I've seen those scenes in which they did sacrifice, have done sacrifices of people.
[01:20:21.880 --> 01:20:24.440] They all seem to be male characters.
[01:20:24.440 --> 01:20:31.520] That doesn't necessarily mean anything in the real world, but they were way skewed towards.
[01:20:32.040 --> 01:20:35.480] I don't think I ever saw a woman be sacrificed in any of those movies.
[01:20:36.040 --> 01:20:42.840] And 80% allows for a lot more range, I think, as far as all these options go.
[01:20:42.840 --> 01:20:46.720] So I'll put my nickel down there and say the machu Picchu one is the fiction.
[01:20:46.720 --> 01:20:47.920] Okay, Kara.
[01:20:47.920 --> 01:20:51.200] Okay, so koalas are sleepy.
[01:20:44.920 --> 01:20:51.680] This I know.
[01:20:51.840 --> 01:20:53.360] I don't know if they're the sleepiest.
[01:20:53.360 --> 01:20:54.960] I know that we have a koala.
[01:20:54.960 --> 01:21:02.800] You guys remember the story at the LA Zoo who didn't like to go up in the trees at night, and one time a keeper let it stay out and it slept on the ground.
[01:21:02.800 --> 01:21:08.720] And our resident Puma, who has since died, P22, snuck into the zoo and ate it.
[01:21:12.240 --> 01:21:13.840] That's what they get for sleeping so much.
[01:21:13.840 --> 01:21:17.840] I have no idea if they're the sleepiest, but yes, I think they are well known to be quite sleepy.
[01:21:17.840 --> 01:21:20.240] I don't think the eucalyptus is like a drug.
[01:21:20.240 --> 01:21:22.320] I think it just doesn't provide a whole lot of energy.
[01:21:22.320 --> 01:21:26.800] 10% of people have an atypical number of presacral spinal vertebra.
[01:21:26.800 --> 01:21:36.080] I mean, I feel like if you took 10%, if you took a population of people, 10% of them would have something anomalous everywhere.
[01:21:36.240 --> 01:21:36.960] You know what I mean?
[01:21:36.960 --> 01:21:39.360] Like maybe not something big like polydactyly.
[01:21:39.360 --> 01:21:40.480] That might not be 10%.
[01:21:40.480 --> 01:21:50.560] But I bet you if you looked at like a hundred hearts, 10% of them would have like a weird little, like it would be bending in a different direction or like there'd be a weird little extra something here.
[01:21:50.560 --> 01:21:51.360] Like who knows?
[01:21:51.360 --> 01:21:54.160] But that doesn't seem unreasonable, 10%.
[01:21:54.160 --> 01:22:03.040] I don't know why both of you think that this human remains found in Machu Picchu are like sacrificial remains.
[01:22:03.040 --> 01:22:04.560] That feels like a leap to me.
[01:22:04.560 --> 01:22:07.440] Wasn't Machu Picchu just in cities?
[01:22:07.440 --> 01:22:09.520] Yeah, I feel like, wasn't this just a city?
[01:22:09.520 --> 01:22:11.280] And people died in cities.
[01:22:11.280 --> 01:22:12.160] And then they have buried.
[01:22:12.160 --> 01:22:14.240] They're getting beheaded by the priests.
[01:22:14.880 --> 01:22:15.680] I don't know.
[01:22:15.680 --> 01:22:20.880] I think there were just dead people at this ancient Inca city.
[01:22:21.360 --> 01:22:24.560] So I don't know why that would be 80% female.
[01:22:24.720 --> 01:22:25.480] Right, why would that be?
[01:22:25.560 --> 01:22:26.960] That city was 80% female?
[01:22:27.120 --> 01:22:29.440] It should be more like maybe 55% or something.
[01:22:29.560 --> 01:22:30.360] That's what I'm thinking.
[01:22:29.760 --> 01:22:35.160] Yeah, I'm thinking maybe that one's not science just because it should be half.
[01:22:35.480 --> 01:22:39.240] So I'm going to put my I'm going to lightly put my nickel on that.
[01:22:39.240 --> 01:22:40.360] Woo-hoo, two nickels.
[01:22:40.360 --> 01:22:41.160] Okay, and Bob.
[01:22:41.640 --> 01:22:43.720] All right, so koalas, yeah, that makes sense.
[01:22:43.880 --> 01:22:53.240] I've heard as well that they're very sleepy, and part of me is thinking Steve knows he remembers that day when we all learned that, and now they found out that they're not.
[01:22:53.960 --> 01:22:56.840] Oh, he's the case.
[01:22:58.440 --> 01:22:59.560] I will not be a happy person.
[01:22:59.960 --> 01:23:01.720] It's a distinct possibility.
[01:23:02.520 --> 01:23:02.920] Don't even.
[01:23:03.160 --> 01:23:03.640] Let's see.
[01:23:03.800 --> 01:23:08.520] The variation in the vertebra, it seems a little high, but I can see that.
[01:23:08.520 --> 01:23:11.240] I mean, we got some, there's some crazy variations out there.
[01:23:11.720 --> 01:23:18.280] Even with types of muscles, some people have an extra muscle in their forearm that most people don't have.
[01:23:18.280 --> 01:23:19.560] It's crazy stuff.
[01:23:19.560 --> 01:23:22.200] So variation like that, I can kind of see.
[01:23:22.200 --> 01:23:27.320] But the remains found in Machu Picchu, I don't know anything about Machu Picchu at all, hardly.
[01:23:27.320 --> 01:23:28.920] And just mainly based on that.
[01:23:28.920 --> 01:23:32.920] The other ones I'm more familiar with, I'll just say that one is fiction.
[01:23:32.920 --> 01:23:33.640] All right.
[01:23:33.640 --> 01:23:35.880] So you guys all agree on the first one?
[01:23:35.880 --> 01:23:36.840] So we'll start there.
[01:23:36.840 --> 01:23:42.120] Koalas are the sleepiest animals sleeping for 20 to 22 hours per day.
[01:23:42.120 --> 01:23:44.440] You guys all think this one is science.
[01:23:44.440 --> 01:23:45.080] Better be science.
[01:23:45.480 --> 01:23:48.200] This one is science.
[01:23:48.200 --> 01:23:49.080] Is it science?
[01:23:49.960 --> 01:23:51.320] Yeah, we would have killed you.
[01:23:52.840 --> 01:23:59.800] There are other, you know, almost as sleepy animals, but yeah, the koala is still considered to be the most sleepiest of the animals.
[01:24:00.200 --> 01:24:01.080] Most sleepy.
[01:24:01.080 --> 01:24:01.720] And yeah, 20.
[01:24:01.880 --> 01:24:03.720] You're just sleeping 22 hours a day.
[01:24:03.880 --> 01:24:04.840] Yeah, Kara, I think you're right.
[01:24:04.840 --> 01:24:07.880] I think it's just because they don't get a lot of energy out of those eucalyptus leaves.
[01:24:08.440 --> 01:24:10.440] Oh, yeah, they're bereft of nutrition.
[01:24:10.440 --> 01:24:13.640] They're just like hypoglycemic all the time.
[01:24:14.040 --> 01:24:18.240] Well, sleeping is an adaptation to energy efficiency, right?
[01:24:18.240 --> 01:24:21.280] You don't want to expend any more energy than you absolutely have to.
[01:24:21.280 --> 01:24:22.720] All right, I guess we'll take these in order.
[01:24:22.720 --> 01:24:27.120] About 10% of people have an atypical number of pre-sacral spinal vertebra.
[01:24:27.120 --> 01:24:28.960] Jay, you think this one is the fiction.
[01:24:28.960 --> 01:24:31.280] Everyone else thinks this one is science.
[01:24:31.280 --> 01:24:36.080] And this one is the question is: is it 1%?
[01:24:36.720 --> 01:24:38.480] Is it 20%?
[01:24:38.480 --> 01:24:39.760] Is it 5%?
[01:24:39.760 --> 01:24:41.360] Is it 0.1%?
[01:24:42.160 --> 01:24:43.360] 90%.
[01:24:43.360 --> 01:24:46.080] This one is science.
[01:24:46.080 --> 01:24:47.360] This is science.
[01:24:47.360 --> 01:24:53.840] Now, that number is, there are different estimates, right, depending on the method that is used.
[01:24:53.840 --> 01:25:03.280] But the best studies, ones where they do like total spine MRI scans, so you could count every single vertebra, come in at around 10%.
[01:25:03.280 --> 01:25:04.720] Which is actually kind of high.
[01:25:04.720 --> 01:25:06.080] But you're, I mean, Kara, you're right.
[01:25:06.080 --> 01:25:12.240] There's the what we learn of as normal anatomy is like 80-90%.
[01:25:12.480 --> 01:25:12.960] Guideline.
[01:25:13.120 --> 01:25:15.120] Yeah, that's sure.
[01:25:15.120 --> 01:25:18.640] Like maybe sometimes it's 60%, sometimes it's 90%.
[01:25:18.640 --> 01:25:19.440] Sometimes it's higher.
[01:25:19.440 --> 01:25:22.400] Sometimes, depending on how absolutely critical it is.
[01:25:22.400 --> 01:25:29.120] The cervical vertebra are pretty much very rare to have an abnormal number of neck vertebra.
[01:25:29.120 --> 01:25:34.800] But thoracic and lumbar are not are pretty common, you know, adding up to about 10%.
[01:25:34.800 --> 01:25:41.920] People have like six lumbar vertebra or five lumbar vertebra or they have 11 or 13 thoracic vertebra.
[01:25:41.920 --> 01:25:49.360] Interestingly, this can result in surgeons operating on the wrong vertebra.
[01:25:49.600 --> 01:25:50.240] Oh, crap.
[01:25:50.240 --> 01:25:51.520] You better count them all.
[01:25:51.520 --> 01:25:51.880] Yeah, exactly.
[01:25:52.040 --> 01:25:59.960] And in fact, they think that abnormal vertebral numbers are responsible for 40% of the times when that happens.
[01:25:59.600 --> 01:26:01.640] Holy almost all of them.
[01:26:02.760 --> 01:26:04.440] Don't they do a full x-ray before?
[01:26:04.920 --> 01:26:10.360] Well, they just x-ray, but they might do a thoracic MRI scan and say, oh, yeah, that's the vertebra that we need to go for.
[01:26:10.840 --> 01:26:12.040] They're going to do the whole back.
[01:26:12.360 --> 01:26:20.840] Well, that's the paper that I was reading is basically saying, yes, we should do that because people were underestimating how frequent this is.
[01:26:21.320 --> 01:26:27.000] And it actually can affect trying to figure out which level is causing the pinched nerve.
[01:26:27.000 --> 01:26:37.880] Like, you have an L2 pinched nerve, and you count out the L2 vertebra, but it's like, no, but it's actually the L1 because there's an extra vertebra in there or whatever.
[01:26:38.760 --> 01:26:40.760] It can cause the surgeons to make a mistake.
[01:26:40.760 --> 01:26:45.480] So we have to consider variability in anatomy.
[01:26:45.480 --> 01:26:47.480] It's more common than we think.
[01:26:47.720 --> 01:26:51.800] Doing nerve conduction studies for the last 20 years, this comes up a lot there too.
[01:26:52.760 --> 01:26:58.280] You see the diagram of the nerve innervation, both of the skin and of the muscles and everything.
[01:26:58.280 --> 01:27:02.520] It's like, yeah, that's like, yeah, it's 60%, 70%, 80%, 90% of the time.
[01:27:02.520 --> 01:27:08.200] But depending on what nerve you're looking at, especially the farther you get downstream, you know what I mean?
[01:27:08.200 --> 01:27:13.480] Like, yeah, everyone has an aorta, but the smaller you get, the more variability there is.
[01:27:13.480 --> 01:27:21.080] And people have a lot of variability in their, like the nerve endings, even when you get to second, third degree, you know, branchings.
[01:27:21.400 --> 01:27:22.760] So yeah, a lot of variability.
[01:27:22.760 --> 01:27:28.200] Okay, that means that of the human remains found in Machu Picchu, 80% are female is the fiction.
[01:27:28.200 --> 01:27:30.520] However, I didn't make this up.
[01:27:31.000 --> 01:27:34.440] There was a study which showed that this was the case.
[01:27:34.440 --> 01:27:41.320] An archaeologist surveying the bones found that was their estimate.
[01:27:41.560 --> 01:27:44.480] 80% of the remains that they found in Machu Picchu were female.
[01:27:44.120 --> 01:27:49.040] But a later re-examination found out, no, they're 50-50.
[01:27:49.840 --> 01:27:51.520] Why did they think that so many were female?
[01:27:51.680 --> 01:27:56.400] I think they were using pelvic examination and then they later did DNA analysis.
[01:27:56.960 --> 01:27:57.840] Yeah, but I wonder why.
[01:27:57.840 --> 01:28:00.640] Oh, well, weren't they very, very small people?
[01:28:01.360 --> 01:28:03.920] Well, they're smaller than modern humans.
[01:28:04.560 --> 01:28:05.440] But even modernization.
[01:28:05.520 --> 01:28:14.160] But you can either sampling error or they just didn't, you know, if you're just going by how the pelvis looks, they maybe didn't do a good job.
[01:28:14.160 --> 01:28:19.520] Or there was, you know, again, some kind of sampling issue in terms of what they were looking for.
[01:28:19.840 --> 01:28:23.360] But a more thorough later analysis found that it was basically 50-50.
[01:28:23.360 --> 01:28:25.360] Now, what was Machu Picchu used for?
[01:28:25.360 --> 01:28:25.600] Right?
[01:28:25.600 --> 01:28:26.000] This has come up.
[01:28:26.000 --> 01:28:33.040] So Machu Picchu is like the crown jewel of the Incan Empire in terms of our lost city of the Incas.
[01:28:33.040 --> 01:28:38.560] It is considered to be, you know, it's the most famous remains of an Incan city.
[01:28:38.560 --> 01:28:39.280] It's beautiful.
[01:28:39.280 --> 01:28:41.520] I don't know if you guys, I've never been there, but I've seen pictures of it.
[01:28:41.520 --> 01:28:42.160] Oh, yeah, the pictures.
[01:28:42.240 --> 01:28:43.360] It looks amazing.
[01:28:44.240 --> 01:28:44.800] And we're not really sure.
[01:28:44.880 --> 01:28:45.760] It looks like a face.
[01:28:45.760 --> 01:28:47.680] Like the mountain looks like a face on its side.
[01:28:47.680 --> 01:28:48.000] Yeah, yeah.
[01:28:49.440 --> 01:28:53.280] But you know, like modern Peruvians are the shortest people in the world.
[01:28:53.280 --> 01:28:53.920] Yeah.
[01:28:53.920 --> 01:28:54.880] So that's interesting.
[01:28:54.880 --> 01:28:56.720] Like, I was just looking it up.
[01:28:56.720 --> 01:29:02.960] An average Peruvian woman is five feet, and an average Peruvian man is five feet four.
[01:29:02.960 --> 01:29:04.160] Wow, that is short.
[01:29:04.160 --> 01:29:06.560] So it might be hard just based on.
[01:29:06.720 --> 01:29:08.640] It might be hard if they're more petite, you know.
[01:29:08.640 --> 01:29:14.800] Yeah, to tell the difference of ancient remains if you're only looking at the way they look.
[01:29:14.800 --> 01:29:17.680] So there were human sacrifices in the area.
[01:29:19.040 --> 01:29:25.360] There are temples at Machu Picchu, so it does have some religious significance, but they're not exactly sure.
[01:29:25.360 --> 01:29:27.680] It probably wasn't just a regular city.
[01:29:27.680 --> 01:29:36.520] They think it might have been like the royal city or just like where the you know, the rich people lived, or it could have been mostly of religious significance.
[01:29:36.840 --> 01:29:39.080] It wasn't just a place to sacrifice people.
[01:29:39.080 --> 01:29:42.200] You know, again, that wasn't the main thing that was happening.
[01:29:42.280 --> 01:29:43.080] It was like a whole city.
[01:29:43.080 --> 01:29:44.040] It was a city, yeah.
[01:29:44.040 --> 01:29:51.480] But there were, they, there were, they said nearby, like there was, you know, places where they did human sacrifices.
[01:29:51.720 --> 01:29:53.400] But that wasn't, yeah, but Cara is right.
[01:29:53.400 --> 01:29:58.760] That's not the bodies they were, and they were not just looking at the bodies of sacrificed individuals, that's the people in the city.
[01:29:58.760 --> 01:30:02.440] Yeah, they didn't just throw all the people into it like somewhere else.
[01:30:03.240 --> 01:30:05.880] Well, there are sites, though, CARA.
[01:30:05.880 --> 01:30:08.920] There are sites where there are just pits of sacrificed people.
[01:30:09.800 --> 01:30:10.760] Yeah, I'm not surprised.
[01:30:10.760 --> 01:30:11.880] This is not one of them, though.
[01:30:14.120 --> 01:30:20.200] And then they're all like have caved in skulls and stuff, like signs of extreme trauma.
[01:30:20.200 --> 01:30:20.920] Oh, gosh.
[01:30:21.400 --> 01:30:22.440] Nasty, what happened?
[01:30:23.160 --> 01:30:23.560] Yikes.
[01:30:23.720 --> 01:30:24.600] People are gross.
[01:30:24.600 --> 01:30:25.400] Brutal.
[01:30:25.400 --> 01:30:25.960] All right.
[01:30:25.960 --> 01:30:27.320] Well, good job, guys.
[01:30:27.320 --> 01:30:28.200] You got it.
[01:30:28.200 --> 01:30:30.440] Evan, give us a quote.
[01:30:30.760 --> 01:30:34.600] This week's quote was suggested by listener Pat from Michigan.
[01:30:34.600 --> 01:30:35.560] Thank you, Pat.
[01:30:35.560 --> 01:30:47.800] If an outsider perceives something wrong with a core scientific model, the humble and justified response of that curious outsider should be to ask, what mistake am I making?
[01:30:47.800 --> 01:30:51.800] Before assuming 100% of the experts are wrong.
[01:30:51.800 --> 01:30:52.360] Yeah.
[01:30:52.680 --> 01:30:54.120] David Brin.
[01:30:54.440 --> 01:30:55.000] Great honor.
[01:30:55.080 --> 01:30:55.400] Brin.
[01:30:55.560 --> 01:30:56.040] David Brin.
[01:30:56.280 --> 01:30:56.840] David Brin.
[01:30:56.840 --> 01:30:57.160] Oh, my God.
[01:30:57.320 --> 01:30:58.280] The Uplift Wars.
[01:30:58.280 --> 01:30:58.760] Yeah, very good.
[01:30:59.480 --> 01:31:00.040] I still remember.
[01:31:00.120 --> 01:31:01.240] It's been like, what, 30 years?
[01:31:01.240 --> 01:31:02.400] Dev, I still remember those stories.
[01:31:02.760 --> 01:31:03.400] Great series.
[01:31:03.400 --> 01:31:04.680] Great science fiction series.
[01:31:04.680 --> 01:31:05.160] Yeah.
[01:31:05.480 --> 01:31:10.600] Yeah, really, the first science fiction series I read with actually alien aliens.
[01:31:10.600 --> 01:31:11.240] You know what I mean?
[01:31:11.240 --> 01:31:13.760] Not just humanoid aliens, but right, not Star Trek.
[01:31:13.920 --> 01:31:15.120] Completely alien aliens.
[01:31:16.640 --> 01:31:17.760] Very good.
[01:31:17.760 --> 01:31:18.320] All right.
[01:31:18.640 --> 01:31:19.680] Thanks, Evan.
[01:31:14.920 --> 01:31:20.000] Thanks.
[01:31:20.320 --> 01:31:22.880] Well, thank you all for joining me this week.
[01:31:22.880 --> 01:31:23.520] Sure, man.
[01:31:23.520 --> 01:31:24.480] You got it, brother.
[01:31:24.480 --> 01:31:29.200] And until next week, this is your Skeptics Guide to the Universe.
[01:31:31.760 --> 01:31:38.480] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:31:38.480 --> 01:31:43.120] For more information, visit us at theskepticsguide.org.
[01:31:43.120 --> 01:31:47.040] Send your questions to info at the skepticsguide.org.
[01:31:47.040 --> 01:31:57.680] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:31:57.680 --> 01:32:01.360] Our listeners and supporters are what make SGU possible.