Debug Information
Processing Details
- VTT File: skepticast2024-07-27.vtt
- Processing Time: September 11, 2025 at 03:46 PM
- Total Chunks: 3
- Transcript Length: 167,268 characters
- Caption Count: 1,856 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.080 --> 00:00:03.040] Mint is still $15 a month for premium wireless.
[00:00:03.040 --> 00:00:07.440] And if you haven't made the switch yet, here are 15 reasons why you should.
[00:00:07.440 --> 00:00:09.520] One, it's $15 a month.
[00:00:09.520 --> 00:00:12.480] Two, seriously, it's $15 a month.
[00:00:12.480 --> 00:00:14.320] Three, no big contracts.
[00:00:14.320 --> 00:00:15.280] Four, I use it.
[00:00:15.280 --> 00:00:16.480] Five, my mom uses it.
[00:00:16.480 --> 00:00:17.760] Are you- Are you playing me off?
[00:00:17.840 --> 00:00:19.200] That's what's happening, right?
[00:00:19.200 --> 00:00:22.640] Okay, give it a try at mintmobile.com slash switch.
[00:00:22.640 --> 00:00:25.840] Upfront payment of $45 for three-month plan, $15 per month equivalent required.
[00:00:25.840 --> 00:00:27.040] New customer offer first three months only.
[00:00:27.040 --> 00:00:28.240] Then full price plan options available.
[00:00:28.240 --> 00:00:29.120] Taxes and fees extra.
[00:00:29.120 --> 00:00:30.720] See Mintmobile.com.
[00:00:33.280 --> 00:00:36.560] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.560 --> 00:00:39.440] Your escape to reality.
[00:00:40.080 --> 00:00:42.720] Hello and welcome to The Skeptic's Guide to the Universe.
[00:00:42.720 --> 00:00:47.600] Today is Wednesday, July 24th, 2024, and this is your host, Stephen Novella.
[00:00:47.600 --> 00:00:49.520] Joining me this week are Bob Novella.
[00:00:49.520 --> 00:00:50.080] Hey, everybody.
[00:00:50.080 --> 00:00:51.440] Kara Santa Maria.
[00:00:51.440 --> 00:00:51.920] Howdy.
[00:00:51.920 --> 00:00:52.880] Jay Novella.
[00:00:52.880 --> 00:00:53.600] Hey, guys.
[00:00:53.600 --> 00:00:55.200] And Evan Bernstein.
[00:00:55.200 --> 00:00:56.320] Good evening, everyone.
[00:00:56.320 --> 00:01:03.760] So I want to start off by thanking the millions of people who emailed me about this past New York Times Sunday crossword puzzle.
[00:01:03.760 --> 00:01:04.640] Oh, boy.
[00:01:04.960 --> 00:01:06.640] Which I've been doing for a while.
[00:01:06.640 --> 00:01:11.680] The, you know, the New York Times crossword puzzle, my daily sort of bathroom game routine.
[00:01:11.680 --> 00:01:13.600] And the Sunday ones are tough.
[00:01:13.600 --> 00:01:14.640] Well, they're just big.
[00:01:14.640 --> 00:01:16.880] Yeah, they're a little, they're definitely the clues are more difficult.
[00:01:16.880 --> 00:01:17.600] They're a little bit more difficult.
[00:01:17.840 --> 00:01:19.200] Well, Sunday is supposed to be...
[00:01:19.200 --> 00:01:20.400] I've done a lot of research on this.
[00:01:20.400 --> 00:01:22.640] Sunday is supposed to be the same difficulty as Wednesday.
[00:01:22.640 --> 00:01:23.360] Is that right?
[00:01:23.360 --> 00:01:25.040] So it starts Monday, which is the easiest.
[00:01:25.200 --> 00:01:27.840] On Monday, yeah, I guess I always see the contract between Sunday and Monday.
[00:01:27.840 --> 00:01:28.400] Monday's.
[00:01:28.480 --> 00:01:29.440] That's why, yeah.
[00:01:29.680 --> 00:01:31.680] The hardest puzzle is Saturday.
[00:01:31.680 --> 00:01:38.000] And then it's, I think it dates back to when people would only get the Sunday puzzle because they would only get the paper on Sunday.
[00:01:38.000 --> 00:01:41.360] So they tried to make that one right in the middle, but it's much larger.
[00:01:41.360 --> 00:01:42.000] Right.
[00:01:42.240 --> 00:01:43.280] So it takes a lot longer to do.
[00:01:43.440 --> 00:01:45.920] There was a theme this past Sunday.
[00:01:45.920 --> 00:01:46.720] And the theme was.
[00:01:46.960 --> 00:01:49.120] The theme was logical fallacies.
[00:01:49.120 --> 00:01:51.440] Yeah, it was so exciting when I was doing it.
[00:01:52.400 --> 00:01:53.200] I know the answer.
[00:01:53.440 --> 00:01:54.800] Wait, formal or informal?
[00:01:55.120 --> 00:01:55.600] Informal.
[00:01:55.600 --> 00:01:56.400] Informal, logical fallacy.
[00:01:56.480 --> 00:01:57.760] Those are the best ones.
[00:01:57.760 --> 00:02:05.400] It was great because you have like these five, whatever, six clues that span the whole puzzle because they're long answers that were really easy for us to get.
[00:02:06.280 --> 00:02:09.080] And it just sort of sets up the whole crossword puzzle.
[00:02:09.080 --> 00:02:11.080] So it was, yeah, it was sweet.
[00:02:11.080 --> 00:02:11.640] Yeah.
[00:02:13.560 --> 00:02:15.880] You know, like one of them was no true Scotsman.
[00:02:15.880 --> 00:02:16.920] Like that was the answer.
[00:02:16.920 --> 00:02:18.680] And for us, it was like, oh, no true Scotsman.
[00:02:18.680 --> 00:02:18.760] Yeah.
[00:02:18.920 --> 00:02:20.360] Post hoc ergo proctor hog.
[00:02:20.920 --> 00:02:21.400] Yeah.
[00:02:21.400 --> 00:02:24.760] It took a second, though, because the clues were so weird.
[00:02:24.760 --> 00:02:25.320] Well, they didn't.
[00:02:25.320 --> 00:02:27.960] Like, they just would be like, this is the best clue.
[00:02:27.960 --> 00:02:30.600] If you don't like this clue, you don't like crossword puzzles.
[00:02:30.760 --> 00:02:33.960] Yeah, they just gave like examples all self-referential.
[00:02:33.960 --> 00:02:34.360] Yeah.
[00:02:34.840 --> 00:02:37.640] But then once you got it, you're like, oh my God, this is so obvious.
[00:02:37.640 --> 00:02:41.480] And they even threw in a Lord of the Rings reference in one of the clues, too.
[00:02:41.480 --> 00:02:42.040] Really?
[00:02:42.040 --> 00:02:42.440] Yeah.
[00:02:42.440 --> 00:02:44.120] Yeah, those always screw me.
[00:02:45.560 --> 00:02:47.960] That was just a bonus easy one for me.
[00:02:47.960 --> 00:02:53.800] You can always text us, Kara, if a Star Wars or a Star Trek or Lord of the Rings comes.
[00:02:54.040 --> 00:03:03.720] Well, okay, so this is very nerdy, but one of my best friends who does not live in the same town as me, she and I do the crossword together every night.
[00:03:04.040 --> 00:03:05.720] It's like our bonding time.
[00:03:05.720 --> 00:03:07.480] And she does it digitally.
[00:03:07.480 --> 00:03:11.800] So she does it on her phone, and I do it auditorily.
[00:03:11.800 --> 00:03:13.480] Like I'm never looking at it.
[00:03:13.960 --> 00:03:17.240] So she just reads me the clues and then tells me how many letters.
[00:03:17.240 --> 00:03:19.720] And then collectively we always get them done.
[00:03:20.120 --> 00:03:20.600] Wait a minute.
[00:03:20.920 --> 00:03:22.280] That's challenging, Kara.
[00:03:22.440 --> 00:03:22.680] It is.
[00:03:23.160 --> 00:03:27.720] Because you don't have the benefit of seeing certain letters in certain positions that have already been established.
[00:03:27.720 --> 00:03:29.960] Yeah, it's difficult, but it's fun.
[00:03:30.280 --> 00:03:35.480] And because we do it together, we always finish because it's like, you know, obviously two heads are better than one on a crossroads.
[00:03:35.640 --> 00:03:37.400] Yeah, often it's my wife and I do it together.
[00:03:37.400 --> 00:03:43.480] If we're driving on a long trip, I'll read the puzzle while she's driving because it keeps her engaged and everything.
[00:03:43.480 --> 00:03:44.720] It is telling you you have to describe it.
[00:03:44.720 --> 00:03:47.440] It's the seven letters, the third letter is, oh, you know.
[00:03:47.440 --> 00:03:47.920] Yeah.
[00:03:44.520 --> 00:03:49.280] But it isn't the same thing.
[00:03:49.440 --> 00:03:50.240] It's really challenging, right?
[00:03:51.360 --> 00:03:54.080] And we have different skills, you know, my friend and I.
[00:03:54.080 --> 00:03:56.320] And so she'll be like, this one's one for you, you know.
[00:03:56.320 --> 00:03:57.360] And then we have this joke.
[00:03:57.360 --> 00:04:10.560] I can't believe I'm saying this on air, but every time we can't, every time we don't know the answer, especially if it's like a, I don't know, a reference to some obscure geographical thing or if it's like a Shakespeare reference or something, we're like, uncultured swine.
[00:04:13.120 --> 00:04:14.960] I don't know.
[00:04:16.240 --> 00:04:19.200] How dare you not know your Tolstoy by heart?
[00:04:19.200 --> 00:04:20.000] I know it.
[00:04:20.000 --> 00:04:21.920] So here's a life hack for you guys.
[00:04:21.920 --> 00:04:22.400] Okay.
[00:04:22.480 --> 00:04:23.200] Thank you.
[00:04:23.200 --> 00:04:30.320] If ever you're, you know, out of town, let's say in a remote area, make sure you have a set of backup keys for your car.
[00:04:31.280 --> 00:04:35.600] You mean like if you were, say, in a theoretical place like New Hampshire?
[00:04:35.600 --> 00:04:41.040] In New Hampshire, like northern New Hampshire, by a pond that was really remote.
[00:04:41.040 --> 00:04:43.360] Oh, so you don't have like limited cell phones.
[00:04:43.520 --> 00:04:44.720] You don't have cell reception.
[00:04:44.720 --> 00:04:45.440] Oh, no.
[00:04:45.440 --> 00:04:50.880] Well, we have Starlink there, so we're plugged into the world that way, as long as we have power.
[00:04:51.120 --> 00:04:55.360] But yeah, cell phones are useless without that in terms of connecting to anything.
[00:04:55.360 --> 00:05:00.800] So we were up there this past weekend trying to get it ready, you know, because there's a lot of work to do on it.
[00:05:00.800 --> 00:05:04.800] And my wife needed to borrow the keys for some reason to get into the car to get the whatever.
[00:05:04.800 --> 00:05:09.920] And then like basically, it's getting like three, four o'clock in the afternoon on Sunday.
[00:05:09.920 --> 00:05:11.920] We're just about ready to come home.
[00:05:12.320 --> 00:05:13.680] Cannot find the keys.
[00:05:13.680 --> 00:05:14.160] Yeah.
[00:05:15.120 --> 00:05:16.160] And we have one set.
[00:05:16.120 --> 00:05:16.960] It's like it's a fob.
[00:05:17.040 --> 00:05:19.760] You know, it's a very keyless, you know, it's like your push button.
[00:05:20.080 --> 00:05:20.720] Fob.
[00:05:22.400 --> 00:05:23.680] Don't give it away, Bob.
[00:05:24.320 --> 00:05:32.440] And the upshot is: you know, my wife had the key on her person and must have dropped it at some point during the day.
[00:05:29.840 --> 00:05:36.440] But it could have been any time over the last several hours.
[00:05:36.760 --> 00:05:39.720] Much of that time she spent on the pond.
[00:05:39.720 --> 00:05:43.480] So yeah, we figured 99% it's in the water.
[00:05:43.480 --> 00:05:44.840] Like that thing is gone.
[00:05:44.840 --> 00:05:47.080] Yeah, that we could not turn it up anywhere.
[00:05:47.080 --> 00:05:47.960] Okay, so what do you do?
[00:05:47.960 --> 00:05:48.120] Right?
[00:05:48.120 --> 00:05:49.640] What are the options at this point in time?
[00:05:49.640 --> 00:05:50.360] You know, I got to get home.
[00:05:50.360 --> 00:05:51.400] I've got to get to work on Monday.
[00:05:51.800 --> 00:05:53.160] Can it be remote started?
[00:05:53.160 --> 00:05:57.320] So theoretically, there are cars that have that feature.
[00:05:57.320 --> 00:05:59.640] This has the Acura link, right?
[00:05:59.640 --> 00:06:01.000] I check into that.
[00:06:01.080 --> 00:06:09.480] Turns out that the 3G on which it is based was discontinued and it's not available at all by hook or by crook.
[00:06:09.480 --> 00:06:11.160] Like the 3G is not available.
[00:06:11.560 --> 00:06:13.160] Like the bandwidth doesn't exist anymore.
[00:06:13.160 --> 00:06:15.960] Like Acura's like, well, I guess that service is over.
[00:06:17.960 --> 00:06:19.880] I hope you're not paying for that service anymore.
[00:06:19.880 --> 00:06:20.600] Well, I wasn't.
[00:06:20.600 --> 00:06:21.960] I was going to do whatever.
[00:06:21.960 --> 00:06:22.680] Just start it up.
[00:06:22.680 --> 00:06:23.960] I was like, how do I sign up for this?
[00:06:24.600 --> 00:06:30.040] And basically, the end of my hour of trying to solve this problem was it doesn't exist anymore.
[00:06:30.040 --> 00:06:34.440] So then, yeah, so then we're just like, all right, we've got to get an emergency blocksmith, right?
[00:06:34.520 --> 00:06:37.800] We're doing this all, obviously, you know, at the same time.
[00:06:37.800 --> 00:06:42.280] And just after about a dozen connections, we have AAA, you know, everything.
[00:06:42.280 --> 00:06:43.720] We did everything we could.
[00:06:43.720 --> 00:06:45.960] There basically was nobody available.
[00:06:45.960 --> 00:06:46.520] No.
[00:06:46.520 --> 00:06:49.800] Yeah, so okay, well, we've got to rent a car to get back home.
[00:06:49.800 --> 00:06:52.120] We'll have to come back another weekend to pick up the car.
[00:06:52.120 --> 00:06:55.640] There was no rent-a-car available in northern New Hampshire.
[00:06:55.640 --> 00:06:58.440] Like, we could not get a rent-a-car.
[00:06:58.440 --> 00:06:59.480] So we were stuck.
[00:06:59.480 --> 00:07:03.640] We were literally stuck for want of this little, stupid little fob.
[00:07:03.640 --> 00:07:05.560] And there was no way to get that car started.
[00:07:05.560 --> 00:07:08.120] I mean, I wasn't going to hotwire it, you know what I mean?
[00:07:08.120 --> 00:07:11.240] But short of that, there was basically no way to do it.
[00:07:11.880 --> 00:07:13.320] I had to take the next day off from work.
[00:07:13.320 --> 00:07:14.880] We had to come home the next day.
[00:07:14.040 --> 00:07:16.480] So, you rented a car the next day?
[00:07:16.480 --> 00:07:18.800] No, there still wasn't a car available.
[00:07:14.440 --> 00:07:20.880] So, we had to rent a Bob.
[00:07:21.200 --> 00:07:21.840] A what?
[00:07:22.400 --> 00:07:23.280] Bob doesn't come cheap.
[00:07:23.920 --> 00:07:28.480] Bob had to drive up four hours each way.
[00:07:28.480 --> 00:07:30.480] Is that like a Bobby cab like they had on Mars?
[00:07:30.480 --> 00:07:31.440] Oh, that was a Johnny cab.
[00:07:31.520 --> 00:07:32.240] Johnny cab, yeah.
[00:07:33.040 --> 00:07:37.040] No, Bob had Bob, God bless him, had to come all the way up there and pick us up and bring us away.
[00:07:37.760 --> 00:07:38.480] Bob, that's rough.
[00:07:38.480 --> 00:07:41.760] I drove four hours this weekend to go camping, but I got to go camping.
[00:07:41.760 --> 00:07:43.360] Well, we did buy Bob a hamburger.
[00:07:43.360 --> 00:07:44.080] That's nice.
[00:07:44.320 --> 00:07:44.720] Two.
[00:07:44.720 --> 00:07:45.120] Two.
[00:07:45.120 --> 00:07:45.760] Two hamburgers.
[00:07:45.920 --> 00:07:46.720] Two of them, actually.
[00:07:46.720 --> 00:07:50.880] Well, he called me the night before, and I knew that they, you know, what was going on.
[00:07:50.880 --> 00:07:52.800] So he called at like six or seven.
[00:07:52.800 --> 00:08:01.040] And my first thought was, oh, crap, they're going to ask me to come right now, which means I probably wouldn't get back till like 3 a.m.
[00:08:01.200 --> 00:08:03.840] And I was like, please don't ask me to pick you up now.
[00:08:03.840 --> 00:08:06.000] And they're like, no, but tomorrow morning?
[00:08:06.000 --> 00:08:08.000] And like, all right, all right.
[00:08:08.320 --> 00:08:11.120] So no spare key to bring to you?
[00:08:11.760 --> 00:08:14.240] So listen, the car's 10 years old, right?
[00:08:14.480 --> 00:08:17.760] We lost that second key at some point along the line.
[00:08:18.160 --> 00:08:18.960] We didn't lose it.
[00:08:18.960 --> 00:08:20.480] We just sort of misplaced it.
[00:08:20.480 --> 00:08:22.400] It's somewhere in the house.
[00:08:22.400 --> 00:08:28.720] We never, I said, yeah, we've got to track that down at some point, but we didn't have it on us, is the bottom line.
[00:08:28.720 --> 00:08:31.760] And now we really, you know, we don't know where that second key is.
[00:08:31.760 --> 00:08:33.680] And, you know, this was definitely our bad.
[00:08:33.920 --> 00:08:34.560] That's what I'm saying.
[00:08:34.560 --> 00:08:35.760] Like, we just wasn't in.
[00:08:36.000 --> 00:08:40.080] I had no idea how difficult it was going to be to replace this key.
[00:08:40.080 --> 00:08:45.840] The other thing is, like, if you don't have one of the keys, you can't program a new key.
[00:08:45.840 --> 00:10:32.400] It's got to, the only way to program this key is you need to either have it towed to a dealer or you need to find you know a car, an auto locksmith who has both a blank right who has one of these ancient fuck 10 year old fobs right yeah and also has the computer equipment to do it and the software to do it so we we did eventually find somebody although they didn't get there till like monday night when we were already back in connecticut and you know so they you know the end of the story is they were able to program a key but it was a saga for them to do it was it expensive 500 bucks oh god yeah so that's an expensive you know i misplaced my key so yeah so we're gonna you know now i'm gonna make like two backups so always have an extra one on me like never gonna be in that situation again but that yeah plus steve that you might not be able to find another fob for that car oh yeah we'll be able to get you can get it oh there's no yeah now that you have the car you know and you have a key you could you could get them on amazon or whatever i could also just bring it to the dealer and they'll make up keys for me so that's what i'm gonna do i can't exist with one key anymore you know now that i know that all the other options are shut down i thought oh the yaki or a link whatever there was some triple a will figure it out whatever but there was nothing nothing yikes yeah it was terrible that that sense of when when realization settled like we are stuck here we are not getting home tonight like that i hate that feeling yep was anybody affected by the like global microsoft outage not me personally are you guys affected by that no at work at work we were affected yeah there were certain systems we could not access till about 11 in the morning yeah one we had like 100 140 computers go down.
[00:10:32.560 --> 00:10:46.720] Just talking about that today with ian we're talking a little bit on the the wednesday TikTok live stream about the fact that cyber security and everything to do with the internet and whatever is such a huge part of our lives now.
[00:10:47.040 --> 00:10:49.760] And we're so dependent and vulnerable.
[00:10:50.000 --> 00:10:51.680] Oh, vulnerable in every way.
[00:10:52.400 --> 00:10:57.760] We need to be putting way more resources into locking this down than we are.
[00:10:57.760 --> 00:11:03.280] I do think this is like this should be a cabinet-level position or equivalent.
[00:11:03.280 --> 00:11:07.280] You know, something like the FDA or the CPA or whatever.
[00:11:07.280 --> 00:11:24.320] A massive organization dedicated to figuring out how to, you know, how to keep us secure, keep this kind of vulnerability from happening, building in resilience and redundancy before civilization collapses because of a rogue update from one company.
[00:11:24.320 --> 00:11:25.360] You know what I mean?
[00:11:27.200 --> 00:11:28.160] And where's their?
[00:11:28.160 --> 00:11:28.960] Yeah, I haven't heard that.
[00:11:29.120 --> 00:11:30.960] This type of error, though, would be tough to.
[00:11:31.280 --> 00:11:37.440] I mean, that's just, you know, this was like one bit of testing software that they use.
[00:11:37.440 --> 00:11:42.560] I mean, so would you have government controlling that granularly?
[00:11:42.560 --> 00:11:43.280] I don't know, man.
[00:11:43.920 --> 00:11:46.080] Just setting standards, doing reviews.
[00:11:46.080 --> 00:11:46.320] Yeah.
[00:11:46.560 --> 00:11:53.920] You know, are you going to say, are you going to have the FDA controlling what drugs we can take back granularly?
[00:11:53.920 --> 00:11:54.560] Yes.
[00:11:54.560 --> 00:11:55.520] Yes, we are.
[00:11:55.520 --> 00:11:56.720] That's how it works.
[00:11:56.720 --> 00:12:02.960] You have a system in place where you have to prove that what you're doing is safe and effective, you know?
[00:12:03.280 --> 00:12:09.600] I heard the patch required a correction at each computer.
[00:12:09.600 --> 00:12:10.400] Yeah, it was manual.
[00:12:10.880 --> 00:12:12.880] You had to delete a system file.
[00:12:13.360 --> 00:12:19.840] You had to log into your computer in safe mode, navigate around, and then and then delete multiple system files.
[00:12:19.840 --> 00:12:20.640] Oh, what a pain in the ass.
[00:12:20.720 --> 00:12:23.600] So, this is what, and right, and there were millions of these computers or something.
[00:12:23.840 --> 00:12:32.520] Yeah, I know that at the hospital where I work, the emails were like: if your computer's been affected, bring it to this place at this time so that we can fix it.
[00:12:29.840 --> 00:12:34.360] So, it's yeah, individual computer level.
[00:12:34.680 --> 00:12:40.600] And I hear the airline airlines have not yet caught up to the backup that they suffered on last Friday.
[00:12:40.760 --> 00:12:42.200] What a nightmare.
[00:12:42.200 --> 00:12:46.120] Airlines are very vulnerable because that's like you're keeping 100 plates spinning, right?
[00:12:46.120 --> 00:12:50.520] That's like that is an industry that is very intolerant to hiccups.
[00:12:50.520 --> 00:12:55.640] If that conveyor belt locks up, like Lucy with the chocolates, that's it.
[00:12:55.640 --> 00:12:58.680] You're going to have a mountain of chocolate that you have to deal with.
[00:12:58.920 --> 00:13:02.600] One more thing to chat about before we go on to the formal section.
[00:13:02.600 --> 00:13:09.720] So, we, according to scientists, those pesky scientists, we just experienced the hottest day on record.
[00:13:09.720 --> 00:13:10.200] Yep.
[00:13:10.200 --> 00:13:11.400] Yep, Sunday.
[00:13:11.400 --> 00:13:11.880] Yep.
[00:13:11.880 --> 00:13:13.640] Collectively or globally.
[00:13:13.640 --> 00:13:14.120] Globally.
[00:13:14.120 --> 00:13:18.760] Like, the global average temperature was higher than anything previously recorded.
[00:13:18.760 --> 00:13:19.480] Recorded.
[00:13:19.480 --> 00:13:19.880] Let's see.
[00:13:19.880 --> 00:13:24.520] The new record high is 17.15 degrees Celsius.
[00:13:24.520 --> 00:13:27.480] And governments are scrambling to fix it.
[00:13:27.480 --> 00:13:30.120] I bet we break that record again at some point this year.
[00:13:30.120 --> 00:13:30.760] Yeah.
[00:13:31.080 --> 00:13:33.160] Maybe, yeah, the year, yeah, the year is not done.
[00:13:33.160 --> 00:13:36.280] But what I didn't realize is that this only goes back to 1940.
[00:13:37.400 --> 00:13:40.600] It's the Copernicus Climate Change Service.
[00:13:40.600 --> 00:13:41.000] Yeah.
[00:13:41.320 --> 00:13:43.720] And their rating books go back to 1940.
[00:13:43.800 --> 00:13:44.200] Yeah, yeah.
[00:13:44.360 --> 00:13:48.280] It's the hottest in 80 years, but it's not really the hottest in all history.
[00:13:48.840 --> 00:13:49.960] Oh, sure, of course not.
[00:13:50.280 --> 00:13:50.600] I mean, the potential of the history of the history.
[00:13:50.680 --> 00:13:54.200] But when you look at all the headlines, it's like hottest day ever recorded.
[00:13:54.280 --> 00:13:55.400] And it's like, yeah, that implies.
[00:13:55.960 --> 00:14:00.760] I know, but that implies like a gut reaction that is like the word matters.
[00:14:00.760 --> 00:14:06.080] I think most people think maybe turn of the century, 1900 is roughly about where those things start.
[00:14:06.080 --> 00:14:08.520] Even though they were measuring temperatures in the 1800s.
[00:14:08.520 --> 00:14:15.920] Yeah, I'm assuming people assume that it was from when we first started measuring temperature, but we didn't have global averages then.
[00:14:14.840 --> 00:14:21.440] Yeah, and these top 10 hottest days on record are all in the last 10 years.
[00:14:23.040 --> 00:14:25.120] Wait, that's just a coincidence, isn't it?
[00:14:25.280 --> 00:14:32.160] And this year and last year, like the last two were peaking way up above the line, you know what I mean?
[00:14:32.160 --> 00:14:33.120] Yep.
[00:14:33.120 --> 00:14:40.800] So, I mean, yeah, it's just, you know, at this point, it's undeniable, even though people deny it, that the Earth is in fact warming.
[00:14:40.800 --> 00:14:44.560] It's not even that trendy anymore to deny it.
[00:14:45.120 --> 00:14:53.920] Like, there are still climate change deniers, no doubt, but, you know, let's say the party of denial is no longer defined by that.
[00:14:53.920 --> 00:14:56.800] Well, they do the Mott and Bailey defense, right?
[00:14:56.800 --> 00:15:05.280] Which is when they'll say, oh, yeah, I mean, yes, the Earth is warming and, you know, man-made activity may be contributing to it, but there's nothing we could do about it.
[00:15:05.280 --> 00:15:07.520] And who's to say it's going to be a bad thing, right?
[00:15:07.840 --> 00:15:17.520] Or, but then when they think they can get away with it, you know, like if there's an anomalous cold day or something, they'll say, see, the Earth isn't really warming.
[00:15:17.520 --> 00:15:24.160] But then they'll storm forward, you know, again, if they, when they're very opportunistic, right?
[00:15:24.160 --> 00:15:27.840] So if they think they can get away with it, they'll deny every aspect of global warming.
[00:15:27.840 --> 00:15:32.960] But when push comes to show up, they'll retreat to the safest position, which is whatever.
[00:15:32.960 --> 00:15:38.080] Well, we can't prove it's all man-made, or there's nothing we can do about it, or, you know, you don't know what's going to be.
[00:15:38.160 --> 00:15:38.560] Yeah.
[00:15:38.560 --> 00:15:39.840] Because it's undeniable.
[00:15:39.840 --> 00:15:40.160] Right.
[00:15:40.160 --> 00:15:42.800] When they're in a situation where they can't deny it.
[00:15:42.800 --> 00:15:44.800] When they're in a situation where they can deny it, they will.
[00:15:44.800 --> 00:15:46.240] So they often will do that.
[00:15:46.240 --> 00:15:46.880] All right.
[00:15:46.880 --> 00:15:49.040] Let's go on with the actual show.
[00:15:49.040 --> 00:15:50.160] Kara.
[00:15:50.160 --> 00:15:51.920] You're going to do a what's the word?
[00:15:51.920 --> 00:15:52.400] I am.
[00:15:52.400 --> 00:15:55.520] I was brainstorming words the other day.
[00:15:55.760 --> 00:16:00.600] Actually, while I was doing the crossword, I have to give a shout out to my friend Sarah because this was her recommendation.
[00:16:00.600 --> 00:16:01.560] It was so good.
[00:16:01.560 --> 00:16:03.000] That's my crossword friend.
[00:15:59.920 --> 00:16:05.400] The word is calculus.
[00:16:05.960 --> 00:16:12.040] I love this word because its etymology is fascinating to me.
[00:16:12.040 --> 00:16:18.120] So, calculus, we all know, is a field or a method within mathematics.
[00:16:18.120 --> 00:16:23.080] It's a method of computation or calculation that uses special notation.
[00:16:23.080 --> 00:16:32.920] When we talk about calculus as a field of mathematics, what we're often talking about is continuingly changing values, right?
[00:16:32.920 --> 00:16:34.920] Vectors, things of that nature.
[00:16:35.480 --> 00:16:41.560] We'll also use that same word, calculus, in, I guess, maybe a more literary or practical sense.
[00:16:41.560 --> 00:16:42.760] Like, I very often use it.
[00:16:42.760 --> 00:16:50.360] I'll say, you know, the calculus here is very difficult, you know, the way that I'm kind of, the algorithm I'm using to judge this situation.
[00:16:50.600 --> 00:16:53.400] But it also has medical definitions.
[00:16:53.400 --> 00:16:56.440] Steve, do you often talk about calculi?
[00:16:56.520 --> 00:16:56.920] Calculi.
[00:16:57.240 --> 00:16:59.560] Because you focus on the brain sometimes.
[00:17:00.040 --> 00:17:00.280] We do.
[00:17:00.280 --> 00:17:02.600] We do talk about calculi, calculus, yeah.
[00:17:02.600 --> 00:17:05.000] In terms of calcification, yeah.
[00:17:05.000 --> 00:17:05.400] Exactly.
[00:17:05.720 --> 00:17:11.320] It's a mineral substance that forms, you know, a hard kind of calculus of things.
[00:17:11.560 --> 00:17:16.760] And also, if you work in the dental field, apparently that's also used as another word for tartar.
[00:17:16.760 --> 00:17:20.520] So your calcified tartar on your teeth becomes calculus.
[00:17:20.840 --> 00:17:24.360] So what do you think came first?
[00:17:24.360 --> 00:17:29.400] When we think about this word calculus, where does it come from?
[00:17:29.400 --> 00:17:32.120] Any idea as to the root of the word?
[00:17:32.120 --> 00:17:36.680] And do you think it was first a math term or first a medical term?
[00:17:36.680 --> 00:17:38.600] I'm going to say medical.
[00:17:38.840 --> 00:17:39.960] Math, I will say.
[00:17:39.960 --> 00:17:40.840] I'll say math.
[00:17:41.160 --> 00:17:43.640] Okay, so three to one, math to medical.
[00:17:43.640 --> 00:17:46.320] And the answer is it's complicated.
[00:17:46.320 --> 00:17:47.040] Oh, boy.
[00:17:44.920 --> 00:17:48.960] So we're all right and wrong.
[00:17:49.280 --> 00:17:59.440] Because even though it was first used in the 1600s for math, it comes from the root that means pebble.
[00:17:59.440 --> 00:18:04.640] Because early on, calculations were performed using abacai.
[00:18:05.440 --> 00:18:16.160] And the stones on the abacus appear to be the root of the utilization of that term for mathematics, which then later was utilized in medicine.
[00:18:16.160 --> 00:18:17.920] Right, so it means stone.
[00:18:17.920 --> 00:18:24.000] It does mean used to mean math because of the abacus and medicine when you have a stone-like thing in medicine.
[00:18:24.160 --> 00:18:27.440] Do they still teach children what an abacus is and how to use one?
[00:18:27.440 --> 00:18:28.800] I learned when I was young.
[00:18:28.800 --> 00:18:36.960] I never learned, but I will tell you, Evan, one of my favorite weird YouTube rabbit holes is watching mental abacus competitions.
[00:18:36.960 --> 00:18:39.520] Oh, it's amazing.
[00:18:39.520 --> 00:18:41.200] Oh, yeah, they're moving their fingers in the air.
[00:18:41.440 --> 00:18:44.960] They move their fingers in the air and then do complex calculations.
[00:18:45.520 --> 00:18:46.320] Yeah, it's fascinating.
[00:18:46.640 --> 00:18:48.960] Kara, you watch that like you watch a sport?
[00:18:49.280 --> 00:18:56.000] I just sometimes on YouTube, if I start watching it, I'll click through a lot of different videos because they're fascinating.
[00:18:56.000 --> 00:18:57.200] It's very cool.
[00:18:57.200 --> 00:18:59.440] But no, I don't actually know how to use an abacus.
[00:18:59.440 --> 00:19:01.120] I had one as a kid, but I don't think I've learned how to use it.
[00:19:01.200 --> 00:19:02.000] They're very reliable.
[00:19:02.000 --> 00:19:03.680] You can always count on them.
[00:19:04.880 --> 00:19:06.320] Calculus.
[00:19:06.320 --> 00:19:07.760] Here come the emails.
[00:19:07.760 --> 00:19:08.720] There it is.
[00:19:09.360 --> 00:19:16.400] Now, why do I sometimes hear the calculus in reference to the mathematical discipline of calculus?
[00:19:16.480 --> 00:19:16.880] I don't know.
[00:19:16.920 --> 00:19:17.760] That's just archaic?
[00:19:17.760 --> 00:19:18.800] That's just quaint.
[00:19:19.120 --> 00:19:22.560] Or it may also be regional.
[00:19:22.560 --> 00:19:26.160] You know, when I just did a recording on Talk Nerdy, it's not out yet.
[00:19:26.160 --> 00:19:32.440] It'll be out in a couple of weeks with an Australian woman, a professor, who wrote a book about vectors.
[00:19:29.840 --> 00:19:34.280] And we talked quite a lot about calculus.
[00:19:34.600 --> 00:19:38.040] She didn't say the calculus, but of course, she said maths.
[00:19:38.040 --> 00:19:40.920] By the way, then they also say maths, right?
[00:19:40.920 --> 00:19:46.520] That the shorthand for mathematics for them is maths, and for us is math.
[00:19:46.680 --> 00:19:47.960] It means the same thing, obviously.
[00:19:47.960 --> 00:19:48.360] Yeah.
[00:19:48.360 --> 00:19:50.680] It always takes me for a loop when I hear maths.
[00:19:51.000 --> 00:19:52.120] It sounds so wrong.
[00:19:52.120 --> 00:19:52.760] Yeah, right.
[00:19:52.760 --> 00:19:53.880] I know it's objective.
[00:19:53.880 --> 00:19:54.600] It's subjective.
[00:19:55.240 --> 00:19:55.480] Totally.
[00:19:55.480 --> 00:19:56.040] I mean, it's regional.
[00:19:56.040 --> 00:19:56.840] It's all fine, yeah.
[00:19:56.840 --> 00:19:59.240] But it's just your ears get so attuned to something.
[00:19:59.640 --> 00:20:02.200] In your brain, there is one way that is right.
[00:20:02.440 --> 00:20:04.440] And everything else is like chocolate.
[00:20:04.840 --> 00:20:05.160] Oh, yeah.
[00:20:05.160 --> 00:20:05.720] I used to watch.
[00:20:05.800 --> 00:20:08.120] Did you guys help me remember the name of it?
[00:20:08.120 --> 00:20:10.360] There was an internet series.
[00:20:10.360 --> 00:20:12.920] It was one of Simon Pegg's early series.
[00:20:13.320 --> 00:20:14.200] Look Around You.
[00:20:14.200 --> 00:20:15.560] Oh, no, not Spaced.
[00:20:15.880 --> 00:20:17.880] I loved Space, but that was a TV show.
[00:20:17.880 --> 00:20:18.520] Look Around You.
[00:20:18.520 --> 00:20:22.200] It's like these series of educational videos, but they're like, you know, parodies.
[00:20:22.200 --> 00:20:25.000] And in one of them, they say maths over and over.
[00:20:25.000 --> 00:20:26.920] And it was my first time I was exposed to that.
[00:20:26.920 --> 00:20:28.520] And I thought that was part of the joke.
[00:20:29.240 --> 00:20:31.640] But then I realized that's just how they say math.
[00:20:32.600 --> 00:20:32.920] All right.
[00:20:32.920 --> 00:20:34.120] Thanks, Kara.
[00:20:34.120 --> 00:20:37.800] Jay, tell us about harvesting water from the air.
[00:20:37.800 --> 00:20:42.360] This is a really interesting thing that these researchers came up with.
[00:20:42.680 --> 00:20:46.120] I'm still a little blown away by the reporting on this.
[00:20:46.120 --> 00:20:57.160] So there's researchers at the University of Utah, and they developed an atmospheric water harvesting or AWH device that obviously we need this, right?
[00:20:57.160 --> 00:21:00.520] Like, you know, can we get water from the air?
[00:21:00.520 --> 00:21:02.280] You know, you'd think there's not enough.
[00:21:02.280 --> 00:21:07.480] And, like, you know, I've seen all these ways that people collect water when they're camping and stuff like that.
[00:21:07.480 --> 00:21:08.280] But it's not a lot.
[00:21:08.280 --> 00:21:10.280] It's never a lot of water, you know?
[00:21:10.280 --> 00:21:21.840] But they created this device that remarkably collects a ton of water from the air, even in places where there is not a lot of water vapor floating around.
[00:21:22.160 --> 00:21:35.040] So they use something called a metal organic framework or an MOF, and they use this thing to capture water vapor from the air, and then they can convert it into liquid water pretty efficiently.
[00:21:35.040 --> 00:21:41.520] So the metal organic framework, these are made in this particular device, they're made out of aluminum fumarate.
[00:21:41.520 --> 00:21:44.080] And I want you to visualize this.
[00:21:44.080 --> 00:21:48.560] It's like a highly porous lattice-like ball, right?
[00:21:48.560 --> 00:21:52.800] There is a ton of surface on this thing, inside, outside.
[00:21:52.800 --> 00:21:55.600] You know, it's like, you know, it's mostly a lattice work, right?
[00:21:55.600 --> 00:21:58.160] So it isn't like a solid thing at all.
[00:21:58.160 --> 00:22:00.800] It has tons of nooks and crannies all through this.
[00:22:00.800 --> 00:22:03.520] And these are nano-scale pores.
[00:22:03.520 --> 00:22:11.680] And these pores are shaped specifically to capture water molecules as the air passes through them.
[00:22:11.680 --> 00:22:12.560] Oh, cool.
[00:22:12.560 --> 00:22:20.160] Here's the mind-blowing part: a single gram of this material has the surface area equivalent to.
[00:22:20.160 --> 00:22:23.280] Now, I want to tell you, but I want to see what you guys think.
[00:22:23.280 --> 00:22:30.560] You know, what size plane would you think that this, you know, a gram of this material has surface area-wise?
[00:22:30.960 --> 00:22:32.560] The surface of the earth.
[00:22:32.880 --> 00:22:33.920] Oh, my God, Steve.
[00:22:35.280 --> 00:22:36.640] I can't follow that one.
[00:22:36.640 --> 00:22:38.080] Did I overshoot?
[00:22:40.000 --> 00:22:41.200] The moon?
[00:22:42.400 --> 00:22:43.040] Just shut up.
[00:22:43.040 --> 00:22:46.160] It has a surface area of two football fields.
[00:22:46.160 --> 00:22:46.800] Wow.
[00:22:47.440 --> 00:22:47.960] Wow.
[00:22:47.760 --> 00:22:48.520] American.
[00:22:49.200 --> 00:22:50.240] American football field.
[00:22:50.400 --> 00:22:51.280] Shut up, Kara.
[00:22:51.280 --> 00:22:53.920] But to bring it back to reality, though, it's a gram.
[00:22:53.920 --> 00:22:56.400] It's a single gram of this material.
[00:22:56.600 --> 00:22:56.760] Right?
[00:22:56.800 --> 00:22:57.560] So it's like a ball.
[00:22:57.360 --> 00:22:58.600] It's just folding it out.
[00:22:58.000 --> 00:22:58.800] Folding it out.
[00:23:00.040 --> 00:23:00.200] Yeah.
[00:22:58.960 --> 00:23:01.160] Two football fields.
[00:22:59.280 --> 00:23:02.440] Twoop, football fields.
[00:22:59.680 --> 00:23:06.600] So as the air flows through this thing, it traps water molecules.
[00:23:06.760 --> 00:23:12.280] Now, these water molecules can be released into liquid form simply by applying heat.
[00:23:12.280 --> 00:23:25.560] And I tried to find out exactly what is happening when the water molecules are, you know, when the heat is increased, I guess they vibrate more and they bump into each other and they gather together as water does, and then it comes out of the machine.
[00:23:25.560 --> 00:23:33.480] So they have an existing prototype, and it can produce five liters of water per day per kilogram of this absorbent material.
[00:23:33.480 --> 00:23:35.000] All right, but Jay, can I ask you a question?
[00:23:35.000 --> 00:23:35.400] Sure.
[00:23:35.400 --> 00:23:38.040] How many droids does it take to operate this machine?
[00:23:38.040 --> 00:23:38.760] That's the problem.
[00:23:38.760 --> 00:23:41.400] You need four droids, and one of them has to speak bocce.
[00:23:41.400 --> 00:23:42.200] I see.
[00:23:45.000 --> 00:23:51.000] So the next question you might have, which is the one that my brain was creeping to, is how do you power this thing?
[00:23:51.000 --> 00:23:52.520] Like, is it super power hungry?
[00:23:52.520 --> 00:23:53.720] What's the deal?
[00:23:53.720 --> 00:24:00.520] So in this particular one, right, because this isn't the only device that's been created like this, and I'll get into that a little bit later.
[00:24:00.520 --> 00:24:06.280] But this one, they use an energy-dense fuel, and they did this very specifically.
[00:24:06.280 --> 00:24:14.120] So in their prototype, they're using white gasoline, and this is the fuel that you would commonly find in a camping stove.
[00:24:14.120 --> 00:24:19.640] It's just very energy-dense, you know, and it's you don't want to carry around a fuel that's going to be super heavy.
[00:24:19.640 --> 00:24:21.160] You want something that's portable.
[00:24:21.160 --> 00:24:29.480] So they decided not to use solar power for this machine because they wanted it to be able to run 24 hours a day for a very specific reason.
[00:24:29.480 --> 00:24:34.760] Solar panels could be used, but they complicate deploying this machine.
[00:24:34.760 --> 00:24:36.920] They're bigger, they're heavier, they need batteries.
[00:24:36.920 --> 00:24:38.920] You know, that complicates the whole thing.
[00:24:38.920 --> 00:24:44.280] So, they just decided we're going to have a little energy pack here of this white gasoline.
[00:24:44.280 --> 00:24:52.480] So, this device was initially designed to provide hydration solutions for, I'm sure you can guess what I'm about to say, for the military, for soldiers.
[00:24:52.480 --> 00:25:01.840] These are people who are in remote areas, they have limited water resources, and this is where, you know, obviously, solar panels are not an optimal fuel source for this thing.
[00:25:01.840 --> 00:25:06.640] But immediately, they're like, well, let's ponder the civilian applications.
[00:25:06.640 --> 00:25:07.840] And it's huge.
[00:25:08.400 --> 00:25:12.720] The need for something like this is, right now, it's extraordinary.
[00:25:12.720 --> 00:25:16.640] We need sources of clean water in so many places around the world.
[00:25:16.640 --> 00:25:23.200] And one of these devices could easily provide daily drinking water and regular household water needs.
[00:25:23.200 --> 00:25:24.800] And I just think that's incredible.
[00:25:24.800 --> 00:25:28.080] Like, you know, it's not a big machine.
[00:25:28.080 --> 00:25:38.080] It's not extraordinarily going to be super expensive because it's not using any materials that are hard to get or are super expensive or anything like that.
[00:25:38.080 --> 00:25:39.600] It's just, it really works.
[00:25:39.600 --> 00:25:42.960] Like, the physics of this machine are amazing.
[00:25:42.960 --> 00:25:56.640] So the device is compact, it's highly efficient, and it solves all the downsides of existing technologies that harvest water because it's small, it doesn't cost a lot, and it's wicked efficient.
[00:25:56.640 --> 00:26:01.520] And everything else that came before it, you know, has problems with size, cost, and efficiency.
[00:26:01.520 --> 00:26:09.760] It can operate effectively in really low humidity conditions, which I find to be that's a bonus, right?
[00:26:09.760 --> 00:26:15.040] You know, you're in some of the most arid places on the planet, and this thing still functions.
[00:26:15.040 --> 00:26:16.320] How did they say how low it could go?
[00:26:16.280 --> 00:26:18.240] Like, could you use this in the Sahara.
[00:26:18.240 --> 00:26:20.720] They said that you could go to Death Valley and it'll work.
[00:26:20.720 --> 00:26:21.120] Really?
[00:26:21.120 --> 00:26:21.520] Yep.
[00:26:21.520 --> 00:26:29.440] These previous technologies that I was mentioning require, you know, pretty, pretty significant humidity levels to function properly.
[00:26:29.440 --> 00:26:36.360] This thing is designed to function in low humidity environments, which is remarkable.
[00:26:36.600 --> 00:26:41.640] I saw a rendering of what the MOF looks like, right?
[00:26:41.960 --> 00:26:50.200] And, you know, it just looks like somebody took a ball of silver metal and kind of melted it with alien acid.
[00:26:50.200 --> 00:26:55.240] You know, it has all these little divots and pivots and all these little things in it.
[00:26:55.240 --> 00:26:58.040] But when you look closely, like it's organized.
[00:26:58.040 --> 00:27:01.560] It really is like a lattice inside that thing.
[00:27:01.960 --> 00:27:03.480] And here's another one that I found.
[00:27:03.480 --> 00:27:08.440] There's another version of this machine that UC Berkeley came up with.
[00:27:08.440 --> 00:27:11.240] They call it the MOF-Powered Water Harvester.
[00:27:11.240 --> 00:27:12.920] And theirs works well too.
[00:27:12.920 --> 00:27:15.240] It's very, very similar in concept here.
[00:27:15.240 --> 00:27:17.000] The device is handheld.
[00:27:17.000 --> 00:27:20.280] It uses ambient sunlight to extract water from the air.
[00:27:20.280 --> 00:27:21.320] I guess there's no battery.
[00:27:21.320 --> 00:27:27.240] It just needs sunlight and it just converts it right into the energy it needs to heat up the machine.
[00:27:27.240 --> 00:27:32.440] And its key points include it can operate in extremely dry conditions.
[00:27:32.440 --> 00:27:35.880] It provides clean water and it only uses sunlight.
[00:27:35.880 --> 00:27:42.520] It can harvest up to 285 grams of water per kilogram of the MOF material, right, in one day.
[00:27:42.680 --> 00:27:47.480] It's 85 to 90% efficient in releasing captured water as drinkable water.
[00:27:47.480 --> 00:27:51.480] And this particular device is environmentally friendly.
[00:27:51.480 --> 00:27:55.400] So this is the kind of technology that the modern world needs.
[00:27:55.400 --> 00:28:03.080] We're going to have places where there's an absolute abundance of water because of weather, you know, whatever the new weather patterns are going to be.
[00:28:03.080 --> 00:28:08.920] And then there's going to be lots of places where there is an incredible and dangerous lack of water.
[00:28:08.920 --> 00:28:13.080] So, anyway, I just find this type of technology super interesting.
[00:28:13.080 --> 00:28:22.160] You know, the shapes that are found inside this MOF material, it's specifically designed to only interact with water molecules.
[00:28:22.160 --> 00:28:24.160] It doesn't collect anything else.
[00:28:24.160 --> 00:28:27.840] So, is the resulting water basically like distilled water?
[00:28:28.000 --> 00:28:29.840] I tried to confirm that, Steve.
[00:28:29.840 --> 00:28:32.880] I mean, they're saying that it's drinkable, clean water.
[00:28:32.880 --> 00:28:35.120] I guess everything else just passes through.
[00:28:35.120 --> 00:28:44.880] But I would imagine, you know, for safety, you know, unless this thing really, really only ever, ever attracts water under every circumstance, you know, they could just put a filter in there, I guess.
[00:28:45.040 --> 00:28:54.400] What I'm saying is, you know, is the resulting water like distilled water, meaning there's no minerals in it, there's no electrolytes, it's just H2O.
[00:28:54.400 --> 00:28:55.440] No, I don't think so.
[00:28:56.080 --> 00:29:00.080] I think that that stuff is carried along with the water because they're saying it's drinkable.
[00:29:00.080 --> 00:29:00.880] It's drinkable as is.
[00:29:00.880 --> 00:29:01.440] You don't have to drink it.
[00:29:02.320 --> 00:29:02.960] Tablet it in.
[00:29:03.120 --> 00:29:04.480] Yeah, because that would kill you, right?
[00:29:04.480 --> 00:29:04.960] That's not.
[00:29:05.120 --> 00:29:06.400] Well, I mean, you'd have to drink a lot of it.
[00:29:06.560 --> 00:29:08.480] It wouldn't be great for your electrolyte.
[00:29:08.480 --> 00:29:12.320] Yeah, or you know, you'd have to give you salt to go along with it.
[00:29:12.320 --> 00:29:13.680] Very cool stuff, though, guys.
[00:29:13.680 --> 00:29:18.000] You know, I'm seeing interesting pieces of technology come out now.
[00:29:18.000 --> 00:29:19.520] NASA's doing a lot of cool things.
[00:29:19.520 --> 00:29:25.360] Like, you know, I love the MOXIE machine that NASA came out with that creates oxygen from CO2.
[00:29:25.360 --> 00:29:34.480] And this thing, you know, if it is what they're saying it is, and they're going to put this, you know, make this available and it's not going to be that expensive.
[00:29:34.480 --> 00:29:36.960] You know, I could see people using it who are camping.
[00:29:36.960 --> 00:29:40.000] And, you know, like, it's just a really, really usable thing.
[00:29:40.000 --> 00:29:42.000] And it probably will save lives.
[00:29:42.000 --> 00:29:46.000] You know what percentage of people lack access to clean drinking water?
[00:29:46.000 --> 00:29:47.200] That's a good statistic, Steve.
[00:29:47.200 --> 00:29:47.840] What is it?
[00:29:47.840 --> 00:29:48.480] Two-thirds.
[00:29:48.480 --> 00:29:49.600] No, it's 25%.
[00:29:49.600 --> 00:29:50.560] So 2 billion people.
[00:29:51.040 --> 00:29:51.520] That's a lot.
[00:29:51.760 --> 00:29:54.480] 2 billion people who don't have access to clean drinking water.
[00:29:54.480 --> 00:29:54.880] Yeah.
[00:29:54.880 --> 00:29:55.680] That's crazy.
[00:29:55.680 --> 00:29:57.040] How do they even survive?
[00:29:57.040 --> 00:29:57.840] Yeah, that's a good question.
[00:29:58.080 --> 00:30:00.840] They drink not clean water, right?
[00:29:58.240 --> 00:30:03.000] Or they drink not clean water.
[00:30:03.000 --> 00:30:04.840] They're dysentery and everything else.
[00:29:59.920 --> 00:30:14.600] Okay, so the ocean contains about 1.3 billion cubic kilometers of water, and the atmosphere contains about 12,900 cubic kilometers of water.
[00:30:14.920 --> 00:30:19.480] It's not even close, but the atmosphere has a ton of water in it.
[00:30:19.480 --> 00:30:19.880] Oh, yeah.
[00:30:19.880 --> 00:30:22.920] Yeah, I've watched survival shows on how to extract water from the air.
[00:30:22.920 --> 00:30:31.320] But like you said, Jay, it always yields a piddly tiny amount that, you know, nothing of high significance to keep maybe more than one person alive.
[00:30:31.320 --> 00:30:34.440] Yeah, 97% of the water on Earth is in the oceans.
[00:30:34.440 --> 00:30:35.240] Very cool.
[00:30:35.480 --> 00:30:37.880] I'm excited about stuff like this.
[00:30:37.880 --> 00:30:45.000] But I'm always fascinated by the fact that Europa has more water on it than the Earth's oceans.
[00:30:45.000 --> 00:30:45.400] Yeah.
[00:30:46.280 --> 00:30:47.160] Twice as much water.
[00:30:47.160 --> 00:30:49.880] Yeah, there's twice as much on Europa as there is in the Earth's ocean.
[00:30:49.880 --> 00:30:50.360] That's nuts.
[00:30:50.520 --> 00:30:51.080] Always fascinating.
[00:30:51.480 --> 00:30:52.120] All right, thanks, Jay.
[00:30:52.280 --> 00:30:53.320] Europa is a moon.
[00:30:53.320 --> 00:30:54.120] It's a moon.
[00:30:54.120 --> 00:30:56.760] All right, do you guys know what dark oxygen is?
[00:30:56.760 --> 00:30:57.640] I had never heard of it.
[00:30:57.800 --> 00:30:58.840] Dark oxygen.
[00:30:58.840 --> 00:31:00.280] Low albedo oxygen.
[00:31:00.440 --> 00:31:01.400] What do you think it is?
[00:31:01.960 --> 00:31:03.080] Hidden oxygen?
[00:31:03.080 --> 00:31:05.000] Oxygen with an isotope of oxygen?
[00:31:05.320 --> 00:31:07.400] Where does oxygen on the Earth usually come from?
[00:31:07.800 --> 00:31:08.840] It's hidden from the atmosphere.
[00:31:08.840 --> 00:31:09.960] So it's like underground.
[00:31:09.960 --> 00:31:10.280] Yeah.
[00:31:10.440 --> 00:31:11.320] Where does it come from?
[00:31:11.320 --> 00:31:12.760] Where does the oxygen come from?
[00:31:12.760 --> 00:31:13.080] Plants.
[00:31:13.240 --> 00:31:15.080] Plants, therefore, what process?
[00:31:15.320 --> 00:31:16.200] Photosynthesis.
[00:31:16.200 --> 00:31:17.960] Right, which involves the sun.
[00:31:17.960 --> 00:31:18.600] The sun.
[00:31:18.920 --> 00:31:23.000] So this is oxygen that comes from a process that doesn't involve the sun.
[00:31:23.000 --> 00:31:25.960] Therefore, it's a chemosynthetic synthesis.
[00:31:27.000 --> 00:31:27.880] Is it chemosynthetic?
[00:31:28.040 --> 00:31:30.440] I did not hear that specific term used, Bob.
[00:31:30.440 --> 00:31:35.880] It could be, but it is from a chemical reaction that doesn't involve light.
[00:31:35.880 --> 00:31:38.760] I think then by definition it would be chemosynthetic.
[00:31:38.760 --> 00:31:39.320] All right.
[00:31:39.320 --> 00:31:39.720] Okay.
[00:31:39.720 --> 00:31:44.560] So this was actually discovered not by accident, but it was a surprise.
[00:31:44.120 --> 00:31:50.080] Researchers were trying to measure oxygen levels in the deep ocean.
[00:31:50.400 --> 00:32:00.400] When they put the detectors down at the sea floor, they got back this huge result, like way more oxygen than there should have been down there.
[00:32:00.400 --> 00:32:04.560] It was so surprising that they thought, well, these detectors are not calibrated properly.
[00:32:04.560 --> 00:32:06.160] They're clearly not functioning.
[00:32:06.160 --> 00:32:09.840] They sent them all back to the manufacturer saying that these were broken.
[00:32:09.840 --> 00:32:13.840] The manufacturer tested them all and saying, nope, these are calibrated and working.
[00:32:13.840 --> 00:32:14.880] So they did it again.
[00:32:14.880 --> 00:32:16.560] They're like, this can't be right, though.
[00:32:16.560 --> 00:32:19.680] There's way more oxygen down there than there's supposed to be.
[00:32:19.680 --> 00:32:23.040] So here's the other thing: where were they doing this measurement?
[00:32:23.040 --> 00:32:26.480] They were doing it in the Clarion Clipperton zone.
[00:32:26.480 --> 00:32:28.640] Does that ring any bells for you guys?
[00:32:28.640 --> 00:32:30.720] Clarion Clipperton zone.
[00:32:30.720 --> 00:32:35.120] So there are something strewn across the floor, the ocean floor, in this area.
[00:32:35.600 --> 00:32:36.240] The Mariana.
[00:32:36.320 --> 00:32:36.560] No, no, no.
[00:32:36.800 --> 00:32:37.760] Are they titanic?
[00:32:37.760 --> 00:32:38.240] No.
[00:32:38.560 --> 00:32:40.000] But it is metal.
[00:32:40.000 --> 00:32:41.600] Polymetallic nodules.
[00:32:41.600 --> 00:32:42.240] Remember that?
[00:32:42.960 --> 00:32:45.040] Polymetallic nodules.
[00:32:45.360 --> 00:32:46.480] Lithium and stuff?
[00:32:46.640 --> 00:32:47.040] Is it lithium?
[00:32:47.120 --> 00:32:50.080] Yeah, manganese and cobalt.
[00:32:50.800 --> 00:32:52.800] All the stuff that people have talked about mining.
[00:32:52.800 --> 00:32:53.520] Yeah, this is so.
[00:32:53.840 --> 00:32:55.840] We talked about this once or twice before.
[00:32:55.840 --> 00:33:09.040] This is the batteries, you know, like all the material we need to make batteries are in these little potato-sized nodules that form slowly over millions of years, strewn across the ocean floor in certain regions.
[00:33:09.040 --> 00:33:15.360] And one of the biggest collections is in this Clarion Clipperton zone in the Pacific Ocean, right?
[00:33:15.360 --> 00:33:22.640] And there are plans, there's companies that are planning on mining these polymetallic nodules, right?
[00:33:22.640 --> 00:33:37.480] We talked about it as kind of as a sort of a good thing, you know, like, okay, this, you know, could be great if it, you know, could produce the raw material to make enough batteries to trans to make all of our cars electric, you know what I mean?
[00:33:37.480 --> 00:33:38.440] That kind of thing.
[00:33:38.440 --> 00:33:46.360] And at the time, we mentioned that, yeah, they just got to do some environmental studies, but you know, like, don't let that take too long.
[00:33:46.360 --> 00:33:53.080] Get that done so we can get mining these nodules and we could start, you know, push forward with our battery revolution.
[00:33:53.080 --> 00:34:01.720] But the more I follow this story, the more these pesky environmental issues are a major problem.
[00:34:01.720 --> 00:34:08.200] And it may actually prevent any significant mining of these nodules or should, right?
[00:34:08.520 --> 00:34:18.440] So this just adds to that because it's probably true that this dark oxygen is being made on these nodules, right?
[00:34:18.440 --> 00:34:23.640] That it's the metals in these nodules that is creating the oxygen.
[00:34:23.640 --> 00:34:24.280] How?
[00:34:24.280 --> 00:34:26.200] Well, they do not know.
[00:34:26.520 --> 00:34:35.240] And so that's why, you know, when Bob asked if it's chemosynthetics, like, well, probably, but they really haven't figured out what the actual process is.
[00:34:36.200 --> 00:34:47.560] But just the fact that they are making so much oxygen means that they may be, they probably are, a critical component to the ecosystem here.
[00:34:47.880 --> 00:34:59.640] And there are other locations where there have been mining and disruptions to the ocean floor that even after 20, 30 years, they haven't recovered.
[00:34:59.640 --> 00:35:04.120] Like they're dead zones, you know, 40 years later.
[00:35:04.120 --> 00:35:08.120] And the thinking is, well, maybe that this is why.
[00:35:08.120 --> 00:35:16.400] That it's because we basically removed the oxygen supply from this, you know, this deep ocean.
[00:35:16.400 --> 00:35:20.160] And because normally there's not a lot of oxygen because there's not a lot of light down there, right?
[00:35:14.840 --> 00:35:22.080] And therefore, not a lot of photosynthesis.
[00:35:22.320 --> 00:35:27.600] They're getting mostly whatever oxygen is being made close to the surface and it's just diffusing through the water.
[00:35:28.000 --> 00:35:31.920] And then, of course, a lot of nutrients get carried down by life on the surface.
[00:35:31.920 --> 00:35:34.560] But there's this vibrant ecosystem down there.
[00:35:34.560 --> 00:35:41.600] And this may explain why that's the case, you know, why there is so much biodiversity and so much life in this deep zone.
[00:35:41.600 --> 00:35:47.120] Because, yeah, because there's like dark oxygen being produced by these polymetallic nodules.
[00:35:47.120 --> 00:35:54.720] Therefore, if we do any significant mining of them, we may create a dead zone down there, you know, where there is now a vibrant ecosystem.
[00:35:54.720 --> 00:35:58.560] So it's not like it'll bounce back, it'll be disruptive, but it'll be fine in 10 years.
[00:35:58.800 --> 00:36:00.800] It may not be, you know.
[00:36:01.360 --> 00:36:08.800] Again, biologists looking at areas that were heavily mined 40 years ago found that essentially there's still no life there.
[00:36:08.800 --> 00:36:12.560] So that is a huge problem, unfortunately.
[00:36:12.720 --> 00:36:21.760] We definitely need to do more research and figure out how this oxygen is being made, confirm that it's coming from the polymetallic nodules, figure out what that means to the ecosystem.
[00:36:21.760 --> 00:36:25.440] We have to really review any proposed method.
[00:36:25.440 --> 00:36:41.680] Right now, one of the companies is planning on basically just sucking them, like vacuuming them off the ocean floor, and that would send up a plume of debris, as well as depriving that ecosystem of these nodules.
[00:36:41.680 --> 00:36:45.920] So it would probably be devastating to the ecosystem, unfortunately.
[00:36:45.920 --> 00:36:50.800] So this is something that, you know, a year ago, two years ago, I was like very, very hopeful about this.
[00:36:50.800 --> 00:36:56.080] And now I've had to modify my opinions because of all the environmental information that's coming out.
[00:36:56.080 --> 00:36:58.720] Unfortunately, I was hoping the answer was going to be like, yeah, you're fine.
[00:36:58.720 --> 00:36:59.280] Go ahead.
[00:36:59.280 --> 00:37:01.400] Vacuum them up, make batteries out of them.
[00:37:01.400 --> 00:37:03.000] But that doesn't seem to be the case.
[00:37:03.000 --> 00:37:12.760] And this one may be the death blow, you know, to any environmentally responsible mining, you know, of these polymetallic nodules, at least in this part of the ocean here.
[00:37:12.760 --> 00:37:16.040] So for the moment, until a new technology maybe can come along.
[00:37:16.040 --> 00:37:23.800] Yeah, maybe we need to just like when you like, you know, you go into a forest and you cut down every third tree, you know, or something and you leave the rest undisturbed.
[00:37:23.800 --> 00:37:26.760] Like maybe there might be some limited, careful mining.
[00:37:26.760 --> 00:37:33.000] Not like this like strip mining, like the equivalent of like just vacuuming up the ocean floor.
[00:37:33.000 --> 00:37:33.560] Right, bolts.
[00:37:33.800 --> 00:37:34.120] Yeah, right.
[00:37:34.120 --> 00:37:35.640] But maybe we might need to do.
[00:37:35.800 --> 00:37:38.680] But then, of course, that gets to the cost-effectiveness of the operation.
[00:37:38.680 --> 00:37:39.080] Sure.
[00:37:39.080 --> 00:37:40.840] And do they know how they form?
[00:37:40.840 --> 00:37:42.200] Like how long it takes for them to form?
[00:37:42.360 --> 00:37:43.000] Millions of years.
[00:37:43.000 --> 00:37:43.640] It takes millions.
[00:37:43.800 --> 00:37:44.680] Well, yeah, so I don't know.
[00:37:44.920 --> 00:37:45.800] So there's a limited amount of time.
[00:37:46.280 --> 00:37:47.000] Slow accretion.
[00:37:47.160 --> 00:37:48.760] Yeah, it's like millions and millions of years.
[00:37:49.160 --> 00:37:50.680] But then it's not a forest.
[00:37:50.680 --> 00:37:50.920] Right.
[00:37:50.920 --> 00:37:51.960] And we can't think of it like a forest.
[00:37:52.280 --> 00:37:53.400] They will not bounce right back.
[00:37:53.400 --> 00:37:53.640] Nope.
[00:37:54.360 --> 00:37:56.680] Yeah, it's basically geological time periods.
[00:37:56.840 --> 00:37:58.360] You got to think of it like fossil fuels.
[00:37:58.360 --> 00:37:58.600] Exactly.
[00:37:58.760 --> 00:37:59.480] Once they're gone.
[00:37:59.480 --> 00:37:59.880] Yeah.
[00:37:59.880 --> 00:38:03.800] Yeah, they don't come back except on geological time scales.
[00:38:04.120 --> 00:38:05.240] Yeah, that's a downer.
[00:38:05.320 --> 00:38:06.680] It's very, very unfortunate.
[00:38:06.760 --> 00:38:13.240] I mean, this is interesting finding, and it's exciting from that reason, but it does say, yeah, it's not looking good.
[00:38:13.240 --> 00:38:15.400] Not looking good for the polymetallic nodules.
[00:38:15.400 --> 00:38:17.080] We have to figure something else out.
[00:38:17.080 --> 00:38:18.760] And again, it's not like this was our one hope.
[00:38:19.160 --> 00:38:24.200] There's other options in terms of sourcing manganese, nickel, and cobalt.
[00:38:24.200 --> 00:38:30.360] And there's different battery designs that use different materials so that we're not as dependent on those three things.
[00:38:30.600 --> 00:38:31.880] Those are the big ones.
[00:38:31.880 --> 00:38:46.160] In addition to the lithium, like lithium, manganese, nickel, and cobalt, those are the four really that are the limiting supply line factors for the current most energy-dense lithium-ion batteries that we have.
[00:38:44.760 --> 00:38:52.400] You can make less energy-dense batteries by using a chemistry that does not involve the nickel and the cobalt.
[00:38:52.720 --> 00:38:55.840] But again, you're sacrificing a little bit of energy density.
[00:38:55.840 --> 00:39:07.600] But there are some entirely new battery designs, you know, that use salt or iron, like the really abundant stuff that's not limited to these rare earths or to these metals.
[00:39:07.600 --> 00:39:13.280] So that's, you know, hopefully those will come online before too long, you know, in a significant way.
[00:39:13.280 --> 00:39:17.440] Because we're not going to be soaking, you know, sucking up batteries from the ocean floor.
[00:39:17.440 --> 00:39:21.680] That doesn't seem to be like a viable option, unfortunately.
[00:39:21.680 --> 00:39:26.640] All right, Kara, tell us about the speed of chimp conversation.
[00:39:26.640 --> 00:39:30.160] Yeah, chimpanzees have conversations.
[00:39:30.160 --> 00:39:34.800] This is something we've known for a while, but what are their conversations made of?
[00:39:34.800 --> 00:39:37.440] I mean, they don't have words, right?
[00:39:37.440 --> 00:39:38.960] They don't speak.
[00:39:38.960 --> 00:39:40.640] How do they communicate with one another?
[00:39:40.640 --> 00:39:41.680] Sign language.
[00:39:41.680 --> 00:39:43.120] And hoots and hollers.
[00:39:43.440 --> 00:39:44.720] If we teach them sign language.
[00:39:46.800 --> 00:39:47.920] In the wild or in the land.
[00:39:48.000 --> 00:39:48.640] They have their own signs.
[00:39:49.040 --> 00:39:50.320] They have their own sign.
[00:39:51.120 --> 00:39:54.160] I would say that they, well, I know that they make a lot of noises.
[00:39:54.160 --> 00:39:54.960] They do make a lot of noises.
[00:39:55.120 --> 00:39:58.000] And they probably do facial expressions and gesturing.
[00:39:58.000 --> 00:40:00.320] And they hold up different numbers of bananas.
[00:40:00.640 --> 00:40:02.240] Do they secrete an odor?
[00:40:03.120 --> 00:40:11.840] So, in this study, entitled, this will kind of give it away: chimpanzee gestural exchanges share temporal structure with human language.
[00:40:11.840 --> 00:40:16.880] This was published in Current Biology.
[00:40:17.200 --> 00:40:30.680] And this is a correspondence from multiple authors who decided to observe chimpanzees in the wild and to record a lot of data on how they interact with one another.
[00:40:31.000 --> 00:40:38.520] They, over the course of their data collection, studied five wild communities of chimpanzees in East Africa.
[00:40:38.520 --> 00:40:47.480] They collected data on more than 8,500 gestures across 252 individuals.
[00:40:47.800 --> 00:41:08.360] And when they were looking at those gestures, they found that the vast majority of these kinds of communications or conversations, upwards of 85, 86% of them, were a single gesture would be made by an individual, and then the other individual would engage in a behavior based on that gesture.
[00:41:08.360 --> 00:41:13.960] So for example, one individual gestures, another come here, the other one comes here.
[00:41:13.960 --> 00:41:21.240] They were looking specifically at the 14% of interactions where there was a gestural exchange.
[00:41:21.560 --> 00:41:23.880] I gesture to you, you gesture back to me.
[00:41:23.880 --> 00:41:26.120] I gesture to you, you gesture back to me.
[00:41:26.120 --> 00:41:31.080] So this was less about simple commands and behavioral responses.
[00:41:31.080 --> 00:41:39.160] And in the researchers' view, they talk about these almost like negotiations because they would often occur around something like food or grooming.
[00:41:39.160 --> 00:41:43.960] They were less likely to occur around simple commands or requests.
[00:41:43.960 --> 00:41:56.600] So across 14% of those observations, they found that there was an exchange of gestures between two individuals that were at least a two-part exchange, but sometimes they would go up to seven back and forths.
[00:41:56.600 --> 00:41:58.760] And they found something kind of interesting.
[00:41:58.760 --> 00:42:17.200] When we speak to one another in normal conversation, and when I say we, I mean human beings, face-to-face conversation, the responses, I say something, you say something, or I move my hands, you move your hands in reaction or response, is about on average 200 milliseconds.
[00:42:17.520 --> 00:42:26.240] They found that in the chimpanzees, the average, of course, this is based on a much smaller sample size, was about 120 milliseconds.
[00:42:26.240 --> 00:42:30.080] They're not saying it's faster, they're saying it's within the same range.
[00:42:30.400 --> 00:42:36.640] The behavioral responses were significantly longer, I think, closer to like 1500 milliseconds.
[00:42:36.640 --> 00:42:52.240] But the gestural responses were fast, so fast, in fact, that sometimes they would observe one chimpanzee gesturing to another and the other gesturing back before the first one had even finished their gesture, which is such a similar thing that we human beings do.
[00:42:52.240 --> 00:43:05.760] We interrupt each other, we react to, you know, we might have a facial expression or a hand motion that's reactive to something that said even before sitting, processing, and then being reactive.
[00:43:05.760 --> 00:43:44.000] And so the researchers posit that this kind of shared behavior, this temporality, which is very, very quick, is likely either indicative of a previous evolutionary ancestor, so that this is something that was conserved prior to the speciation of both chimpanzees and human beings, or that this might have been a source of homologous kind of evolution, that like we developed it and they developed it, possibly because we had similar building blocks there.
[00:43:44.000 --> 00:44:04.520] But either way, this study, I believe, is important, and many of the writers who've done write-arounds of it, and even the authors themselves, are saying, this is important for kind of understanding the roots, sort of the evolutionary roots of human language, because these gestural exchanges are a form of communication.
[00:44:04.680 --> 00:44:13.480] And even though words are not being uttered, we're seeing conversations happening as opposed to just command and follow.
[00:44:13.480 --> 00:44:14.120] Wow.
[00:44:14.120 --> 00:44:15.320] It's pretty interesting.
[00:44:15.320 --> 00:44:19.560] And at the same rate, you know, there's not a it's not slower.
[00:44:19.560 --> 00:44:24.520] It's there's something going on there that's the cognitive capability is very similar.
[00:44:24.760 --> 00:44:26.760] Which, yeah, not surprisingly surprised.
[00:44:26.920 --> 00:44:29.720] Do we have any idea what they're saying to each other?
[00:44:29.960 --> 00:44:32.760] I bet you a lot of these behaviorists do.
[00:44:32.760 --> 00:44:44.200] You know, these individuals who spend time in the field just watching and watching and observing, you know, through context clues, through behaviors that follow the gestures and the exchanges.
[00:44:44.200 --> 00:44:45.800] You know, how often do they do this?
[00:44:45.800 --> 00:44:49.000] And then the food is given or the food is taken away or the food is whatever.
[00:44:49.000 --> 00:44:51.720] I bet you they can kind of speak chimpanzee.
[00:44:51.720 --> 00:44:59.160] And I remember years ago having a woman on my show, gosh, I even remember her name, Rebecca Atencia.
[00:44:59.160 --> 00:45:07.080] She was the chief veterinarian at the Chimpaunga Reserve in the DRC.
[00:45:07.400 --> 00:45:12.760] And she told me, like, her kids grew up on the reserve with her as she was working with these chimps.
[00:45:12.760 --> 00:45:14.760] It was a Jane Goodall Foundation reserve.
[00:45:14.760 --> 00:45:17.320] She was like, I feel like my kids speak chimp.
[00:45:17.640 --> 00:45:20.280] Like, they know what the different calls mean.
[00:45:20.280 --> 00:45:25.400] They're so used to it because they just grew up hearing it all the time and seeing it all the time.
[00:45:25.400 --> 00:45:27.640] That in some ways it's like a language that they speak too.
[00:45:27.640 --> 00:45:28.040] Yeah, yeah.
[00:45:28.840 --> 00:45:29.960] It's fascinating.
[00:45:30.280 --> 00:45:33.000] Did you guys watch Kingdom of the Planet of the Apes, the most recent one?
[00:45:34.120 --> 00:45:35.000] No, I didn't say the most recent.
[00:45:35.720 --> 00:45:36.280] It was okay.
[00:45:36.280 --> 00:45:36.920] I enjoyed it.
[00:45:36.920 --> 00:45:37.720] It was fine.
[00:45:37.720 --> 00:45:50.000] But one thing I did not like about it was that the way they represented the fact that the chimpanzees had language that was not as fully developed as human language.
[00:45:50.320 --> 00:45:53.440] They had them speak in a slow and halting fashion.
[00:45:53.440 --> 00:45:53.840] Right.
[00:45:53.840 --> 00:45:54.720] Which is not appropriate.
[00:45:54.880 --> 00:45:58.560] Yeah, I don't, well, even before, I'm like, that just doesn't, that's not right.
[00:45:58.880 --> 00:46:06.160] It is, that's like a Hollywood trope, you know, like, like, that's just the only way they know to represent, you know, that language is limited.
[00:46:06.320 --> 00:46:10.160] Like, they probably, there's no, I don't, it didn't, you know, ring true to me.
[00:46:10.160 --> 00:46:17.360] I thought they, if you're trying to represent that, they should have a completely fluent, just simplified language structure.
[00:46:17.360 --> 00:46:20.320] Yes, just fewer, yeah, fewer modifiers, simple.
[00:46:20.480 --> 00:46:22.400] Smaller vocabulary, whatever.
[00:46:22.640 --> 00:46:25.120] But it should still be completely fluid for them.
[00:46:25.120 --> 00:46:25.760] You know what I mean?
[00:46:26.480 --> 00:46:27.840] It's not like they're children learning.
[00:46:27.920 --> 00:46:28.080] Yeah.
[00:46:29.120 --> 00:46:31.440] Yeah, it's like trying to speak to a different speech.
[00:46:31.520 --> 00:46:36.080] Right, or they're developing speech again during speech therapy after a head injury or something.
[00:46:36.080 --> 00:46:39.600] Or like it's a second language, but halting, like broken language.
[00:46:39.600 --> 00:46:41.120] Like, no, this is their language.
[00:46:41.120 --> 00:46:42.960] This is what they grew up speaking.
[00:46:42.960 --> 00:46:44.880] Right, yeah, it would be more fluid than that.
[00:46:44.880 --> 00:46:45.840] But maybe you're right.
[00:46:45.840 --> 00:46:50.960] More kind of lexically, I guess you could say, simple.
[00:46:51.600 --> 00:46:57.680] Hi, I'm Chris Gethard, and I'm very excited to tell you about Beautiful Anonymous, a podcast where I talk to random people on the phone.
[00:46:57.680 --> 00:47:00.320] I tweet out a phone number, thousands of people try to call.
[00:47:00.320 --> 00:47:02.320] I talk to one of them, they stay anonymous.
[00:47:02.320 --> 00:47:03.200] I can't hang up.
[00:47:03.200 --> 00:47:04.240] That's all the rules.
[00:47:04.240 --> 00:47:05.760] I never know what's going to happen.
[00:47:05.760 --> 00:47:07.120] We get serious ones.
[00:47:07.120 --> 00:47:09.120] I've talked with meth dealers on their way to prison.
[00:47:09.120 --> 00:47:11.280] I've talked to people who survived mass shootings.
[00:47:11.280 --> 00:47:12.400] Crazy, funny ones.
[00:47:12.400 --> 00:47:16.320] I talked to a guy with a goose laugh, somebody who dresses up as a pirate on the weekends.
[00:47:16.320 --> 00:47:17.680] I never know what's going to happen.
[00:47:17.680 --> 00:47:18.880] It's a great show.
[00:47:18.880 --> 00:47:21.440] Subscribe today, Beautiful Anonymous.
[00:47:21.600 --> 00:47:24.080] All right, Bob, tell us about this new nuclear clock.
[00:47:24.080 --> 00:47:26.960] Yeah, researchers have taken an important step.
[00:47:26.960 --> 00:47:36.760] I think it's important in creating the first nuclear clock, which in many ways would be superior to all those old-school atomic clocks that you've heard so much about.
[00:47:37.640 --> 00:47:39.320] That's a big statement, Bob.
[00:47:39.320 --> 00:47:39.800] Oh, yeah.
[00:47:39.800 --> 00:47:41.720] Well, let's see what the science says here.
[00:47:41.720 --> 00:47:50.200] So, this is the result of a collaboration of physicists from the Federal Physical and Technical Institute of Brunschweek in Germany and the Vienna University of Technology in Austria.
[00:47:50.200 --> 00:47:50.760] Okay?
[00:47:50.760 --> 00:47:53.240] Published in Physical Review Letters.
[00:47:53.640 --> 00:47:59.080] The title of the study is Laser Excitation of the Thorium-229 Nucleus.
[00:47:59.080 --> 00:48:08.920] Okay, so now to understand the future nuclear clocks that we will maybe see, it'd be helpful to know how plain old atomic clocks work.
[00:48:08.920 --> 00:48:11.880] And we've all, you think everyone's heard of atomic clocks, right?
[00:48:11.880 --> 00:48:14.120] At least the term.
[00:48:14.120 --> 00:48:24.840] And I think a lot of those people have their takeaway is that, yeah, atomic clocks use atoms and they're ridiculously accurate, like losing a second only after like millions of years or something.
[00:48:24.840 --> 00:48:28.600] So I think most people would agree with, would know about that.
[00:48:28.600 --> 00:48:30.680] And that's essentially correct.
[00:48:30.840 --> 00:48:37.000] The part of the atom that's most important, though, for atomic clocks are the electrons in their orbital shells around the atoms.
[00:48:37.000 --> 00:48:38.680] That's what's really critical.
[00:48:38.680 --> 00:48:41.880] That's where all the hard work is happening.
[00:48:41.880 --> 00:48:56.760] Now, by exciting the electrons, those orbital shells with a precise amount of radiation, the electrons gain energy, right, rising to another orbital where they hang out there for a very tiny amount of time.
[00:48:56.760 --> 00:49:02.120] Then they will release a specific amount before going down, before going back down, right?
[00:49:02.280 --> 00:49:15.280] Now, so that up and down transition of the electrons between energy levels, it's very predictable, it's very stable, and they're essentially ticks of a clock like a swinging pendulum or a vibrating crystal to tell time.
[00:49:14.760 --> 00:49:18.560] So, that's kind of my overview of atomic clocks.
[00:49:18.640 --> 00:49:29.840] Now, our current standard today uses microwaves to excite cesium atom electrons up and down, 9.192631770 billion times per second.
[00:49:29.840 --> 00:49:39.840] In fact, that's exactly how our second is now defined by 9.19 billion oscillations of electrons in the cesium atom.
[00:49:39.840 --> 00:49:46.080] Now, the time standard the world depends on is called International Atomic Time, which I always love that.
[00:49:46.160 --> 00:49:47.040] Sounds pretty cool.
[00:49:47.040 --> 00:49:48.560] International Atomic Time.
[00:49:48.560 --> 00:49:58.880] And it uses not just one of these cesium microwave atomic clocks, but about 450 of them spread out all over the world, I think, in 80 labs throughout the world.
[00:49:58.880 --> 00:50:08.160] And that average, that average of those 450 atomic clocks, that's the foundation, or I'll call it the raw time that the world uses.
[00:50:08.160 --> 00:50:12.880] Now, I say raw because this atomic time does not add leap seconds.
[00:50:13.680 --> 00:50:21.040] It's just that raw time, just this is how many seconds have passed, and it doesn't, there's no leap seconds added.
[00:50:21.040 --> 00:50:28.320] Now, leap seconds are added to this atomic time to make coordinated universal time, UTC.
[00:50:28.320 --> 00:50:32.320] And I bet a lot of people have heard about UTC.
[00:50:33.360 --> 00:50:40.720] So, this is like the foundation of our timekeeping, our civil timekeeping in a lot of the industrialized world.
[00:50:40.720 --> 00:50:42.800] And so, it's obviously pretty damn critical.
[00:50:42.800 --> 00:50:44.880] Yeah, so no need to improve on it, right?
[00:50:46.560 --> 00:50:49.120] Actually, it's getting old and crusty, man.
[00:50:49.120 --> 00:50:50.160] Old and crusty.
[00:50:50.160 --> 00:50:54.240] As great as microwave atomic clocks are, there's another that's better.
[00:50:54.240 --> 00:50:58.240] And it's still an atomic clock, but it's an optical atomic clock.
[00:50:58.240 --> 00:51:03.000] And it's actually been in the news like this past week for big advances.
[00:50:59.840 --> 00:51:07.240] It's really gotten amazingly good, amazingly precise, fascinating stuff.
[00:51:07.560 --> 00:51:09.960] Now, these are complicated as hell.
[00:51:09.960 --> 00:51:11.560] I was trying to understand how they're working.
[00:51:11.560 --> 00:51:22.280] It's really a lot of complication in there, but it's basically very similar to these microwave atomic clocks, but they don't use microwaves to make the electron transitions.
[00:51:22.280 --> 00:51:24.120] They use optical wavelengths.
[00:51:24.200 --> 00:51:27.480] Obviously, that's why they call it optical atomic clocks.
[00:51:27.480 --> 00:51:31.400] So they're using the higher frequency optical wavelengths of light.
[00:51:31.400 --> 00:51:35.080] And that higher frequency means that there's even more ticks of the clock per second.
[00:51:35.080 --> 00:51:42.920] There's perhaps 100 times more accurate than microwave atomic clocks losing a second once only every 30 billion years.
[00:51:42.920 --> 00:51:47.400] So these are donkulous, these new optical clocks.
[00:51:47.400 --> 00:51:56.440] And many think that these optical atomic clocks will supplant the microwave clocks for our standard for this, our international atomic time.
[00:51:56.680 --> 00:51:58.120] And that might very well happen.
[00:51:58.120 --> 00:52:02.440] But before that does happen, there's a new game in town.
[00:52:03.240 --> 00:52:08.600] These nuclear clocks may be around in the near future at some point.
[00:52:08.920 --> 00:52:16.120] Now, as you may have guessed, nuclear clocks don't work off the tick of electrons transitioning between energy states.
[00:52:16.120 --> 00:52:20.440] Nuclear clocks would be based on the energy transitions of the nucleus, right?
[00:52:20.440 --> 00:52:27.720] So as the neutrons and protons themselves enter into a higher energy state and go back down to their ground state.
[00:52:27.720 --> 00:52:29.320] So that's what's oscillating.
[00:52:29.320 --> 00:52:33.560] It's the nucleus and not the electrons in their distant
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
n amazingly good, amazingly precise, fascinating stuff.
[00:51:07.560 --> 00:51:09.960] Now, these are complicated as hell.
[00:51:09.960 --> 00:51:11.560] I was trying to understand how they're working.
[00:51:11.560 --> 00:51:22.280] It's really a lot of complication in there, but it's basically very similar to these microwave atomic clocks, but they don't use microwaves to make the electron transitions.
[00:51:22.280 --> 00:51:24.120] They use optical wavelengths.
[00:51:24.200 --> 00:51:27.480] Obviously, that's why they call it optical atomic clocks.
[00:51:27.480 --> 00:51:31.400] So they're using the higher frequency optical wavelengths of light.
[00:51:31.400 --> 00:51:35.080] And that higher frequency means that there's even more ticks of the clock per second.
[00:51:35.080 --> 00:51:42.920] There's perhaps 100 times more accurate than microwave atomic clocks losing a second once only every 30 billion years.
[00:51:42.920 --> 00:51:47.400] So these are donkulous, these new optical clocks.
[00:51:47.400 --> 00:51:56.440] And many think that these optical atomic clocks will supplant the microwave clocks for our standard for this, our international atomic time.
[00:51:56.680 --> 00:51:58.120] And that might very well happen.
[00:51:58.120 --> 00:52:02.440] But before that does happen, there's a new game in town.
[00:52:03.240 --> 00:52:08.600] These nuclear clocks may be around in the near future at some point.
[00:52:08.920 --> 00:52:16.120] Now, as you may have guessed, nuclear clocks don't work off the tick of electrons transitioning between energy states.
[00:52:16.120 --> 00:52:20.440] Nuclear clocks would be based on the energy transitions of the nucleus, right?
[00:52:20.440 --> 00:52:27.720] So as the neutrons and protons themselves enter into a higher energy state and go back down to their ground state.
[00:52:27.720 --> 00:52:29.320] So that's what's oscillating.
[00:52:29.320 --> 00:52:33.560] It's the nucleus and not the electrons in their distant orbitals.
[00:52:33.560 --> 00:52:34.440] That's not easy, though.
[00:52:34.440 --> 00:52:35.400] It's not easy to do.
[00:52:35.400 --> 00:52:37.800] And they've been thinking about this for many decades.
[00:52:38.520 --> 00:52:44.480] For old school atomic clocks, we can use just like lab-grade lasers to force electrons to transition.
[00:52:44.200 --> 00:52:46.480] It's not that difficult.
[00:52:46.800 --> 00:52:57.600] But if you want to do that to a nucleus, if you want to cause that nucleus to enter into a greater energy state, you would need lasers that are at least a thousand times more powerful than what's common today.
[00:52:57.600 --> 00:52:59.280] So that's just not happening.
[00:52:59.600 --> 00:53:00.640] Not in the near future.
[00:53:01.120 --> 00:53:04.400] We just don't have the lasers that would be required to do that.
[00:53:04.400 --> 00:53:21.120] I mean, but even if we did, even if we did have a Star Trek laser to energize a nucleus properly, we would also need to know how very precisely what the energy gap is between the ground state of the nucleus and the higher energy state.
[00:53:21.120 --> 00:53:26.400] We don't know what that precise frequency is, and it's not easy to find that out.
[00:53:26.400 --> 00:53:30.080] A key to the researchers' breakthrough was a very special atom.
[00:53:30.080 --> 00:53:32.320] This is the thorium-229 atom.
[00:53:32.320 --> 00:53:35.200] This is special because thorium is kind of weird.
[00:53:35.440 --> 00:53:40.640] Its energized state is very, very close to the lowest energy state or the ground state.
[00:53:41.040 --> 00:53:47.360] The ground state is tiny-ish, and the next level up is really, really close.
[00:53:47.360 --> 00:53:51.360] And there's no other atom that has such a small transition energy.
[00:53:51.360 --> 00:53:56.240] And that's why we don't need a Star Trek laser to manipulate its nucleus.
[00:53:56.240 --> 00:54:02.080] And that's why they were able to do this, because thorium is very, very special in that regard.
[00:54:02.080 --> 00:54:05.680] You don't need a super powerful laser to energize a nucleus.
[00:54:05.680 --> 00:54:09.040] You can use something like an ultraviolet laser, which is what they did.
[00:54:09.040 --> 00:54:14.800] Okay, so the researchers used this vacuum ultraviolet laser, which they made themselves.
[00:54:14.800 --> 00:54:15.680] How cool is that?
[00:54:15.680 --> 00:54:17.120] They just like, yep, this is what we need.
[00:54:17.120 --> 00:54:19.120] We're just going to make this ultraviolet laser.
[00:54:19.120 --> 00:54:24.000] And they were able, for the first time ever, no one's been ever been able to do this before.
[00:54:24.000 --> 00:54:27.280] They excited the nucleus of a thorium-229.
[00:54:27.280 --> 00:54:40.280] And not only that, they were able to estimate that ideal frequency, like you know, the perfect frequency, they were able to determine what that frequency needs to be with one-thousandth the previous uncertainty.
[00:54:40.280 --> 00:54:48.120] So they took the standard uncertainty for that frequency that would be the best frequency to stimulate it, and they chopped that into a thousand pieces.
[00:54:48.120 --> 00:54:54.360] Like, here, it's a thousand times more or less certain, I guess is one way to say it.
[00:54:54.360 --> 00:54:56.520] Now, of course, we don't have nuclear clocks yet.
[00:54:56.520 --> 00:55:01.800] This is just kind of like setting the stage, I think, for making the first prototype.
[00:55:01.800 --> 00:55:08.040] And it could still take an amount of years before this is happening, but this was big.
[00:55:08.760 --> 00:55:09.880] This one will go down.
[00:55:09.880 --> 00:55:11.800] So, what is the future of timekeeping?
[00:55:11.800 --> 00:55:20.600] So, I've done a lot of thinking about what are we going to see in the near future in terms of timekeeping, especially with atomic clocks and nuclear clocks.
[00:55:20.600 --> 00:55:27.880] So, I think the current international atomic time standard that uses microwave atomic clocks, right, I think they're going to go away.
[00:55:28.520 --> 00:55:32.200] They're just not as precise as what we have now.
[00:55:32.520 --> 00:55:33.640] They're just not as good.
[00:55:33.640 --> 00:55:41.880] The electrons for these cesium atoms oscillate only a billion times per second, which is really limiting their precision at this point.
[00:55:42.040 --> 00:55:49.160] I think in the near future, and even now, optical atomic clocks are really kicking butt, especially this latest news item.
[00:55:49.720 --> 00:55:51.480] They really improve the hell out of it.
[00:55:51.480 --> 00:55:53.480] They have amazing precision because why?
[00:55:53.480 --> 00:55:55.240] They're using optical wavelengths, right?
[00:55:55.400 --> 00:56:03.720] The higher frequency means that there's more ticks to that clock, and so they can subdivide and be much more precise with their timing.
[00:56:03.720 --> 00:56:07.000] So, the clock ticks is many orders of magnitude higher.
[00:56:07.000 --> 00:56:10.520] The level of precision would probably make this a standard for timekeeping.
[00:56:11.560 --> 00:56:18.800] And maybe in five or six years, I think they may abandon the microwave cesium atomic clocks and go for the opticals.
[00:56:18.960 --> 00:56:24.240] Okay, but nuclear clocks though, they don't have, I was very disappointed of this actually.
[00:56:24.400 --> 00:56:28.800] The nuclear clocks won't have the precision of the optical atomic clocks.
[00:56:28.800 --> 00:56:32.400] And that's basically because of the frequency of UV laser light, right?
[00:56:32.400 --> 00:56:36.000] I mean, the frequency of UV laser light is not what optical is.
[00:56:36.000 --> 00:56:41.840] It's just not as good in terms of being precise and having so many ticks of that clock, as I've been saying.
[00:56:41.840 --> 00:56:46.400] But the nuclear clocks will, when we have them, they will have some very interesting advantages.
[00:56:46.400 --> 00:56:48.320] And one of these is stability.
[00:56:48.320 --> 00:56:49.600] One is stability.
[00:56:49.600 --> 00:56:58.560] These clocks will be far more stable because an atomic nucleus, think about it, it's much more isolated from the environment than an electron cloud is.
[00:56:58.560 --> 00:57:05.200] So think, an electron cloud that the atomic clocks depend on are is huge compared to the nucleus.
[00:57:05.200 --> 00:57:09.760] A nucleus is five orders of magnitude smaller than the atomic cloud.
[00:57:09.760 --> 00:57:18.320] So that electron cloud is out there and it can be influenced by ambient electromagnetic fields and other things, and that's what hurts the stability of the atomic clocks.
[00:57:18.320 --> 00:57:23.120] Nuclear clocks, on the other hand, with their tiny nucleus, it's really isolated.
[00:57:23.280 --> 00:57:29.600] So it's much more able to just like ignore things that could interfere with the atomic clock.
[00:57:29.600 --> 00:57:34.960] So that's a huge advantage with that stability, much, much better than the regular atomic clocks.
[00:57:34.960 --> 00:57:37.680] And this stability leads to another advantage.
[00:57:37.840 --> 00:57:45.520] You can pack the atoms that are involved in the timekeeping very, very close together, meaning that nuclear clocks won't need thorium gas.
[00:57:45.520 --> 00:57:47.600] You know, they won't need to be in a gas form.
[00:57:47.600 --> 00:57:55.760] It could be embedded in a solid material, meaning that these nuclear clocks can be solid state.
[00:57:55.760 --> 00:57:58.560] And you know, solid state is a huge advantage.
[00:57:58.560 --> 00:58:04.280] It's like none to very few moving parts at all, which means that it'd be much more portable.
[00:58:04.280 --> 00:58:04.840] And who knows?
[00:57:59.680 --> 00:58:07.080] I mean, it's just a huge advantage for these.
[00:58:07.320 --> 00:58:12.520] The stability of this, the potential nuclear clock, also has scientific advantages as well.
[00:58:12.520 --> 00:58:20.360] Nuclear clocks should have advantages testing theories of fundamental physics beyond the standard model, which I've been dying for for how many decades now?
[00:58:20.520 --> 00:58:27.000] So maybe it could actually contribute and find some tiny little hints of physics beyond the standard model.
[00:58:27.000 --> 00:58:46.520] We may find that fundamental constants are not as constant, and that's one of the things that people really are pinning their hopes on for nuclear clocks because if anything can look at the constants of physics close enough to see variation that we're not seeing now, these atomic nuclear clocks might be able to do that.
[00:58:46.520 --> 00:58:49.320] And they even might be able to find clues to dark matter.
[00:58:49.320 --> 00:58:53.560] So I'll end with one advantage for nuclear clocks that I read.
[00:58:53.560 --> 00:59:00.360] I couldn't confirm it, but I did read this on one website, and they said that the impact on GPS could be dramatic.
[00:59:00.360 --> 00:59:01.160] So what do you think?
[00:59:02.040 --> 00:59:04.440] What's the accuracy of conventional GPS today?
[00:59:04.440 --> 00:59:05.080] It's like, what?
[00:59:05.400 --> 00:59:08.280] It's like 4.9 meters, so 16 feet.
[00:59:08.280 --> 00:59:09.640] So that's the accuracy.
[00:59:09.640 --> 00:59:19.400] So this one website was saying that nuclear clocks could have an accuracy to improve the accuracy of GPS down to the millimeter scale, millimeters.
[00:59:19.400 --> 00:59:27.960] And so, Steve, if we had that and you had GPS on your keys, we could have said, yep, it's definitely in the lake, and you're never going to get it again.
[00:59:27.960 --> 00:59:28.760] Right.
[00:59:29.720 --> 00:59:30.760] Oh, nice tie-in.
[00:59:30.880 --> 00:59:31.640] Yeah, thank you.
[00:59:31.720 --> 00:59:32.040] I'm done.
[00:59:32.040 --> 00:59:33.960] Let's just end this damn segment.
[00:59:34.920 --> 00:59:36.120] All right, good job, Abby.
[00:59:36.680 --> 00:59:41.400] All right, Evan, tell us about this updated poll on creationism.
[00:59:41.400 --> 00:59:45.840] Yeah, poll on creationism that came out just a couple of days ago.
[00:59:44.840 --> 00:59:48.000] Poll by the firm called Gallup.
[00:59:48.240 --> 00:59:59.760] I'm sure we are all familiar with that, but those who are not, it is recognized as one of the leading polling organizations in the United States, and they take polls on all sorts of things all the time.
[00:59:59.760 --> 01:00:02.320] But this one, they've been tracking for a while.
[01:00:02.320 --> 01:00:06.800] The last time they took this particular poll was 2019.
[01:00:06.800 --> 01:00:10.560] So we're five years afterwards, and they did it again.
[01:00:10.560 --> 01:00:13.760] Here is exactly how it went down.
[01:00:14.000 --> 01:00:30.640] These were telephone interviews that were conducted between May 1st and May 23rd of this year with a random sample of 1,024 adults ages 18 or over living in all 50 United States and the District of Columbia.
[01:00:30.640 --> 01:00:39.600] The results based on the sample of national adults, the margin of sampling error was plus or minus four percentage points at the 95% confidence level.
[01:00:39.600 --> 01:00:40.800] So there you go.
[01:00:41.280 --> 01:00:44.080] Here's the question that they asked.
[01:00:44.080 --> 01:00:47.840] I mean, they asked a lot of questions in this particular poll, but one in particular was this one.
[01:00:47.920 --> 01:00:49.280] This is what made the headlines.
[01:00:49.280 --> 01:00:55.040] Which of the following statements comes closest to your views on the origin and development of human beings?
[01:00:55.040 --> 01:00:56.960] And there's basically three choices.
[01:00:56.960 --> 01:01:05.840] Number one: human beings have developed over millions of years from less advanced forms of life, but God guided this process.
[01:01:06.160 --> 01:01:16.160] Number two, human beings have developed over millions of years from less advanced forms of life, but God had no part in this process.
[01:01:16.480 --> 01:01:25.920] And option three: God created human beings pretty much in their present form at one time within the last 10,000 years or so.
[01:01:25.920 --> 01:01:32.760] The fourth option was basically that no opinion, no answer, of which 5% of these people did.
[01:01:29.520 --> 01:01:34.440] So here's what I'd like to do.
[01:01:35.240 --> 01:01:36.440] Ignore the 5%, right?
[01:01:36.440 --> 01:01:41.240] So we have 95% of these poll respondents that did answer one, two, or three.
[01:01:41.560 --> 01:01:44.280] How do you think this divided up?
[01:01:44.600 --> 01:01:49.640] And I want to know each of your opinions as to how you think this broke down.
[01:01:49.640 --> 01:01:55.640] Again, the first option was evolution, but God guided the process.
[01:01:55.640 --> 01:01:59.160] Number two, evolution, God had no part of it.
[01:01:59.160 --> 01:02:05.080] Number three, God creating human beings as they are today 10,000 years or more recently.
[01:02:05.080 --> 01:02:10.760] So, Steve, how do you think of those three, of the 95% left, how did that divide up, do you think?
[01:02:10.760 --> 01:02:13.480] I mean, I've been following this poll for 30 years, seven.
[01:02:13.720 --> 01:02:15.000] Bob, how do you think this?
[01:02:15.160 --> 01:02:17.640] And I know that I know I looked, I know all the recent numbers too.
[01:02:17.640 --> 01:02:18.600] I saw the SR.
[01:02:18.600 --> 01:02:20.040] So, yeah, I'll pass.
[01:02:20.040 --> 01:02:23.480] I'd say just God, 55%.
[01:02:23.480 --> 01:02:23.880] Okay.
[01:02:24.200 --> 01:02:24.840] What?
[01:02:24.840 --> 01:02:28.200] You mean like in the last 10,000 years?
[01:02:28.200 --> 01:02:28.520] Right.
[01:02:28.760 --> 01:02:30.040] No, I think that's like 20%.
[01:02:30.200 --> 01:02:30.520] What do you mean?
[01:02:30.680 --> 01:02:32.280] What does 10,000 years think?
[01:02:32.280 --> 01:02:36.280] I thought just God meant like they don't believe in evolution.
[01:02:36.920 --> 01:02:38.680] Well, let me read to you the third option again.
[01:02:38.680 --> 01:02:43.160] God created human beings pretty much in their present form at one time within the last 10,000 years.
[01:02:43.160 --> 01:02:45.000] Oh, I missed that last 10,000 years.
[01:02:45.000 --> 01:02:45.640] Oh, what?
[01:02:45.640 --> 01:02:45.880] All right.
[01:02:45.880 --> 01:02:48.360] So put that one at 25%.
[01:02:48.360 --> 01:02:49.800] Okay, I would put it.
[01:02:49.800 --> 01:02:51.080] All right, Kara says 25.
[01:02:51.640 --> 01:02:53.480] Jay, you think, what do you think?
[01:02:53.480 --> 01:02:55.160] I'll say that's 20%.
[01:02:55.160 --> 01:02:55.640] 25%.
[01:02:55.800 --> 01:02:56.280] 20%.
[01:02:56.680 --> 01:02:57.080] 20%.
[01:02:57.320 --> 01:02:59.800] Now, for each of you, Bob, Kerry, and Jay, I'll skip Steve.
[01:02:59.800 --> 01:03:04.840] So, Bob, what do you think about evolution and God had no part in the process?
[01:03:04.840 --> 01:03:08.040] Does that mean that God definitely exists, though?
[01:03:08.040 --> 01:03:08.680] Is that the implication?
[01:03:08.880 --> 01:03:09.720] Don'll read into it, Bob.
[01:03:09.960 --> 01:03:10.680] All right, we'll read into it.
[01:03:10.920 --> 01:03:11.720] There's only three choices.
[01:03:12.040 --> 01:03:12.440] This is how that is.
[01:03:12.520 --> 01:03:14.360] I'll say 35% for that.
[01:03:14.360 --> 01:03:16.480] Okay, 35%.
[01:03:14.520 --> 01:03:19.600] That's 60, and that means you mean 35% for the first category.
[01:03:19.760 --> 01:03:20.240] Got it?
[01:03:20.400 --> 01:03:23.600] Kara, God had no part in the process?
[01:03:23.600 --> 01:03:28.720] I think it's 40% scientific evolution and 30% religious evolution.
[01:03:28.720 --> 01:03:29.840] And Jay, what do you think?
[01:03:29.840 --> 01:03:32.160] God had no part in the process.
[01:03:32.480 --> 01:03:33.840] Very low percentage.
[01:03:33.840 --> 01:03:34.800] You think it's low?
[01:03:34.800 --> 01:03:37.200] Lower than 40% or 35%, like Bob is.
[01:03:37.600 --> 01:03:39.520] Yeah, I'd even go lower than 30%.
[01:03:39.520 --> 01:03:40.960] Should I put you in at a 30%?
[01:03:40.960 --> 01:03:41.680] Yeah, go ahead.
[01:03:41.680 --> 01:03:45.200] All right, so you're either be safe and therefore 45% in the first category.
[01:03:45.200 --> 01:03:48.480] Okay, so here are the results.
[01:03:48.480 --> 01:03:48.880] Okay.
[01:03:49.200 --> 01:03:53.840] Man developed with God's guiding hand, 34%.
[01:03:53.840 --> 01:03:56.160] So, Bob, you were pretty much on the mark with that.
[01:03:56.160 --> 01:03:56.320] Yeah.
[01:03:56.320 --> 01:03:57.200] Good job, Bob.
[01:03:57.200 --> 01:03:57.920] Very well done.
[01:03:58.560 --> 01:04:01.600] Man developed, but God had no part in the process.
[01:04:01.600 --> 01:04:03.200] 24%.
[01:04:03.840 --> 01:04:06.720] You guys basically flipped evolution and creationism in the poll.
[01:04:06.720 --> 01:04:07.600] Oh, Jesus.
[01:04:07.920 --> 01:04:08.880] Has it always been that way?
[01:04:09.440 --> 01:04:10.320] Oh, is it getting worse?
[01:04:11.200 --> 01:04:12.160] It was much worse.
[01:04:12.160 --> 01:04:12.960] Yeah, it was worse.
[01:04:13.440 --> 01:04:17.040] I'll give you the historical worsts on these in a second.
[01:04:17.040 --> 01:04:22.000] And God created man in present form, 37%.
[01:04:22.240 --> 01:04:23.840] Jesus, that's so high.
[01:04:24.080 --> 01:04:27.760] That is higher than all of you basically guess.
[01:04:28.560 --> 01:04:32.400] And probably, I think I would have fallen in the same category as you guys.
[01:04:33.120 --> 01:04:37.360] Because we also are keeping up with the Gallups on nuns, right?
[01:04:37.360 --> 01:04:39.520] On actual religious belief.
[01:04:39.520 --> 01:04:43.040] And fewer and fewer people are identifying as religious.
[01:04:43.360 --> 01:04:44.560] Yeah, but weird that.
[01:04:45.520 --> 01:04:49.600] I think that has more to do with not being down with organized religion.
[01:04:49.600 --> 01:04:53.760] It doesn't mean they're not spiritual and don't hold magical beliefs.
[01:04:53.760 --> 01:04:54.400] True.
[01:04:54.400 --> 01:04:54.880] Yeah.
[01:04:55.440 --> 01:04:59.280] So they've been doing this poll since 1982.
[01:04:59.280 --> 01:05:01.720] This was the first time they asked this question.
[01:05:01.720 --> 01:05:02.520] Yeah, I know.
[01:04:59.760 --> 01:05:04.200] They've been tracking this a while now.
[01:05:04.520 --> 01:05:11.960] And back in 1982, 44% were in the category of God-created man in present form.
[01:05:11.960 --> 01:05:17.720] So it's gone down, which is good, from 44 to 37.
[01:05:17.720 --> 01:05:19.640] Doesn't sound like a lot, but it is significant.
[01:05:19.640 --> 01:05:23.720] And it's the and this time it's the lowest that's ever polled.
[01:05:23.720 --> 01:05:31.800] So that's the point, and that's kind of why it made the headlines is because it's the lowest that they've recorded.
[01:05:31.800 --> 01:05:38.200] But here, back in 1982, man developed, but God had no part in the process.
[01:05:38.200 --> 01:05:40.520] 1982, 9%.
[01:05:41.160 --> 01:05:41.800] 9%.
[01:05:41.800 --> 01:05:42.280] That's about 20%.
[01:05:42.520 --> 01:05:43.000] 24%.
[01:05:43.800 --> 01:05:44.840] Yeah, but now it's 24%.
[01:05:45.400 --> 01:05:45.720] So that's good.
[01:05:45.960 --> 01:05:46.520] That's huge.
[01:05:46.920 --> 01:05:47.880] Yeah, that's a big jump.
[01:05:47.880 --> 01:05:50.440] And that's tracking with the nuns, I think, Kara.
[01:05:50.920 --> 01:05:51.480] Yeah.
[01:05:51.720 --> 01:05:52.440] N-O-N-E.
[01:05:52.760 --> 01:05:52.920] Yeah.
[01:05:52.920 --> 01:05:53.080] Yeah.
[01:05:53.560 --> 01:05:53.720] Right.
[01:05:54.440 --> 01:05:55.160] Not N-U-N.
[01:05:55.160 --> 01:05:55.480] Yeah.
[01:05:55.480 --> 01:05:55.640] Yeah.
[01:05:56.200 --> 01:05:57.080] Exactly.
[01:05:57.080 --> 01:06:05.960] So I guess the point is that it's a good trend improving, and there's a lot of work to do here.
[01:06:06.120 --> 01:06:07.000] It's still horrible.
[01:06:07.000 --> 01:06:07.480] Yeah.
[01:06:07.480 --> 01:06:08.360] It is horrible.
[01:06:08.360 --> 01:06:18.120] I also wonder how many of that 5% would have been in that category if not for exactly the point that Bob raised, which is the wording of this poll is not good.
[01:06:18.120 --> 01:06:18.840] It's not great.
[01:06:19.240 --> 01:06:24.200] It sounds like it assumes that God exists the way that it's worded.
[01:06:24.200 --> 01:06:29.640] So it's somebody who is like an atheist, atheist who's like, God's hand wasn't involved.
[01:06:29.640 --> 01:06:31.000] Well, what if there's no God's hand?
[01:06:31.000 --> 01:06:31.880] How do I answer that question?
[01:06:32.040 --> 01:06:35.240] But it does say what's closest to your views.
[01:06:35.640 --> 01:06:36.680] But you're right.
[01:06:36.920 --> 01:06:40.360] The way you ask these questions has a big influence on it.
[01:06:40.680 --> 01:06:45.920] We talked about this with Jeannie Scott, you know, that the different surveys you ask the question different ways.
[01:06:46.240 --> 01:06:57.840] And the criticism of the Gallup poll has always been that people may answer, you know, may give the creationism answer because they don't want to seem anti-God.
[01:06:58.160 --> 01:07:08.160] You know, even if they don't really believe that, you know, and you do get different results depending on how this is all God-focused, you know, the way that they're asking these questions.
[01:07:08.160 --> 01:07:16.000] If you ask it more of a, from a science-focused perspective, more people will endorse evolution if you don't have God in the question.
[01:07:16.000 --> 01:07:16.480] You know what I mean?
[01:07:16.720 --> 01:07:17.440] Absolutely.
[01:07:17.440 --> 01:07:24.960] I, you know, I did my dissertation on medical aid and dying, and Gallup, I think it was Gallup, asked the medical aid and dying question two different ways.
[01:07:24.960 --> 01:07:27.840] In one of them, they called it assisted suicide.
[01:07:27.840 --> 01:07:30.240] And in another, they'd never use the word suicide.
[01:07:30.240 --> 01:07:31.680] And the answers were wildly different.
[01:07:31.920 --> 01:07:32.880] Wildly different, yeah.
[01:07:32.880 --> 01:07:34.240] Yeah, because it's prejudicial.
[01:07:34.240 --> 01:07:36.880] There's certain words that bias people's responses.
[01:07:36.880 --> 01:07:45.440] I also didn't like that the secular or scientific evolution choice said that we evolved from simpler forms.
[01:07:45.440 --> 01:07:47.600] Yeah, even that was weird.
[01:07:47.600 --> 01:07:50.160] And none of the other ones use the same verbiage.
[01:07:50.160 --> 01:07:51.520] Like there should be continuity.
[01:07:51.920 --> 01:07:52.400] Yeah.
[01:07:52.400 --> 01:07:53.040] Yeah.
[01:07:53.040 --> 01:07:53.280] Right.
[01:07:53.280 --> 01:07:54.720] It wasn't even a good, yeah.
[01:07:54.720 --> 01:07:59.280] If you're if you're somebody who's, you know, knows a lot about evolution, you would not sit well with that.
[01:07:59.280 --> 01:08:04.640] But the advantage of this poll, as Evan said, is this is 42 years.
[01:08:04.960 --> 01:08:07.280] It's the same exact question over time.
[01:08:07.280 --> 01:08:08.880] That has utility.
[01:08:09.520 --> 01:08:10.160] It does.
[01:08:10.320 --> 01:08:12.400] And saying which one is the closest to you.
[01:08:12.400 --> 01:08:13.520] Like, at least there's that qualifying.
[01:08:13.760 --> 01:08:17.760] Yeah, so it tells you more about the change than about the absolute numbers.
[01:08:18.080 --> 01:08:24.720] Right, because the absolute numbers have more to do with the wording of the survey, whereas the change tells you more about, I think, change in society.
[01:08:24.720 --> 01:08:29.520] So at least these, you know, at least these numbers are going, you're heading in the right direction.
[01:08:29.560 --> 01:08:31.000] Yeah, slowly trending in the current.
[01:08:31.160 --> 01:08:32.040] Not all trends are bad.
[01:08:32.040 --> 01:08:36.440] I do feel like we do sound like we're doomsayers, like everything's horrible and getting worse and whatever.
[01:08:29.680 --> 01:08:37.000] We're really not.
[01:08:37.080 --> 01:08:39.800] We just follow the evidence whichever way it goes.
[01:08:39.800 --> 01:08:44.920] And like there are things, some surprisingly good trends, like scientific literacy is way up.
[01:08:44.920 --> 01:08:45.720] You know what I mean?
[01:08:46.360 --> 01:08:47.160] Even given everything.
[01:08:47.640 --> 01:08:49.160] Most things are trending.
[01:08:49.480 --> 01:08:50.520] Most things are actually trending.
[01:08:51.240 --> 01:08:52.920] Yeah, when you zoom out enough.
[01:08:52.920 --> 01:08:53.400] Yeah.
[01:08:53.640 --> 01:08:54.200] Yeah, yeah, yeah.
[01:08:54.280 --> 01:08:57.560] I mean, short-term trends can go anywhere.
[01:08:57.560 --> 01:09:01.880] But yeah, if you at the decadal, you know, kind of trend, people are getting smarter.
[01:09:01.880 --> 01:09:03.160] They're getting more savvy.
[01:09:03.160 --> 01:09:04.760] They're getting more scientifically literate.
[01:09:05.480 --> 01:09:06.520] Rhyme is way down.
[01:09:06.760 --> 01:09:09.480] Yeah, there's a lot of good trends happening.
[01:09:09.480 --> 01:09:11.080] Not all the trends are bad.
[01:09:11.080 --> 01:09:14.360] Obviously, the bad trends are newsworthy, right?
[01:09:14.360 --> 01:09:16.520] Like if it bleeds, it leads kind of thing.
[01:09:16.840 --> 01:09:20.840] And they're often taken out of context, and they're often, the resolution is very low.
[01:09:20.840 --> 01:09:21.400] Right.
[01:09:21.720 --> 01:09:22.040] Right.
[01:09:22.040 --> 01:09:23.240] Ooh, more of this.
[01:09:23.240 --> 01:09:25.800] And it's like, yeah, over the last six months, okay.
[01:09:25.800 --> 01:09:26.600] Yeah, right.
[01:09:27.320 --> 01:09:30.120] It's why my financial planner tells me not to look at my investment.
[01:09:30.360 --> 01:09:31.320] Don't even look at it.
[01:09:31.880 --> 01:09:36.200] Until you get within a few years or a decade, whatever.
[01:09:36.200 --> 01:09:41.400] Yeah, when you're at your age, I made sure my daughter started investing already.
[01:09:41.800 --> 01:09:43.480] And obviously, we helped them get started.
[01:09:43.480 --> 01:09:43.800] Same.
[01:09:44.360 --> 01:09:46.840] A very young age, absolutely just get started.
[01:09:46.840 --> 01:09:48.280] And I'm like, don't even look at it.
[01:09:48.280 --> 01:09:52.520] Just don't even let it ride for 50 years, whatever.
[01:09:53.320 --> 01:09:55.160] That's your retirement account.
[01:09:55.160 --> 01:09:56.440] Just don't worry about it.
[01:09:56.440 --> 01:09:56.920] Yep.
[01:09:56.920 --> 01:09:57.480] Yeah.
[01:09:58.120 --> 01:09:58.520] All right.
[01:09:58.520 --> 01:09:59.320] Thanks, Evan.
[01:09:59.320 --> 01:10:01.240] Jay, it's who's that noisy time?
[01:10:01.240 --> 01:10:01.960] All right, guys.
[01:10:01.960 --> 01:10:04.120] Last week I played This Noisy.
[01:10:15.840 --> 01:10:18.320] That's an annoying sound, right?
[01:10:20.240 --> 01:10:21.760] I had a ton of people email me.
[01:10:21.760 --> 01:10:22.400] You know, I was away.
[01:10:22.720 --> 01:10:25.040] You know, I had two weeks of answers this week.
[01:10:25.040 --> 01:10:26.800] You're not interested in what we think?
[01:10:26.800 --> 01:10:27.600] I am.
[01:10:31.600 --> 01:10:33.360] You have an opinion, Steve?
[01:10:33.360 --> 01:10:38.560] I mean, it does sound like a bird to me, but then a lot of things sound like birds to me.
[01:10:38.880 --> 01:10:40.160] But it could be something electronic.
[01:10:40.160 --> 01:10:41.680] But a lot of birds sound electronic.
[01:10:41.760 --> 01:10:42.320] They sound electronic.
[01:10:42.400 --> 01:10:42.640] They do.
[01:10:42.640 --> 01:10:43.680] A lot of birds sound electronic.
[01:10:43.840 --> 01:10:45.840] Carling sounds so electronic.
[01:10:45.840 --> 01:10:48.320] Crown-headed cowbirds are always the crazy one.
[01:10:48.640 --> 01:10:49.600] All right, anybody else?
[01:10:49.600 --> 01:10:52.320] I mean, you know, stop me on my tricks.
[01:10:52.960 --> 01:10:59.040] All right, well, a listener named Keely Hill wrote in, and Keely said, it sounds outside, but I don't hear wind.
[01:10:59.040 --> 01:11:05.840] I'll guess a malfunctioning fire alarm of a warehouse in the woods heard from a short distance outside.
[01:11:05.840 --> 01:11:07.120] I think this is a great guess.
[01:11:07.120 --> 01:11:12.480] It's not correct, but yeah, this thing, whatever it is, sounds like an alarm.
[01:11:12.480 --> 01:11:14.000] Visto Tutti wrote in.
[01:11:14.000 --> 01:11:18.080] He says, sounds like a bird, but I always think it's a bird, and I'm always wrong.
[01:11:18.080 --> 01:11:24.000] So I'm saying it's a windmill with squeaky bearings, and it really needs some lubrication.
[01:11:24.000 --> 01:11:26.960] Ladies and gentlemen, Visto Tuti.
[01:11:26.960 --> 01:11:28.640] You are wrong, sir.
[01:11:28.960 --> 01:11:30.640] But in all the right ways.
[01:11:30.640 --> 01:11:31.840] Do you know what I mean?
[01:11:32.800 --> 01:11:36.000] Michael Blaney wrote in, Hey, Jay, sounds like a bird to me.
[01:11:36.000 --> 01:11:38.400] So I'm guessing it's the northern mockingbird.
[01:11:38.400 --> 01:11:41.760] As I read, they can sound like a car alarm going off.
[01:11:42.400 --> 01:11:48.960] Michael, you're the first person that guessed it was an actual bird, which puts you in Steve's camp.
[01:11:48.960 --> 01:11:50.080] I will say no more.
[01:11:50.080 --> 01:11:52.480] You know how you know when you're listening to a mockingbird?
[01:11:52.480 --> 01:11:52.880] How?
[01:11:52.360 --> 01:11:52.760] No.
[01:11:53.040 --> 01:11:56.960] They cycle through three or four different bird calls, all one after another.
[01:11:57.120 --> 01:11:57.840] That's cool.
[01:11:57.960 --> 01:11:59.600] A listener named Rob Webb wrote in.
[01:11:59.600 --> 01:12:07.560] He said, My 11-year-old son, Socorso, would love to guess, finally, after listening to you for all for many years.
[01:12:07.560 --> 01:12:10.920] He said it's a broken backing up car horn.
[01:12:10.920 --> 01:12:14.360] And I got to be honest with you guys, this is my favorite guess so far.
[01:12:14.840 --> 01:12:17.320] Because it does sound like that a lot.
[01:12:17.320 --> 01:12:22.280] And then Rob said that he thinks it's a parrot scratching its nails on a chalkboard.
[01:12:22.280 --> 01:12:28.200] If I was in a room with a parrot that was doing that, one of us would not make it out of the room alive.
[01:12:29.480 --> 01:12:31.880] So, okay, I have a few winners here.
[01:12:31.880 --> 01:12:40.120] The first person to write in the correct answer was Jennifer Narang, and she said, Evening, I think it is a bare-throated bellbird.
[01:12:40.120 --> 01:12:41.720] And she is correct.
[01:12:41.720 --> 01:12:43.960] Lydia Parsons wrote in, Hello, Jay and team.
[01:12:43.960 --> 01:12:47.080] My guess for this week's Who's That Noisy is the call of the Bellbird.
[01:12:47.400 --> 01:12:48.040] Bird.
[01:12:48.040 --> 01:12:49.880] And Lydia is correct.
[01:12:49.880 --> 01:12:55.240] And then Wilson Fazio Martins wrote in and said, Hi, Jay.
[01:12:55.240 --> 01:13:00.440] I listened to SGU since the beginning, but this is my first time on Who's That Noisy because I know exactly what the noisy is.
[01:13:00.440 --> 01:13:03.320] It's a Brazilian bird called Araponga.
[01:13:03.720 --> 01:13:05.640] And I can't pronounce the other words that he wrote here.
[01:13:05.640 --> 01:13:10.040] It sounds like an anvil, and some even call it a blacksmith bird.
[01:13:10.360 --> 01:13:13.560] Wikipedia translates the arapanga to a bellbird.
[01:13:13.560 --> 01:13:15.640] So, yeah, so that's a bellbird.
[01:13:15.640 --> 01:13:20.600] Apparently, there's different kinds of bellbirds, but they all make this ridiculous noise.
[01:13:20.600 --> 01:13:28.920] I'm going to play that for you again.
[01:13:28.920 --> 01:13:31.240] Okay, so you have one of these in your backyard.
[01:13:31.880 --> 01:13:36.360] How long until you start thinking creatively about how to get rid of that bird?
[01:13:36.600 --> 01:13:38.840] Maybe it just fades into the background at some point.
[01:13:38.840 --> 01:13:39.560] I don't think so.
[01:13:39.560 --> 01:13:40.520] You don't think so?
[01:13:40.520 --> 01:13:40.920] Nope.
[01:13:40.920 --> 01:13:41.160] Sorry.
[01:13:41.320 --> 01:13:43.000] Does it acclimate to that sound?
[01:13:43.000 --> 01:13:43.800] Nope.
[01:13:44.440 --> 01:13:44.960] No.
[01:13:44.960 --> 01:13:48.000] Some noises I pick are highly irritating.
[01:13:48.000 --> 01:13:50.080] But my bird radar was on this week.
[01:13:50.080 --> 01:13:50.880] Yeah, you got it.
[01:13:44.600 --> 01:13:51.600] I mean, you know, it wasn't.
[01:13:52.640 --> 01:13:56.400] When you hear anything like that, you know, you're 50% right if you're saying it's a bird.
[01:13:56.400 --> 01:14:00.080] You know, you know, some crazy clacks on or a bird.
[01:14:00.080 --> 01:14:00.560] Yeah.
[01:14:00.560 --> 01:14:02.880] All right, I got a new noisy for you guys this week.
[01:14:03.040 --> 01:14:06.480] It was sent in by a listener named Alex Freshie.
[01:14:06.480 --> 01:14:09.760] F-R-E-S-C-H-I.
[01:14:09.760 --> 01:14:11.120] And here it is.
[01:14:19.040 --> 01:14:21.040] Very interesting, right?
[01:14:21.040 --> 01:14:22.000] Hmm.
[01:14:22.000 --> 01:14:24.480] If you think you know what it is, Kara, you know what to do, right?
[01:14:24.480 --> 01:14:25.200] You email me.
[01:14:25.200 --> 01:14:27.920] Well, you could email me at my regular email address, Kara, if you want to.
[01:14:28.160 --> 01:14:30.000] I could, but I won't give that out.
[01:14:30.000 --> 01:14:34.000] No, but email me at WTN at the skepticsguy.org.
[01:14:34.000 --> 01:14:37.040] And if you heard something cool, please take the time.
[01:14:37.040 --> 01:14:38.480] It takes you seconds to do this.
[01:14:38.480 --> 01:14:39.440] It's just email.
[01:14:39.440 --> 01:14:40.560] Just email me.
[01:14:40.560 --> 01:14:41.040] It's good.
[01:14:41.040 --> 01:14:42.560] It's good for the show.
[01:14:42.560 --> 01:14:43.040] All right.
[01:14:43.040 --> 01:14:49.760] So, Steve, I think I am formally shutting down ticket sales for the 1000th show.
[01:14:49.920 --> 01:14:50.400] Wow.
[01:14:50.560 --> 01:14:51.440] We're full.
[01:14:51.440 --> 01:14:52.720] We are full.
[01:14:52.720 --> 01:14:54.560] Don't say we didn't warn you.
[01:14:54.560 --> 01:14:55.040] That's it.
[01:14:55.120 --> 01:14:56.480] We're capacity.
[01:14:57.360 --> 01:14:59.600] I might let one more ticket be sold.
[01:14:59.600 --> 01:15:00.320] A golden ticket.
[01:15:00.400 --> 01:15:01.040] A golden ticket.
[01:15:01.520 --> 01:15:02.160] I wish I could do that.
[01:15:02.160 --> 01:15:03.520] I wish I could flag it.
[01:15:03.760 --> 01:15:09.120] There's lots of things that the SGU would like you guys to know, and I'm going to tell you all of them as quickly as I can.
[01:15:09.120 --> 01:15:14.320] One thing is, we are coming up on our 1000th show, and this means a couple of things.
[01:15:14.320 --> 01:15:27.200] One, we've been doing this for 20 years, and if you find any value in the work that we do, if you think that the effort that we put into this show is good for humanity, then please consider becoming a patron of ours.
[01:15:27.200 --> 01:15:30.040] It really does help us do all the things that we want to do.
[01:15:30.040 --> 01:15:33.320] It allows us to put more time and energy into the SGU.
[01:15:29.600 --> 01:15:42.200] So you can go to patreon.com forward slash skepticsguide, that's one word, and you could read up on what it would mean to become a patron of the SGU.
[01:15:42.360 --> 01:15:44.120] Next thing, you can join our mailing list.
[01:15:44.120 --> 01:15:49.080] We send out a mailer every week that outlines everything that we've created the previous week.
[01:15:49.080 --> 01:15:50.760] And there's some fun stuff in there.
[01:15:50.760 --> 01:15:53.240] And, you know, we have a word of the week and all that stuff.
[01:15:53.240 --> 01:15:55.240] So if you're interested, please join us.
[01:15:55.640 --> 01:15:57.800] I will drop a little teaser here.
[01:15:57.800 --> 01:16:11.480] I am in the middle of it, it probably won't happen until after the Chicago show or 1000th show, but I'm in the middle of creating an SGU puzzle game that will be a weekly puzzle game.
[01:16:12.200 --> 01:16:13.720] And what version?
[01:16:15.160 --> 01:16:17.240] I'm testing out a bunch of different things.
[01:16:17.240 --> 01:16:22.680] I've been using ChatGPT to help me come up with what's possible.
[01:16:23.800 --> 01:16:25.000] It codes really well.
[01:16:25.240 --> 01:16:28.920] You can have ChatGPT make some simple mechanics and stuff.
[01:16:28.920 --> 01:16:34.120] And then I was going to ask a programmer that Ian and I know to probably help us actually do it for real.
[01:16:34.120 --> 01:16:37.160] But I'm just looking at different things right now.
[01:16:37.720 --> 01:16:42.360] I want to come up with something that's kind of like connections, like New York Times connections type of game.
[01:16:43.000 --> 01:16:51.240] But I'm not saying I was thinking after the logical fallacy themed puzzle, I was thinking, huh, we should do a skeptical SGU themed crossword puzzle.
[01:16:51.240 --> 01:16:52.840] We are on the same page, Steve.
[01:16:52.840 --> 01:16:53.320] Yeah.
[01:16:53.320 --> 01:16:55.240] But I'm actually getting the work done.
[01:16:55.240 --> 01:16:57.240] Yes, it's a good time.
[01:16:57.240 --> 01:16:58.360] This is your full-time job.
[01:16:59.560 --> 01:17:01.880] But how hard is it to make crossword puzzles?
[01:17:01.920 --> 01:17:03.160] I mean, it's got to be crazy hard.
[01:17:03.160 --> 01:17:06.760] But now I see that there are compilers that will kind of do all the, I guess, the hard work.
[01:17:06.760 --> 01:17:08.200] I'm going to do a hard computer to do it for you.
[01:17:08.680 --> 01:17:11.880] Yeah, I think it's hard to make unique crossword puzzles.
[01:17:11.880 --> 01:17:16.400] It's probably easier to use what has already been made to your advantage.
[01:17:16.720 --> 01:17:20.480] That's basically, yeah, like my father-in-law told me, I'm like, how do you do those crosswords?
[01:17:20.560 --> 01:17:23.600] He's like, Jay, you do it for 10 years and you know all the answers.
[01:17:23.600 --> 01:17:28.560] Yeah, I've noticed that when I'm doing the New York Times puzzles for a while, there's a lot of repeats.
[01:17:28.560 --> 01:17:29.840] There's a lot of big ads.
[01:17:30.160 --> 01:17:30.960] All right, to continue.
[01:17:31.760 --> 01:17:41.120] You can give our showtides and you could also give the SGU podcast a rating on whatever podcast player you're using.
[01:17:41.120 --> 01:17:46.160] Do what you got to do to let other people find out about us if you think that people would be interested.
[01:17:46.160 --> 01:17:51.760] Now, we have tickets available for the Chicago Extravaganza, which is not sold out.
[01:17:51.760 --> 01:17:54.320] This is the one that's going to start at 2:30 p.m.
[01:17:54.320 --> 01:17:58.400] in the afternoon on Saturday, August 17th.
[01:17:58.400 --> 01:18:02.320] And hey, you could stay for the Democratic National Convention.
[01:18:02.320 --> 01:18:03.280] That's correct.
[01:18:03.840 --> 01:18:08.400] We are going to be trying out some new bits in that particular extravaganza that we've never done before.
[01:18:08.400 --> 01:18:12.160] So I guarantee you it's going to be a little weird and unhinged, which is great.
[01:18:12.480 --> 01:18:13.040] Cara loves it.
[01:18:13.200 --> 01:18:14.560] It's a feature, not a bug.
[01:18:14.560 --> 01:18:19.520] When we go flying directly off the rails on this show, Kara loves it.
[01:18:19.840 --> 01:18:20.800] Who wouldn't?
[01:18:20.800 --> 01:18:21.200] I know.
[01:18:21.200 --> 01:18:21.840] No, it's fun.
[01:18:21.840 --> 01:18:22.240] It's good.
[01:18:22.240 --> 01:18:23.120] We're just going to try some new stuff.
[01:18:23.280 --> 01:18:23.840] We have rails?
[01:18:24.320 --> 01:18:25.920] Oh, Stephen, I almost forgot.
[01:18:25.920 --> 01:18:29.440] I just scheduled us for an extravaganza in Washington, D.C.
[01:18:29.520 --> 01:18:30.880] on December 7th.
[01:18:30.880 --> 01:18:32.960] We're going to be going to the Miracle Theater.
[01:18:32.960 --> 01:18:38.320] And that weekend, we're also going to be having another SGU private show, of course.
[01:18:38.320 --> 01:18:40.800] But details on that will come out in a little bit.
[01:18:40.800 --> 01:18:44.640] But if you're interested in buying tickets for the extravaganza in D.C.
[01:18:44.640 --> 01:18:50.480] on December 7th, you can just go to the homepage, SGU homepage, and there's a button there for that, too.
[01:18:50.480 --> 01:18:51.840] More shows, man.
[01:18:51.840 --> 01:18:53.120] Thank you, brother.
[01:18:53.120 --> 01:18:58.640] Ford was built on the belief that the world doesn't get to decide what you're capable of.
[01:18:58.640 --> 01:18:59.600] You do.
[01:18:59.600 --> 01:19:03.160] So, ask yourself: can you or can't you?
[01:19:03.160 --> 01:19:07.960] Can you load up a Ford F-150 and build your dream with sweat and steel?
[01:19:07.960 --> 01:19:11.640] Can you chase thrills and conquer curves in a Mustang?
[01:19:11.640 --> 01:19:16.120] Can you take a Bronco to where the map ends and adventure begins?
[01:19:16.120 --> 01:19:20.200] Whether you think you can or think you can't, you're right.
[01:19:20.200 --> 01:19:21.080] Ready?
[01:19:21.080 --> 01:19:21.880] Set.
[01:19:21.880 --> 01:19:22.760] Ford.
[01:19:23.080 --> 01:19:26.280] We have a name that logical fallacy this week.
[01:19:26.680 --> 01:19:28.840] This comes from Tim.
[01:19:29.480 --> 01:19:30.600] Only something.
[01:19:30.680 --> 01:19:31.160] Tim?
[01:19:31.160 --> 01:19:33.000] Tim writes, Dr.
[01:19:33.000 --> 01:19:36.360] Steve, you've had email conversations in the past with a Tim Dowling.
[01:19:36.360 --> 01:19:37.080] That's my dad.
[01:19:37.400 --> 01:19:39.640] We have a second generation skeptic here.
[01:19:39.640 --> 01:19:40.360] Excellent.
[01:19:40.360 --> 01:19:43.400] He says, longtime listener, first-time writer, thanks for all you do.
[01:19:43.400 --> 01:19:46.600] You've been a big influence in my life and how I look at the world.
[01:19:46.600 --> 01:19:51.080] But to the point, I had a big religious conversation with my sister this evening.
[01:19:51.080 --> 01:19:56.680] Much of the discussion was about if God, evangelical Christian interpretation, is logical.
[01:19:56.680 --> 01:20:00.920] She was back and forth, but eventually settled on God being logical.
[01:20:00.920 --> 01:20:05.240] She then qualified that by saying, we just don't always understand it.
[01:20:05.240 --> 01:20:12.680] The example we were arguing was how God could be in control of everything, yet let things happen that are not of his will.
[01:20:12.680 --> 01:20:14.360] That sounds illogical to me.
[01:20:14.360 --> 01:20:20.520] You're either in control of everything, or everything that happens is your will, or you aren't, and it's not.
[01:20:20.520 --> 01:20:22.440] But that's a discussion for another day.
[01:20:22.440 --> 01:20:25.800] First question, but not the real one I'm asking.
[01:20:25.800 --> 01:20:27.560] What logical fallacy is?
[01:20:27.560 --> 01:20:29.640] We just don't understand it.
[01:20:29.640 --> 01:20:31.560] Maybe moving the goalposts?
[01:20:31.560 --> 01:20:32.920] Now, for the real question.
[01:20:32.920 --> 01:20:37.720] It seems to me that she is really just conflating the words logic and reasons.
[01:20:37.720 --> 01:20:43.160] I'm 100% happy with the statement, we just don't understand the reason God does things.
[01:20:43.160 --> 01:20:47.440] But God is logical, we just don't always understand it, doesn't sit right.
[01:20:47.680 --> 01:20:50.320] Isn't the very nature of logic knowable?
[01:20:50.320 --> 01:20:55.920] If you don't follow someone's logic, it isn't because it's a mystery or unknowable.
[01:20:55.920 --> 01:20:58.160] You just need more info to understand it.
[01:20:58.160 --> 01:21:04.080] Granted, we can't ask God questions, but we can look at the Bible and get that knowledge to a degree at least.
[01:21:04.080 --> 01:21:09.600] I don't think my sister would say we can't know God's logic because there isn't enough information in the Bible.
[01:21:09.600 --> 01:21:12.400] Bottom line: what is the nature of logic?
[01:21:12.400 --> 01:21:14.160] Can it be unknowable?
[01:21:14.160 --> 01:21:18.080] I know there is probably a lot of unstated baggage with this question.
[01:21:18.080 --> 01:21:20.560] I'm not trying to ask a religious question.
[01:21:20.560 --> 01:21:27.280] For the record, I'm a deist at best, and I don't find the Christian God terribly logical, or if he is, it's not a logic I want any part of.
[01:21:27.280 --> 01:21:28.960] Thanks to him.
[01:21:28.960 --> 01:21:33.840] So, this is kind of a classic theology philosophy question.
[01:21:34.160 --> 01:21:34.640] I don't know.
[01:21:34.640 --> 01:21:37.600] Can you help me understand the actual question that was being asked?
[01:21:37.600 --> 01:21:38.320] Because there's a lot.
[01:21:39.280 --> 01:21:41.120] Yeah, it comes down to two things.
[01:21:41.120 --> 01:21:48.000] The first one is: what is the logical fallacy of God is logical, we just don't understand his logic?
[01:21:48.320 --> 01:21:51.440] And I don't know if you guys have any immediate thoughts about that.
[01:21:51.760 --> 01:21:56.560] I mean, he suggested moving the goalpost, and I can kind of see hints of that in there, but it's not the best one.
[01:21:56.800 --> 01:22:04.000] No, I think the core of it's probably an appeal to ignorance, but it's also special pleading, which is kind of a generic one.
[01:22:04.400 --> 01:22:05.440] It's always a good fallback one.
[01:22:05.600 --> 01:22:17.600] Yeah, it's also, I mean, I don't think it's a no-true Scotsman, but it's kind of like, you know, the special pleading of they always come out at night, except during the day.
[01:22:17.600 --> 01:22:19.200] You know, it's sad.
[01:22:19.520 --> 01:22:23.440] You know, it's like you just ad hoc exceptions to the rule.
[01:22:23.440 --> 01:22:29.360] He's always logical, and if you don't, and if he appears not to be logical, it's because you don't understand the logic.
[01:22:29.680 --> 01:22:35.320] Well, and it's also, I think, at a kind of a larger extent, it's kind of an appeal to authority.
[01:22:29.760 --> 01:22:35.880] Yeah, that's true.
[01:22:36.120 --> 01:22:37.720] It's that God Himself is the authority.
[01:22:37.960 --> 01:22:38.920] So He must be right.
[01:22:38.920 --> 01:22:40.120] We'll start there.
[01:22:40.440 --> 01:22:44.360] And therefore, if it seems illogical, it must be because we don't understand the logic.
[01:22:44.600 --> 01:22:46.200] Because we don't understand, because He's the one.
[01:22:46.600 --> 01:22:47.480] Yeah, that's a good point.
[01:22:47.480 --> 01:22:50.760] Yeah, so it's kind of the assumption of God as the ultimate authority.
[01:22:50.760 --> 01:22:57.320] And so, therefore, then, if that's the case, it also becomes circular logic, circular reasoning.
[01:22:58.040 --> 01:22:58.840] God is logical.
[01:22:58.840 --> 01:22:59.480] How do you know that?
[01:22:59.480 --> 01:23:01.640] Because God is logical, because God's always logical.
[01:23:02.280 --> 01:23:02.840] Logic comes from God.
[01:23:03.160 --> 01:23:06.440] Yes, logic comes from God, therefore you must be logical because he's God.
[01:23:06.680 --> 01:23:07.640] He's a Star Trek episode.
[01:23:07.880 --> 01:23:08.360] Exactly.
[01:23:08.680 --> 01:23:14.840] Yeah, because if He seems illogical, maybe we don't understand it, but maybe, you know, He's not logical.
[01:23:14.840 --> 01:23:17.800] That's not the awful stealth out of that damn coin.
[01:23:18.120 --> 01:23:30.040] But it does remind me of sort of this classic theological debate: if God is all-powerful, all-knowing, and all-loving, why are there innocent babies with cancer?
[01:23:30.040 --> 01:23:30.440] Right?
[01:23:32.120 --> 01:23:41.800] How do awful things happen through no real volition of anybody just happen to innocent, good people?
[01:23:41.800 --> 01:23:42.360] Right?
[01:23:42.360 --> 01:23:44.520] Again, it's hard to square that circle.
[01:23:44.520 --> 01:23:45.640] But a lot of people do.
[01:23:45.880 --> 01:23:48.680] I mean, there are all sorts of answers that people have come up with to.
[01:23:49.000 --> 01:23:50.360] It's testing people.
[01:23:50.360 --> 01:23:52.600] Some babies were too good for this world.
[01:23:52.600 --> 01:23:53.640] Yeah, exactly.
[01:23:53.640 --> 01:23:53.880] Right.
[01:23:53.880 --> 01:23:56.760] It's all a lot of ad hoc bullshit reasoning, is what it is, right?
[01:23:56.760 --> 01:24:00.760] But it doesn't, none of it gets you away from the conundrum, though.
[01:24:01.080 --> 01:24:04.920] What some people say is, well, okay, he's not all three of those things, right?
[01:24:04.920 --> 01:24:06.760] Take your pick, but he's not all three.
[01:24:06.760 --> 01:24:08.760] He's only two of those things.
[01:24:09.560 --> 01:24:09.880] Right?
[01:24:09.880 --> 01:24:15.600] Like, and maybe, and I know of some theologians, like, well, all right, he's all-loving, he's all-knowing.
[01:24:14.920 --> 01:24:20.800] He's not exactly all-powerful because he can't interfere with his own creation, like some kind of logic like that.
[01:24:21.040 --> 01:24:29.360] You know, it's like he kind of set things in motion and in such a way that he doesn't interfere with it.
[01:24:29.360 --> 01:24:31.680] But of course, then that would mean all of this.
[01:24:31.680 --> 01:24:33.200] So there are no miracles then, huh?
[01:24:33.200 --> 01:24:36.160] There's no at no point does God intervene.
[01:24:36.160 --> 01:24:37.680] There's no miracles.
[01:24:37.680 --> 01:24:39.680] And if you say, well, well, there are miracles.
[01:24:39.680 --> 01:24:42.160] Okay, then why didn't he miracle that kid's cancer away?
[01:24:42.160 --> 01:24:43.760] He chose not to.
[01:24:44.080 --> 01:24:44.640] Right?
[01:24:44.960 --> 01:24:51.920] But so, yeah, then you have to, then you have to say, well, God has some kind of motivation that's beyond works mysteriously.
[01:24:52.000 --> 01:24:54.240] Right, but that interferes with the all-loving thing.
[01:24:54.240 --> 01:25:02.080] Because then you're saying there are things that are more important to him than doing for us what any parent would do for their child, right?
[01:25:02.080 --> 01:25:06.400] Or just any decent human being would do for any other human being.
[01:25:06.880 --> 01:25:09.440] There's a lot of the not all-loving shit in the Bible.
[01:25:09.440 --> 01:25:10.160] Yeah, oh, yeah.
[01:25:11.040 --> 01:25:12.480] That's like the first one to fall.
[01:25:12.480 --> 01:25:13.280] Yeah, yeah, yeah.
[01:25:13.600 --> 01:25:14.080] Right.
[01:25:14.080 --> 01:25:20.080] But again, but if you're trying to maintain that position, then you have a lot of explaining to do, basically.
[01:25:20.080 --> 01:25:25.840] All right, but the second half of this is just to say, is my sister conflating logic and reason?
[01:25:25.840 --> 01:25:27.840] Like, does God have a reason for it?
[01:25:27.920 --> 01:25:29.360] This kind of gets to what we're saying.
[01:25:29.360 --> 01:25:33.120] If God has a reason for it, that reason doesn't have to be logic.
[01:25:33.120 --> 01:25:37.680] You can't just assume that God is always logical.
[01:25:38.080 --> 01:25:42.320] And I agree with him that there is something objective about logic.
[01:25:42.320 --> 01:25:50.040] You know, saying that, well, we don't understand God's logic is kind of implying that God's logic is somehow subjective, right?
[01:25:50.080 --> 01:25:55.760] The only other way that you could really rescue that is say, well, it's so complicated that we can't understand it.
[01:25:55.760 --> 01:26:08.920] But if you could, you would see that in like this 12-dimensional kind of way, it actually is logical, but it's just way, it's operating on too deep a level for you to understand.
[01:26:08.920 --> 01:26:14.440] But that is, again, like an ultimate appeal to ignorance and appeal to authority.
[01:26:15.240 --> 01:26:17.960] It's like, yeah, we can't understand it.
[01:26:17.960 --> 01:26:18.760] And so.
[01:26:19.080 --> 01:26:20.280] But that is what she argues.
[01:26:20.840 --> 01:26:24.200] Yeah, but he's the ultimate authority, so just believe it.
[01:26:24.200 --> 01:26:24.520] Right.
[01:26:25.640 --> 01:26:26.840] Okay, so it's pure belief.
[01:26:28.040 --> 01:26:28.600] That is faith.
[01:26:29.160 --> 01:26:39.480] And this is my personal problem with religion: at the end of the day, you're being asked to suspend your logic and reason to just have faith because.
[01:26:39.480 --> 01:26:42.600] Because faith is a circular reasoning of, well, because faith is good.
[01:26:43.720 --> 01:26:44.120] How do you know?
[01:26:44.280 --> 01:26:44.440] Exactly.
[01:26:44.600 --> 01:26:45.320] How do you know faith is good?
[01:26:45.400 --> 01:26:47.240] Because my faith tells me it is, you know.
[01:26:47.480 --> 01:26:48.520] Because the Bible says so.
[01:26:48.520 --> 01:26:49.160] How is the Bible right?
[01:26:49.160 --> 01:26:50.120] Because I have faith in the Bible.
[01:26:50.120 --> 01:26:51.000] Why do you have faith in the Bible?
[01:26:51.000 --> 01:26:52.120] Because it tells me I should.
[01:26:52.680 --> 01:26:57.480] It's all circular reasoning at the end of the day, and then you just cut it completely circular.
[01:26:57.480 --> 01:27:01.240] Yeah, it is impenetrable to logic at that point.
[01:27:01.480 --> 01:27:03.080] So just don't call it logic.
[01:27:03.080 --> 01:27:04.920] Just call it something else.
[01:27:05.720 --> 01:27:10.840] But I also think it comes from a place of, well, we don't want to say anything that can be perceived as negative about God.
[01:27:11.080 --> 01:27:12.600] So you don't want to say God's illogical.
[01:27:12.600 --> 01:27:17.160] So he has every virtue always all the time, even when they're mutually incompatible.
[01:27:17.160 --> 01:27:20.200] And then we just don't worry about the mutual incompatibility.
[01:27:20.200 --> 01:27:21.000] All right.
[01:27:21.000 --> 01:27:25.000] We do have a really excellent email, too.
[01:27:25.000 --> 01:27:31.240] This one comes from Mauser, who is a nuclear engineer at Los Alamos National Laboratory.
[01:27:31.480 --> 01:27:32.440] Yeah, this is interesting.
[01:27:32.440 --> 01:27:43.880] And he writes, Hey, gang, I really enjoyed episode 993, but I wanted to make a small correction to something that was said regarding piling up regolith on top of lunar habitats to protect from space radiation.
[01:27:43.880 --> 01:27:54.960] If you made a, now this is in quotes, and I believe I said this: if you made a protective structure on a moon base with two or three feet of moon creep on the outside, that would go a long way towards protecting you from radiation.
[01:27:54.960 --> 01:28:03.360] It turns out that when incoming radiation enters shielding around a habitat, it can react with atoms in the shielding and produce secondary radiation.
[01:28:03.760 --> 01:28:12.480] The counterintuitive thing is that the secondary radiation could actually be more penetrating and harmful to the occupants of the habitat than the original primary radiation.
[01:28:12.480 --> 01:28:30.480] So, in order to effectively shield a habitat, you don't just need sufficient shielding to stop incoming cosmic rays and whatnot, but you also need enough shielding to shield against the spallation neutrons and the other secondary nasties that the cosmic rays generate within the shielding.
[01:28:30.480 --> 01:28:49.280] Some folks working on NASA's NC2 resource utilization efforts estimated that the break-even point for piled-up lunar regolith, where the effective dose within the habitat was the same as if there was no habitat shielding at all, could be as high as seven to nine meters of regolith.
[01:28:49.840 --> 01:28:53.440] After that, your shielding starts to actually be effective.
[01:28:53.440 --> 01:28:56.960] So, he says, Go, lava tubes, thanks for everything you do.
[01:28:56.960 --> 01:28:58.720] So, smart guy.
[01:28:58.720 --> 01:29:00.320] I had a quick back and forth with him.
[01:29:00.320 --> 01:29:02.000] I said, because I looked that up.
[01:29:02.000 --> 01:29:05.760] I don't always just take, even though he's a nuclear engineer, I don't take it for granted.
[01:29:05.760 --> 01:29:07.600] I said, All right, what can I find over here?
[01:29:07.600 --> 01:29:22.800] And I found multiple resources with studies and with charts, you know, at the curves of radiation, where the break-even point is somewhere around three meters at the low end and five-something meters at the high end.
[01:29:22.800 --> 01:29:25.600] I didn't find anything in the seven to nine-meter range.
[01:29:25.600 --> 01:29:30.200] I emailed this back to Mauser, and he said, Yeah, I'll have to dig up that reference.
[01:29:29.840 --> 01:29:35.960] It was a YouTube video of a lecture, you know, a presentation, but he hasn't sent that to me yet.
[01:29:36.040 --> 01:29:42.200] He said, And you're right, when he looks around, he also is finding the lower numbers, the three to five meters.
[01:29:42.200 --> 01:29:44.760] Still, three to five meters is a lot.
[01:29:44.760 --> 01:29:47.640] And again, that's 16 to 17 meters.
[01:29:47.720 --> 01:29:57.160] No, the break-even point means at that point, you're just the same amount of radiation with no shielding because you're basically this secondary radiation now is equal to the primary radiation.
[01:29:57.160 --> 01:29:59.880] So you have to have more than that.
[01:30:01.800 --> 01:30:05.160] But that assumes homogeneous shielding that's just all lunar regolith.
[01:30:05.400 --> 01:30:07.480] First, we assign it with a perfect sphere.
[01:30:07.480 --> 01:30:07.880] Yeah.
[01:30:08.200 --> 01:30:10.120] Right, like as physicists always do.
[01:30:10.120 --> 01:30:10.840] No, you're right.
[01:30:10.920 --> 01:30:17.720] But is there a way to layer the shielding so that it actually reduces that effect of secondary radiation?
[01:30:18.040 --> 01:30:19.000] Or put a trap in it.
[01:30:19.160 --> 01:30:20.280] Air gap, the radiation.
[01:30:21.560 --> 01:30:22.520] Yeah, like a bladder.
[01:30:22.760 --> 01:30:25.480] Let me tell you what I found in the time that I had to research this.
[01:30:25.480 --> 01:30:28.040] So, first of all, we knew about this effect.
[01:30:28.520 --> 01:30:29.160] With film.
[01:30:29.480 --> 01:30:30.280] Well, not just that.
[01:30:30.520 --> 01:30:32.200] Yeah, but with spaceships.
[01:30:32.200 --> 01:30:37.160] Like, we knew, like, oh, yeah, if you have inches of steel on a spaceship, it's not going to keep out cosmetic.
[01:30:37.240 --> 01:30:43.400] It'll keep out cosmic rays, but it'll just create more secondary radiation, or it's just going to bounce around inside.
[01:30:43.400 --> 01:30:45.400] And so it's actually a net negative.
[01:30:45.400 --> 01:30:49.400] But I didn't think that would apply to feet of regolith, but apparently it does.
[01:30:49.400 --> 01:30:50.360] Yeah, I didn't think so either.
[01:30:50.520 --> 01:31:00.040] Steve, if you remember, I think the first time we heard of it was when they were trying to safeguard really important film, and they got back and it was worse.
[01:31:00.360 --> 01:31:10.520] It was completely exposed because it it had this secondary radiation that just it was better to have one bit of radiation just fly through than have all these daughter particles fly in the field.
[01:31:10.600 --> 01:31:14.360] Yeah, and why are the why is the secondary radiation worse?
[01:31:14.360 --> 01:31:15.760] Why is it more penetrating?
[01:31:16.080 --> 01:31:17.600] Because it's concentrated.
[01:31:17.600 --> 01:31:22.240] No, because it's neutrons and neutrons are neutral.
[01:31:22.240 --> 01:31:28.720] They're not charged, whereas electrons are charged and so they just interact more, they don't go as they don't penetrate as deeply.
[01:31:28.720 --> 01:31:32.080] So that's why those secondary neutron-based radiation is worse.
[01:31:32.400 --> 01:31:33.920] It penetrates deeper.
[01:31:33.920 --> 01:31:35.600] It's harder to shield against.
[01:31:35.600 --> 01:31:36.320] Deeper.
[01:31:36.320 --> 01:31:39.920] And they're still high enough energy that they cause problems.
[01:31:39.920 --> 01:31:51.360] All right, but I did read a study that looked at different material, and different materials have a different potential to kick off these secondary neutron radiation particles.
[01:31:51.360 --> 01:31:55.040] So regolith is bad for secondary particles.
[01:31:55.920 --> 01:31:57.440] It's just about the same as aluminum.
[01:31:57.440 --> 01:32:00.560] Aluminum and regolith are just about equivalent in terms of shielding.
[01:32:00.560 --> 01:32:04.080] They're effective, but they create a lot of secondary radiation.
[01:32:04.080 --> 01:32:13.200] There was three specifically, three materials specifically tested that have a very low potential to generate secondary radiation.
[01:32:13.200 --> 01:32:15.120] One is water, which is good.
[01:32:15.440 --> 01:32:17.440] Human waste and Krell metal.
[01:32:17.920 --> 01:32:24.080] One is carbon fiber, and the other is PE, polyethylene.
[01:32:24.080 --> 01:32:34.640] So you could make layers of carbon fiber, polyethylene, and water that would be effective shielding and would not generate a lot of daughter particles for secondary radio.
[01:32:34.960 --> 01:32:35.920] How thick would it have to be?
[01:32:36.240 --> 01:32:44.080] It's very hard for me to find that out because most of the studies give like they give it in weird units from my perspective.
[01:32:44.080 --> 01:32:46.280] It's always like grams per centimeters cubed, you know.
[01:32:46.480 --> 01:32:47.840] I was like, okay, but how thick is it?
[01:32:47.840 --> 01:32:49.120] And they don't say.
[01:32:49.120 --> 01:32:52.520] I guess it's a more precise way of doing it, but we want like the simplistic.
[01:32:52.520 --> 01:32:54.080] How thick does it have to be?
[01:32:54.080 --> 01:33:06.920] But in any case, it would have to be thicker, but as far as a primary shielding is concerned, but it would be more effective in that there wouldn't be hardly any secondary radiation.
[01:33:06.920 --> 01:33:13.960] So you might want to have that on the outside, you know, to block the cosmic rays and maybe have regolith on the inside.
[01:33:13.960 --> 01:33:16.040] And then maybe you can get away with less shielding.
[01:33:16.040 --> 01:33:18.920] But also, just being in lava tubes also fixes the problem.
[01:33:18.920 --> 01:33:24.600] You know, if you're going to be 20, 10 or more meters underground, you're good.
[01:33:24.600 --> 01:33:27.400] This all applies on Mars too, by the way.
[01:33:27.640 --> 01:33:29.000] It's all the same.
[01:33:29.000 --> 01:33:32.120] So, yeah, being underground is probably going to be the best.
[01:33:32.120 --> 01:33:43.320] Unless we really can come up with some layered material where you can have a foot thick of shielding, you know, where that's where you store your water and you have like three feet of water or something or three yards of water.
[01:33:43.320 --> 01:33:45.640] That's where all the water is stalled and stored.
[01:33:45.640 --> 01:33:47.080] And that is effective shielding.
[01:33:47.320 --> 01:33:49.320] We may have something like that.
[01:33:49.640 --> 01:33:50.600] But yeah, but interesting.
[01:33:50.600 --> 01:33:52.760] So it's harder than even we thought.
[01:33:52.760 --> 01:34:01.800] Yeah, so if you're living underground on Mars or on the moon, and okay, let's say Earth explodes and you need to find some place to live.
[01:34:01.800 --> 01:34:05.880] But if it doesn't, hopefully you won't be on the moon long, if that's the case.
[01:34:05.880 --> 01:34:07.880] Yeah, that's true too, exactly.
[01:34:08.600 --> 01:34:09.720] I just don't get it.
[01:34:09.960 --> 01:34:11.800] Why would you want to live underground?
[01:34:11.800 --> 01:34:12.120] I wouldn't.
[01:34:12.280 --> 01:34:14.360] It's like you're going to moon to be in prison.
[01:34:14.360 --> 01:34:15.160] I don't get it.
[01:34:15.160 --> 01:34:15.640] Yeah, not real.
[01:34:17.000 --> 01:34:22.680] I think people are there's going to be scientists who are doing a one or two year there and you know things like that.
[01:34:23.000 --> 01:34:24.280] That I get like a space station.
[01:34:24.760 --> 01:34:26.280] Antarctica or Antarctica.
[01:34:26.280 --> 01:34:27.080] Yeah, that I told them.
[01:34:27.160 --> 01:34:35.240] Well, the other thing is if you're in a lava tube, Kara, it could be big enough that if you it could be lighted, it could be pressurized, there could be plants.
[01:34:35.240 --> 01:34:38.200] It could be, you know, I could see, like, it's not going to be in a room.
[01:34:38.440 --> 01:34:40.520] It could have an outdoorsy feel to it.
[01:34:40.920 --> 01:34:41.240] Oh, for sure.
[01:34:41.560 --> 01:34:42.360] It would be big enough for it.
[01:34:42.440 --> 01:34:50.720] It could be a whole environment, but it would still always be like every dystopian black mirror post-earth environment.
[01:34:51.120 --> 01:34:53.200] People born in that may like it.
[01:34:55.040 --> 01:34:56.960] I'm so glad I live here.
[01:34:57.920 --> 01:34:58.480] All right.
[01:34:59.040 --> 01:35:01.600] Let's go on with science or fiction.
[01:35:03.840 --> 01:35:08.880] It's time for science or fiction.
[01:35:14.240 --> 01:35:21.760] Each week I come up with three science news items or facts: two real and one fake, and then I challenge my panel of skeptics to sniff out the fake.
[01:35:22.080 --> 01:35:24.480] You guys can play along at home if you want to.
[01:35:24.480 --> 01:35:26.320] I have three regular news items.
[01:35:26.320 --> 01:35:27.600] Are you guys ready?
[01:35:27.600 --> 01:35:28.160] Yes.
[01:35:28.240 --> 01:35:29.440] I think these are fun.
[01:35:29.440 --> 01:35:30.800] Okay, here we go.
[01:35:30.800 --> 01:35:40.160] Item number one: a recent analysis finds that the teeth of Komodo dragons are coated with iron to help maintain their strength and cutting edge.
[01:35:40.160 --> 01:35:54.320] I number two: an extensive study finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower when just letting the land naturally regenerate than planting trees.
[01:35:54.320 --> 01:36:04.240] And item number three: a recent study finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:36:04.240 --> 01:36:05.360] Jay, go first.
[01:36:05.360 --> 01:36:13.600] All right, this first one about this recent analysis that says that Komodo teeth, the Komodo dragon, their teeth, they're coated with iron.
[01:36:13.600 --> 01:36:15.520] Why does that sound so crazy to me?
[01:36:15.520 --> 01:36:17.680] They're coated with iron.
[01:36:17.680 --> 01:36:19.280] How could that even happen?
[01:36:19.600 --> 01:36:22.160] I mean, it's badass, and it's cool.
[01:36:22.160 --> 01:36:24.560] I think I would have heard about this before.
[01:36:24.560 --> 01:36:26.240] All right, so I'm going to put that one on hold.
[01:36:26.240 --> 01:36:29.280] I think that one has a good chance of being the fake.
[01:36:29.600 --> 01:36:41.400] Next one here: this is an extensive study that finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower with just letting the land naturally regenerate than planting trees.
[01:36:41.720 --> 01:36:46.120] Oh, I mean, I wouldn't think that one is that off the mark.
[01:36:46.120 --> 01:36:48.280] That one, to me, seems like science.
[01:36:48.280 --> 01:36:55.560] The third one, a recent study, finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:36:55.560 --> 01:36:56.280] Damn.
[01:36:56.280 --> 01:36:57.800] Well, doesn't that suck?
[01:36:57.800 --> 01:36:59.400] You know, I think that one's the fiction.
[01:36:59.400 --> 01:37:00.520] The music one?
[01:37:00.840 --> 01:37:02.120] And I'll tell you why.
[01:37:02.120 --> 01:37:09.480] Because my kids, you know, they want to hear music in the car, and every single song that they play, I end up humming it and singing it as I'm bopping around.
[01:37:09.480 --> 01:37:11.640] I don't even like the songs.
[01:37:11.960 --> 01:37:12.840] You know?
[01:37:12.840 --> 01:37:15.320] I'll sign your name like Taylor Swift, man.
[01:37:15.640 --> 01:37:19.960] It's like, I can't forget, you know, I hear the melody and I got it in my head forever.
[01:37:19.960 --> 01:37:21.320] So I think that one's a fiction.
[01:37:21.320 --> 01:37:22.200] Okay, Bob.
[01:37:22.200 --> 01:37:24.040] Yeah, Iron Teeth for a Komodo.
[01:37:25.400 --> 01:37:27.480] That sounds a little sketchy.
[01:37:28.200 --> 01:37:29.160] Damn, man.
[01:37:29.240 --> 01:37:32.600] Look like, what's that dude from Moonraker?
[01:37:32.600 --> 01:37:33.160] Jaws.
[01:37:33.160 --> 01:37:35.000] With the silver jaws.
[01:37:35.000 --> 01:37:38.200] The second one, carbon removal.
[01:37:38.200 --> 01:37:40.040] That sounds reasonable to me.
[01:37:40.280 --> 01:37:41.560] Yeah, let nature handle it.
[01:37:41.720 --> 01:37:42.600] Be cheaper.
[01:37:42.600 --> 01:37:49.400] I mean, also, it might be cheaper, but maybe, you know, nature takes a lot longer.
[01:37:49.400 --> 01:37:49.960] I don't know.
[01:37:49.960 --> 01:37:56.920] It's hard for me to disagree with the basic premise that getting old really sucks.
[01:37:58.280 --> 01:38:02.200] So it's going to be hard for me to make that one the fiction.
[01:38:02.200 --> 01:38:10.600] But yeah, but I may agree with Jay here that I haven't noticed that yet, at least for that specific deficit.
[01:38:10.600 --> 01:38:13.320] It hasn't really cropped, reared its ugly head.
[01:38:13.320 --> 01:38:15.280] So, yeah, I'll agree with Jay.
[01:38:15.280 --> 01:38:20.000] GWJ, go with number three, the music recognition verbal.
[01:38:14.760 --> 01:38:20.880] Okay, Evan.
[01:38:21.200 --> 01:38:24.240] Well, I think I'm going to agree on this one as well.
[01:38:24.720 --> 01:38:27.680] I don't have any hard data to back it up.
[01:38:27.680 --> 01:38:29.840] I only have my anecdotes.
[01:38:29.840 --> 01:38:37.680] And one of them is that Jennifer's, my wife's mother, died several years ago, suffering Alzheimer's.
[01:38:37.680 --> 01:38:48.080] And one of the ways towards the end of her life that they would be able to sort of communicate and function, you know, have a point of reference was the music.
[01:38:48.480 --> 01:38:54.320] In that, yes, she didn't know who practically anybody was, names and things towards the end.
[01:38:54.320 --> 01:39:02.960] But if a piece of music came on that she was familiar with, she seemed to be able to show signs of having recognition of it.
[01:39:02.960 --> 01:39:10.880] So, you know, I understand that's just one case in my own personal experience with that.
[01:39:10.880 --> 01:39:13.440] But it lends to this being the fiction.
[01:39:13.440 --> 01:39:16.560] So, yeah, for that reason, I'm just going to go with the guys on that.
[01:39:16.560 --> 01:39:18.240] Okay, and Kara.
[01:39:18.240 --> 01:39:22.000] Yeah, I mean, iron teeth, pretty weird, but I don't know.
[01:39:23.360 --> 01:39:31.920] The one about carbon removal with natural, I guess, reforestation is complicated.
[01:39:31.920 --> 01:39:33.360] So, I don't know.
[01:39:35.120 --> 01:39:43.360] And a recent study finding that the ability to recognize a previously heard piece of music significantly decreases with age and older adults.
[01:39:43.360 --> 01:39:48.400] The way this is worded could be interpreted so many different ways.
[01:39:48.400 --> 01:39:54.080] But the thing that sticks out to me, kind of like you mentioned, Evan, it's not uncommon to use music.
[01:39:54.320 --> 01:39:57.440] I don't know, have we've seen Awakenings, right?
[01:39:57.440 --> 01:40:04.760] We've seen these like stories of people with severe neurological issues kind of finding flow again with music.
[01:39:59.760 --> 01:40:05.080] I don't know.
[01:40:05.160 --> 01:40:13.800] The thing that comes up for me is that it is well established that the older you get, the more likely you are to listen to music from your past.
[01:40:14.120 --> 01:40:15.720] Like you don't want new music.
[01:40:15.720 --> 01:40:18.200] It's like kids today, their music sucks.
[01:40:18.200 --> 01:40:25.480] And I think that like every generation ever has always said that because we actually are really nostalgic and comforted by our old music.
[01:40:25.480 --> 01:40:28.040] And I think if anything, maybe it's the opposite.
[01:40:28.040 --> 01:40:32.600] That like the older we are, the more reinforced the music we've listened to our whole lives becomes.
[01:40:32.600 --> 01:40:35.240] And we're like better at recognizing it.
[01:40:35.240 --> 01:40:38.120] So yeah, I'm going to say that that one's also the fiction.
[01:40:38.120 --> 01:40:38.440] All right.
[01:40:38.440 --> 01:40:42.200] So you guys all seem to be most, I think, in agreement on number two.
[01:40:42.200 --> 01:40:42.920] So we'll start there.
[01:40:42.920 --> 01:40:54.040] An extensive study finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower when just letting the land naturally regenerate than planting trees.
[01:40:54.040 --> 01:40:59.400] You all think this one is science and this one is science.
[01:40:59.400 --> 01:41:00.120] This one is science.
[01:41:00.440 --> 01:41:00.840] Science.
[01:41:00.840 --> 01:41:08.760] Yeah, interesting, you know, that you could have a big, expensive tree planting program and doing nothing was more effective, basically.
[01:41:09.000 --> 01:41:11.640] And they examined, I think, thousands of sites.
[01:41:11.640 --> 01:41:14.120] You know, it's a very pretty extensive study.
[01:41:14.120 --> 01:41:17.320] And they were looking at mainly it was cost effectiveness.
[01:41:17.320 --> 01:41:26.280] Again, like, what's the, how much does it cost to do the program and how much carbon gets absorbed for that piece of land?
[01:41:26.280 --> 01:41:30.600] And for the, you know, the natural recovery thing, they're not doing nothing exactly.
[01:41:30.600 --> 01:41:36.680] They are sort of protecting the land and they are having to do some things, but they're just not planting trees.
[01:41:36.680 --> 01:41:40.920] And I said, it is very specific to the land itself, but it's not the same.
[01:41:40.920 --> 01:41:46.560] So it was 46% for regeneration and 54% for plantations.
[01:41:44.840 --> 01:41:51.760] So you have to pick and choose which ones you're going to plant trees on.
[01:41:52.000 --> 01:42:04.400] But they also said that if you take a combined approach where you do some planting but also let it recover naturally, that was more effective in most areas.
[01:42:04.720 --> 01:42:11.760] So it was 44% more effective than natural regeneration alone and 39% more effective than plantations alone.
[01:42:11.760 --> 01:42:15.600] So this mixed approach seems to be the best approach.
[01:42:15.600 --> 01:42:29.120] And what they also found was that the natural regeneration, when you allowed the forest to regenerate naturally rather than just planting a bunch of trees, it was better in other ways as well.
[01:42:29.120 --> 01:42:34.240] It was more biodiversity and better water management.
[01:42:34.240 --> 01:42:35.680] And it still stored a lot of wood.
[01:42:35.680 --> 01:42:40.400] The big advantage of planting trees, the plantation, was you get more wood out of it, right?
[01:42:40.400 --> 01:42:43.440] So as a source of wood, that's the way to go.
[01:42:43.440 --> 01:42:49.120] So if you're planting a tree for a forest for later harvesting, then yes, you'll want to plant trees.
[01:42:49.120 --> 01:42:57.440] But if you just want to absorb a lot of oxygen, the best thing to do would be to plant some trees strategically, but let it recover mostly on its own.
[01:42:57.440 --> 01:42:59.520] Okay, let's move on to number three.
[01:42:59.520 --> 01:43:06.320] A recent study finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:43:06.320 --> 01:43:08.880] You guys all think this one is the fiction.
[01:43:08.880 --> 01:43:12.640] Kara, you said this could be interpreted a number of different ways.
[01:43:12.640 --> 01:43:13.280] Yeah.
[01:43:13.280 --> 01:43:15.440] So let me tell you what they did.
[01:43:15.440 --> 01:43:19.960] So they had subjects of different ages, right?
[01:43:20.280 --> 01:43:23.040] Exposed them to a specific piece of music.
[01:43:23.040 --> 01:43:27.200] So it's not something necessarily that they already knew or something from their childhood.
[01:43:27.200 --> 01:43:29.360] It's a new piece of music to them.
[01:43:29.360 --> 01:43:35.240] They allowed them to hear it and they listened to it at least three times, right?
[01:43:35.240 --> 01:43:42.520] And then at a later time, they were then hearing music, different music, and this is like played by an orchestra.
[01:43:42.600 --> 01:43:45.880] The orchestra is like playing a medley, like they're going through different
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 9: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 3 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
erate naturally rather than just planting a bunch of trees, it was better in other ways as well.
[01:42:29.120 --> 01:42:34.240] It was more biodiversity and better water management.
[01:42:34.240 --> 01:42:35.680] And it still stored a lot of wood.
[01:42:35.680 --> 01:42:40.400] The big advantage of planting trees, the plantation, was you get more wood out of it, right?
[01:42:40.400 --> 01:42:43.440] So as a source of wood, that's the way to go.
[01:42:43.440 --> 01:42:49.120] So if you're planting a tree for a forest for later harvesting, then yes, you'll want to plant trees.
[01:42:49.120 --> 01:42:57.440] But if you just want to absorb a lot of oxygen, the best thing to do would be to plant some trees strategically, but let it recover mostly on its own.
[01:42:57.440 --> 01:42:59.520] Okay, let's move on to number three.
[01:42:59.520 --> 01:43:06.320] A recent study finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:43:06.320 --> 01:43:08.880] You guys all think this one is the fiction.
[01:43:08.880 --> 01:43:12.640] Kara, you said this could be interpreted a number of different ways.
[01:43:12.640 --> 01:43:13.280] Yeah.
[01:43:13.280 --> 01:43:15.440] So let me tell you what they did.
[01:43:15.440 --> 01:43:19.960] So they had subjects of different ages, right?
[01:43:20.280 --> 01:43:23.040] Exposed them to a specific piece of music.
[01:43:23.040 --> 01:43:27.200] So it's not something necessarily that they already knew or something from their childhood.
[01:43:27.200 --> 01:43:29.360] It's a new piece of music to them.
[01:43:29.360 --> 01:43:35.240] They allowed them to hear it and they listened to it at least three times, right?
[01:43:35.240 --> 01:43:42.520] And then at a later time, they were then hearing music, different music, and this is like played by an orchestra.
[01:43:42.600 --> 01:43:45.880] The orchestra is like playing a medley, like they're going through different things.
[01:43:45.880 --> 01:43:56.440] And then when they got to the melody from the music that they heard recently, they were to hit a button, right, to indicate, oh, yes, that's the music that I heard.
[01:43:56.440 --> 01:43:57.000] Oh, okay.
[01:43:57.000 --> 01:43:59.080] So it's not previously heard like decades.
[01:43:59.240 --> 01:44:01.640] No, it's like just as part of the study.
[01:44:01.640 --> 01:44:03.160] So does that change your mind at all?
[01:44:03.480 --> 01:44:04.120] No.
[01:44:04.440 --> 01:44:06.200] It makes me less sure.
[01:44:07.160 --> 01:44:14.520] So what they found was that age had no effect on people's ability to recognize music.
[01:44:14.680 --> 01:44:14.920] Yeah.
[01:44:15.320 --> 01:44:16.920] It was not a variable.
[01:44:16.920 --> 01:44:19.080] It's a preserved area of your brain.
[01:44:19.080 --> 01:44:19.720] Yeah.
[01:44:19.720 --> 01:44:22.840] So, but here's the thing: don't celebrate too much.
[01:44:23.320 --> 01:44:34.520] But the point of the research was: I mean, one of the research questions really is like, what types of memory tend to slow down with normal aging and which ones don't?
[01:44:34.520 --> 01:44:37.320] Because not all memory is the same, right?
[01:44:37.320 --> 01:44:39.720] There's episodic memory, for example.
[01:44:39.720 --> 01:44:43.320] There is recognition and there is recall.
[01:44:43.320 --> 01:44:50.680] Now, recall is the one that, you know, healthy, normal aging people tend to have increased difficulty with.
[01:44:51.000 --> 01:44:52.840] We're experiencing that even now.
[01:44:52.840 --> 01:44:58.120] You know, just it takes longer as you get older to think of names and to think of words and to think of things.
[01:44:58.120 --> 01:45:01.080] Like, how many times, like, remember that guy from the movie?
[01:45:01.080 --> 01:45:01.720] What's his name?
[01:45:01.720 --> 01:45:04.280] Like, that's like a common occurrence now.
[01:45:04.520 --> 01:45:07.880] So, that kind of recall does get more challenging as you get older.
[01:45:07.880 --> 01:45:08.920] Absolutely.
[01:45:08.920 --> 01:45:12.440] But recognition does not apparently diminish as you get older.
[01:45:12.440 --> 01:45:17.680] Recognition remains intact, at least according to the research, and this study supports that.
[01:45:18.000 --> 01:45:25.520] So, that's really the issue here: the types of memory, the types of processing, and which ones are affected by aging and which ones are not.
[01:45:25.520 --> 01:45:33.600] Okay, this means that a recent analysis finds that the teeth of Komodo dragons are coated with iron to help maintain their strength and cutting edge.
[01:45:33.600 --> 01:45:34.720] Is science?
[01:45:34.720 --> 01:45:35.360] This was cool.
[01:45:35.360 --> 01:45:37.680] I thought I would get somebody on this one.
[01:45:38.320 --> 01:45:41.600] So, yeah, I mean, it's not obviously like they don't look like iron.
[01:45:41.680 --> 01:45:44.160] Doesn't look like jaws from Moonrake or James Bond.
[01:45:44.160 --> 01:45:49.280] Right, it's but there's a lot of iron concentrated in the coating on their teeth.
[01:45:49.280 --> 01:45:58.320] It's not just enamel, it's a different kind of substance, and it evolved to be very, very hard in resistance to dulling and breaking.
[01:45:58.320 --> 01:46:02.480] So, they have serrated teeth for ripping and tearing.
[01:46:02.480 --> 01:46:09.920] So, they, you know, it is any wear and tear on the teeth would really significantly reduce their effectiveness.
[01:46:09.920 --> 01:46:10.560] Oh, cool.
[01:46:10.560 --> 01:46:24.720] And so, having a very hard coating with a lot of iron in it on the outside of the teeth is extremely adaptive for the anatomy of the teeth, the kinds of teeth that they have, and the eating, the way they eat.
[01:46:25.120 --> 01:46:29.520] They really have to rip flesh off of their prey.
[01:46:29.760 --> 01:46:36.000] They have similar teeth to certain kinds of dinosaurs, like Tyrannosaurus Rex.
[01:46:36.000 --> 01:46:44.960] So, the question is: is this adaptation sort of unique to the Komodo dragon line?
[01:46:44.960 --> 01:46:53.120] Or do other reptiles with similar teeth also have iron in their teeth?
[01:46:53.120 --> 01:46:57.200] Now, extant, both extinct and extant.
[01:46:57.440 --> 01:47:06.760] But they say that unfortunately, because of the way fossilization works, that this layer, this iron-rich layer, would not survive on fossils.
[01:47:07.240 --> 01:47:14.120] So we don't know if T-Rex had iron teeth, but they might have had a similar kind of adaptation.
[01:47:14.120 --> 01:47:14.680] Would they have needed?
[01:47:14.840 --> 01:47:17.880] Because they had serrated teeth they used for ripping.
[01:47:18.120 --> 01:47:22.280] They had very similar teeth to Komodo dragons, curved serrated teeth.
[01:47:22.280 --> 01:47:31.560] So it's reasonable, but it could just be that it could have just been an evolutionary innovation that is pretty unique to the Komodo dragon.
[01:47:31.560 --> 01:47:33.560] Or it could be widespread as hell.
[01:47:33.560 --> 01:47:34.680] Or it could be much more widespread.
[01:47:34.840 --> 01:47:35.480] You never know.
[01:47:36.200 --> 01:47:37.080] Yeah, so interesting.
[01:47:37.240 --> 01:47:38.120] But that was fascinating.
[01:47:38.120 --> 01:47:39.960] I'm hoping they could infer it some other way.
[01:47:39.960 --> 01:47:40.680] Yeah, maybe.
[01:47:40.680 --> 01:47:42.040] It's not impossible.
[01:47:42.280 --> 01:47:48.680] They have figured out a way to identify the color of bird feathers, dinosaur feathers.
[01:47:49.240 --> 01:47:52.600] And some vocalizations, like noises perhaps, they were making.
[01:47:52.920 --> 01:47:53.400] Yeah.
[01:47:53.400 --> 01:47:53.960] Yeah.
[01:47:54.280 --> 01:47:57.080] The anatomy of their vocal box.
[01:47:57.080 --> 01:47:57.480] All right.
[01:47:57.480 --> 01:47:58.520] Well, good job, everyone.
[01:47:58.520 --> 01:48:00.200] You swept me this week.
[01:48:00.200 --> 01:48:00.920] That's been a little bit.
[01:48:01.400 --> 01:48:03.720] Didn't get you on the iron teeth.
[01:48:03.720 --> 01:48:05.560] All right, Evan, give us a quote.
[01:48:05.880 --> 01:48:13.400] One of my biggest pet peeves is when people use science that they don't understand to try to justify their stupidity and hate.
[01:48:14.520 --> 01:48:15.560] Forrest.
[01:48:15.720 --> 01:48:19.400] Yeah, Forrest Valkyrie said that in an interview.
[01:48:19.400 --> 01:48:20.280] And who's Forrest?
[01:48:20.440 --> 01:48:21.800] Who is Forrest Valkyrie?
[01:48:21.800 --> 01:48:24.920] Well, he's a scientist and an educator.
[01:48:25.240 --> 01:48:39.720] He is a biologist, science communicator, uses his platform mainly on YouTube, Instagram, to teach exciting science lessons, promote compassion and skepticism, and share his boundless love for life with audiences of all ages.
[01:48:39.720 --> 01:48:47.120] And he is going to be sharing those very things with us when we meet him at SaiCon this coming October.
[01:48:48.080 --> 01:48:49.280] Oh, cool.
[01:48:49.280 --> 01:48:51.280] Yeah, that's kind of a peeve of mine, too.
[01:48:44.600 --> 01:48:51.920] I know.
[01:48:53.200 --> 01:48:57.680] The whole using science you don't understand, like you could stop there.
[01:48:57.680 --> 01:48:59.360] Like, that bugs me.
[01:48:59.360 --> 01:48:59.760] Yeah.
[01:48:59.760 --> 01:49:02.240] But then to justify something hateful, come on.
[01:49:02.240 --> 01:49:04.480] Or stupid or pseudoscientific or whatever.
[01:49:05.040 --> 01:49:07.120] That's basically all of social media.
[01:49:07.120 --> 01:49:07.520] Yeah.
[01:49:07.680 --> 01:49:14.720] Yeah, like we're doing a lot of TikTok response videos and talking a lot about some of the crazy stuff on TikTok.
[01:49:14.720 --> 01:49:16.960] And it's a lot of it falls into this category.
[01:49:17.440 --> 01:49:29.040] You're talking about something you really don't understand and you're using it as a way of justifying some ridiculous belief, whether it's a conspiracy theory or some snake oil or something pseudoscientific.
[01:49:29.040 --> 01:49:29.840] Quantum mechanics.
[01:49:29.840 --> 01:49:33.760] Or something, yeah, or some bigoted opinion or whatever.
[01:49:33.840 --> 01:49:35.520] Yeah, it is very frustrating.
[01:49:35.520 --> 01:49:36.240] All right, guys.
[01:49:36.240 --> 01:49:38.160] Well, thank you all for joining me this week.
[01:49:38.160 --> 01:49:38.640] Thank you, Steve.
[01:49:38.800 --> 01:49:40.000] Got it, Steve.
[01:49:40.000 --> 01:49:44.320] And until next week, this is your Skeptic's Guide to the Universe.
[01:49:46.960 --> 01:49:53.600] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:49:53.600 --> 01:49:58.240] For more information, visit us at the skepticsguide.org.
[01:49:58.240 --> 01:50:02.160] Send your questions to info at the skepticsguide.org.
[01:50:02.160 --> 01:50:12.880] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:50:12.880 --> 01:50:16.480] Our listeners and supporters are what make SGU possible.
Prompt 10: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 11: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.080 --> 00:00:03.040] Mint is still $15 a month for premium wireless.
[00:00:03.040 --> 00:00:07.440] And if you haven't made the switch yet, here are 15 reasons why you should.
[00:00:07.440 --> 00:00:09.520] One, it's $15 a month.
[00:00:09.520 --> 00:00:12.480] Two, seriously, it's $15 a month.
[00:00:12.480 --> 00:00:14.320] Three, no big contracts.
[00:00:14.320 --> 00:00:15.280] Four, I use it.
[00:00:15.280 --> 00:00:16.480] Five, my mom uses it.
[00:00:16.480 --> 00:00:17.760] Are you- Are you playing me off?
[00:00:17.840 --> 00:00:19.200] That's what's happening, right?
[00:00:19.200 --> 00:00:22.640] Okay, give it a try at mintmobile.com slash switch.
[00:00:22.640 --> 00:00:25.840] Upfront payment of $45 for three-month plan, $15 per month equivalent required.
[00:00:25.840 --> 00:00:27.040] New customer offer first three months only.
[00:00:27.040 --> 00:00:28.240] Then full price plan options available.
[00:00:28.240 --> 00:00:29.120] Taxes and fees extra.
[00:00:29.120 --> 00:00:30.720] See Mintmobile.com.
[00:00:33.280 --> 00:00:36.560] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.560 --> 00:00:39.440] Your escape to reality.
[00:00:40.080 --> 00:00:42.720] Hello and welcome to The Skeptic's Guide to the Universe.
[00:00:42.720 --> 00:00:47.600] Today is Wednesday, July 24th, 2024, and this is your host, Stephen Novella.
[00:00:47.600 --> 00:00:49.520] Joining me this week are Bob Novella.
[00:00:49.520 --> 00:00:50.080] Hey, everybody.
[00:00:50.080 --> 00:00:51.440] Kara Santa Maria.
[00:00:51.440 --> 00:00:51.920] Howdy.
[00:00:51.920 --> 00:00:52.880] Jay Novella.
[00:00:52.880 --> 00:00:53.600] Hey, guys.
[00:00:53.600 --> 00:00:55.200] And Evan Bernstein.
[00:00:55.200 --> 00:00:56.320] Good evening, everyone.
[00:00:56.320 --> 00:01:03.760] So I want to start off by thanking the millions of people who emailed me about this past New York Times Sunday crossword puzzle.
[00:01:03.760 --> 00:01:04.640] Oh, boy.
[00:01:04.960 --> 00:01:06.640] Which I've been doing for a while.
[00:01:06.640 --> 00:01:11.680] The, you know, the New York Times crossword puzzle, my daily sort of bathroom game routine.
[00:01:11.680 --> 00:01:13.600] And the Sunday ones are tough.
[00:01:13.600 --> 00:01:14.640] Well, they're just big.
[00:01:14.640 --> 00:01:16.880] Yeah, they're a little, they're definitely the clues are more difficult.
[00:01:16.880 --> 00:01:17.600] They're a little bit more difficult.
[00:01:17.840 --> 00:01:19.200] Well, Sunday is supposed to be...
[00:01:19.200 --> 00:01:20.400] I've done a lot of research on this.
[00:01:20.400 --> 00:01:22.640] Sunday is supposed to be the same difficulty as Wednesday.
[00:01:22.640 --> 00:01:23.360] Is that right?
[00:01:23.360 --> 00:01:25.040] So it starts Monday, which is the easiest.
[00:01:25.200 --> 00:01:27.840] On Monday, yeah, I guess I always see the contract between Sunday and Monday.
[00:01:27.840 --> 00:01:28.400] Monday's.
[00:01:28.480 --> 00:01:29.440] That's why, yeah.
[00:01:29.680 --> 00:01:31.680] The hardest puzzle is Saturday.
[00:01:31.680 --> 00:01:38.000] And then it's, I think it dates back to when people would only get the Sunday puzzle because they would only get the paper on Sunday.
[00:01:38.000 --> 00:01:41.360] So they tried to make that one right in the middle, but it's much larger.
[00:01:41.360 --> 00:01:42.000] Right.
[00:01:42.240 --> 00:01:43.280] So it takes a lot longer to do.
[00:01:43.440 --> 00:01:45.920] There was a theme this past Sunday.
[00:01:45.920 --> 00:01:46.720] And the theme was.
[00:01:46.960 --> 00:01:49.120] The theme was logical fallacies.
[00:01:49.120 --> 00:01:51.440] Yeah, it was so exciting when I was doing it.
[00:01:52.400 --> 00:01:53.200] I know the answer.
[00:01:53.440 --> 00:01:54.800] Wait, formal or informal?
[00:01:55.120 --> 00:01:55.600] Informal.
[00:01:55.600 --> 00:01:56.400] Informal, logical fallacy.
[00:01:56.480 --> 00:01:57.760] Those are the best ones.
[00:01:57.760 --> 00:02:05.400] It was great because you have like these five, whatever, six clues that span the whole puzzle because they're long answers that were really easy for us to get.
[00:02:06.280 --> 00:02:09.080] And it just sort of sets up the whole crossword puzzle.
[00:02:09.080 --> 00:02:11.080] So it was, yeah, it was sweet.
[00:02:11.080 --> 00:02:11.640] Yeah.
[00:02:13.560 --> 00:02:15.880] You know, like one of them was no true Scotsman.
[00:02:15.880 --> 00:02:16.920] Like that was the answer.
[00:02:16.920 --> 00:02:18.680] And for us, it was like, oh, no true Scotsman.
[00:02:18.680 --> 00:02:18.760] Yeah.
[00:02:18.920 --> 00:02:20.360] Post hoc ergo proctor hog.
[00:02:20.920 --> 00:02:21.400] Yeah.
[00:02:21.400 --> 00:02:24.760] It took a second, though, because the clues were so weird.
[00:02:24.760 --> 00:02:25.320] Well, they didn't.
[00:02:25.320 --> 00:02:27.960] Like, they just would be like, this is the best clue.
[00:02:27.960 --> 00:02:30.600] If you don't like this clue, you don't like crossword puzzles.
[00:02:30.760 --> 00:02:33.960] Yeah, they just gave like examples all self-referential.
[00:02:33.960 --> 00:02:34.360] Yeah.
[00:02:34.840 --> 00:02:37.640] But then once you got it, you're like, oh my God, this is so obvious.
[00:02:37.640 --> 00:02:41.480] And they even threw in a Lord of the Rings reference in one of the clues, too.
[00:02:41.480 --> 00:02:42.040] Really?
[00:02:42.040 --> 00:02:42.440] Yeah.
[00:02:42.440 --> 00:02:44.120] Yeah, those always screw me.
[00:02:45.560 --> 00:02:47.960] That was just a bonus easy one for me.
[00:02:47.960 --> 00:02:53.800] You can always text us, Kara, if a Star Wars or a Star Trek or Lord of the Rings comes.
[00:02:54.040 --> 00:03:03.720] Well, okay, so this is very nerdy, but one of my best friends who does not live in the same town as me, she and I do the crossword together every night.
[00:03:04.040 --> 00:03:05.720] It's like our bonding time.
[00:03:05.720 --> 00:03:07.480] And she does it digitally.
[00:03:07.480 --> 00:03:11.800] So she does it on her phone, and I do it auditorily.
[00:03:11.800 --> 00:03:13.480] Like I'm never looking at it.
[00:03:13.960 --> 00:03:17.240] So she just reads me the clues and then tells me how many letters.
[00:03:17.240 --> 00:03:19.720] And then collectively we always get them done.
[00:03:20.120 --> 00:03:20.600] Wait a minute.
[00:03:20.920 --> 00:03:22.280] That's challenging, Kara.
[00:03:22.440 --> 00:03:22.680] It is.
[00:03:23.160 --> 00:03:27.720] Because you don't have the benefit of seeing certain letters in certain positions that have already been established.
[00:03:27.720 --> 00:03:29.960] Yeah, it's difficult, but it's fun.
[00:03:30.280 --> 00:03:35.480] And because we do it together, we always finish because it's like, you know, obviously two heads are better than one on a crossroads.
[00:03:35.640 --> 00:03:37.400] Yeah, often it's my wife and I do it together.
[00:03:37.400 --> 00:03:43.480] If we're driving on a long trip, I'll read the puzzle while she's driving because it keeps her engaged and everything.
[00:03:43.480 --> 00:03:44.720] It is telling you you have to describe it.
[00:03:44.720 --> 00:03:47.440] It's the seven letters, the third letter is, oh, you know.
[00:03:47.440 --> 00:03:47.920] Yeah.
[00:03:44.520 --> 00:03:49.280] But it isn't the same thing.
[00:03:49.440 --> 00:03:50.240] It's really challenging, right?
[00:03:51.360 --> 00:03:54.080] And we have different skills, you know, my friend and I.
[00:03:54.080 --> 00:03:56.320] And so she'll be like, this one's one for you, you know.
[00:03:56.320 --> 00:03:57.360] And then we have this joke.
[00:03:57.360 --> 00:04:10.560] I can't believe I'm saying this on air, but every time we can't, every time we don't know the answer, especially if it's like a, I don't know, a reference to some obscure geographical thing or if it's like a Shakespeare reference or something, we're like, uncultured swine.
[00:04:13.120 --> 00:04:14.960] I don't know.
[00:04:16.240 --> 00:04:19.200] How dare you not know your Tolstoy by heart?
[00:04:19.200 --> 00:04:20.000] I know it.
[00:04:20.000 --> 00:04:21.920] So here's a life hack for you guys.
[00:04:21.920 --> 00:04:22.400] Okay.
[00:04:22.480 --> 00:04:23.200] Thank you.
[00:04:23.200 --> 00:04:30.320] If ever you're, you know, out of town, let's say in a remote area, make sure you have a set of backup keys for your car.
[00:04:31.280 --> 00:04:35.600] You mean like if you were, say, in a theoretical place like New Hampshire?
[00:04:35.600 --> 00:04:41.040] In New Hampshire, like northern New Hampshire, by a pond that was really remote.
[00:04:41.040 --> 00:04:43.360] Oh, so you don't have like limited cell phones.
[00:04:43.520 --> 00:04:44.720] You don't have cell reception.
[00:04:44.720 --> 00:04:45.440] Oh, no.
[00:04:45.440 --> 00:04:50.880] Well, we have Starlink there, so we're plugged into the world that way, as long as we have power.
[00:04:51.120 --> 00:04:55.360] But yeah, cell phones are useless without that in terms of connecting to anything.
[00:04:55.360 --> 00:05:00.800] So we were up there this past weekend trying to get it ready, you know, because there's a lot of work to do on it.
[00:05:00.800 --> 00:05:04.800] And my wife needed to borrow the keys for some reason to get into the car to get the whatever.
[00:05:04.800 --> 00:05:09.920] And then like basically, it's getting like three, four o'clock in the afternoon on Sunday.
[00:05:09.920 --> 00:05:11.920] We're just about ready to come home.
[00:05:12.320 --> 00:05:13.680] Cannot find the keys.
[00:05:13.680 --> 00:05:14.160] Yeah.
[00:05:15.120 --> 00:05:16.160] And we have one set.
[00:05:16.120 --> 00:05:16.960] It's like it's a fob.
[00:05:17.040 --> 00:05:19.760] You know, it's a very keyless, you know, it's like your push button.
[00:05:20.080 --> 00:05:20.720] Fob.
[00:05:22.400 --> 00:05:23.680] Don't give it away, Bob.
[00:05:24.320 --> 00:05:32.440] And the upshot is: you know, my wife had the key on her person and must have dropped it at some point during the day.
[00:05:29.840 --> 00:05:36.440] But it could have been any time over the last several hours.
[00:05:36.760 --> 00:05:39.720] Much of that time she spent on the pond.
[00:05:39.720 --> 00:05:43.480] So yeah, we figured 99% it's in the water.
[00:05:43.480 --> 00:05:44.840] Like that thing is gone.
[00:05:44.840 --> 00:05:47.080] Yeah, that we could not turn it up anywhere.
[00:05:47.080 --> 00:05:47.960] Okay, so what do you do?
[00:05:47.960 --> 00:05:48.120] Right?
[00:05:48.120 --> 00:05:49.640] What are the options at this point in time?
[00:05:49.640 --> 00:05:50.360] You know, I got to get home.
[00:05:50.360 --> 00:05:51.400] I've got to get to work on Monday.
[00:05:51.800 --> 00:05:53.160] Can it be remote started?
[00:05:53.160 --> 00:05:57.320] So theoretically, there are cars that have that feature.
[00:05:57.320 --> 00:05:59.640] This has the Acura link, right?
[00:05:59.640 --> 00:06:01.000] I check into that.
[00:06:01.080 --> 00:06:09.480] Turns out that the 3G on which it is based was discontinued and it's not available at all by hook or by crook.
[00:06:09.480 --> 00:06:11.160] Like the 3G is not available.
[00:06:11.560 --> 00:06:13.160] Like the bandwidth doesn't exist anymore.
[00:06:13.160 --> 00:06:15.960] Like Acura's like, well, I guess that service is over.
[00:06:17.960 --> 00:06:19.880] I hope you're not paying for that service anymore.
[00:06:19.880 --> 00:06:20.600] Well, I wasn't.
[00:06:20.600 --> 00:06:21.960] I was going to do whatever.
[00:06:21.960 --> 00:06:22.680] Just start it up.
[00:06:22.680 --> 00:06:23.960] I was like, how do I sign up for this?
[00:06:24.600 --> 00:06:30.040] And basically, the end of my hour of trying to solve this problem was it doesn't exist anymore.
[00:06:30.040 --> 00:06:34.440] So then, yeah, so then we're just like, all right, we've got to get an emergency blocksmith, right?
[00:06:34.520 --> 00:06:37.800] We're doing this all, obviously, you know, at the same time.
[00:06:37.800 --> 00:06:42.280] And just after about a dozen connections, we have AAA, you know, everything.
[00:06:42.280 --> 00:06:43.720] We did everything we could.
[00:06:43.720 --> 00:06:45.960] There basically was nobody available.
[00:06:45.960 --> 00:06:46.520] No.
[00:06:46.520 --> 00:06:49.800] Yeah, so okay, well, we've got to rent a car to get back home.
[00:06:49.800 --> 00:06:52.120] We'll have to come back another weekend to pick up the car.
[00:06:52.120 --> 00:06:55.640] There was no rent-a-car available in northern New Hampshire.
[00:06:55.640 --> 00:06:58.440] Like, we could not get a rent-a-car.
[00:06:58.440 --> 00:06:59.480] So we were stuck.
[00:06:59.480 --> 00:07:03.640] We were literally stuck for want of this little, stupid little fob.
[00:07:03.640 --> 00:07:05.560] And there was no way to get that car started.
[00:07:05.560 --> 00:07:08.120] I mean, I wasn't going to hotwire it, you know what I mean?
[00:07:08.120 --> 00:07:11.240] But short of that, there was basically no way to do it.
[00:07:11.880 --> 00:07:13.320] I had to take the next day off from work.
[00:07:13.320 --> 00:07:14.880] We had to come home the next day.
[00:07:14.040 --> 00:07:16.480] So, you rented a car the next day?
[00:07:16.480 --> 00:07:18.800] No, there still wasn't a car available.
[00:07:14.440 --> 00:07:20.880] So, we had to rent a Bob.
[00:07:21.200 --> 00:07:21.840] A what?
[00:07:22.400 --> 00:07:23.280] Bob doesn't come cheap.
[00:07:23.920 --> 00:07:28.480] Bob had to drive up four hours each way.
[00:07:28.480 --> 00:07:30.480] Is that like a Bobby cab like they had on Mars?
[00:07:30.480 --> 00:07:31.440] Oh, that was a Johnny cab.
[00:07:31.520 --> 00:07:32.240] Johnny cab, yeah.
[00:07:33.040 --> 00:07:37.040] No, Bob had Bob, God bless him, had to come all the way up there and pick us up and bring us away.
[00:07:37.760 --> 00:07:38.480] Bob, that's rough.
[00:07:38.480 --> 00:07:41.760] I drove four hours this weekend to go camping, but I got to go camping.
[00:07:41.760 --> 00:07:43.360] Well, we did buy Bob a hamburger.
[00:07:43.360 --> 00:07:44.080] That's nice.
[00:07:44.320 --> 00:07:44.720] Two.
[00:07:44.720 --> 00:07:45.120] Two.
[00:07:45.120 --> 00:07:45.760] Two hamburgers.
[00:07:45.920 --> 00:07:46.720] Two of them, actually.
[00:07:46.720 --> 00:07:50.880] Well, he called me the night before, and I knew that they, you know, what was going on.
[00:07:50.880 --> 00:07:52.800] So he called at like six or seven.
[00:07:52.800 --> 00:08:01.040] And my first thought was, oh, crap, they're going to ask me to come right now, which means I probably wouldn't get back till like 3 a.m.
[00:08:01.200 --> 00:08:03.840] And I was like, please don't ask me to pick you up now.
[00:08:03.840 --> 00:08:06.000] And they're like, no, but tomorrow morning?
[00:08:06.000 --> 00:08:08.000] And like, all right, all right.
[00:08:08.320 --> 00:08:11.120] So no spare key to bring to you?
[00:08:11.760 --> 00:08:14.240] So listen, the car's 10 years old, right?
[00:08:14.480 --> 00:08:17.760] We lost that second key at some point along the line.
[00:08:18.160 --> 00:08:18.960] We didn't lose it.
[00:08:18.960 --> 00:08:20.480] We just sort of misplaced it.
[00:08:20.480 --> 00:08:22.400] It's somewhere in the house.
[00:08:22.400 --> 00:08:28.720] We never, I said, yeah, we've got to track that down at some point, but we didn't have it on us, is the bottom line.
[00:08:28.720 --> 00:08:31.760] And now we really, you know, we don't know where that second key is.
[00:08:31.760 --> 00:08:33.680] And, you know, this was definitely our bad.
[00:08:33.920 --> 00:08:34.560] That's what I'm saying.
[00:08:34.560 --> 00:08:35.760] Like, we just wasn't in.
[00:08:36.000 --> 00:08:40.080] I had no idea how difficult it was going to be to replace this key.
[00:08:40.080 --> 00:08:45.840] The other thing is, like, if you don't have one of the keys, you can't program a new key.
[00:08:45.840 --> 00:10:32.400] It's got to, the only way to program this key is you need to either have it towed to a dealer or you need to find you know a car, an auto locksmith who has both a blank right who has one of these ancient fuck 10 year old fobs right yeah and also has the computer equipment to do it and the software to do it so we we did eventually find somebody although they didn't get there till like monday night when we were already back in connecticut and you know so they you know the end of the story is they were able to program a key but it was a saga for them to do it was it expensive 500 bucks oh god yeah so that's an expensive you know i misplaced my key so yeah so we're gonna you know now i'm gonna make like two backups so always have an extra one on me like never gonna be in that situation again but that yeah plus steve that you might not be able to find another fob for that car oh yeah we'll be able to get you can get it oh there's no yeah now that you have the car you know and you have a key you could you could get them on amazon or whatever i could also just bring it to the dealer and they'll make up keys for me so that's what i'm gonna do i can't exist with one key anymore you know now that i know that all the other options are shut down i thought oh the yaki or a link whatever there was some triple a will figure it out whatever but there was nothing nothing yikes yeah it was terrible that that sense of when when realization settled like we are stuck here we are not getting home tonight like that i hate that feeling yep was anybody affected by the like global microsoft outage not me personally are you guys affected by that no at work at work we were affected yeah there were certain systems we could not access till about 11 in the morning yeah one we had like 100 140 computers go down.
[00:10:32.560 --> 00:10:46.720] Just talking about that today with ian we're talking a little bit on the the wednesday TikTok live stream about the fact that cyber security and everything to do with the internet and whatever is such a huge part of our lives now.
[00:10:47.040 --> 00:10:49.760] And we're so dependent and vulnerable.
[00:10:50.000 --> 00:10:51.680] Oh, vulnerable in every way.
[00:10:52.400 --> 00:10:57.760] We need to be putting way more resources into locking this down than we are.
[00:10:57.760 --> 00:11:03.280] I do think this is like this should be a cabinet-level position or equivalent.
[00:11:03.280 --> 00:11:07.280] You know, something like the FDA or the CPA or whatever.
[00:11:07.280 --> 00:11:24.320] A massive organization dedicated to figuring out how to, you know, how to keep us secure, keep this kind of vulnerability from happening, building in resilience and redundancy before civilization collapses because of a rogue update from one company.
[00:11:24.320 --> 00:11:25.360] You know what I mean?
[00:11:27.200 --> 00:11:28.160] And where's their?
[00:11:28.160 --> 00:11:28.960] Yeah, I haven't heard that.
[00:11:29.120 --> 00:11:30.960] This type of error, though, would be tough to.
[00:11:31.280 --> 00:11:37.440] I mean, that's just, you know, this was like one bit of testing software that they use.
[00:11:37.440 --> 00:11:42.560] I mean, so would you have government controlling that granularly?
[00:11:42.560 --> 00:11:43.280] I don't know, man.
[00:11:43.920 --> 00:11:46.080] Just setting standards, doing reviews.
[00:11:46.080 --> 00:11:46.320] Yeah.
[00:11:46.560 --> 00:11:53.920] You know, are you going to say, are you going to have the FDA controlling what drugs we can take back granularly?
[00:11:53.920 --> 00:11:54.560] Yes.
[00:11:54.560 --> 00:11:55.520] Yes, we are.
[00:11:55.520 --> 00:11:56.720] That's how it works.
[00:11:56.720 --> 00:12:02.960] You have a system in place where you have to prove that what you're doing is safe and effective, you know?
[00:12:03.280 --> 00:12:09.600] I heard the patch required a correction at each computer.
[00:12:09.600 --> 00:12:10.400] Yeah, it was manual.
[00:12:10.880 --> 00:12:12.880] You had to delete a system file.
[00:12:13.360 --> 00:12:19.840] You had to log into your computer in safe mode, navigate around, and then and then delete multiple system files.
[00:12:19.840 --> 00:12:20.640] Oh, what a pain in the ass.
[00:12:20.720 --> 00:12:23.600] So, this is what, and right, and there were millions of these computers or something.
[00:12:23.840 --> 00:12:32.520] Yeah, I know that at the hospital where I work, the emails were like: if your computer's been affected, bring it to this place at this time so that we can fix it.
[00:12:29.840 --> 00:12:34.360] So, it's yeah, individual computer level.
[00:12:34.680 --> 00:12:40.600] And I hear the airline airlines have not yet caught up to the backup that they suffered on last Friday.
[00:12:40.760 --> 00:12:42.200] What a nightmare.
[00:12:42.200 --> 00:12:46.120] Airlines are very vulnerable because that's like you're keeping 100 plates spinning, right?
[00:12:46.120 --> 00:12:50.520] That's like that is an industry that is very intolerant to hiccups.
[00:12:50.520 --> 00:12:55.640] If that conveyor belt locks up, like Lucy with the chocolates, that's it.
[00:12:55.640 --> 00:12:58.680] You're going to have a mountain of chocolate that you have to deal with.
[00:12:58.920 --> 00:13:02.600] One more thing to chat about before we go on to the formal section.
[00:13:02.600 --> 00:13:09.720] So, we, according to scientists, those pesky scientists, we just experienced the hottest day on record.
[00:13:09.720 --> 00:13:10.200] Yep.
[00:13:10.200 --> 00:13:11.400] Yep, Sunday.
[00:13:11.400 --> 00:13:11.880] Yep.
[00:13:11.880 --> 00:13:13.640] Collectively or globally.
[00:13:13.640 --> 00:13:14.120] Globally.
[00:13:14.120 --> 00:13:18.760] Like, the global average temperature was higher than anything previously recorded.
[00:13:18.760 --> 00:13:19.480] Recorded.
[00:13:19.480 --> 00:13:19.880] Let's see.
[00:13:19.880 --> 00:13:24.520] The new record high is 17.15 degrees Celsius.
[00:13:24.520 --> 00:13:27.480] And governments are scrambling to fix it.
[00:13:27.480 --> 00:13:30.120] I bet we break that record again at some point this year.
[00:13:30.120 --> 00:13:30.760] Yeah.
[00:13:31.080 --> 00:13:33.160] Maybe, yeah, the year, yeah, the year is not done.
[00:13:33.160 --> 00:13:36.280] But what I didn't realize is that this only goes back to 1940.
[00:13:37.400 --> 00:13:40.600] It's the Copernicus Climate Change Service.
[00:13:40.600 --> 00:13:41.000] Yeah.
[00:13:41.320 --> 00:13:43.720] And their rating books go back to 1940.
[00:13:43.800 --> 00:13:44.200] Yeah, yeah.
[00:13:44.360 --> 00:13:48.280] It's the hottest in 80 years, but it's not really the hottest in all history.
[00:13:48.840 --> 00:13:49.960] Oh, sure, of course not.
[00:13:50.280 --> 00:13:50.600] I mean, the potential of the history of the history.
[00:13:50.680 --> 00:13:54.200] But when you look at all the headlines, it's like hottest day ever recorded.
[00:13:54.280 --> 00:13:55.400] And it's like, yeah, that implies.
[00:13:55.960 --> 00:14:00.760] I know, but that implies like a gut reaction that is like the word matters.
[00:14:00.760 --> 00:14:06.080] I think most people think maybe turn of the century, 1900 is roughly about where those things start.
[00:14:06.080 --> 00:14:08.520] Even though they were measuring temperatures in the 1800s.
[00:14:08.520 --> 00:14:15.920] Yeah, I'm assuming people assume that it was from when we first started measuring temperature, but we didn't have global averages then.
[00:14:14.840 --> 00:14:21.440] Yeah, and these top 10 hottest days on record are all in the last 10 years.
[00:14:23.040 --> 00:14:25.120] Wait, that's just a coincidence, isn't it?
[00:14:25.280 --> 00:14:32.160] And this year and last year, like the last two were peaking way up above the line, you know what I mean?
[00:14:32.160 --> 00:14:33.120] Yep.
[00:14:33.120 --> 00:14:40.800] So, I mean, yeah, it's just, you know, at this point, it's undeniable, even though people deny it, that the Earth is in fact warming.
[00:14:40.800 --> 00:14:44.560] It's not even that trendy anymore to deny it.
[00:14:45.120 --> 00:14:53.920] Like, there are still climate change deniers, no doubt, but, you know, let's say the party of denial is no longer defined by that.
[00:14:53.920 --> 00:14:56.800] Well, they do the Mott and Bailey defense, right?
[00:14:56.800 --> 00:15:05.280] Which is when they'll say, oh, yeah, I mean, yes, the Earth is warming and, you know, man-made activity may be contributing to it, but there's nothing we could do about it.
[00:15:05.280 --> 00:15:07.520] And who's to say it's going to be a bad thing, right?
[00:15:07.840 --> 00:15:17.520] Or, but then when they think they can get away with it, you know, like if there's an anomalous cold day or something, they'll say, see, the Earth isn't really warming.
[00:15:17.520 --> 00:15:24.160] But then they'll storm forward, you know, again, if they, when they're very opportunistic, right?
[00:15:24.160 --> 00:15:27.840] So if they think they can get away with it, they'll deny every aspect of global warming.
[00:15:27.840 --> 00:15:32.960] But when push comes to show up, they'll retreat to the safest position, which is whatever.
[00:15:32.960 --> 00:15:38.080] Well, we can't prove it's all man-made, or there's nothing we can do about it, or, you know, you don't know what's going to be.
[00:15:38.160 --> 00:15:38.560] Yeah.
[00:15:38.560 --> 00:15:39.840] Because it's undeniable.
[00:15:39.840 --> 00:15:40.160] Right.
[00:15:40.160 --> 00:15:42.800] When they're in a situation where they can't deny it.
[00:15:42.800 --> 00:15:44.800] When they're in a situation where they can deny it, they will.
[00:15:44.800 --> 00:15:46.240] So they often will do that.
[00:15:46.240 --> 00:15:46.880] All right.
[00:15:46.880 --> 00:15:49.040] Let's go on with the actual show.
[00:15:49.040 --> 00:15:50.160] Kara.
[00:15:50.160 --> 00:15:51.920] You're going to do a what's the word?
[00:15:51.920 --> 00:15:52.400] I am.
[00:15:52.400 --> 00:15:55.520] I was brainstorming words the other day.
[00:15:55.760 --> 00:16:00.600] Actually, while I was doing the crossword, I have to give a shout out to my friend Sarah because this was her recommendation.
[00:16:00.600 --> 00:16:01.560] It was so good.
[00:16:01.560 --> 00:16:03.000] That's my crossword friend.
[00:15:59.920 --> 00:16:05.400] The word is calculus.
[00:16:05.960 --> 00:16:12.040] I love this word because its etymology is fascinating to me.
[00:16:12.040 --> 00:16:18.120] So, calculus, we all know, is a field or a method within mathematics.
[00:16:18.120 --> 00:16:23.080] It's a method of computation or calculation that uses special notation.
[00:16:23.080 --> 00:16:32.920] When we talk about calculus as a field of mathematics, what we're often talking about is continuingly changing values, right?
[00:16:32.920 --> 00:16:34.920] Vectors, things of that nature.
[00:16:35.480 --> 00:16:41.560] We'll also use that same word, calculus, in, I guess, maybe a more literary or practical sense.
[00:16:41.560 --> 00:16:42.760] Like, I very often use it.
[00:16:42.760 --> 00:16:50.360] I'll say, you know, the calculus here is very difficult, you know, the way that I'm kind of, the algorithm I'm using to judge this situation.
[00:16:50.600 --> 00:16:53.400] But it also has medical definitions.
[00:16:53.400 --> 00:16:56.440] Steve, do you often talk about calculi?
[00:16:56.520 --> 00:16:56.920] Calculi.
[00:16:57.240 --> 00:16:59.560] Because you focus on the brain sometimes.
[00:17:00.040 --> 00:17:00.280] We do.
[00:17:00.280 --> 00:17:02.600] We do talk about calculi, calculus, yeah.
[00:17:02.600 --> 00:17:05.000] In terms of calcification, yeah.
[00:17:05.000 --> 00:17:05.400] Exactly.
[00:17:05.720 --> 00:17:11.320] It's a mineral substance that forms, you know, a hard kind of calculus of things.
[00:17:11.560 --> 00:17:16.760] And also, if you work in the dental field, apparently that's also used as another word for tartar.
[00:17:16.760 --> 00:17:20.520] So your calcified tartar on your teeth becomes calculus.
[00:17:20.840 --> 00:17:24.360] So what do you think came first?
[00:17:24.360 --> 00:17:29.400] When we think about this word calculus, where does it come from?
[00:17:29.400 --> 00:17:32.120] Any idea as to the root of the word?
[00:17:32.120 --> 00:17:36.680] And do you think it was first a math term or first a medical term?
[00:17:36.680 --> 00:17:38.600] I'm going to say medical.
[00:17:38.840 --> 00:17:39.960] Math, I will say.
[00:17:39.960 --> 00:17:40.840] I'll say math.
[00:17:41.160 --> 00:17:43.640] Okay, so three to one, math to medical.
[00:17:43.640 --> 00:17:46.320] And the answer is it's complicated.
[00:17:46.320 --> 00:17:47.040] Oh, boy.
[00:17:44.920 --> 00:17:48.960] So we're all right and wrong.
[00:17:49.280 --> 00:17:59.440] Because even though it was first used in the 1600s for math, it comes from the root that means pebble.
[00:17:59.440 --> 00:18:04.640] Because early on, calculations were performed using abacai.
[00:18:05.440 --> 00:18:16.160] And the stones on the abacus appear to be the root of the utilization of that term for mathematics, which then later was utilized in medicine.
[00:18:16.160 --> 00:18:17.920] Right, so it means stone.
[00:18:17.920 --> 00:18:24.000] It does mean used to mean math because of the abacus and medicine when you have a stone-like thing in medicine.
[00:18:24.160 --> 00:18:27.440] Do they still teach children what an abacus is and how to use one?
[00:18:27.440 --> 00:18:28.800] I learned when I was young.
[00:18:28.800 --> 00:18:36.960] I never learned, but I will tell you, Evan, one of my favorite weird YouTube rabbit holes is watching mental abacus competitions.
[00:18:36.960 --> 00:18:39.520] Oh, it's amazing.
[00:18:39.520 --> 00:18:41.200] Oh, yeah, they're moving their fingers in the air.
[00:18:41.440 --> 00:18:44.960] They move their fingers in the air and then do complex calculations.
[00:18:45.520 --> 00:18:46.320] Yeah, it's fascinating.
[00:18:46.640 --> 00:18:48.960] Kara, you watch that like you watch a sport?
[00:18:49.280 --> 00:18:56.000] I just sometimes on YouTube, if I start watching it, I'll click through a lot of different videos because they're fascinating.
[00:18:56.000 --> 00:18:57.200] It's very cool.
[00:18:57.200 --> 00:18:59.440] But no, I don't actually know how to use an abacus.
[00:18:59.440 --> 00:19:01.120] I had one as a kid, but I don't think I've learned how to use it.
[00:19:01.200 --> 00:19:02.000] They're very reliable.
[00:19:02.000 --> 00:19:03.680] You can always count on them.
[00:19:04.880 --> 00:19:06.320] Calculus.
[00:19:06.320 --> 00:19:07.760] Here come the emails.
[00:19:07.760 --> 00:19:08.720] There it is.
[00:19:09.360 --> 00:19:16.400] Now, why do I sometimes hear the calculus in reference to the mathematical discipline of calculus?
[00:19:16.480 --> 00:19:16.880] I don't know.
[00:19:16.920 --> 00:19:17.760] That's just archaic?
[00:19:17.760 --> 00:19:18.800] That's just quaint.
[00:19:19.120 --> 00:19:22.560] Or it may also be regional.
[00:19:22.560 --> 00:19:26.160] You know, when I just did a recording on Talk Nerdy, it's not out yet.
[00:19:26.160 --> 00:19:32.440] It'll be out in a couple of weeks with an Australian woman, a professor, who wrote a book about vectors.
[00:19:29.840 --> 00:19:34.280] And we talked quite a lot about calculus.
[00:19:34.600 --> 00:19:38.040] She didn't say the calculus, but of course, she said maths.
[00:19:38.040 --> 00:19:40.920] By the way, then they also say maths, right?
[00:19:40.920 --> 00:19:46.520] That the shorthand for mathematics for them is maths, and for us is math.
[00:19:46.680 --> 00:19:47.960] It means the same thing, obviously.
[00:19:47.960 --> 00:19:48.360] Yeah.
[00:19:48.360 --> 00:19:50.680] It always takes me for a loop when I hear maths.
[00:19:51.000 --> 00:19:52.120] It sounds so wrong.
[00:19:52.120 --> 00:19:52.760] Yeah, right.
[00:19:52.760 --> 00:19:53.880] I know it's objective.
[00:19:53.880 --> 00:19:54.600] It's subjective.
[00:19:55.240 --> 00:19:55.480] Totally.
[00:19:55.480 --> 00:19:56.040] I mean, it's regional.
[00:19:56.040 --> 00:19:56.840] It's all fine, yeah.
[00:19:56.840 --> 00:19:59.240] But it's just your ears get so attuned to something.
[00:19:59.640 --> 00:20:02.200] In your brain, there is one way that is right.
[00:20:02.440 --> 00:20:04.440] And everything else is like chocolate.
[00:20:04.840 --> 00:20:05.160] Oh, yeah.
[00:20:05.160 --> 00:20:05.720] I used to watch.
[00:20:05.800 --> 00:20:08.120] Did you guys help me remember the name of it?
[00:20:08.120 --> 00:20:10.360] There was an internet series.
[00:20:10.360 --> 00:20:12.920] It was one of Simon Pegg's early series.
[00:20:13.320 --> 00:20:14.200] Look Around You.
[00:20:14.200 --> 00:20:15.560] Oh, no, not Spaced.
[00:20:15.880 --> 00:20:17.880] I loved Space, but that was a TV show.
[00:20:17.880 --> 00:20:18.520] Look Around You.
[00:20:18.520 --> 00:20:22.200] It's like these series of educational videos, but they're like, you know, parodies.
[00:20:22.200 --> 00:20:25.000] And in one of them, they say maths over and over.
[00:20:25.000 --> 00:20:26.920] And it was my first time I was exposed to that.
[00:20:26.920 --> 00:20:28.520] And I thought that was part of the joke.
[00:20:29.240 --> 00:20:31.640] But then I realized that's just how they say math.
[00:20:32.600 --> 00:20:32.920] All right.
[00:20:32.920 --> 00:20:34.120] Thanks, Kara.
[00:20:34.120 --> 00:20:37.800] Jay, tell us about harvesting water from the air.
[00:20:37.800 --> 00:20:42.360] This is a really interesting thing that these researchers came up with.
[00:20:42.680 --> 00:20:46.120] I'm still a little blown away by the reporting on this.
[00:20:46.120 --> 00:20:57.160] So there's researchers at the University of Utah, and they developed an atmospheric water harvesting or AWH device that obviously we need this, right?
[00:20:57.160 --> 00:21:00.520] Like, you know, can we get water from the air?
[00:21:00.520 --> 00:21:02.280] You know, you'd think there's not enough.
[00:21:02.280 --> 00:21:07.480] And, like, you know, I've seen all these ways that people collect water when they're camping and stuff like that.
[00:21:07.480 --> 00:21:08.280] But it's not a lot.
[00:21:08.280 --> 00:21:10.280] It's never a lot of water, you know?
[00:21:10.280 --> 00:21:21.840] But they created this device that remarkably collects a ton of water from the air, even in places where there is not a lot of water vapor floating around.
[00:21:22.160 --> 00:21:35.040] So they use something called a metal organic framework or an MOF, and they use this thing to capture water vapor from the air, and then they can convert it into liquid water pretty efficiently.
[00:21:35.040 --> 00:21:41.520] So the metal organic framework, these are made in this particular device, they're made out of aluminum fumarate.
[00:21:41.520 --> 00:21:44.080] And I want you to visualize this.
[00:21:44.080 --> 00:21:48.560] It's like a highly porous lattice-like ball, right?
[00:21:48.560 --> 00:21:52.800] There is a ton of surface on this thing, inside, outside.
[00:21:52.800 --> 00:21:55.600] You know, it's like, you know, it's mostly a lattice work, right?
[00:21:55.600 --> 00:21:58.160] So it isn't like a solid thing at all.
[00:21:58.160 --> 00:22:00.800] It has tons of nooks and crannies all through this.
[00:22:00.800 --> 00:22:03.520] And these are nano-scale pores.
[00:22:03.520 --> 00:22:11.680] And these pores are shaped specifically to capture water molecules as the air passes through them.
[00:22:11.680 --> 00:22:12.560] Oh, cool.
[00:22:12.560 --> 00:22:20.160] Here's the mind-blowing part: a single gram of this material has the surface area equivalent to.
[00:22:20.160 --> 00:22:23.280] Now, I want to tell you, but I want to see what you guys think.
[00:22:23.280 --> 00:22:30.560] You know, what size plane would you think that this, you know, a gram of this material has surface area-wise?
[00:22:30.960 --> 00:22:32.560] The surface of the earth.
[00:22:32.880 --> 00:22:33.920] Oh, my God, Steve.
[00:22:35.280 --> 00:22:36.640] I can't follow that one.
[00:22:36.640 --> 00:22:38.080] Did I overshoot?
[00:22:40.000 --> 00:22:41.200] The moon?
[00:22:42.400 --> 00:22:43.040] Just shut up.
[00:22:43.040 --> 00:22:46.160] It has a surface area of two football fields.
[00:22:46.160 --> 00:22:46.800] Wow.
[00:22:47.440 --> 00:22:47.960] Wow.
[00:22:47.760 --> 00:22:48.520] American.
[00:22:49.200 --> 00:22:50.240] American football field.
[00:22:50.400 --> 00:22:51.280] Shut up, Kara.
[00:22:51.280 --> 00:22:53.920] But to bring it back to reality, though, it's a gram.
[00:22:53.920 --> 00:22:56.400] It's a single gram of this material.
[00:22:56.600 --> 00:22:56.760] Right?
[00:22:56.800 --> 00:22:57.560] So it's like a ball.
[00:22:57.360 --> 00:22:58.600] It's just folding it out.
[00:22:58.000 --> 00:22:58.800] Folding it out.
[00:23:00.040 --> 00:23:00.200] Yeah.
[00:22:58.960 --> 00:23:01.160] Two football fields.
[00:22:59.280 --> 00:23:02.440] Twoop, football fields.
[00:22:59.680 --> 00:23:06.600] So as the air flows through this thing, it traps water molecules.
[00:23:06.760 --> 00:23:12.280] Now, these water molecules can be released into liquid form simply by applying heat.
[00:23:12.280 --> 00:23:25.560] And I tried to find out exactly what is happening when the water molecules are, you know, when the heat is increased, I guess they vibrate more and they bump into each other and they gather together as water does, and then it comes out of the machine.
[00:23:25.560 --> 00:23:33.480] So they have an existing prototype, and it can produce five liters of water per day per kilogram of this absorbent material.
[00:23:33.480 --> 00:23:35.000] All right, but Jay, can I ask you a question?
[00:23:35.000 --> 00:23:35.400] Sure.
[00:23:35.400 --> 00:23:38.040] How many droids does it take to operate this machine?
[00:23:38.040 --> 00:23:38.760] That's the problem.
[00:23:38.760 --> 00:23:41.400] You need four droids, and one of them has to speak bocce.
[00:23:41.400 --> 00:23:42.200] I see.
[00:23:45.000 --> 00:23:51.000] So the next question you might have, which is the one that my brain was creeping to, is how do you power this thing?
[00:23:51.000 --> 00:23:52.520] Like, is it super power hungry?
[00:23:52.520 --> 00:23:53.720] What's the deal?
[00:23:53.720 --> 00:24:00.520] So in this particular one, right, because this isn't the only device that's been created like this, and I'll get into that a little bit later.
[00:24:00.520 --> 00:24:06.280] But this one, they use an energy-dense fuel, and they did this very specifically.
[00:24:06.280 --> 00:24:14.120] So in their prototype, they're using white gasoline, and this is the fuel that you would commonly find in a camping stove.
[00:24:14.120 --> 00:24:19.640] It's just very energy-dense, you know, and it's you don't want to carry around a fuel that's going to be super heavy.
[00:24:19.640 --> 00:24:21.160] You want something that's portable.
[00:24:21.160 --> 00:24:29.480] So they decided not to use solar power for this machine because they wanted it to be able to run 24 hours a day for a very specific reason.
[00:24:29.480 --> 00:24:34.760] Solar panels could be used, but they complicate deploying this machine.
[00:24:34.760 --> 00:24:36.920] They're bigger, they're heavier, they need batteries.
[00:24:36.920 --> 00:24:38.920] You know, that complicates the whole thing.
[00:24:38.920 --> 00:24:44.280] So, they just decided we're going to have a little energy pack here of this white gasoline.
[00:24:44.280 --> 00:24:52.480] So, this device was initially designed to provide hydration solutions for, I'm sure you can guess what I'm about to say, for the military, for soldiers.
[00:24:52.480 --> 00:25:01.840] These are people who are in remote areas, they have limited water resources, and this is where, you know, obviously, solar panels are not an optimal fuel source for this thing.
[00:25:01.840 --> 00:25:06.640] But immediately, they're like, well, let's ponder the civilian applications.
[00:25:06.640 --> 00:25:07.840] And it's huge.
[00:25:08.400 --> 00:25:12.720] The need for something like this is, right now, it's extraordinary.
[00:25:12.720 --> 00:25:16.640] We need sources of clean water in so many places around the world.
[00:25:16.640 --> 00:25:23.200] And one of these devices could easily provide daily drinking water and regular household water needs.
[00:25:23.200 --> 00:25:24.800] And I just think that's incredible.
[00:25:24.800 --> 00:25:28.080] Like, you know, it's not a big machine.
[00:25:28.080 --> 00:25:38.080] It's not extraordinarily going to be super expensive because it's not using any materials that are hard to get or are super expensive or anything like that.
[00:25:38.080 --> 00:25:39.600] It's just, it really works.
[00:25:39.600 --> 00:25:42.960] Like, the physics of this machine are amazing.
[00:25:42.960 --> 00:25:56.640] So the device is compact, it's highly efficient, and it solves all the downsides of existing technologies that harvest water because it's small, it doesn't cost a lot, and it's wicked efficient.
[00:25:56.640 --> 00:26:01.520] And everything else that came before it, you know, has problems with size, cost, and efficiency.
[00:26:01.520 --> 00:26:09.760] It can operate effectively in really low humidity conditions, which I find to be that's a bonus, right?
[00:26:09.760 --> 00:26:15.040] You know, you're in some of the most arid places on the planet, and this thing still functions.
[00:26:15.040 --> 00:26:16.320] How did they say how low it could go?
[00:26:16.280 --> 00:26:18.240] Like, could you use this in the Sahara.
[00:26:18.240 --> 00:26:20.720] They said that you could go to Death Valley and it'll work.
[00:26:20.720 --> 00:26:21.120] Really?
[00:26:21.120 --> 00:26:21.520] Yep.
[00:26:21.520 --> 00:26:29.440] These previous technologies that I was mentioning require, you know, pretty, pretty significant humidity levels to function properly.
[00:26:29.440 --> 00:26:36.360] This thing is designed to function in low humidity environments, which is remarkable.
[00:26:36.600 --> 00:26:41.640] I saw a rendering of what the MOF looks like, right?
[00:26:41.960 --> 00:26:50.200] And, you know, it just looks like somebody took a ball of silver metal and kind of melted it with alien acid.
[00:26:50.200 --> 00:26:55.240] You know, it has all these little divots and pivots and all these little things in it.
[00:26:55.240 --> 00:26:58.040] But when you look closely, like it's organized.
[00:26:58.040 --> 00:27:01.560] It really is like a lattice inside that thing.
[00:27:01.960 --> 00:27:03.480] And here's another one that I found.
[00:27:03.480 --> 00:27:08.440] There's another version of this machine that UC Berkeley came up with.
[00:27:08.440 --> 00:27:11.240] They call it the MOF-Powered Water Harvester.
[00:27:11.240 --> 00:27:12.920] And theirs works well too.
[00:27:12.920 --> 00:27:15.240] It's very, very similar in concept here.
[00:27:15.240 --> 00:27:17.000] The device is handheld.
[00:27:17.000 --> 00:27:20.280] It uses ambient sunlight to extract water from the air.
[00:27:20.280 --> 00:27:21.320] I guess there's no battery.
[00:27:21.320 --> 00:27:27.240] It just needs sunlight and it just converts it right into the energy it needs to heat up the machine.
[00:27:27.240 --> 00:27:32.440] And its key points include it can operate in extremely dry conditions.
[00:27:32.440 --> 00:27:35.880] It provides clean water and it only uses sunlight.
[00:27:35.880 --> 00:27:42.520] It can harvest up to 285 grams of water per kilogram of the MOF material, right, in one day.
[00:27:42.680 --> 00:27:47.480] It's 85 to 90% efficient in releasing captured water as drinkable water.
[00:27:47.480 --> 00:27:51.480] And this particular device is environmentally friendly.
[00:27:51.480 --> 00:27:55.400] So this is the kind of technology that the modern world needs.
[00:27:55.400 --> 00:28:03.080] We're going to have places where there's an absolute abundance of water because of weather, you know, whatever the new weather patterns are going to be.
[00:28:03.080 --> 00:28:08.920] And then there's going to be lots of places where there is an incredible and dangerous lack of water.
[00:28:08.920 --> 00:28:13.080] So, anyway, I just find this type of technology super interesting.
[00:28:13.080 --> 00:28:22.160] You know, the shapes that are found inside this MOF material, it's specifically designed to only interact with water molecules.
[00:28:22.160 --> 00:28:24.160] It doesn't collect anything else.
[00:28:24.160 --> 00:28:27.840] So, is the resulting water basically like distilled water?
[00:28:28.000 --> 00:28:29.840] I tried to confirm that, Steve.
[00:28:29.840 --> 00:28:32.880] I mean, they're saying that it's drinkable, clean water.
[00:28:32.880 --> 00:28:35.120] I guess everything else just passes through.
[00:28:35.120 --> 00:28:44.880] But I would imagine, you know, for safety, you know, unless this thing really, really only ever, ever attracts water under every circumstance, you know, they could just put a filter in there, I guess.
[00:28:45.040 --> 00:28:54.400] What I'm saying is, you know, is the resulting water like distilled water, meaning there's no minerals in it, there's no electrolytes, it's just H2O.
[00:28:54.400 --> 00:28:55.440] No, I don't think so.
[00:28:56.080 --> 00:29:00.080] I think that that stuff is carried along with the water because they're saying it's drinkable.
[00:29:00.080 --> 00:29:00.880] It's drinkable as is.
[00:29:00.880 --> 00:29:01.440] You don't have to drink it.
[00:29:02.320 --> 00:29:02.960] Tablet it in.
[00:29:03.120 --> 00:29:04.480] Yeah, because that would kill you, right?
[00:29:04.480 --> 00:29:04.960] That's not.
[00:29:05.120 --> 00:29:06.400] Well, I mean, you'd have to drink a lot of it.
[00:29:06.560 --> 00:29:08.480] It wouldn't be great for your electrolyte.
[00:29:08.480 --> 00:29:12.320] Yeah, or you know, you'd have to give you salt to go along with it.
[00:29:12.320 --> 00:29:13.680] Very cool stuff, though, guys.
[00:29:13.680 --> 00:29:18.000] You know, I'm seeing interesting pieces of technology come out now.
[00:29:18.000 --> 00:29:19.520] NASA's doing a lot of cool things.
[00:29:19.520 --> 00:29:25.360] Like, you know, I love the MOXIE machine that NASA came out with that creates oxygen from CO2.
[00:29:25.360 --> 00:29:34.480] And this thing, you know, if it is what they're saying it is, and they're going to put this, you know, make this available and it's not going to be that expensive.
[00:29:34.480 --> 00:29:36.960] You know, I could see people using it who are camping.
[00:29:36.960 --> 00:29:40.000] And, you know, like, it's just a really, really usable thing.
[00:29:40.000 --> 00:29:42.000] And it probably will save lives.
[00:29:42.000 --> 00:29:46.000] You know what percentage of people lack access to clean drinking water?
[00:29:46.000 --> 00:29:47.200] That's a good statistic, Steve.
[00:29:47.200 --> 00:29:47.840] What is it?
[00:29:47.840 --> 00:29:48.480] Two-thirds.
[00:29:48.480 --> 00:29:49.600] No, it's 25%.
[00:29:49.600 --> 00:29:50.560] So 2 billion people.
[00:29:51.040 --> 00:29:51.520] That's a lot.
[00:29:51.760 --> 00:29:54.480] 2 billion people who don't have access to clean drinking water.
[00:29:54.480 --> 00:29:54.880] Yeah.
[00:29:54.880 --> 00:29:55.680] That's crazy.
[00:29:55.680 --> 00:29:57.040] How do they even survive?
[00:29:57.040 --> 00:29:57.840] Yeah, that's a good question.
[00:29:58.080 --> 00:30:00.840] They drink not clean water, right?
[00:29:58.240 --> 00:30:03.000] Or they drink not clean water.
[00:30:03.000 --> 00:30:04.840] They're dysentery and everything else.
[00:29:59.920 --> 00:30:14.600] Okay, so the ocean contains about 1.3 billion cubic kilometers of water, and the atmosphere contains about 12,900 cubic kilometers of water.
[00:30:14.920 --> 00:30:19.480] It's not even close, but the atmosphere has a ton of water in it.
[00:30:19.480 --> 00:30:19.880] Oh, yeah.
[00:30:19.880 --> 00:30:22.920] Yeah, I've watched survival shows on how to extract water from the air.
[00:30:22.920 --> 00:30:31.320] But like you said, Jay, it always yields a piddly tiny amount that, you know, nothing of high significance to keep maybe more than one person alive.
[00:30:31.320 --> 00:30:34.440] Yeah, 97% of the water on Earth is in the oceans.
[00:30:34.440 --> 00:30:35.240] Very cool.
[00:30:35.480 --> 00:30:37.880] I'm excited about stuff like this.
[00:30:37.880 --> 00:30:45.000] But I'm always fascinated by the fact that Europa has more water on it than the Earth's oceans.
[00:30:45.000 --> 00:30:45.400] Yeah.
[00:30:46.280 --> 00:30:47.160] Twice as much water.
[00:30:47.160 --> 00:30:49.880] Yeah, there's twice as much on Europa as there is in the Earth's ocean.
[00:30:49.880 --> 00:30:50.360] That's nuts.
[00:30:50.520 --> 00:30:51.080] Always fascinating.
[00:30:51.480 --> 00:30:52.120] All right, thanks, Jay.
[00:30:52.280 --> 00:30:53.320] Europa is a moon.
[00:30:53.320 --> 00:30:54.120] It's a moon.
[00:30:54.120 --> 00:30:56.760] All right, do you guys know what dark oxygen is?
[00:30:56.760 --> 00:30:57.640] I had never heard of it.
[00:30:57.800 --> 00:30:58.840] Dark oxygen.
[00:30:58.840 --> 00:31:00.280] Low albedo oxygen.
[00:31:00.440 --> 00:31:01.400] What do you think it is?
[00:31:01.960 --> 00:31:03.080] Hidden oxygen?
[00:31:03.080 --> 00:31:05.000] Oxygen with an isotope of oxygen?
[00:31:05.320 --> 00:31:07.400] Where does oxygen on the Earth usually come from?
[00:31:07.800 --> 00:31:08.840] It's hidden from the atmosphere.
[00:31:08.840 --> 00:31:09.960] So it's like underground.
[00:31:09.960 --> 00:31:10.280] Yeah.
[00:31:10.440 --> 00:31:11.320] Where does it come from?
[00:31:11.320 --> 00:31:12.760] Where does the oxygen come from?
[00:31:12.760 --> 00:31:13.080] Plants.
[00:31:13.240 --> 00:31:15.080] Plants, therefore, what process?
[00:31:15.320 --> 00:31:16.200] Photosynthesis.
[00:31:16.200 --> 00:31:17.960] Right, which involves the sun.
[00:31:17.960 --> 00:31:18.600] The sun.
[00:31:18.920 --> 00:31:23.000] So this is oxygen that comes from a process that doesn't involve the sun.
[00:31:23.000 --> 00:31:25.960] Therefore, it's a chemosynthetic synthesis.
[00:31:27.000 --> 00:31:27.880] Is it chemosynthetic?
[00:31:28.040 --> 00:31:30.440] I did not hear that specific term used, Bob.
[00:31:30.440 --> 00:31:35.880] It could be, but it is from a chemical reaction that doesn't involve light.
[00:31:35.880 --> 00:31:38.760] I think then by definition it would be chemosynthetic.
[00:31:38.760 --> 00:31:39.320] All right.
[00:31:39.320 --> 00:31:39.720] Okay.
[00:31:39.720 --> 00:31:44.560] So this was actually discovered not by accident, but it was a surprise.
[00:31:44.120 --> 00:31:50.080] Researchers were trying to measure oxygen levels in the deep ocean.
[00:31:50.400 --> 00:32:00.400] When they put the detectors down at the sea floor, they got back this huge result, like way more oxygen than there should have been down there.
[00:32:00.400 --> 00:32:04.560] It was so surprising that they thought, well, these detectors are not calibrated properly.
[00:32:04.560 --> 00:32:06.160] They're clearly not functioning.
[00:32:06.160 --> 00:32:09.840] They sent them all back to the manufacturer saying that these were broken.
[00:32:09.840 --> 00:32:13.840] The manufacturer tested them all and saying, nope, these are calibrated and working.
[00:32:13.840 --> 00:32:14.880] So they did it again.
[00:32:14.880 --> 00:32:16.560] They're like, this can't be right, though.
[00:32:16.560 --> 00:32:19.680] There's way more oxygen down there than there's supposed to be.
[00:32:19.680 --> 00:32:23.040] So here's the other thing: where were they doing this measurement?
[00:32:23.040 --> 00:32:26.480] They were doing it in the Clarion Clipperton zone.
[00:32:26.480 --> 00:32:28.640] Does that ring any bells for you guys?
[00:32:28.640 --> 00:32:30.720] Clarion Clipperton zone.
[00:32:30.720 --> 00:32:35.120] So there are something strewn across the floor, the ocean floor, in this area.
[00:32:35.600 --> 00:32:36.240] The Mariana.
[00:32:36.320 --> 00:32:36.560] No, no, no.
[00:32:36.800 --> 00:32:37.760] Are they titanic?
[00:32:37.760 --> 00:32:38.240] No.
[00:32:38.560 --> 00:32:40.000] But it is metal.
[00:32:40.000 --> 00:32:41.600] Polymetallic nodules.
[00:32:41.600 --> 00:32:42.240] Remember that?
[00:32:42.960 --> 00:32:45.040] Polymetallic nodules.
[00:32:45.360 --> 00:32:46.480] Lithium and stuff?
[00:32:46.640 --> 00:32:47.040] Is it lithium?
[00:32:47.120 --> 00:32:50.080] Yeah, manganese and cobalt.
[00:32:50.800 --> 00:32:52.800] All the stuff that people have talked about mining.
[00:32:52.800 --> 00:32:53.520] Yeah, this is so.
[00:32:53.840 --> 00:32:55.840] We talked about this once or twice before.
[00:32:55.840 --> 00:33:09.040] This is the batteries, you know, like all the material we need to make batteries are in these little potato-sized nodules that form slowly over millions of years, strewn across the ocean floor in certain regions.
[00:33:09.040 --> 00:33:15.360] And one of the biggest collections is in this Clarion Clipperton zone in the Pacific Ocean, right?
[00:33:15.360 --> 00:33:22.640] And there are plans, there's companies that are planning on mining these polymetallic nodules, right?
[00:33:22.640 --> 00:33:37.480] We talked about it as kind of as a sort of a good thing, you know, like, okay, this, you know, could be great if it, you know, could produce the raw material to make enough batteries to trans to make all of our cars electric, you know what I mean?
[00:33:37.480 --> 00:33:38.440] That kind of thing.
[00:33:38.440 --> 00:33:46.360] And at the time, we mentioned that, yeah, they just got to do some environmental studies, but you know, like, don't let that take too long.
[00:33:46.360 --> 00:33:53.080] Get that done so we can get mining these nodules and we could start, you know, push forward with our battery revolution.
[00:33:53.080 --> 00:34:01.720] But the more I follow this story, the more these pesky environmental issues are a major problem.
[00:34:01.720 --> 00:34:08.200] And it may actually prevent any significant mining of these nodules or should, right?
[00:34:08.520 --> 00:34:18.440] So this just adds to that because it's probably true that this dark oxygen is being made on these nodules, right?
[00:34:18.440 --> 00:34:23.640] That it's the metals in these nodules that is creating the oxygen.
[00:34:23.640 --> 00:34:24.280] How?
[00:34:24.280 --> 00:34:26.200] Well, they do not know.
[00:34:26.520 --> 00:34:35.240] And so that's why, you know, when Bob asked if it's chemosynthetics, like, well, probably, but they really haven't figured out what the actual process is.
[00:34:36.200 --> 00:34:47.560] But just the fact that they are making so much oxygen means that they may be, they probably are, a critical component to the ecosystem here.
[00:34:47.880 --> 00:34:59.640] And there are other locations where there have been mining and disruptions to the ocean floor that even after 20, 30 years, they haven't recovered.
[00:34:59.640 --> 00:35:04.120] Like they're dead zones, you know, 40 years later.
[00:35:04.120 --> 00:35:08.120] And the thinking is, well, maybe that this is why.
[00:35:08.120 --> 00:35:16.400] That it's because we basically removed the oxygen supply from this, you know, this deep ocean.
[00:35:16.400 --> 00:35:20.160] And because normally there's not a lot of oxygen because there's not a lot of light down there, right?
[00:35:14.840 --> 00:35:22.080] And therefore, not a lot of photosynthesis.
[00:35:22.320 --> 00:35:27.600] They're getting mostly whatever oxygen is being made close to the surface and it's just diffusing through the water.
[00:35:28.000 --> 00:35:31.920] And then, of course, a lot of nutrients get carried down by life on the surface.
[00:35:31.920 --> 00:35:34.560] But there's this vibrant ecosystem down there.
[00:35:34.560 --> 00:35:41.600] And this may explain why that's the case, you know, why there is so much biodiversity and so much life in this deep zone.
[00:35:41.600 --> 00:35:47.120] Because, yeah, because there's like dark oxygen being produced by these polymetallic nodules.
[00:35:47.120 --> 00:35:54.720] Therefore, if we do any significant mining of them, we may create a dead zone down there, you know, where there is now a vibrant ecosystem.
[00:35:54.720 --> 00:35:58.560] So it's not like it'll bounce back, it'll be disruptive, but it'll be fine in 10 years.
[00:35:58.800 --> 00:36:00.800] It may not be, you know.
[00:36:01.360 --> 00:36:08.800] Again, biologists looking at areas that were heavily mined 40 years ago found that essentially there's still no life there.
[00:36:08.800 --> 00:36:12.560] So that is a huge problem, unfortunately.
[00:36:12.720 --> 00:36:21.760] We definitely need to do more research and figure out how this oxygen is being made, confirm that it's coming from the polymetallic nodules, figure out what that means to the ecosystem.
[00:36:21.760 --> 00:36:25.440] We have to really review any proposed method.
[00:36:25.440 --> 00:36:41.680] Right now, one of the companies is planning on basically just sucking them, like vacuuming them off the ocean floor, and that would send up a plume of debris, as well as depriving that ecosystem of these nodules.
[00:36:41.680 --> 00:36:45.920] So it would probably be devastating to the ecosystem, unfortunately.
[00:36:45.920 --> 00:36:50.800] So this is something that, you know, a year ago, two years ago, I was like very, very hopeful about this.
[00:36:50.800 --> 00:36:56.080] And now I've had to modify my opinions because of all the environmental information that's coming out.
[00:36:56.080 --> 00:36:58.720] Unfortunately, I was hoping the answer was going to be like, yeah, you're fine.
[00:36:58.720 --> 00:36:59.280] Go ahead.
[00:36:59.280 --> 00:37:01.400] Vacuum them up, make batteries out of them.
[00:37:01.400 --> 00:37:03.000] But that doesn't seem to be the case.
[00:37:03.000 --> 00:37:12.760] And this one may be the death blow, you know, to any environmentally responsible mining, you know, of these polymetallic nodules, at least in this part of the ocean here.
[00:37:12.760 --> 00:37:16.040] So for the moment, until a new technology maybe can come along.
[00:37:16.040 --> 00:37:23.800] Yeah, maybe we need to just like when you like, you know, you go into a forest and you cut down every third tree, you know, or something and you leave the rest undisturbed.
[00:37:23.800 --> 00:37:26.760] Like maybe there might be some limited, careful mining.
[00:37:26.760 --> 00:37:33.000] Not like this like strip mining, like the equivalent of like just vacuuming up the ocean floor.
[00:37:33.000 --> 00:37:33.560] Right, bolts.
[00:37:33.800 --> 00:37:34.120] Yeah, right.
[00:37:34.120 --> 00:37:35.640] But maybe we might need to do.
[00:37:35.800 --> 00:37:38.680] But then, of course, that gets to the cost-effectiveness of the operation.
[00:37:38.680 --> 00:37:39.080] Sure.
[00:37:39.080 --> 00:37:40.840] And do they know how they form?
[00:37:40.840 --> 00:37:42.200] Like how long it takes for them to form?
[00:37:42.360 --> 00:37:43.000] Millions of years.
[00:37:43.000 --> 00:37:43.640] It takes millions.
[00:37:43.800 --> 00:37:44.680] Well, yeah, so I don't know.
[00:37:44.920 --> 00:37:45.800] So there's a limited amount of time.
[00:37:46.280 --> 00:37:47.000] Slow accretion.
[00:37:47.160 --> 00:37:48.760] Yeah, it's like millions and millions of years.
[00:37:49.160 --> 00:37:50.680] But then it's not a forest.
[00:37:50.680 --> 00:37:50.920] Right.
[00:37:50.920 --> 00:37:51.960] And we can't think of it like a forest.
[00:37:52.280 --> 00:37:53.400] They will not bounce right back.
[00:37:53.400 --> 00:37:53.640] Nope.
[00:37:54.360 --> 00:37:56.680] Yeah, it's basically geological time periods.
[00:37:56.840 --> 00:37:58.360] You got to think of it like fossil fuels.
[00:37:58.360 --> 00:37:58.600] Exactly.
[00:37:58.760 --> 00:37:59.480] Once they're gone.
[00:37:59.480 --> 00:37:59.880] Yeah.
[00:37:59.880 --> 00:38:03.800] Yeah, they don't come back except on geological time scales.
[00:38:04.120 --> 00:38:05.240] Yeah, that's a downer.
[00:38:05.320 --> 00:38:06.680] It's very, very unfortunate.
[00:38:06.760 --> 00:38:13.240] I mean, this is interesting finding, and it's exciting from that reason, but it does say, yeah, it's not looking good.
[00:38:13.240 --> 00:38:15.400] Not looking good for the polymetallic nodules.
[00:38:15.400 --> 00:38:17.080] We have to figure something else out.
[00:38:17.080 --> 00:38:18.760] And again, it's not like this was our one hope.
[00:38:19.160 --> 00:38:24.200] There's other options in terms of sourcing manganese, nickel, and cobalt.
[00:38:24.200 --> 00:38:30.360] And there's different battery designs that use different materials so that we're not as dependent on those three things.
[00:38:30.600 --> 00:38:31.880] Those are the big ones.
[00:38:31.880 --> 00:38:46.160] In addition to the lithium, like lithium, manganese, nickel, and cobalt, those are the four really that are the limiting supply line factors for the current most energy-dense lithium-ion batteries that we have.
[00:38:44.760 --> 00:38:52.400] You can make less energy-dense batteries by using a chemistry that does not involve the nickel and the cobalt.
[00:38:52.720 --> 00:38:55.840] But again, you're sacrificing a little bit of energy density.
[00:38:55.840 --> 00:39:07.600] But there are some entirely new battery designs, you know, that use salt or iron, like the really abundant stuff that's not limited to these rare earths or to these metals.
[00:39:07.600 --> 00:39:13.280] So that's, you know, hopefully those will come online before too long, you know, in a significant way.
[00:39:13.280 --> 00:39:17.440] Because we're not going to be soaking, you know, sucking up batteries from the ocean floor.
[00:39:17.440 --> 00:39:21.680] That doesn't seem to be like a viable option, unfortunately.
[00:39:21.680 --> 00:39:26.640] All right, Kara, tell us about the speed of chimp conversation.
[00:39:26.640 --> 00:39:30.160] Yeah, chimpanzees have conversations.
[00:39:30.160 --> 00:39:34.800] This is something we've known for a while, but what are their conversations made of?
[00:39:34.800 --> 00:39:37.440] I mean, they don't have words, right?
[00:39:37.440 --> 00:39:38.960] They don't speak.
[00:39:38.960 --> 00:39:40.640] How do they communicate with one another?
[00:39:40.640 --> 00:39:41.680] Sign language.
[00:39:41.680 --> 00:39:43.120] And hoots and hollers.
[00:39:43.440 --> 00:39:44.720] If we teach them sign language.
[00:39:46.800 --> 00:39:47.920] In the wild or in the land.
[00:39:48.000 --> 00:39:48.640] They have their own signs.
[00:39:49.040 --> 00:39:50.320] They have their own sign.
[00:39:51.120 --> 00:39:54.160] I would say that they, well, I know that they make a lot of noises.
[00:39:54.160 --> 00:39:54.960] They do make a lot of noises.
[00:39:55.120 --> 00:39:58.000] And they probably do facial expressions and gesturing.
[00:39:58.000 --> 00:40:00.320] And they hold up different numbers of bananas.
[00:40:00.640 --> 00:40:02.240] Do they secrete an odor?
[00:40:03.120 --> 00:40:11.840] So, in this study, entitled, this will kind of give it away: chimpanzee gestural exchanges share temporal structure with human language.
[00:40:11.840 --> 00:40:16.880] This was published in Current Biology.
[00:40:17.200 --> 00:40:30.680] And this is a correspondence from multiple authors who decided to observe chimpanzees in the wild and to record a lot of data on how they interact with one another.
[00:40:31.000 --> 00:40:38.520] They, over the course of their data collection, studied five wild communities of chimpanzees in East Africa.
[00:40:38.520 --> 00:40:47.480] They collected data on more than 8,500 gestures across 252 individuals.
[00:40:47.800 --> 00:41:08.360] And when they were looking at those gestures, they found that the vast majority of these kinds of communications or conversations, upwards of 85, 86% of them, were a single gesture would be made by an individual, and then the other individual would engage in a behavior based on that gesture.
[00:41:08.360 --> 00:41:13.960] So for example, one individual gestures, another come here, the other one comes here.
[00:41:13.960 --> 00:41:21.240] They were looking specifically at the 14% of interactions where there was a gestural exchange.
[00:41:21.560 --> 00:41:23.880] I gesture to you, you gesture back to me.
[00:41:23.880 --> 00:41:26.120] I gesture to you, you gesture back to me.
[00:41:26.120 --> 00:41:31.080] So this was less about simple commands and behavioral responses.
[00:41:31.080 --> 00:41:39.160] And in the researchers' view, they talk about these almost like negotiations because they would often occur around something like food or grooming.
[00:41:39.160 --> 00:41:43.960] They were less likely to occur around simple commands or requests.
[00:41:43.960 --> 00:41:56.600] So across 14% of those observations, they found that there was an exchange of gestures between two individuals that were at least a two-part exchange, but sometimes they would go up to seven back and forths.
[00:41:56.600 --> 00:41:58.760] And they found something kind of interesting.
[00:41:58.760 --> 00:42:17.200] When we speak to one another in normal conversation, and when I say we, I mean human beings, face-to-face conversation, the responses, I say something, you say something, or I move my hands, you move your hands in reaction or response, is about on average 200 milliseconds.
[00:42:17.520 --> 00:42:26.240] They found that in the chimpanzees, the average, of course, this is based on a much smaller sample size, was about 120 milliseconds.
[00:42:26.240 --> 00:42:30.080] They're not saying it's faster, they're saying it's within the same range.
[00:42:30.400 --> 00:42:36.640] The behavioral responses were significantly longer, I think, closer to like 1500 milliseconds.
[00:42:36.640 --> 00:42:52.240] But the gestural responses were fast, so fast, in fact, that sometimes they would observe one chimpanzee gesturing to another and the other gesturing back before the first one had even finished their gesture, which is such a similar thing that we human beings do.
[00:42:52.240 --> 00:43:05.760] We interrupt each other, we react to, you know, we might have a facial expression or a hand motion that's reactive to something that said even before sitting, processing, and then being reactive.
[00:43:05.760 --> 00:43:44.000] And so the researchers posit that this kind of shared behavior, this temporality, which is very, very quick, is likely either indicative of a previous evolutionary ancestor, so that this is something that was conserved prior to the speciation of both chimpanzees and human beings, or that this might have been a source of homologous kind of evolution, that like we developed it and they developed it, possibly because we had similar building blocks there.
[00:43:44.000 --> 00:44:04.520] But either way, this study, I believe, is important, and many of the writers who've done write-arounds of it, and even the authors themselves, are saying, this is important for kind of understanding the roots, sort of the evolutionary roots of human language, because these gestural exchanges are a form of communication.
[00:44:04.680 --> 00:44:13.480] And even though words are not being uttered, we're seeing conversations happening as opposed to just command and follow.
[00:44:13.480 --> 00:44:14.120] Wow.
[00:44:14.120 --> 00:44:15.320] It's pretty interesting.
[00:44:15.320 --> 00:44:19.560] And at the same rate, you know, there's not a it's not slower.
[00:44:19.560 --> 00:44:24.520] It's there's something going on there that's the cognitive capability is very similar.
[00:44:24.760 --> 00:44:26.760] Which, yeah, not surprisingly surprised.
[00:44:26.920 --> 00:44:29.720] Do we have any idea what they're saying to each other?
[00:44:29.960 --> 00:44:32.760] I bet you a lot of these behaviorists do.
[00:44:32.760 --> 00:44:44.200] You know, these individuals who spend time in the field just watching and watching and observing, you know, through context clues, through behaviors that follow the gestures and the exchanges.
[00:44:44.200 --> 00:44:45.800] You know, how often do they do this?
[00:44:45.800 --> 00:44:49.000] And then the food is given or the food is taken away or the food is whatever.
[00:44:49.000 --> 00:44:51.720] I bet you they can kind of speak chimpanzee.
[00:44:51.720 --> 00:44:59.160] And I remember years ago having a woman on my show, gosh, I even remember her name, Rebecca Atencia.
[00:44:59.160 --> 00:45:07.080] She was the chief veterinarian at the Chimpaunga Reserve in the DRC.
[00:45:07.400 --> 00:45:12.760] And she told me, like, her kids grew up on the reserve with her as she was working with these chimps.
[00:45:12.760 --> 00:45:14.760] It was a Jane Goodall Foundation reserve.
[00:45:14.760 --> 00:45:17.320] She was like, I feel like my kids speak chimp.
[00:45:17.640 --> 00:45:20.280] Like, they know what the different calls mean.
[00:45:20.280 --> 00:45:25.400] They're so used to it because they just grew up hearing it all the time and seeing it all the time.
[00:45:25.400 --> 00:45:27.640] That in some ways it's like a language that they speak too.
[00:45:27.640 --> 00:45:28.040] Yeah, yeah.
[00:45:28.840 --> 00:45:29.960] It's fascinating.
[00:45:30.280 --> 00:45:33.000] Did you guys watch Kingdom of the Planet of the Apes, the most recent one?
[00:45:34.120 --> 00:45:35.000] No, I didn't say the most recent.
[00:45:35.720 --> 00:45:36.280] It was okay.
[00:45:36.280 --> 00:45:36.920] I enjoyed it.
[00:45:36.920 --> 00:45:37.720] It was fine.
[00:45:37.720 --> 00:45:50.000] But one thing I did not like about it was that the way they represented the fact that the chimpanzees had language that was not as fully developed as human language.
[00:45:50.320 --> 00:45:53.440] They had them speak in a slow and halting fashion.
[00:45:53.440 --> 00:45:53.840] Right.
[00:45:53.840 --> 00:45:54.720] Which is not appropriate.
[00:45:54.880 --> 00:45:58.560] Yeah, I don't, well, even before, I'm like, that just doesn't, that's not right.
[00:45:58.880 --> 00:46:06.160] It is, that's like a Hollywood trope, you know, like, like, that's just the only way they know to represent, you know, that language is limited.
[00:46:06.320 --> 00:46:10.160] Like, they probably, there's no, I don't, it didn't, you know, ring true to me.
[00:46:10.160 --> 00:46:17.360] I thought they, if you're trying to represent that, they should have a completely fluent, just simplified language structure.
[00:46:17.360 --> 00:46:20.320] Yes, just fewer, yeah, fewer modifiers, simple.
[00:46:20.480 --> 00:46:22.400] Smaller vocabulary, whatever.
[00:46:22.640 --> 00:46:25.120] But it should still be completely fluid for them.
[00:46:25.120 --> 00:46:25.760] You know what I mean?
[00:46:26.480 --> 00:46:27.840] It's not like they're children learning.
[00:46:27.920 --> 00:46:28.080] Yeah.
[00:46:29.120 --> 00:46:31.440] Yeah, it's like trying to speak to a different speech.
[00:46:31.520 --> 00:46:36.080] Right, or they're developing speech again during speech therapy after a head injury or something.
[00:46:36.080 --> 00:46:39.600] Or like it's a second language, but halting, like broken language.
[00:46:39.600 --> 00:46:41.120] Like, no, this is their language.
[00:46:41.120 --> 00:46:42.960] This is what they grew up speaking.
[00:46:42.960 --> 00:46:44.880] Right, yeah, it would be more fluid than that.
[00:46:44.880 --> 00:46:45.840] But maybe you're right.
[00:46:45.840 --> 00:46:50.960] More kind of lexically, I guess you could say, simple.
[00:46:51.600 --> 00:46:57.680] Hi, I'm Chris Gethard, and I'm very excited to tell you about Beautiful Anonymous, a podcast where I talk to random people on the phone.
[00:46:57.680 --> 00:47:00.320] I tweet out a phone number, thousands of people try to call.
[00:47:00.320 --> 00:47:02.320] I talk to one of them, they stay anonymous.
[00:47:02.320 --> 00:47:03.200] I can't hang up.
[00:47:03.200 --> 00:47:04.240] That's all the rules.
[00:47:04.240 --> 00:47:05.760] I never know what's going to happen.
[00:47:05.760 --> 00:47:07.120] We get serious ones.
[00:47:07.120 --> 00:47:09.120] I've talked with meth dealers on their way to prison.
[00:47:09.120 --> 00:47:11.280] I've talked to people who survived mass shootings.
[00:47:11.280 --> 00:47:12.400] Crazy, funny ones.
[00:47:12.400 --> 00:47:16.320] I talked to a guy with a goose laugh, somebody who dresses up as a pirate on the weekends.
[00:47:16.320 --> 00:47:17.680] I never know what's going to happen.
[00:47:17.680 --> 00:47:18.880] It's a great show.
[00:47:18.880 --> 00:47:21.440] Subscribe today, Beautiful Anonymous.
[00:47:21.600 --> 00:47:24.080] All right, Bob, tell us about this new nuclear clock.
[00:47:24.080 --> 00:47:26.960] Yeah, researchers have taken an important step.
[00:47:26.960 --> 00:47:36.760] I think it's important in creating the first nuclear clock, which in many ways would be superior to all those old-school atomic clocks that you've heard so much about.
[00:47:37.640 --> 00:47:39.320] That's a big statement, Bob.
[00:47:39.320 --> 00:47:39.800] Oh, yeah.
[00:47:39.800 --> 00:47:41.720] Well, let's see what the science says here.
[00:47:41.720 --> 00:47:50.200] So, this is the result of a collaboration of physicists from the Federal Physical and Technical Institute of Brunschweek in Germany and the Vienna University of Technology in Austria.
[00:47:50.200 --> 00:47:50.760] Okay?
[00:47:50.760 --> 00:47:53.240] Published in Physical Review Letters.
[00:47:53.640 --> 00:47:59.080] The title of the study is Laser Excitation of the Thorium-229 Nucleus.
[00:47:59.080 --> 00:48:08.920] Okay, so now to understand the future nuclear clocks that we will maybe see, it'd be helpful to know how plain old atomic clocks work.
[00:48:08.920 --> 00:48:11.880] And we've all, you think everyone's heard of atomic clocks, right?
[00:48:11.880 --> 00:48:14.120] At least the term.
[00:48:14.120 --> 00:48:24.840] And I think a lot of those people have their takeaway is that, yeah, atomic clocks use atoms and they're ridiculously accurate, like losing a second only after like millions of years or something.
[00:48:24.840 --> 00:48:28.600] So I think most people would agree with, would know about that.
[00:48:28.600 --> 00:48:30.680] And that's essentially correct.
[00:48:30.840 --> 00:48:37.000] The part of the atom that's most important, though, for atomic clocks are the electrons in their orbital shells around the atoms.
[00:48:37.000 --> 00:48:38.680] That's what's really critical.
[00:48:38.680 --> 00:48:41.880] That's where all the hard work is happening.
[00:48:41.880 --> 00:48:56.760] Now, by exciting the electrons, those orbital shells with a precise amount of radiation, the electrons gain energy, right, rising to another orbital where they hang out there for a very tiny amount of time.
[00:48:56.760 --> 00:49:02.120] Then they will release a specific amount before going down, before going back down, right?
[00:49:02.280 --> 00:49:15.280] Now, so that up and down transition of the electrons between energy levels, it's very predictable, it's very stable, and they're essentially ticks of a clock like a swinging pendulum or a vibrating crystal to tell time.
[00:49:14.760 --> 00:49:18.560] So, that's kind of my overview of atomic clocks.
[00:49:18.640 --> 00:49:29.840] Now, our current standard today uses microwaves to excite cesium atom electrons up and down, 9.192631770 billion times per second.
[00:49:29.840 --> 00:49:39.840] In fact, that's exactly how our second is now defined by 9.19 billion oscillations of electrons in the cesium atom.
[00:49:39.840 --> 00:49:46.080] Now, the time standard the world depends on is called International Atomic Time, which I always love that.
[00:49:46.160 --> 00:49:47.040] Sounds pretty cool.
[00:49:47.040 --> 00:49:48.560] International Atomic Time.
[00:49:48.560 --> 00:49:58.880] And it uses not just one of these cesium microwave atomic clocks, but about 450 of them spread out all over the world, I think, in 80 labs throughout the world.
[00:49:58.880 --> 00:50:08.160] And that average, that average of those 450 atomic clocks, that's the foundation, or I'll call it the raw time that the world uses.
[00:50:08.160 --> 00:50:12.880] Now, I say raw because this atomic time does not add leap seconds.
[00:50:13.680 --> 00:50:21.040] It's just that raw time, just this is how many seconds have passed, and it doesn't, there's no leap seconds added.
[00:50:21.040 --> 00:50:28.320] Now, leap seconds are added to this atomic time to make coordinated universal time, UTC.
[00:50:28.320 --> 00:50:32.320] And I bet a lot of people have heard about UTC.
[00:50:33.360 --> 00:50:40.720] So, this is like the foundation of our timekeeping, our civil timekeeping in a lot of the industrialized world.
[00:50:40.720 --> 00:50:42.800] And so, it's obviously pretty damn critical.
[00:50:42.800 --> 00:50:44.880] Yeah, so no need to improve on it, right?
[00:50:46.560 --> 00:50:49.120] Actually, it's getting old and crusty, man.
[00:50:49.120 --> 00:50:50.160] Old and crusty.
[00:50:50.160 --> 00:50:54.240] As great as microwave atomic clocks are, there's another that's better.
[00:50:54.240 --> 00:50:58.240] And it's still an atomic clock, but it's an optical atomic clock.
[00:50:58.240 --> 00:51:03.000] And it's actually been in the news like this past week for big advances.
[00:50:59.840 --> 00:51:07.240] It's really gotten amazingly good, amazingly precise, fascinating stuff.
[00:51:07.560 --> 00:51:09.960] Now, these are complicated as hell.
[00:51:09.960 --> 00:51:11.560] I was trying to understand how they're working.
[00:51:11.560 --> 00:51:22.280] It's really a lot of complication in there, but it's basically very similar to these microwave atomic clocks, but they don't use microwaves to make the electron transitions.
[00:51:22.280 --> 00:51:24.120] They use optical wavelengths.
[00:51:24.200 --> 00:51:27.480] Obviously, that's why they call it optical atomic clocks.
[00:51:27.480 --> 00:51:31.400] So they're using the higher frequency optical wavelengths of light.
[00:51:31.400 --> 00:51:35.080] And that higher frequency means that there's even more ticks of the clock per second.
[00:51:35.080 --> 00:51:42.920] There's perhaps 100 times more accurate than microwave atomic clocks losing a second once only every 30 billion years.
[00:51:42.920 --> 00:51:47.400] So these are donkulous, these new optical clocks.
[00:51:47.400 --> 00:51:56.440] And many think that these optical atomic clocks will supplant the microwave clocks for our standard for this, our international atomic time.
[00:51:56.680 --> 00:51:58.120] And that might very well happen.
[00:51:58.120 --> 00:52:02.440] But before that does happen, there's a new game in town.
[00:52:03.240 --> 00:52:08.600] These nuclear clocks may be around in the near future at some point.
[00:52:08.920 --> 00:52:16.120] Now, as you may have guessed, nuclear clocks don't work off the tick of electrons transitioning between energy states.
[00:52:16.120 --> 00:52:20.440] Nuclear clocks would be based on the energy transitions of the nucleus, right?
[00:52:20.440 --> 00:52:27.720] So as the neutrons and protons themselves enter into a higher energy state and go back down to their ground state.
[00:52:27.720 --> 00:52:29.320] So that's what's oscillating.
[00:52:29.320 --> 00:52:33.560] It's the nucleus and not the electrons in their distant orbitals.
[00:52:33.560 --> 00:52:34.440] That's not easy, though.
[00:52:34.440 --> 00:52:35.400] It's not easy to do.
[00:52:35.400 --> 00:52:37.800] And they've been thinking about this for many decades.
[00:52:38.520 --> 00:52:44.480] For old school atomic clocks, we can use just like lab-grade lasers to force electrons to transition.
[00:52:44.200 --> 00:52:46.480] It's not that difficult.
[00:52:46.800 --> 00:52:57.600] But if you want to do that to a nucleus, if you want to cause that nucleus to enter into a greater energy state, you would need lasers that are at least a thousand times more powerful than what's common today.
[00:52:57.600 --> 00:52:59.280] So that's just not happening.
[00:52:59.600 --> 00:53:00.640] Not in the near future.
[00:53:01.120 --> 00:53:04.400] We just don't have the lasers that would be required to do that.
[00:53:04.400 --> 00:53:21.120] I mean, but even if we did, even if we did have a Star Trek laser to energize a nucleus properly, we would also need to know how very precisely what the energy gap is between the ground state of the nucleus and the higher energy state.
[00:53:21.120 --> 00:53:26.400] We don't know what that precise frequency is, and it's not easy to find that out.
[00:53:26.400 --> 00:53:30.080] A key to the researchers' breakthrough was a very special atom.
[00:53:30.080 --> 00:53:32.320] This is the thorium-229 atom.
[00:53:32.320 --> 00:53:35.200] This is special because thorium is kind of weird.
[00:53:35.440 --> 00:53:40.640] Its energized state is very, very close to the lowest energy state or the ground state.
[00:53:41.040 --> 00:53:47.360] The ground state is tiny-ish, and the next level up is really, really close.
[00:53:47.360 --> 00:53:51.360] And there's no other atom that has such a small transition energy.
[00:53:51.360 --> 00:53:56.240] And that's why we don't need a Star Trek laser to manipulate its nucleus.
[00:53:56.240 --> 00:54:02.080] And that's why they were able to do this, because thorium is very, very special in that regard.
[00:54:02.080 --> 00:54:05.680] You don't need a super powerful laser to energize a nucleus.
[00:54:05.680 --> 00:54:09.040] You can use something like an ultraviolet laser, which is what they did.
[00:54:09.040 --> 00:54:14.800] Okay, so the researchers used this vacuum ultraviolet laser, which they made themselves.
[00:54:14.800 --> 00:54:15.680] How cool is that?
[00:54:15.680 --> 00:54:17.120] They just like, yep, this is what we need.
[00:54:17.120 --> 00:54:19.120] We're just going to make this ultraviolet laser.
[00:54:19.120 --> 00:54:24.000] And they were able, for the first time ever, no one's been ever been able to do this before.
[00:54:24.000 --> 00:54:27.280] They excited the nucleus of a thorium-229.
[00:54:27.280 --> 00:54:40.280] And not only that, they were able to estimate that ideal frequency, like you know, the perfect frequency, they were able to determine what that frequency needs to be with one-thousandth the previous uncertainty.
[00:54:40.280 --> 00:54:48.120] So they took the standard uncertainty for that frequency that would be the best frequency to stimulate it, and they chopped that into a thousand pieces.
[00:54:48.120 --> 00:54:54.360] Like, here, it's a thousand times more or less certain, I guess is one way to say it.
[00:54:54.360 --> 00:54:56.520] Now, of course, we don't have nuclear clocks yet.
[00:54:56.520 --> 00:55:01.800] This is just kind of like setting the stage, I think, for making the first prototype.
[00:55:01.800 --> 00:55:08.040] And it could still take an amount of years before this is happening, but this was big.
[00:55:08.760 --> 00:55:09.880] This one will go down.
[00:55:09.880 --> 00:55:11.800] So, what is the future of timekeeping?
[00:55:11.800 --> 00:55:20.600] So, I've done a lot of thinking about what are we going to see in the near future in terms of timekeeping, especially with atomic clocks and nuclear clocks.
[00:55:20.600 --> 00:55:27.880] So, I think the current international atomic time standard that uses microwave atomic clocks, right, I think they're going to go away.
[00:55:28.520 --> 00:55:32.200] They're just not as precise as what we have now.
[00:55:32.520 --> 00:55:33.640] They're just not as good.
[00:55:33.640 --> 00:55:41.880] The electrons for these cesium atoms oscillate only a billion times per second, which is really limiting their precision at this point.
[00:55:42.040 --> 00:55:49.160] I think in the near future, and even now, optical atomic clocks are really kicking butt, especially this latest news item.
[00:55:49.720 --> 00:55:51.480] They really improve the hell out of it.
[00:55:51.480 --> 00:55:53.480] They have amazing precision because why?
[00:55:53.480 --> 00:55:55.240] They're using optical wavelengths, right?
[00:55:55.400 --> 00:56:03.720] The higher frequency means that there's more ticks to that clock, and so they can subdivide and be much more precise with their timing.
[00:56:03.720 --> 00:56:07.000] So, the clock ticks is many orders of magnitude higher.
[00:56:07.000 --> 00:56:10.520] The level of precision would probably make this a standard for timekeeping.
[00:56:11.560 --> 00:56:18.800] And maybe in five or six years, I think they may abandon the microwave cesium atomic clocks and go for the opticals.
[00:56:18.960 --> 00:56:24.240] Okay, but nuclear clocks though, they don't have, I was very disappointed of this actually.
[00:56:24.400 --> 00:56:28.800] The nuclear clocks won't have the precision of the optical atomic clocks.
[00:56:28.800 --> 00:56:32.400] And that's basically because of the frequency of UV laser light, right?
[00:56:32.400 --> 00:56:36.000] I mean, the frequency of UV laser light is not what optical is.
[00:56:36.000 --> 00:56:41.840] It's just not as good in terms of being precise and having so many ticks of that clock, as I've been saying.
[00:56:41.840 --> 00:56:46.400] But the nuclear clocks will, when we have them, they will have some very interesting advantages.
[00:56:46.400 --> 00:56:48.320] And one of these is stability.
[00:56:48.320 --> 00:56:49.600] One is stability.
[00:56:49.600 --> 00:56:58.560] These clocks will be far more stable because an atomic nucleus, think about it, it's much more isolated from the environment than an electron cloud is.
[00:56:58.560 --> 00:57:05.200] So think, an electron cloud that the atomic clocks depend on are is huge compared to the nucleus.
[00:57:05.200 --> 00:57:09.760] A nucleus is five orders of magnitude smaller than the atomic cloud.
[00:57:09.760 --> 00:57:18.320] So that electron cloud is out there and it can be influenced by ambient electromagnetic fields and other things, and that's what hurts the stability of the atomic clocks.
[00:57:18.320 --> 00:57:23.120] Nuclear clocks, on the other hand, with their tiny nucleus, it's really isolated.
[00:57:23.280 --> 00:57:29.600] So it's much more able to just like ignore things that could interfere with the atomic clock.
[00:57:29.600 --> 00:57:34.960] So that's a huge advantage with that stability, much, much better than the regular atomic clocks.
[00:57:34.960 --> 00:57:37.680] And this stability leads to another advantage.
[00:57:37.840 --> 00:57:45.520] You can pack the atoms that are involved in the timekeeping very, very close together, meaning that nuclear clocks won't need thorium gas.
[00:57:45.520 --> 00:57:47.600] You know, they won't need to be in a gas form.
[00:57:47.600 --> 00:57:55.760] It could be embedded in a solid material, meaning that these nuclear clocks can be solid state.
[00:57:55.760 --> 00:57:58.560] And you know, solid state is a huge advantage.
[00:57:58.560 --> 00:58:04.280] It's like none to very few moving parts at all, which means that it'd be much more portable.
[00:58:04.280 --> 00:58:04.840] And who knows?
[00:57:59.680 --> 00:58:07.080] I mean, it's just a huge advantage for these.
[00:58:07.320 --> 00:58:12.520] The stability of this, the potential nuclear clock, also has scientific advantages as well.
[00:58:12.520 --> 00:58:20.360] Nuclear clocks should have advantages testing theories of fundamental physics beyond the standard model, which I've been dying for for how many decades now?
[00:58:20.520 --> 00:58:27.000] So maybe it could actually contribute and find some tiny little hints of physics beyond the standard model.
[00:58:27.000 --> 00:58:46.520] We may find that fundamental constants are not as constant, and that's one of the things that people really are pinning their hopes on for nuclear clocks because if anything can look at the constants of physics close enough to see variation that we're not seeing now, these atomic nuclear clocks might be able to do that.
[00:58:46.520 --> 00:58:49.320] And they even might be able to find clues to dark matter.
[00:58:49.320 --> 00:58:53.560] So I'll end with one advantage for nuclear clocks that I read.
[00:58:53.560 --> 00:59:00.360] I couldn't confirm it, but I did read this on one website, and they said that the impact on GPS could be dramatic.
[00:59:00.360 --> 00:59:01.160] So what do you think?
[00:59:02.040 --> 00:59:04.440] What's the accuracy of conventional GPS today?
[00:59:04.440 --> 00:59:05.080] It's like, what?
[00:59:05.400 --> 00:59:08.280] It's like 4.9 meters, so 16 feet.
[00:59:08.280 --> 00:59:09.640] So that's the accuracy.
[00:59:09.640 --> 00:59:19.400] So this one website was saying that nuclear clocks could have an accuracy to improve the accuracy of GPS down to the millimeter scale, millimeters.
[00:59:19.400 --> 00:59:27.960] And so, Steve, if we had that and you had GPS on your keys, we could have said, yep, it's definitely in the lake, and you're never going to get it again.
[00:59:27.960 --> 00:59:28.760] Right.
[00:59:29.720 --> 00:59:30.760] Oh, nice tie-in.
[00:59:30.880 --> 00:59:31.640] Yeah, thank you.
[00:59:31.720 --> 00:59:32.040] I'm done.
[00:59:32.040 --> 00:59:33.960] Let's just end this damn segment.
[00:59:34.920 --> 00:59:36.120] All right, good job, Abby.
[00:59:36.680 --> 00:59:41.400] All right, Evan, tell us about this updated poll on creationism.
[00:59:41.400 --> 00:59:45.840] Yeah, poll on creationism that came out just a couple of days ago.
[00:59:44.840 --> 00:59:48.000] Poll by the firm called Gallup.
[00:59:48.240 --> 00:59:59.760] I'm sure we are all familiar with that, but those who are not, it is recognized as one of the leading polling organizations in the United States, and they take polls on all sorts of things all the time.
[00:59:59.760 --> 01:00:02.320] But this one, they've been tracking for a while.
[01:00:02.320 --> 01:00:06.800] The last time they took this particular poll was 2019.
[01:00:06.800 --> 01:00:10.560] So we're five years afterwards, and they did it again.
[01:00:10.560 --> 01:00:13.760] Here is exactly how it went down.
[01:00:14.000 --> 01:00:30.640] These were telephone interviews that were conducted between May 1st and May 23rd of this year with a random sample of 1,024 adults ages 18 or over living in all 50 United States and the District of Columbia.
[01:00:30.640 --> 01:00:39.600] The results based on the sample of national adults, the margin of sampling error was plus or minus four percentage points at the 95% confidence level.
[01:00:39.600 --> 01:00:40.800] So there you go.
[01:00:41.280 --> 01:00:44.080] Here's the question that they asked.
[01:00:44.080 --> 01:00:47.840] I mean, they asked a lot of questions in this particular poll, but one in particular was this one.
[01:00:47.920 --> 01:00:49.280] This is what made the headlines.
[01:00:49.280 --> 01:00:55.040] Which of the following statements comes closest to your views on the origin and development of human beings?
[01:00:55.040 --> 01:00:56.960] And there's basically three choices.
[01:00:56.960 --> 01:01:05.840] Number one: human beings have developed over millions of years from less advanced forms of life, but God guided this process.
[01:01:06.160 --> 01:01:16.160] Number two, human beings have developed over millions of years from less advanced forms of life, but God had no part in this process.
[01:01:16.480 --> 01:01:25.920] And option three: God created human beings pretty much in their present form at one time within the last 10,000 years or so.
[01:01:25.920 --> 01:01:32.760] The fourth option was basically that no opinion, no answer, of which 5% of these people did.
[01:01:29.520 --> 01:01:34.440] So here's what I'd like to do.
[01:01:35.240 --> 01:01:36.440] Ignore the 5%, right?
[01:01:36.440 --> 01:01:41.240] So we have 95% of these poll respondents that did answer one, two, or three.
[01:01:41.560 --> 01:01:44.280] How do you think this divided up?
[01:01:44.600 --> 01:01:49.640] And I want to know each of your opinions as to how you think this broke down.
[01:01:49.640 --> 01:01:55.640] Again, the first option was evolution, but God guided the process.
[01:01:55.640 --> 01:01:59.160] Number two, evolution, God had no part of it.
[01:01:59.160 --> 01:02:05.080] Number three, God creating human beings as they are today 10,000 years or more recently.
[01:02:05.080 --> 01:02:10.760] So, Steve, how do you think of those three, of the 95% left, how did that divide up, do you think?
[01:02:10.760 --> 01:02:13.480] I mean, I've been following this poll for 30 years, seven.
[01:02:13.720 --> 01:02:15.000] Bob, how do you think this?
[01:02:15.160 --> 01:02:17.640] And I know that I know I looked, I know all the recent numbers too.
[01:02:17.640 --> 01:02:18.600] I saw the SR.
[01:02:18.600 --> 01:02:20.040] So, yeah, I'll pass.
[01:02:20.040 --> 01:02:23.480] I'd say just God, 55%.
[01:02:23.480 --> 01:02:23.880] Okay.
[01:02:24.200 --> 01:02:24.840] What?
[01:02:24.840 --> 01:02:28.200] You mean like in the last 10,000 years?
[01:02:28.200 --> 01:02:28.520] Right.
[01:02:28.760 --> 01:02:30.040] No, I think that's like 20%.
[01:02:30.200 --> 01:02:30.520] What do you mean?
[01:02:30.680 --> 01:02:32.280] What does 10,000 years think?
[01:02:32.280 --> 01:02:36.280] I thought just God meant like they don't believe in evolution.
[01:02:36.920 --> 01:02:38.680] Well, let me read to you the third option again.
[01:02:38.680 --> 01:02:43.160] God created human beings pretty much in their present form at one time within the last 10,000 years.
[01:02:43.160 --> 01:02:45.000] Oh, I missed that last 10,000 years.
[01:02:45.000 --> 01:02:45.640] Oh, what?
[01:02:45.640 --> 01:02:45.880] All right.
[01:02:45.880 --> 01:02:48.360] So put that one at 25%.
[01:02:48.360 --> 01:02:49.800] Okay, I would put it.
[01:02:49.800 --> 01:02:51.080] All right, Kara says 25.
[01:02:51.640 --> 01:02:53.480] Jay, you think, what do you think?
[01:02:53.480 --> 01:02:55.160] I'll say that's 20%.
[01:02:55.160 --> 01:02:55.640] 25%.
[01:02:55.800 --> 01:02:56.280] 20%.
[01:02:56.680 --> 01:02:57.080] 20%.
[01:02:57.320 --> 01:02:59.800] Now, for each of you, Bob, Kerry, and Jay, I'll skip Steve.
[01:02:59.800 --> 01:03:04.840] So, Bob, what do you think about evolution and God had no part in the process?
[01:03:04.840 --> 01:03:08.040] Does that mean that God definitely exists, though?
[01:03:08.040 --> 01:03:08.680] Is that the implication?
[01:03:08.880 --> 01:03:09.720] Don'll read into it, Bob.
[01:03:09.960 --> 01:03:10.680] All right, we'll read into it.
[01:03:10.920 --> 01:03:11.720] There's only three choices.
[01:03:12.040 --> 01:03:12.440] This is how that is.
[01:03:12.520 --> 01:03:14.360] I'll say 35% for that.
[01:03:14.360 --> 01:03:16.480] Okay, 35%.
[01:03:14.520 --> 01:03:19.600] That's 60, and that means you mean 35% for the first category.
[01:03:19.760 --> 01:03:20.240] Got it?
[01:03:20.400 --> 01:03:23.600] Kara, God had no part in the process?
[01:03:23.600 --> 01:03:28.720] I think it's 40% scientific evolution and 30% religious evolution.
[01:03:28.720 --> 01:03:29.840] And Jay, what do you think?
[01:03:29.840 --> 01:03:32.160] God had no part in the process.
[01:03:32.480 --> 01:03:33.840] Very low percentage.
[01:03:33.840 --> 01:03:34.800] You think it's low?
[01:03:34.800 --> 01:03:37.200] Lower than 40% or 35%, like Bob is.
[01:03:37.600 --> 01:03:39.520] Yeah, I'd even go lower than 30%.
[01:03:39.520 --> 01:03:40.960] Should I put you in at a 30%?
[01:03:40.960 --> 01:03:41.680] Yeah, go ahead.
[01:03:41.680 --> 01:03:45.200] All right, so you're either be safe and therefore 45% in the first category.
[01:03:45.200 --> 01:03:48.480] Okay, so here are the results.
[01:03:48.480 --> 01:03:48.880] Okay.
[01:03:49.200 --> 01:03:53.840] Man developed with God's guiding hand, 34%.
[01:03:53.840 --> 01:03:56.160] So, Bob, you were pretty much on the mark with that.
[01:03:56.160 --> 01:03:56.320] Yeah.
[01:03:56.320 --> 01:03:57.200] Good job, Bob.
[01:03:57.200 --> 01:03:57.920] Very well done.
[01:03:58.560 --> 01:04:01.600] Man developed, but God had no part in the process.
[01:04:01.600 --> 01:04:03.200] 24%.
[01:04:03.840 --> 01:04:06.720] You guys basically flipped evolution and creationism in the poll.
[01:04:06.720 --> 01:04:07.600] Oh, Jesus.
[01:04:07.920 --> 01:04:08.880] Has it always been that way?
[01:04:09.440 --> 01:04:10.320] Oh, is it getting worse?
[01:04:11.200 --> 01:04:12.160] It was much worse.
[01:04:12.160 --> 01:04:12.960] Yeah, it was worse.
[01:04:13.440 --> 01:04:17.040] I'll give you the historical worsts on these in a second.
[01:04:17.040 --> 01:04:22.000] And God created man in present form, 37%.
[01:04:22.240 --> 01:04:23.840] Jesus, that's so high.
[01:04:24.080 --> 01:04:27.760] That is higher than all of you basically guess.
[01:04:28.560 --> 01:04:32.400] And probably, I think I would have fallen in the same category as you guys.
[01:04:33.120 --> 01:04:37.360] Because we also are keeping up with the Gallups on nuns, right?
[01:04:37.360 --> 01:04:39.520] On actual religious belief.
[01:04:39.520 --> 01:04:43.040] And fewer and fewer people are identifying as religious.
[01:04:43.360 --> 01:04:44.560] Yeah, but weird that.
[01:04:45.520 --> 01:04:49.600] I think that has more to do with not being down with organized religion.
[01:04:49.600 --> 01:04:53.760] It doesn't mean they're not spiritual and don't hold magical beliefs.
[01:04:53.760 --> 01:04:54.400] True.
[01:04:54.400 --> 01:04:54.880] Yeah.
[01:04:55.440 --> 01:04:59.280] So they've been doing this poll since 1982.
[01:04:59.280 --> 01:05:01.720] This was the first time they asked this question.
[01:05:01.720 --> 01:05:02.520] Yeah, I know.
[01:04:59.760 --> 01:05:04.200] They've been tracking this a while now.
[01:05:04.520 --> 01:05:11.960] And back in 1982, 44% were in the category of God-created man in present form.
[01:05:11.960 --> 01:05:17.720] So it's gone down, which is good, from 44 to 37.
[01:05:17.720 --> 01:05:19.640] Doesn't sound like a lot, but it is significant.
[01:05:19.640 --> 01:05:23.720] And it's the and this time it's the lowest that's ever polled.
[01:05:23.720 --> 01:05:31.800] So that's the point, and that's kind of why it made the headlines is because it's the lowest that they've recorded.
[01:05:31.800 --> 01:05:38.200] But here, back in 1982, man developed, but God had no part in the process.
[01:05:38.200 --> 01:05:40.520] 1982, 9%.
[01:05:41.160 --> 01:05:41.800] 9%.
[01:05:41.800 --> 01:05:42.280] That's about 20%.
[01:05:42.520 --> 01:05:43.000] 24%.
[01:05:43.800 --> 01:05:44.840] Yeah, but now it's 24%.
[01:05:45.400 --> 01:05:45.720] So that's good.
[01:05:45.960 --> 01:05:46.520] That's huge.
[01:05:46.920 --> 01:05:47.880] Yeah, that's a big jump.
[01:05:47.880 --> 01:05:50.440] And that's tracking with the nuns, I think, Kara.
[01:05:50.920 --> 01:05:51.480] Yeah.
[01:05:51.720 --> 01:05:52.440] N-O-N-E.
[01:05:52.760 --> 01:05:52.920] Yeah.
[01:05:52.920 --> 01:05:53.080] Yeah.
[01:05:53.560 --> 01:05:53.720] Right.
[01:05:54.440 --> 01:05:55.160] Not N-U-N.
[01:05:55.160 --> 01:05:55.480] Yeah.
[01:05:55.480 --> 01:05:55.640] Yeah.
[01:05:56.200 --> 01:05:57.080] Exactly.
[01:05:57.080 --> 01:06:05.960] So I guess the point is that it's a good trend improving, and there's a lot of work to do here.
[01:06:06.120 --> 01:06:07.000] It's still horrible.
[01:06:07.000 --> 01:06:07.480] Yeah.
[01:06:07.480 --> 01:06:08.360] It is horrible.
[01:06:08.360 --> 01:06:18.120] I also wonder how many of that 5% would have been in that category if not for exactly the point that Bob raised, which is the wording of this poll is not good.
[01:06:18.120 --> 01:06:18.840] It's not great.
[01:06:19.240 --> 01:06:24.200] It sounds like it assumes that God exists the way that it's worded.
[01:06:24.200 --> 01:06:29.640] So it's somebody who is like an atheist, atheist who's like, God's hand wasn't involved.
[01:06:29.640 --> 01:06:31.000] Well, what if there's no God's hand?
[01:06:31.000 --> 01:06:31.880] How do I answer that question?
[01:06:32.040 --> 01:06:35.240] But it does say what's closest to your views.
[01:06:35.640 --> 01:06:36.680] But you're right.
[01:06:36.920 --> 01:06:40.360] The way you ask these questions has a big influence on it.
[01:06:40.680 --> 01:06:45.920] We talked about this with Jeannie Scott, you know, that the different surveys you ask the question different ways.
[01:06:46.240 --> 01:06:57.840] And the criticism of the Gallup poll has always been that people may answer, you know, may give the creationism answer because they don't want to seem anti-God.
[01:06:58.160 --> 01:07:08.160] You know, even if they don't really believe that, you know, and you do get different results depending on how this is all God-focused, you know, the way that they're asking these questions.
[01:07:08.160 --> 01:07:16.000] If you ask it more of a, from a science-focused perspective, more people will endorse evolution if you don't have God in the question.
[01:07:16.000 --> 01:07:16.480] You know what I mean?
[01:07:16.720 --> 01:07:17.440] Absolutely.
[01:07:17.440 --> 01:07:24.960] I, you know, I did my dissertation on medical aid and dying, and Gallup, I think it was Gallup, asked the medical aid and dying question two different ways.
[01:07:24.960 --> 01:07:27.840] In one of them, they called it assisted suicide.
[01:07:27.840 --> 01:07:30.240] And in another, they'd never use the word suicide.
[01:07:30.240 --> 01:07:31.680] And the answers were wildly different.
[01:07:31.920 --> 01:07:32.880] Wildly different, yeah.
[01:07:32.880 --> 01:07:34.240] Yeah, because it's prejudicial.
[01:07:34.240 --> 01:07:36.880] There's certain words that bias people's responses.
[01:07:36.880 --> 01:07:45.440] I also didn't like that the secular or scientific evolution choice said that we evolved from simpler forms.
[01:07:45.440 --> 01:07:47.600] Yeah, even that was weird.
[01:07:47.600 --> 01:07:50.160] And none of the other ones use the same verbiage.
[01:07:50.160 --> 01:07:51.520] Like there should be continuity.
[01:07:51.920 --> 01:07:52.400] Yeah.
[01:07:52.400 --> 01:07:53.040] Yeah.
[01:07:53.040 --> 01:07:53.280] Right.
[01:07:53.280 --> 01:07:54.720] It wasn't even a good, yeah.
[01:07:54.720 --> 01:07:59.280] If you're if you're somebody who's, you know, knows a lot about evolution, you would not sit well with that.
[01:07:59.280 --> 01:08:04.640] But the advantage of this poll, as Evan said, is this is 42 years.
[01:08:04.960 --> 01:08:07.280] It's the same exact question over time.
[01:08:07.280 --> 01:08:08.880] That has utility.
[01:08:09.520 --> 01:08:10.160] It does.
[01:08:10.320 --> 01:08:12.400] And saying which one is the closest to you.
[01:08:12.400 --> 01:08:13.520] Like, at least there's that qualifying.
[01:08:13.760 --> 01:08:17.760] Yeah, so it tells you more about the change than about the absolute numbers.
[01:08:18.080 --> 01:08:24.720] Right, because the absolute numbers have more to do with the wording of the survey, whereas the change tells you more about, I think, change in society.
[01:08:24.720 --> 01:08:29.520] So at least these, you know, at least these numbers are going, you're heading in the right direction.
[01:08:29.560 --> 01:08:31.000] Yeah, slowly trending in the current.
[01:08:31.160 --> 01:08:32.040] Not all trends are bad.
[01:08:32.040 --> 01:08:36.440] I do feel like we do sound like we're doomsayers, like everything's horrible and getting worse and whatever.
[01:08:29.680 --> 01:08:37.000] We're really not.
[01:08:37.080 --> 01:08:39.800] We just follow the evidence whichever way it goes.
[01:08:39.800 --> 01:08:44.920] And like there are things, some surprisingly good trends, like scientific literacy is way up.
[01:08:44.920 --> 01:08:45.720] You know what I mean?
[01:08:46.360 --> 01:08:47.160] Even given everything.
[01:08:47.640 --> 01:08:49.160] Most things are trending.
[01:08:49.480 --> 01:08:50.520] Most things are actually trending.
[01:08:51.240 --> 01:08:52.920] Yeah, when you zoom out enough.
[01:08:52.920 --> 01:08:53.400] Yeah.
[01:08:53.640 --> 01:08:54.200] Yeah, yeah, yeah.
[01:08:54.280 --> 01:08:57.560] I mean, short-term trends can go anywhere.
[01:08:57.560 --> 01:09:01.880] But yeah, if you at the decadal, you know, kind of trend, people are getting smarter.
[01:09:01.880 --> 01:09:03.160] They're getting more savvy.
[01:09:03.160 --> 01:09:04.760] They're getting more scientifically literate.
[01:09:05.480 --> 01:09:06.520] Rhyme is way down.
[01:09:06.760 --> 01:09:09.480] Yeah, there's a lot of good trends happening.
[01:09:09.480 --> 01:09:11.080] Not all the trends are bad.
[01:09:11.080 --> 01:09:14.360] Obviously, the bad trends are newsworthy, right?
[01:09:14.360 --> 01:09:16.520] Like if it bleeds, it leads kind of thing.
[01:09:16.840 --> 01:09:20.840] And they're often taken out of context, and they're often, the resolution is very low.
[01:09:20.840 --> 01:09:21.400] Right.
[01:09:21.720 --> 01:09:22.040] Right.
[01:09:22.040 --> 01:09:23.240] Ooh, more of this.
[01:09:23.240 --> 01:09:25.800] And it's like, yeah, over the last six months, okay.
[01:09:25.800 --> 01:09:26.600] Yeah, right.
[01:09:27.320 --> 01:09:30.120] It's why my financial planner tells me not to look at my investment.
[01:09:30.360 --> 01:09:31.320] Don't even look at it.
[01:09:31.880 --> 01:09:36.200] Until you get within a few years or a decade, whatever.
[01:09:36.200 --> 01:09:41.400] Yeah, when you're at your age, I made sure my daughter started investing already.
[01:09:41.800 --> 01:09:43.480] And obviously, we helped them get started.
[01:09:43.480 --> 01:09:43.800] Same.
[01:09:44.360 --> 01:09:46.840] A very young age, absolutely just get started.
[01:09:46.840 --> 01:09:48.280] And I'm like, don't even look at it.
[01:09:48.280 --> 01:09:52.520] Just don't even let it ride for 50 years, whatever.
[01:09:53.320 --> 01:09:55.160] That's your retirement account.
[01:09:55.160 --> 01:09:56.440] Just don't worry about it.
[01:09:56.440 --> 01:09:56.920] Yep.
[01:09:56.920 --> 01:09:57.480] Yeah.
[01:09:58.120 --> 01:09:58.520] All right.
[01:09:58.520 --> 01:09:59.320] Thanks, Evan.
[01:09:59.320 --> 01:10:01.240] Jay, it's who's that noisy time?
[01:10:01.240 --> 01:10:01.960] All right, guys.
[01:10:01.960 --> 01:10:04.120] Last week I played This Noisy.
[01:10:15.840 --> 01:10:18.320] That's an annoying sound, right?
[01:10:20.240 --> 01:10:21.760] I had a ton of people email me.
[01:10:21.760 --> 01:10:22.400] You know, I was away.
[01:10:22.720 --> 01:10:25.040] You know, I had two weeks of answers this week.
[01:10:25.040 --> 01:10:26.800] You're not interested in what we think?
[01:10:26.800 --> 01:10:27.600] I am.
[01:10:31.600 --> 01:10:33.360] You have an opinion, Steve?
[01:10:33.360 --> 01:10:38.560] I mean, it does sound like a bird to me, but then a lot of things sound like birds to me.
[01:10:38.880 --> 01:10:40.160] But it could be something electronic.
[01:10:40.160 --> 01:10:41.680] But a lot of birds sound electronic.
[01:10:41.760 --> 01:10:42.320] They sound electronic.
[01:10:42.400 --> 01:10:42.640] They do.
[01:10:42.640 --> 01:10:43.680] A lot of birds sound electronic.
[01:10:43.840 --> 01:10:45.840] Carling sounds so electronic.
[01:10:45.840 --> 01:10:48.320] Crown-headed cowbirds are always the crazy one.
[01:10:48.640 --> 01:10:49.600] All right, anybody else?
[01:10:49.600 --> 01:10:52.320] I mean, you know, stop me on my tricks.
[01:10:52.960 --> 01:10:59.040] All right, well, a listener named Keely Hill wrote in, and Keely said, it sounds outside, but I don't hear wind.
[01:10:59.040 --> 01:11:05.840] I'll guess a malfunctioning fire alarm of a warehouse in the woods heard from a short distance outside.
[01:11:05.840 --> 01:11:07.120] I think this is a great guess.
[01:11:07.120 --> 01:11:12.480] It's not correct, but yeah, this thing, whatever it is, sounds like an alarm.
[01:11:12.480 --> 01:11:14.000] Visto Tutti wrote in.
[01:11:14.000 --> 01:11:18.080] He says, sounds like a bird, but I always think it's a bird, and I'm always wrong.
[01:11:18.080 --> 01:11:24.000] So I'm saying it's a windmill with squeaky bearings, and it really needs some lubrication.
[01:11:24.000 --> 01:11:26.960] Ladies and gentlemen, Visto Tuti.
[01:11:26.960 --> 01:11:28.640] You are wrong, sir.
[01:11:28.960 --> 01:11:30.640] But in all the right ways.
[01:11:30.640 --> 01:11:31.840] Do you know what I mean?
[01:11:32.800 --> 01:11:36.000] Michael Blaney wrote in, Hey, Jay, sounds like a bird to me.
[01:11:36.000 --> 01:11:38.400] So I'm guessing it's the northern mockingbird.
[01:11:38.400 --> 01:11:41.760] As I read, they can sound like a car alarm going off.
[01:11:42.400 --> 01:11:48.960] Michael, you're the first person that guessed it was an actual bird, which puts you in Steve's camp.
[01:11:48.960 --> 01:11:50.080] I will say no more.
[01:11:50.080 --> 01:11:52.480] You know how you know when you're listening to a mockingbird?
[01:11:52.480 --> 01:11:52.880] How?
[01:11:52.360 --> 01:11:52.760] No.
[01:11:53.040 --> 01:11:56.960] They cycle through three or four different bird calls, all one after another.
[01:11:57.120 --> 01:11:57.840] That's cool.
[01:11:57.960 --> 01:11:59.600] A listener named Rob Webb wrote in.
[01:11:59.600 --> 01:12:07.560] He said, My 11-year-old son, Socorso, would love to guess, finally, after listening to you for all for many years.
[01:12:07.560 --> 01:12:10.920] He said it's a broken backing up car horn.
[01:12:10.920 --> 01:12:14.360] And I got to be honest with you guys, this is my favorite guess so far.
[01:12:14.840 --> 01:12:17.320] Because it does sound like that a lot.
[01:12:17.320 --> 01:12:22.280] And then Rob said that he thinks it's a parrot scratching its nails on a chalkboard.
[01:12:22.280 --> 01:12:28.200] If I was in a room with a parrot that was doing that, one of us would not make it out of the room alive.
[01:12:29.480 --> 01:12:31.880] So, okay, I have a few winners here.
[01:12:31.880 --> 01:12:40.120] The first person to write in the correct answer was Jennifer Narang, and she said, Evening, I think it is a bare-throated bellbird.
[01:12:40.120 --> 01:12:41.720] And she is correct.
[01:12:41.720 --> 01:12:43.960] Lydia Parsons wrote in, Hello, Jay and team.
[01:12:43.960 --> 01:12:47.080] My guess for this week's Who's That Noisy is the call of the Bellbird.
[01:12:47.400 --> 01:12:48.040] Bird.
[01:12:48.040 --> 01:12:49.880] And Lydia is correct.
[01:12:49.880 --> 01:12:55.240] And then Wilson Fazio Martins wrote in and said, Hi, Jay.
[01:12:55.240 --> 01:13:00.440] I listened to SGU since the beginning, but this is my first time on Who's That Noisy because I know exactly what the noisy is.
[01:13:00.440 --> 01:13:03.320] It's a Brazilian bird called Araponga.
[01:13:03.720 --> 01:13:05.640] And I can't pronounce the other words that he wrote here.
[01:13:05.640 --> 01:13:10.040] It sounds like an anvil, and some even call it a blacksmith bird.
[01:13:10.360 --> 01:13:13.560] Wikipedia translates the arapanga to a bellbird.
[01:13:13.560 --> 01:13:15.640] So, yeah, so that's a bellbird.
[01:13:15.640 --> 01:13:20.600] Apparently, there's different kinds of bellbirds, but they all make this ridiculous noise.
[01:13:20.600 --> 01:13:28.920] I'm going to play that for you again.
[01:13:28.920 --> 01:13:31.240] Okay, so you have one of these in your backyard.
[01:13:31.880 --> 01:13:36.360] How long until you start thinking creatively about how to get rid of that bird?
[01:13:36.600 --> 01:13:38.840] Maybe it just fades into the background at some point.
[01:13:38.840 --> 01:13:39.560] I don't think so.
[01:13:39.560 --> 01:13:40.520] You don't think so?
[01:13:40.520 --> 01:13:40.920] Nope.
[01:13:40.920 --> 01:13:41.160] Sorry.
[01:13:41.320 --> 01:13:43.000] Does it acclimate to that sound?
[01:13:43.000 --> 01:13:43.800] Nope.
[01:13:44.440 --> 01:13:44.960] No.
[01:13:44.960 --> 01:13:48.000] Some noises I pick are highly irritating.
[01:13:48.000 --> 01:13:50.080] But my bird radar was on this week.
[01:13:50.080 --> 01:13:50.880] Yeah, you got it.
[01:13:44.600 --> 01:13:51.600] I mean, you know, it wasn't.
[01:13:52.640 --> 01:13:56.400] When you hear anything like that, you know, you're 50% right if you're saying it's a bird.
[01:13:56.400 --> 01:14:00.080] You know, you know, some crazy clacks on or a bird.
[01:14:00.080 --> 01:14:00.560] Yeah.
[01:14:00.560 --> 01:14:02.880] All right, I got a new noisy for you guys this week.
[01:14:03.040 --> 01:14:06.480] It was sent in by a listener named Alex Freshie.
[01:14:06.480 --> 01:14:09.760] F-R-E-S-C-H-I.
[01:14:09.760 --> 01:14:11.120] And here it is.
[01:14:19.040 --> 01:14:21.040] Very interesting, right?
[01:14:21.040 --> 01:14:22.000] Hmm.
[01:14:22.000 --> 01:14:24.480] If you think you know what it is, Kara, you know what to do, right?
[01:14:24.480 --> 01:14:25.200] You email me.
[01:14:25.200 --> 01:14:27.920] Well, you could email me at my regular email address, Kara, if you want to.
[01:14:28.160 --> 01:14:30.000] I could, but I won't give that out.
[01:14:30.000 --> 01:14:34.000] No, but email me at WTN at the skepticsguy.org.
[01:14:34.000 --> 01:14:37.040] And if you heard something cool, please take the time.
[01:14:37.040 --> 01:14:38.480] It takes you seconds to do this.
[01:14:38.480 --> 01:14:39.440] It's just email.
[01:14:39.440 --> 01:14:40.560] Just email me.
[01:14:40.560 --> 01:14:41.040] It's good.
[01:14:41.040 --> 01:14:42.560] It's good for the show.
[01:14:42.560 --> 01:14:43.040] All right.
[01:14:43.040 --> 01:14:49.760] So, Steve, I think I am formally shutting down ticket sales for the 1000th show.
[01:14:49.920 --> 01:14:50.400] Wow.
[01:14:50.560 --> 01:14:51.440] We're full.
[01:14:51.440 --> 01:14:52.720] We are full.
[01:14:52.720 --> 01:14:54.560] Don't say we didn't warn you.
[01:14:54.560 --> 01:14:55.040] That's it.
[01:14:55.120 --> 01:14:56.480] We're capacity.
[01:14:57.360 --> 01:14:59.600] I might let one more ticket be sold.
[01:14:59.600 --> 01:15:00.320] A golden ticket.
[01:15:00.400 --> 01:15:01.040] A golden ticket.
[01:15:01.520 --> 01:15:02.160] I wish I could do that.
[01:15:02.160 --> 01:15:03.520] I wish I could flag it.
[01:15:03.760 --> 01:15:09.120] There's lots of things that the SGU would like you guys to know, and I'm going to tell you all of them as quickly as I can.
[01:15:09.120 --> 01:15:14.320] One thing is, we are coming up on our 1000th show, and this means a couple of things.
[01:15:14.320 --> 01:15:27.200] One, we've been doing this for 20 years, and if you find any value in the work that we do, if you think that the effort that we put into this show is good for humanity, then please consider becoming a patron of ours.
[01:15:27.200 --> 01:15:30.040] It really does help us do all the things that we want to do.
[01:15:30.040 --> 01:15:33.320] It allows us to put more time and energy into the SGU.
[01:15:29.600 --> 01:15:42.200] So you can go to patreon.com forward slash skepticsguide, that's one word, and you could read up on what it would mean to become a patron of the SGU.
[01:15:42.360 --> 01:15:44.120] Next thing, you can join our mailing list.
[01:15:44.120 --> 01:15:49.080] We send out a mailer every week that outlines everything that we've created the previous week.
[01:15:49.080 --> 01:15:50.760] And there's some fun stuff in there.
[01:15:50.760 --> 01:15:53.240] And, you know, we have a word of the week and all that stuff.
[01:15:53.240 --> 01:15:55.240] So if you're interested, please join us.
[01:15:55.640 --> 01:15:57.800] I will drop a little teaser here.
[01:15:57.800 --> 01:16:11.480] I am in the middle of it, it probably won't happen until after the Chicago show or 1000th show, but I'm in the middle of creating an SGU puzzle game that will be a weekly puzzle game.
[01:16:12.200 --> 01:16:13.720] And what version?
[01:16:15.160 --> 01:16:17.240] I'm testing out a bunch of different things.
[01:16:17.240 --> 01:16:22.680] I've been using ChatGPT to help me come up with what's possible.
[01:16:23.800 --> 01:16:25.000] It codes really well.
[01:16:25.240 --> 01:16:28.920] You can have ChatGPT make some simple mechanics and stuff.
[01:16:28.920 --> 01:16:34.120] And then I was going to ask a programmer that Ian and I know to probably help us actually do it for real.
[01:16:34.120 --> 01:16:37.160] But I'm just looking at different things right now.
[01:16:37.720 --> 01:16:42.360] I want to come up with something that's kind of like connections, like New York Times connections type of game.
[01:16:43.000 --> 01:16:51.240] But I'm not saying I was thinking after the logical fallacy themed puzzle, I was thinking, huh, we should do a skeptical SGU themed crossword puzzle.
[01:16:51.240 --> 01:16:52.840] We are on the same page, Steve.
[01:16:52.840 --> 01:16:53.320] Yeah.
[01:16:53.320 --> 01:16:55.240] But I'm actually getting the work done.
[01:16:55.240 --> 01:16:57.240] Yes, it's a good time.
[01:16:57.240 --> 01:16:58.360] This is your full-time job.
[01:16:59.560 --> 01:17:01.880] But how hard is it to make crossword puzzles?
[01:17:01.920 --> 01:17:03.160] I mean, it's got to be crazy hard.
[01:17:03.160 --> 01:17:06.760] But now I see that there are compilers that will kind of do all the, I guess, the hard work.
[01:17:06.760 --> 01:17:08.200] I'm going to do a hard computer to do it for you.
[01:17:08.680 --> 01:17:11.880] Yeah, I think it's hard to make unique crossword puzzles.
[01:17:11.880 --> 01:17:16.400] It's probably easier to use what has already been made to your advantage.
[01:17:16.720 --> 01:17:20.480] That's basically, yeah, like my father-in-law told me, I'm like, how do you do those crosswords?
[01:17:20.560 --> 01:17:23.600] He's like, Jay, you do it for 10 years and you know all the answers.
[01:17:23.600 --> 01:17:28.560] Yeah, I've noticed that when I'm doing the New York Times puzzles for a while, there's a lot of repeats.
[01:17:28.560 --> 01:17:29.840] There's a lot of big ads.
[01:17:30.160 --> 01:17:30.960] All right, to continue.
[01:17:31.760 --> 01:17:41.120] You can give our showtides and you could also give the SGU podcast a rating on whatever podcast player you're using.
[01:17:41.120 --> 01:17:46.160] Do what you got to do to let other people find out about us if you think that people would be interested.
[01:17:46.160 --> 01:17:51.760] Now, we have tickets available for the Chicago Extravaganza, which is not sold out.
[01:17:51.760 --> 01:17:54.320] This is the one that's going to start at 2:30 p.m.
[01:17:54.320 --> 01:17:58.400] in the afternoon on Saturday, August 17th.
[01:17:58.400 --> 01:18:02.320] And hey, you could stay for the Democratic National Convention.
[01:18:02.320 --> 01:18:03.280] That's correct.
[01:18:03.840 --> 01:18:08.400] We are going to be trying out some new bits in that particular extravaganza that we've never done before.
[01:18:08.400 --> 01:18:12.160] So I guarantee you it's going to be a little weird and unhinged, which is great.
[01:18:12.480 --> 01:18:13.040] Cara loves it.
[01:18:13.200 --> 01:18:14.560] It's a feature, not a bug.
[01:18:14.560 --> 01:18:19.520] When we go flying directly off the rails on this show, Kara loves it.
[01:18:19.840 --> 01:18:20.800] Who wouldn't?
[01:18:20.800 --> 01:18:21.200] I know.
[01:18:21.200 --> 01:18:21.840] No, it's fun.
[01:18:21.840 --> 01:18:22.240] It's good.
[01:18:22.240 --> 01:18:23.120] We're just going to try some new stuff.
[01:18:23.280 --> 01:18:23.840] We have rails?
[01:18:24.320 --> 01:18:25.920] Oh, Stephen, I almost forgot.
[01:18:25.920 --> 01:18:29.440] I just scheduled us for an extravaganza in Washington, D.C.
[01:18:29.520 --> 01:18:30.880] on December 7th.
[01:18:30.880 --> 01:18:32.960] We're going to be going to the Miracle Theater.
[01:18:32.960 --> 01:18:38.320] And that weekend, we're also going to be having another SGU private show, of course.
[01:18:38.320 --> 01:18:40.800] But details on that will come out in a little bit.
[01:18:40.800 --> 01:18:44.640] But if you're interested in buying tickets for the extravaganza in D.C.
[01:18:44.640 --> 01:18:50.480] on December 7th, you can just go to the homepage, SGU homepage, and there's a button there for that, too.
[01:18:50.480 --> 01:18:51.840] More shows, man.
[01:18:51.840 --> 01:18:53.120] Thank you, brother.
[01:18:53.120 --> 01:18:58.640] Ford was built on the belief that the world doesn't get to decide what you're capable of.
[01:18:58.640 --> 01:18:59.600] You do.
[01:18:59.600 --> 01:19:03.160] So, ask yourself: can you or can't you?
[01:19:03.160 --> 01:19:07.960] Can you load up a Ford F-150 and build your dream with sweat and steel?
[01:19:07.960 --> 01:19:11.640] Can you chase thrills and conquer curves in a Mustang?
[01:19:11.640 --> 01:19:16.120] Can you take a Bronco to where the map ends and adventure begins?
[01:19:16.120 --> 01:19:20.200] Whether you think you can or think you can't, you're right.
[01:19:20.200 --> 01:19:21.080] Ready?
[01:19:21.080 --> 01:19:21.880] Set.
[01:19:21.880 --> 01:19:22.760] Ford.
[01:19:23.080 --> 01:19:26.280] We have a name that logical fallacy this week.
[01:19:26.680 --> 01:19:28.840] This comes from Tim.
[01:19:29.480 --> 01:19:30.600] Only something.
[01:19:30.680 --> 01:19:31.160] Tim?
[01:19:31.160 --> 01:19:33.000] Tim writes, Dr.
[01:19:33.000 --> 01:19:36.360] Steve, you've had email conversations in the past with a Tim Dowling.
[01:19:36.360 --> 01:19:37.080] That's my dad.
[01:19:37.400 --> 01:19:39.640] We have a second generation skeptic here.
[01:19:39.640 --> 01:19:40.360] Excellent.
[01:19:40.360 --> 01:19:43.400] He says, longtime listener, first-time writer, thanks for all you do.
[01:19:43.400 --> 01:19:46.600] You've been a big influence in my life and how I look at the world.
[01:19:46.600 --> 01:19:51.080] But to the point, I had a big religious conversation with my sister this evening.
[01:19:51.080 --> 01:19:56.680] Much of the discussion was about if God, evangelical Christian interpretation, is logical.
[01:19:56.680 --> 01:20:00.920] She was back and forth, but eventually settled on God being logical.
[01:20:00.920 --> 01:20:05.240] She then qualified that by saying, we just don't always understand it.
[01:20:05.240 --> 01:20:12.680] The example we were arguing was how God could be in control of everything, yet let things happen that are not of his will.
[01:20:12.680 --> 01:20:14.360] That sounds illogical to me.
[01:20:14.360 --> 01:20:20.520] You're either in control of everything, or everything that happens is your will, or you aren't, and it's not.
[01:20:20.520 --> 01:20:22.440] But that's a discussion for another day.
[01:20:22.440 --> 01:20:25.800] First question, but not the real one I'm asking.
[01:20:25.800 --> 01:20:27.560] What logical fallacy is?
[01:20:27.560 --> 01:20:29.640] We just don't understand it.
[01:20:29.640 --> 01:20:31.560] Maybe moving the goalposts?
[01:20:31.560 --> 01:20:32.920] Now, for the real question.
[01:20:32.920 --> 01:20:37.720] It seems to me that she is really just conflating the words logic and reasons.
[01:20:37.720 --> 01:20:43.160] I'm 100% happy with the statement, we just don't understand the reason God does things.
[01:20:43.160 --> 01:20:47.440] But God is logical, we just don't always understand it, doesn't sit right.
[01:20:47.680 --> 01:20:50.320] Isn't the very nature of logic knowable?
[01:20:50.320 --> 01:20:55.920] If you don't follow someone's logic, it isn't because it's a mystery or unknowable.
[01:20:55.920 --> 01:20:58.160] You just need more info to understand it.
[01:20:58.160 --> 01:21:04.080] Granted, we can't ask God questions, but we can look at the Bible and get that knowledge to a degree at least.
[01:21:04.080 --> 01:21:09.600] I don't think my sister would say we can't know God's logic because there isn't enough information in the Bible.
[01:21:09.600 --> 01:21:12.400] Bottom line: what is the nature of logic?
[01:21:12.400 --> 01:21:14.160] Can it be unknowable?
[01:21:14.160 --> 01:21:18.080] I know there is probably a lot of unstated baggage with this question.
[01:21:18.080 --> 01:21:20.560] I'm not trying to ask a religious question.
[01:21:20.560 --> 01:21:27.280] For the record, I'm a deist at best, and I don't find the Christian God terribly logical, or if he is, it's not a logic I want any part of.
[01:21:27.280 --> 01:21:28.960] Thanks to him.
[01:21:28.960 --> 01:21:33.840] So, this is kind of a classic theology philosophy question.
[01:21:34.160 --> 01:21:34.640] I don't know.
[01:21:34.640 --> 01:21:37.600] Can you help me understand the actual question that was being asked?
[01:21:37.600 --> 01:21:38.320] Because there's a lot.
[01:21:39.280 --> 01:21:41.120] Yeah, it comes down to two things.
[01:21:41.120 --> 01:21:48.000] The first one is: what is the logical fallacy of God is logical, we just don't understand his logic?
[01:21:48.320 --> 01:21:51.440] And I don't know if you guys have any immediate thoughts about that.
[01:21:51.760 --> 01:21:56.560] I mean, he suggested moving the goalpost, and I can kind of see hints of that in there, but it's not the best one.
[01:21:56.800 --> 01:22:04.000] No, I think the core of it's probably an appeal to ignorance, but it's also special pleading, which is kind of a generic one.
[01:22:04.400 --> 01:22:05.440] It's always a good fallback one.
[01:22:05.600 --> 01:22:17.600] Yeah, it's also, I mean, I don't think it's a no-true Scotsman, but it's kind of like, you know, the special pleading of they always come out at night, except during the day.
[01:22:17.600 --> 01:22:19.200] You know, it's sad.
[01:22:19.520 --> 01:22:23.440] You know, it's like you just ad hoc exceptions to the rule.
[01:22:23.440 --> 01:22:29.360] He's always logical, and if you don't, and if he appears not to be logical, it's because you don't understand the logic.
[01:22:29.680 --> 01:22:35.320] Well, and it's also, I think, at a kind of a larger extent, it's kind of an appeal to authority.
[01:22:29.760 --> 01:22:35.880] Yeah, that's true.
[01:22:36.120 --> 01:22:37.720] It's that God Himself is the authority.
[01:22:37.960 --> 01:22:38.920] So He must be right.
[01:22:38.920 --> 01:22:40.120] We'll start there.
[01:22:40.440 --> 01:22:44.360] And therefore, if it seems illogical, it must be because we don't understand the logic.
[01:22:44.600 --> 01:22:46.200] Because we don't understand, because He's the one.
[01:22:46.600 --> 01:22:47.480] Yeah, that's a good point.
[01:22:47.480 --> 01:22:50.760] Yeah, so it's kind of the assumption of God as the ultimate authority.
[01:22:50.760 --> 01:22:57.320] And so, therefore, then, if that's the case, it also becomes circular logic, circular reasoning.
[01:22:58.040 --> 01:22:58.840] God is logical.
[01:22:58.840 --> 01:22:59.480] How do you know that?
[01:22:59.480 --> 01:23:01.640] Because God is logical, because God's always logical.
[01:23:02.280 --> 01:23:02.840] Logic comes from God.
[01:23:03.160 --> 01:23:06.440] Yes, logic comes from God, therefore you must be logical because he's God.
[01:23:06.680 --> 01:23:07.640] He's a Star Trek episode.
[01:23:07.880 --> 01:23:08.360] Exactly.
[01:23:08.680 --> 01:23:14.840] Yeah, because if He seems illogical, maybe we don't understand it, but maybe, you know, He's not logical.
[01:23:14.840 --> 01:23:17.800] That's not the awful stealth out of that damn coin.
[01:23:18.120 --> 01:23:30.040] But it does remind me of sort of this classic theological debate: if God is all-powerful, all-knowing, and all-loving, why are there innocent babies with cancer?
[01:23:30.040 --> 01:23:30.440] Right?
[01:23:32.120 --> 01:23:41.800] How do awful things happen through no real volition of anybody just happen to innocent, good people?
[01:23:41.800 --> 01:23:42.360] Right?
[01:23:42.360 --> 01:23:44.520] Again, it's hard to square that circle.
[01:23:44.520 --> 01:23:45.640] But a lot of people do.
[01:23:45.880 --> 01:23:48.680] I mean, there are all sorts of answers that people have come up with to.
[01:23:49.000 --> 01:23:50.360] It's testing people.
[01:23:50.360 --> 01:23:52.600] Some babies were too good for this world.
[01:23:52.600 --> 01:23:53.640] Yeah, exactly.
[01:23:53.640 --> 01:23:53.880] Right.
[01:23:53.880 --> 01:23:56.760] It's all a lot of ad hoc bullshit reasoning, is what it is, right?
[01:23:56.760 --> 01:24:00.760] But it doesn't, none of it gets you away from the conundrum, though.
[01:24:01.080 --> 01:24:04.920] What some people say is, well, okay, he's not all three of those things, right?
[01:24:04.920 --> 01:24:06.760] Take your pick, but he's not all three.
[01:24:06.760 --> 01:24:08.760] He's only two of those things.
[01:24:09.560 --> 01:24:09.880] Right?
[01:24:09.880 --> 01:24:15.600] Like, and maybe, and I know of some theologians, like, well, all right, he's all-loving, he's all-knowing.
[01:24:14.920 --> 01:24:20.800] He's not exactly all-powerful because he can't interfere with his own creation, like some kind of logic like that.
[01:24:21.040 --> 01:24:29.360] You know, it's like he kind of set things in motion and in such a way that he doesn't interfere with it.
[01:24:29.360 --> 01:24:31.680] But of course, then that would mean all of this.
[01:24:31.680 --> 01:24:33.200] So there are no miracles then, huh?
[01:24:33.200 --> 01:24:36.160] There's no at no point does God intervene.
[01:24:36.160 --> 01:24:37.680] There's no miracles.
[01:24:37.680 --> 01:24:39.680] And if you say, well, well, there are miracles.
[01:24:39.680 --> 01:24:42.160] Okay, then why didn't he miracle that kid's cancer away?
[01:24:42.160 --> 01:24:43.760] He chose not to.
[01:24:44.080 --> 01:24:44.640] Right?
[01:24:44.960 --> 01:24:51.920] But so, yeah, then you have to, then you have to say, well, God has some kind of motivation that's beyond works mysteriously.
[01:24:52.000 --> 01:24:54.240] Right, but that interferes with the all-loving thing.
[01:24:54.240 --> 01:25:02.080] Because then you're saying there are things that are more important to him than doing for us what any parent would do for their child, right?
[01:25:02.080 --> 01:25:06.400] Or just any decent human being would do for any other human being.
[01:25:06.880 --> 01:25:09.440] There's a lot of the not all-loving shit in the Bible.
[01:25:09.440 --> 01:25:10.160] Yeah, oh, yeah.
[01:25:11.040 --> 01:25:12.480] That's like the first one to fall.
[01:25:12.480 --> 01:25:13.280] Yeah, yeah, yeah.
[01:25:13.600 --> 01:25:14.080] Right.
[01:25:14.080 --> 01:25:20.080] But again, but if you're trying to maintain that position, then you have a lot of explaining to do, basically.
[01:25:20.080 --> 01:25:25.840] All right, but the second half of this is just to say, is my sister conflating logic and reason?
[01:25:25.840 --> 01:25:27.840] Like, does God have a reason for it?
[01:25:27.920 --> 01:25:29.360] This kind of gets to what we're saying.
[01:25:29.360 --> 01:25:33.120] If God has a reason for it, that reason doesn't have to be logic.
[01:25:33.120 --> 01:25:37.680] You can't just assume that God is always logical.
[01:25:38.080 --> 01:25:42.320] And I agree with him that there is something objective about logic.
[01:25:42.320 --> 01:25:50.040] You know, saying that, well, we don't understand God's logic is kind of implying that God's logic is somehow subjective, right?
[01:25:50.080 --> 01:25:55.760] The only other way that you could really rescue that is say, well, it's so complicated that we can't understand it.
[01:25:55.760 --> 01:26:08.920] But if you could, you would see that in like this 12-dimensional kind of way, it actually is logical, but it's just way, it's operating on too deep a level for you to understand.
[01:26:08.920 --> 01:26:14.440] But that is, again, like an ultimate appeal to ignorance and appeal to authority.
[01:26:15.240 --> 01:26:17.960] It's like, yeah, we can't understand it.
[01:26:17.960 --> 01:26:18.760] And so.
[01:26:19.080 --> 01:26:20.280] But that is what she argues.
[01:26:20.840 --> 01:26:24.200] Yeah, but he's the ultimate authority, so just believe it.
[01:26:24.200 --> 01:26:24.520] Right.
[01:26:25.640 --> 01:26:26.840] Okay, so it's pure belief.
[01:26:28.040 --> 01:26:28.600] That is faith.
[01:26:29.160 --> 01:26:39.480] And this is my personal problem with religion: at the end of the day, you're being asked to suspend your logic and reason to just have faith because.
[01:26:39.480 --> 01:26:42.600] Because faith is a circular reasoning of, well, because faith is good.
[01:26:43.720 --> 01:26:44.120] How do you know?
[01:26:44.280 --> 01:26:44.440] Exactly.
[01:26:44.600 --> 01:26:45.320] How do you know faith is good?
[01:26:45.400 --> 01:26:47.240] Because my faith tells me it is, you know.
[01:26:47.480 --> 01:26:48.520] Because the Bible says so.
[01:26:48.520 --> 01:26:49.160] How is the Bible right?
[01:26:49.160 --> 01:26:50.120] Because I have faith in the Bible.
[01:26:50.120 --> 01:26:51.000] Why do you have faith in the Bible?
[01:26:51.000 --> 01:26:52.120] Because it tells me I should.
[01:26:52.680 --> 01:26:57.480] It's all circular reasoning at the end of the day, and then you just cut it completely circular.
[01:26:57.480 --> 01:27:01.240] Yeah, it is impenetrable to logic at that point.
[01:27:01.480 --> 01:27:03.080] So just don't call it logic.
[01:27:03.080 --> 01:27:04.920] Just call it something else.
[01:27:05.720 --> 01:27:10.840] But I also think it comes from a place of, well, we don't want to say anything that can be perceived as negative about God.
[01:27:11.080 --> 01:27:12.600] So you don't want to say God's illogical.
[01:27:12.600 --> 01:27:17.160] So he has every virtue always all the time, even when they're mutually incompatible.
[01:27:17.160 --> 01:27:20.200] And then we just don't worry about the mutual incompatibility.
[01:27:20.200 --> 01:27:21.000] All right.
[01:27:21.000 --> 01:27:25.000] We do have a really excellent email, too.
[01:27:25.000 --> 01:27:31.240] This one comes from Mauser, who is a nuclear engineer at Los Alamos National Laboratory.
[01:27:31.480 --> 01:27:32.440] Yeah, this is interesting.
[01:27:32.440 --> 01:27:43.880] And he writes, Hey, gang, I really enjoyed episode 993, but I wanted to make a small correction to something that was said regarding piling up regolith on top of lunar habitats to protect from space radiation.
[01:27:43.880 --> 01:27:54.960] If you made a, now this is in quotes, and I believe I said this: if you made a protective structure on a moon base with two or three feet of moon creep on the outside, that would go a long way towards protecting you from radiation.
[01:27:54.960 --> 01:28:03.360] It turns out that when incoming radiation enters shielding around a habitat, it can react with atoms in the shielding and produce secondary radiation.
[01:28:03.760 --> 01:28:12.480] The counterintuitive thing is that the secondary radiation could actually be more penetrating and harmful to the occupants of the habitat than the original primary radiation.
[01:28:12.480 --> 01:28:30.480] So, in order to effectively shield a habitat, you don't just need sufficient shielding to stop incoming cosmic rays and whatnot, but you also need enough shielding to shield against the spallation neutrons and the other secondary nasties that the cosmic rays generate within the shielding.
[01:28:30.480 --> 01:28:49.280] Some folks working on NASA's NC2 resource utilization efforts estimated that the break-even point for piled-up lunar regolith, where the effective dose within the habitat was the same as if there was no habitat shielding at all, could be as high as seven to nine meters of regolith.
[01:28:49.840 --> 01:28:53.440] After that, your shielding starts to actually be effective.
[01:28:53.440 --> 01:28:56.960] So, he says, Go, lava tubes, thanks for everything you do.
[01:28:56.960 --> 01:28:58.720] So, smart guy.
[01:28:58.720 --> 01:29:00.320] I had a quick back and forth with him.
[01:29:00.320 --> 01:29:02.000] I said, because I looked that up.
[01:29:02.000 --> 01:29:05.760] I don't always just take, even though he's a nuclear engineer, I don't take it for granted.
[01:29:05.760 --> 01:29:07.600] I said, All right, what can I find over here?
[01:29:07.600 --> 01:29:22.800] And I found multiple resources with studies and with charts, you know, at the curves of radiation, where the break-even point is somewhere around three meters at the low end and five-something meters at the high end.
[01:29:22.800 --> 01:29:25.600] I didn't find anything in the seven to nine-meter range.
[01:29:25.600 --> 01:29:30.200] I emailed this back to Mauser, and he said, Yeah, I'll have to dig up that reference.
[01:29:29.840 --> 01:29:35.960] It was a YouTube video of a lecture, you know, a presentation, but he hasn't sent that to me yet.
[01:29:36.040 --> 01:29:42.200] He said, And you're right, when he looks around, he also is finding the lower numbers, the three to five meters.
[01:29:42.200 --> 01:29:44.760] Still, three to five meters is a lot.
[01:29:44.760 --> 01:29:47.640] And again, that's 16 to 17 meters.
[01:29:47.720 --> 01:29:57.160] No, the break-even point means at that point, you're just the same amount of radiation with no shielding because you're basically this secondary radiation now is equal to the primary radiation.
[01:29:57.160 --> 01:29:59.880] So you have to have more than that.
[01:30:01.800 --> 01:30:05.160] But that assumes homogeneous shielding that's just all lunar regolith.
[01:30:05.400 --> 01:30:07.480] First, we assign it with a perfect sphere.
[01:30:07.480 --> 01:30:07.880] Yeah.
[01:30:08.200 --> 01:30:10.120] Right, like as physicists always do.
[01:30:10.120 --> 01:30:10.840] No, you're right.
[01:30:10.920 --> 01:30:17.720] But is there a way to layer the shielding so that it actually reduces that effect of secondary radiation?
[01:30:18.040 --> 01:30:19.000] Or put a trap in it.
[01:30:19.160 --> 01:30:20.280] Air gap, the radiation.
[01:30:21.560 --> 01:30:22.520] Yeah, like a bladder.
[01:30:22.760 --> 01:30:25.480] Let me tell you what I found in the time that I had to research this.
[01:30:25.480 --> 01:30:28.040] So, first of all, we knew about this effect.
[01:30:28.520 --> 01:30:29.160] With film.
[01:30:29.480 --> 01:30:30.280] Well, not just that.
[01:30:30.520 --> 01:30:32.200] Yeah, but with spaceships.
[01:30:32.200 --> 01:30:37.160] Like, we knew, like, oh, yeah, if you have inches of steel on a spaceship, it's not going to keep out cosmetic.
[01:30:37.240 --> 01:30:43.400] It'll keep out cosmic rays, but it'll just create more secondary radiation, or it's just going to bounce around inside.
[01:30:43.400 --> 01:30:45.400] And so it's actually a net negative.
[01:30:45.400 --> 01:30:49.400] But I didn't think that would apply to feet of regolith, but apparently it does.
[01:30:49.400 --> 01:30:50.360] Yeah, I didn't think so either.
[01:30:50.520 --> 01:31:00.040] Steve, if you remember, I think the first time we heard of it was when they were trying to safeguard really important film, and they got back and it was worse.
[01:31:00.360 --> 01:31:10.520] It was completely exposed because it it had this secondary radiation that just it was better to have one bit of radiation just fly through than have all these daughter particles fly in the field.
[01:31:10.600 --> 01:31:14.360] Yeah, and why are the why is the secondary radiation worse?
[01:31:14.360 --> 01:31:15.760] Why is it more penetrating?
[01:31:16.080 --> 01:31:17.600] Because it's concentrated.
[01:31:17.600 --> 01:31:22.240] No, because it's neutrons and neutrons are neutral.
[01:31:22.240 --> 01:31:28.720] They're not charged, whereas electrons are charged and so they just interact more, they don't go as they don't penetrate as deeply.
[01:31:28.720 --> 01:31:32.080] So that's why those secondary neutron-based radiation is worse.
[01:31:32.400 --> 01:31:33.920] It penetrates deeper.
[01:31:33.920 --> 01:31:35.600] It's harder to shield against.
[01:31:35.600 --> 01:31:36.320] Deeper.
[01:31:36.320 --> 01:31:39.920] And they're still high enough energy that they cause problems.
[01:31:39.920 --> 01:31:51.360] All right, but I did read a study that looked at different material, and different materials have a different potential to kick off these secondary neutron radiation particles.
[01:31:51.360 --> 01:31:55.040] So regolith is bad for secondary particles.
[01:31:55.920 --> 01:31:57.440] It's just about the same as aluminum.
[01:31:57.440 --> 01:32:00.560] Aluminum and regolith are just about equivalent in terms of shielding.
[01:32:00.560 --> 01:32:04.080] They're effective, but they create a lot of secondary radiation.
[01:32:04.080 --> 01:32:13.200] There was three specifically, three materials specifically tested that have a very low potential to generate secondary radiation.
[01:32:13.200 --> 01:32:15.120] One is water, which is good.
[01:32:15.440 --> 01:32:17.440] Human waste and Krell metal.
[01:32:17.920 --> 01:32:24.080] One is carbon fiber, and the other is PE, polyethylene.
[01:32:24.080 --> 01:32:34.640] So you could make layers of carbon fiber, polyethylene, and water that would be effective shielding and would not generate a lot of daughter particles for secondary radio.
[01:32:34.960 --> 01:32:35.920] How thick would it have to be?
[01:32:36.240 --> 01:32:44.080] It's very hard for me to find that out because most of the studies give like they give it in weird units from my perspective.
[01:32:44.080 --> 01:32:46.280] It's always like grams per centimeters cubed, you know.
[01:32:46.480 --> 01:32:47.840] I was like, okay, but how thick is it?
[01:32:47.840 --> 01:32:49.120] And they don't say.
[01:32:49.120 --> 01:32:52.520] I guess it's a more precise way of doing it, but we want like the simplistic.
[01:32:52.520 --> 01:32:54.080] How thick does it have to be?
[01:32:54.080 --> 01:33:06.920] But in any case, it would have to be thicker, but as far as a primary shielding is concerned, but it would be more effective in that there wouldn't be hardly any secondary radiation.
[01:33:06.920 --> 01:33:13.960] So you might want to have that on the outside, you know, to block the cosmic rays and maybe have regolith on the inside.
[01:33:13.960 --> 01:33:16.040] And then maybe you can get away with less shielding.
[01:33:16.040 --> 01:33:18.920] But also, just being in lava tubes also fixes the problem.
[01:33:18.920 --> 01:33:24.600] You know, if you're going to be 20, 10 or more meters underground, you're good.
[01:33:24.600 --> 01:33:27.400] This all applies on Mars too, by the way.
[01:33:27.640 --> 01:33:29.000] It's all the same.
[01:33:29.000 --> 01:33:32.120] So, yeah, being underground is probably going to be the best.
[01:33:32.120 --> 01:33:43.320] Unless we really can come up with some layered material where you can have a foot thick of shielding, you know, where that's where you store your water and you have like three feet of water or something or three yards of water.
[01:33:43.320 --> 01:33:45.640] That's where all the water is stalled and stored.
[01:33:45.640 --> 01:33:47.080] And that is effective shielding.
[01:33:47.320 --> 01:33:49.320] We may have something like that.
[01:33:49.640 --> 01:33:50.600] But yeah, but interesting.
[01:33:50.600 --> 01:33:52.760] So it's harder than even we thought.
[01:33:52.760 --> 01:34:01.800] Yeah, so if you're living underground on Mars or on the moon, and okay, let's say Earth explodes and you need to find some place to live.
[01:34:01.800 --> 01:34:05.880] But if it doesn't, hopefully you won't be on the moon long, if that's the case.
[01:34:05.880 --> 01:34:07.880] Yeah, that's true too, exactly.
[01:34:08.600 --> 01:34:09.720] I just don't get it.
[01:34:09.960 --> 01:34:11.800] Why would you want to live underground?
[01:34:11.800 --> 01:34:12.120] I wouldn't.
[01:34:12.280 --> 01:34:14.360] It's like you're going to moon to be in prison.
[01:34:14.360 --> 01:34:15.160] I don't get it.
[01:34:15.160 --> 01:34:15.640] Yeah, not real.
[01:34:17.000 --> 01:34:22.680] I think people are there's going to be scientists who are doing a one or two year there and you know things like that.
[01:34:23.000 --> 01:34:24.280] That I get like a space station.
[01:34:24.760 --> 01:34:26.280] Antarctica or Antarctica.
[01:34:26.280 --> 01:34:27.080] Yeah, that I told them.
[01:34:27.160 --> 01:34:35.240] Well, the other thing is if you're in a lava tube, Kara, it could be big enough that if you it could be lighted, it could be pressurized, there could be plants.
[01:34:35.240 --> 01:34:38.200] It could be, you know, I could see, like, it's not going to be in a room.
[01:34:38.440 --> 01:34:40.520] It could have an outdoorsy feel to it.
[01:34:40.920 --> 01:34:41.240] Oh, for sure.
[01:34:41.560 --> 01:34:42.360] It would be big enough for it.
[01:34:42.440 --> 01:34:50.720] It could be a whole environment, but it would still always be like every dystopian black mirror post-earth environment.
[01:34:51.120 --> 01:34:53.200] People born in that may like it.
[01:34:55.040 --> 01:34:56.960] I'm so glad I live here.
[01:34:57.920 --> 01:34:58.480] All right.
[01:34:59.040 --> 01:35:01.600] Let's go on with science or fiction.
[01:35:03.840 --> 01:35:08.880] It's time for science or fiction.
[01:35:14.240 --> 01:35:21.760] Each week I come up with three science news items or facts: two real and one fake, and then I challenge my panel of skeptics to sniff out the fake.
[01:35:22.080 --> 01:35:24.480] You guys can play along at home if you want to.
[01:35:24.480 --> 01:35:26.320] I have three regular news items.
[01:35:26.320 --> 01:35:27.600] Are you guys ready?
[01:35:27.600 --> 01:35:28.160] Yes.
[01:35:28.240 --> 01:35:29.440] I think these are fun.
[01:35:29.440 --> 01:35:30.800] Okay, here we go.
[01:35:30.800 --> 01:35:40.160] Item number one: a recent analysis finds that the teeth of Komodo dragons are coated with iron to help maintain their strength and cutting edge.
[01:35:40.160 --> 01:35:54.320] I number two: an extensive study finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower when just letting the land naturally regenerate than planting trees.
[01:35:54.320 --> 01:36:04.240] And item number three: a recent study finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:36:04.240 --> 01:36:05.360] Jay, go first.
[01:36:05.360 --> 01:36:13.600] All right, this first one about this recent analysis that says that Komodo teeth, the Komodo dragon, their teeth, they're coated with iron.
[01:36:13.600 --> 01:36:15.520] Why does that sound so crazy to me?
[01:36:15.520 --> 01:36:17.680] They're coated with iron.
[01:36:17.680 --> 01:36:19.280] How could that even happen?
[01:36:19.600 --> 01:36:22.160] I mean, it's badass, and it's cool.
[01:36:22.160 --> 01:36:24.560] I think I would have heard about this before.
[01:36:24.560 --> 01:36:26.240] All right, so I'm going to put that one on hold.
[01:36:26.240 --> 01:36:29.280] I think that one has a good chance of being the fake.
[01:36:29.600 --> 01:36:41.400] Next one here: this is an extensive study that finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower with just letting the land naturally regenerate than planting trees.
[01:36:41.720 --> 01:36:46.120] Oh, I mean, I wouldn't think that one is that off the mark.
[01:36:46.120 --> 01:36:48.280] That one, to me, seems like science.
[01:36:48.280 --> 01:36:55.560] The third one, a recent study, finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:36:55.560 --> 01:36:56.280] Damn.
[01:36:56.280 --> 01:36:57.800] Well, doesn't that suck?
[01:36:57.800 --> 01:36:59.400] You know, I think that one's the fiction.
[01:36:59.400 --> 01:37:00.520] The music one?
[01:37:00.840 --> 01:37:02.120] And I'll tell you why.
[01:37:02.120 --> 01:37:09.480] Because my kids, you know, they want to hear music in the car, and every single song that they play, I end up humming it and singing it as I'm bopping around.
[01:37:09.480 --> 01:37:11.640] I don't even like the songs.
[01:37:11.960 --> 01:37:12.840] You know?
[01:37:12.840 --> 01:37:15.320] I'll sign your name like Taylor Swift, man.
[01:37:15.640 --> 01:37:19.960] It's like, I can't forget, you know, I hear the melody and I got it in my head forever.
[01:37:19.960 --> 01:37:21.320] So I think that one's a fiction.
[01:37:21.320 --> 01:37:22.200] Okay, Bob.
[01:37:22.200 --> 01:37:24.040] Yeah, Iron Teeth for a Komodo.
[01:37:25.400 --> 01:37:27.480] That sounds a little sketchy.
[01:37:28.200 --> 01:37:29.160] Damn, man.
[01:37:29.240 --> 01:37:32.600] Look like, what's that dude from Moonraker?
[01:37:32.600 --> 01:37:33.160] Jaws.
[01:37:33.160 --> 01:37:35.000] With the silver jaws.
[01:37:35.000 --> 01:37:38.200] The second one, carbon removal.
[01:37:38.200 --> 01:37:40.040] That sounds reasonable to me.
[01:37:40.280 --> 01:37:41.560] Yeah, let nature handle it.
[01:37:41.720 --> 01:37:42.600] Be cheaper.
[01:37:42.600 --> 01:37:49.400] I mean, also, it might be cheaper, but maybe, you know, nature takes a lot longer.
[01:37:49.400 --> 01:37:49.960] I don't know.
[01:37:49.960 --> 01:37:56.920] It's hard for me to disagree with the basic premise that getting old really sucks.
[01:37:58.280 --> 01:38:02.200] So it's going to be hard for me to make that one the fiction.
[01:38:02.200 --> 01:38:10.600] But yeah, but I may agree with Jay here that I haven't noticed that yet, at least for that specific deficit.
[01:38:10.600 --> 01:38:13.320] It hasn't really cropped, reared its ugly head.
[01:38:13.320 --> 01:38:15.280] So, yeah, I'll agree with Jay.
[01:38:15.280 --> 01:38:20.000] GWJ, go with number three, the music recognition verbal.
[01:38:14.760 --> 01:38:20.880] Okay, Evan.
[01:38:21.200 --> 01:38:24.240] Well, I think I'm going to agree on this one as well.
[01:38:24.720 --> 01:38:27.680] I don't have any hard data to back it up.
[01:38:27.680 --> 01:38:29.840] I only have my anecdotes.
[01:38:29.840 --> 01:38:37.680] And one of them is that Jennifer's, my wife's mother, died several years ago, suffering Alzheimer's.
[01:38:37.680 --> 01:38:48.080] And one of the ways towards the end of her life that they would be able to sort of communicate and function, you know, have a point of reference was the music.
[01:38:48.480 --> 01:38:54.320] In that, yes, she didn't know who practically anybody was, names and things towards the end.
[01:38:54.320 --> 01:39:02.960] But if a piece of music came on that she was familiar with, she seemed to be able to show signs of having recognition of it.
[01:39:02.960 --> 01:39:10.880] So, you know, I understand that's just one case in my own personal experience with that.
[01:39:10.880 --> 01:39:13.440] But it lends to this being the fiction.
[01:39:13.440 --> 01:39:16.560] So, yeah, for that reason, I'm just going to go with the guys on that.
[01:39:16.560 --> 01:39:18.240] Okay, and Kara.
[01:39:18.240 --> 01:39:22.000] Yeah, I mean, iron teeth, pretty weird, but I don't know.
[01:39:23.360 --> 01:39:31.920] The one about carbon removal with natural, I guess, reforestation is complicated.
[01:39:31.920 --> 01:39:33.360] So, I don't know.
[01:39:35.120 --> 01:39:43.360] And a recent study finding that the ability to recognize a previously heard piece of music significantly decreases with age and older adults.
[01:39:43.360 --> 01:39:48.400] The way this is worded could be interpreted so many different ways.
[01:39:48.400 --> 01:39:54.080] But the thing that sticks out to me, kind of like you mentioned, Evan, it's not uncommon to use music.
[01:39:54.320 --> 01:39:57.440] I don't know, have we've seen Awakenings, right?
[01:39:57.440 --> 01:40:04.760] We've seen these like stories of people with severe neurological issues kind of finding flow again with music.
[01:39:59.760 --> 01:40:05.080] I don't know.
[01:40:05.160 --> 01:40:13.800] The thing that comes up for me is that it is well established that the older you get, the more likely you are to listen to music from your past.
[01:40:14.120 --> 01:40:15.720] Like you don't want new music.
[01:40:15.720 --> 01:40:18.200] It's like kids today, their music sucks.
[01:40:18.200 --> 01:40:25.480] And I think that like every generation ever has always said that because we actually are really nostalgic and comforted by our old music.
[01:40:25.480 --> 01:40:28.040] And I think if anything, maybe it's the opposite.
[01:40:28.040 --> 01:40:32.600] That like the older we are, the more reinforced the music we've listened to our whole lives becomes.
[01:40:32.600 --> 01:40:35.240] And we're like better at recognizing it.
[01:40:35.240 --> 01:40:38.120] So yeah, I'm going to say that that one's also the fiction.
[01:40:38.120 --> 01:40:38.440] All right.
[01:40:38.440 --> 01:40:42.200] So you guys all seem to be most, I think, in agreement on number two.
[01:40:42.200 --> 01:40:42.920] So we'll start there.
[01:40:42.920 --> 01:40:54.040] An extensive study finds that for about half of the sites analyzed, the cost per ton of carbon removal is lower when just letting the land naturally regenerate than planting trees.
[01:40:54.040 --> 01:40:59.400] You all think this one is science and this one is science.
[01:40:59.400 --> 01:41:00.120] This one is science.
[01:41:00.440 --> 01:41:00.840] Science.
[01:41:00.840 --> 01:41:08.760] Yeah, interesting, you know, that you could have a big, expensive tree planting program and doing nothing was more effective, basically.
[01:41:09.000 --> 01:41:11.640] And they examined, I think, thousands of sites.
[01:41:11.640 --> 01:41:14.120] You know, it's a very pretty extensive study.
[01:41:14.120 --> 01:41:17.320] And they were looking at mainly it was cost effectiveness.
[01:41:17.320 --> 01:41:26.280] Again, like, what's the, how much does it cost to do the program and how much carbon gets absorbed for that piece of land?
[01:41:26.280 --> 01:41:30.600] And for the, you know, the natural recovery thing, they're not doing nothing exactly.
[01:41:30.600 --> 01:41:36.680] They are sort of protecting the land and they are having to do some things, but they're just not planting trees.
[01:41:36.680 --> 01:41:40.920] And I said, it is very specific to the land itself, but it's not the same.
[01:41:40.920 --> 01:41:46.560] So it was 46% for regeneration and 54% for plantations.
[01:41:44.840 --> 01:41:51.760] So you have to pick and choose which ones you're going to plant trees on.
[01:41:52.000 --> 01:42:04.400] But they also said that if you take a combined approach where you do some planting but also let it recover naturally, that was more effective in most areas.
[01:42:04.720 --> 01:42:11.760] So it was 44% more effective than natural regeneration alone and 39% more effective than plantations alone.
[01:42:11.760 --> 01:42:15.600] So this mixed approach seems to be the best approach.
[01:42:15.600 --> 01:42:29.120] And what they also found was that the natural regeneration, when you allowed the forest to regenerate naturally rather than just planting a bunch of trees, it was better in other ways as well.
[01:42:29.120 --> 01:42:34.240] It was more biodiversity and better water management.
[01:42:34.240 --> 01:42:35.680] And it still stored a lot of wood.
[01:42:35.680 --> 01:42:40.400] The big advantage of planting trees, the plantation, was you get more wood out of it, right?
[01:42:40.400 --> 01:42:43.440] So as a source of wood, that's the way to go.
[01:42:43.440 --> 01:42:49.120] So if you're planting a tree for a forest for later harvesting, then yes, you'll want to plant trees.
[01:42:49.120 --> 01:42:57.440] But if you just want to absorb a lot of oxygen, the best thing to do would be to plant some trees strategically, but let it recover mostly on its own.
[01:42:57.440 --> 01:42:59.520] Okay, let's move on to number three.
[01:42:59.520 --> 01:43:06.320] A recent study finds that the ability to recognize a previously heard piece of music significantly decreases with age in older adults.
[01:43:06.320 --> 01:43:08.880] You guys all think this one is the fiction.
[01:43:08.880 --> 01:43:12.640] Kara, you said this could be interpreted a number of different ways.
[01:43:12.640 --> 01:43:13.280] Yeah.
[01:43:13.280 --> 01:43:15.440] So let me tell you what they did.
[01:43:15.440 --> 01:43:19.960] So they had subjects of different ages, right?
[01:43:20.280 --> 01:43:23.040] Exposed them to a specific piece of music.
[01:43:23.040 --> 01:43:27.200] So it's not something necessarily that they already knew or something from their childhood.
[01:43:27.200 --> 01:43:29.360] It's a new piece of music to them.
[01:43:29.360 --> 01:43:35.240] They allowed them to hear it and they listened to it at least three times, right?
[01:43:35.240 --> 01:43:42.520] And then at a later time, they were then hearing music, different music, and this is like played by an orchestra.
[01:43:42.600 --> 01:43:45.880] The orchestra is like playing a medley, like they're going through different things.
[01:43:45.880 --> 01:43:56.440] And then when they got to the melody from the music that they heard recently, they were to hit a button, right, to indicate, oh, yes, that's the music that I heard.
[01:43:56.440 --> 01:43:57.000] Oh, okay.
[01:43:57.000 --> 01:43:59.080] So it's not previously heard like decades.
[01:43:59.240 --> 01:44:01.640] No, it's like just as part of the study.
[01:44:01.640 --> 01:44:03.160] So does that change your mind at all?
[01:44:03.480 --> 01:44:04.120] No.
[01:44:04.440 --> 01:44:06.200] It makes me less sure.
[01:44:07.160 --> 01:44:14.520] So what they found was that age had no effect on people's ability to recognize music.
[01:44:14.680 --> 01:44:14.920] Yeah.
[01:44:15.320 --> 01:44:16.920] It was not a variable.
[01:44:16.920 --> 01:44:19.080] It's a preserved area of your brain.
[01:44:19.080 --> 01:44:19.720] Yeah.
[01:44:19.720 --> 01:44:22.840] So, but here's the thing: don't celebrate too much.
[01:44:23.320 --> 01:44:34.520] But the point of the research was: I mean, one of the research questions really is like, what types of memory tend to slow down with normal aging and which ones don't?
[01:44:34.520 --> 01:44:37.320] Because not all memory is the same, right?
[01:44:37.320 --> 01:44:39.720] There's episodic memory, for example.
[01:44:39.720 --> 01:44:43.320] There is recognition and there is recall.
[01:44:43.320 --> 01:44:50.680] Now, recall is the one that, you know, healthy, normal aging people tend to have increased difficulty with.
[01:44:51.000 --> 01:44:52.840] We're experiencing that even now.
[01:44:52.840 --> 01:44:58.120] You know, just it takes longer as you get older to think of names and to think of words and to think of things.
[01:44:58.120 --> 01:45:01.080] Like, how many times, like, remember that guy from the movie?
[01:45:01.080 --> 01:45:01.720] What's his name?
[01:45:01.720 --> 01:45:04.280] Like, that's like a common occurrence now.
[01:45:04.520 --> 01:45:07.880] So, that kind of recall does get more challenging as you get older.
[01:45:07.880 --> 01:45:08.920] Absolutely.
[01:45:08.920 --> 01:45:12.440] But recognition does not apparently diminish as you get older.
[01:45:12.440 --> 01:45:17.680] Recognition remains intact, at least according to the research, and this study supports that.
[01:45:18.000 --> 01:45:25.520] So, that's really the issue here: the types of memory, the types of processing, and which ones are affected by aging and which ones are not.
[01:45:25.520 --> 01:45:33.600] Okay, this means that a recent analysis finds that the teeth of Komodo dragons are coated with iron to help maintain their strength and cutting edge.
[01:45:33.600 --> 01:45:34.720] Is science?
[01:45:34.720 --> 01:45:35.360] This was cool.
[01:45:35.360 --> 01:45:37.680] I thought I would get somebody on this one.
[01:45:38.320 --> 01:45:41.600] So, yeah, I mean, it's not obviously like they don't look like iron.
[01:45:41.680 --> 01:45:44.160] Doesn't look like jaws from Moonrake or James Bond.
[01:45:44.160 --> 01:45:49.280] Right, it's but there's a lot of iron concentrated in the coating on their teeth.
[01:45:49.280 --> 01:45:58.320] It's not just enamel, it's a different kind of substance, and it evolved to be very, very hard in resistance to dulling and breaking.
[01:45:58.320 --> 01:46:02.480] So, they have serrated teeth for ripping and tearing.
[01:46:02.480 --> 01:46:09.920] So, they, you know, it is any wear and tear on the teeth would really significantly reduce their effectiveness.
[01:46:09.920 --> 01:46:10.560] Oh, cool.
[01:46:10.560 --> 01:46:24.720] And so, having a very hard coating with a lot of iron in it on the outside of the teeth is extremely adaptive for the anatomy of the teeth, the kinds of teeth that they have, and the eating, the way they eat.
[01:46:25.120 --> 01:46:29.520] They really have to rip flesh off of their prey.
[01:46:29.760 --> 01:46:36.000] They have similar teeth to certain kinds of dinosaurs, like Tyrannosaurus Rex.
[01:46:36.000 --> 01:46:44.960] So, the question is: is this adaptation sort of unique to the Komodo dragon line?
[01:46:44.960 --> 01:46:53.120] Or do other reptiles with similar teeth also have iron in their teeth?
[01:46:53.120 --> 01:46:57.200] Now, extant, both extinct and extant.
[01:46:57.440 --> 01:47:06.760] But they say that unfortunately, because of the way fossilization works, that this layer, this iron-rich layer, would not survive on fossils.
[01:47:07.240 --> 01:47:14.120] So we don't know if T-Rex had iron teeth, but they might have had a similar kind of adaptation.
[01:47:14.120 --> 01:47:14.680] Would they have needed?
[01:47:14.840 --> 01:47:17.880] Because they had serrated teeth they used for ripping.
[01:47:18.120 --> 01:47:22.280] They had very similar teeth to Komodo dragons, curved serrated teeth.
[01:47:22.280 --> 01:47:31.560] So it's reasonable, but it could just be that it could have just been an evolutionary innovation that is pretty unique to the Komodo dragon.
[01:47:31.560 --> 01:47:33.560] Or it could be widespread as hell.
[01:47:33.560 --> 01:47:34.680] Or it could be much more widespread.
[01:47:34.840 --> 01:47:35.480] You never know.
[01:47:36.200 --> 01:47:37.080] Yeah, so interesting.
[01:47:37.240 --> 01:47:38.120] But that was fascinating.
[01:47:38.120 --> 01:47:39.960] I'm hoping they could infer it some other way.
[01:47:39.960 --> 01:47:40.680] Yeah, maybe.
[01:47:40.680 --> 01:47:42.040] It's not impossible.
[01:47:42.280 --> 01:47:48.680] They have figured out a way to identify the color of bird feathers, dinosaur feathers.
[01:47:49.240 --> 01:47:52.600] And some vocalizations, like noises perhaps, they were making.
[01:47:52.920 --> 01:47:53.400] Yeah.
[01:47:53.400 --> 01:47:53.960] Yeah.
[01:47:54.280 --> 01:47:57.080] The anatomy of their vocal box.
[01:47:57.080 --> 01:47:57.480] All right.
[01:47:57.480 --> 01:47:58.520] Well, good job, everyone.
[01:47:58.520 --> 01:48:00.200] You swept me this week.
[01:48:00.200 --> 01:48:00.920] That's been a little bit.
[01:48:01.400 --> 01:48:03.720] Didn't get you on the iron teeth.
[01:48:03.720 --> 01:48:05.560] All right, Evan, give us a quote.
[01:48:05.880 --> 01:48:13.400] One of my biggest pet peeves is when people use science that they don't understand to try to justify their stupidity and hate.
[01:48:14.520 --> 01:48:15.560] Forrest.
[01:48:15.720 --> 01:48:19.400] Yeah, Forrest Valkyrie said that in an interview.
[01:48:19.400 --> 01:48:20.280] And who's Forrest?
[01:48:20.440 --> 01:48:21.800] Who is Forrest Valkyrie?
[01:48:21.800 --> 01:48:24.920] Well, he's a scientist and an educator.
[01:48:25.240 --> 01:48:39.720] He is a biologist, science communicator, uses his platform mainly on YouTube, Instagram, to teach exciting science lessons, promote compassion and skepticism, and share his boundless love for life with audiences of all ages.
[01:48:39.720 --> 01:48:47.120] And he is going to be sharing those very things with us when we meet him at SaiCon this coming October.
[01:48:48.080 --> 01:48:49.280] Oh, cool.
[01:48:49.280 --> 01:48:51.280] Yeah, that's kind of a peeve of mine, too.
[01:48:44.600 --> 01:48:51.920] I know.
[01:48:53.200 --> 01:48:57.680] The whole using science you don't understand, like you could stop there.
[01:48:57.680 --> 01:48:59.360] Like, that bugs me.
[01:48:59.360 --> 01:48:59.760] Yeah.
[01:48:59.760 --> 01:49:02.240] But then to justify something hateful, come on.
[01:49:02.240 --> 01:49:04.480] Or stupid or pseudoscientific or whatever.
[01:49:05.040 --> 01:49:07.120] That's basically all of social media.
[01:49:07.120 --> 01:49:07.520] Yeah.
[01:49:07.680 --> 01:49:14.720] Yeah, like we're doing a lot of TikTok response videos and talking a lot about some of the crazy stuff on TikTok.
[01:49:14.720 --> 01:49:16.960] And it's a lot of it falls into this category.
[01:49:17.440 --> 01:49:29.040] You're talking about something you really don't understand and you're using it as a way of justifying some ridiculous belief, whether it's a conspiracy theory or some snake oil or something pseudoscientific.
[01:49:29.040 --> 01:49:29.840] Quantum mechanics.
[01:49:29.840 --> 01:49:33.760] Or something, yeah, or some bigoted opinion or whatever.
[01:49:33.840 --> 01:49:35.520] Yeah, it is very frustrating.
[01:49:35.520 --> 01:49:36.240] All right, guys.
[01:49:36.240 --> 01:49:38.160] Well, thank you all for joining me this week.
[01:49:38.160 --> 01:49:38.640] Thank you, Steve.
[01:49:38.800 --> 01:49:40.000] Got it, Steve.
[01:49:40.000 --> 01:49:44.320] And until next week, this is your Skeptic's Guide to the Universe.
[01:49:46.960 --> 01:49:53.600] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:49:53.600 --> 01:49:58.240] For more information, visit us at the skepticsguide.org.
[01:49:58.240 --> 01:50:02.160] Send your questions to info at the skepticsguide.org.
[01:50:02.160 --> 01:50:12.880] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:50:12.880 --> 01:50:16.480] Our listeners and supporters are what make SGU possible.