Debug Information
Processing Details
- VTT File: skepticast2024-11-02.vtt
- Processing Time: September 09, 2025 at 06:08 PM
- Total Chunks: 2
- Transcript Length: 156,512 characters
- Caption Count: 1,636 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.680] Learn more at aarp.org/slash skills.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.720] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:42.720 --> 00:00:47.920] Today is Wednesday, September 4th, 2024, and this is your host, Stephen Novella.
[00:00:47.920 --> 00:00:49.600] Joining me this week are Bob Novella.
[00:00:49.600 --> 00:00:50.240] Hey, everybody.
[00:00:50.240 --> 00:00:51.600] Kara Santa Maria.
[00:00:51.600 --> 00:00:52.080] Howdy.
[00:00:52.080 --> 00:00:53.040] Jay Novella.
[00:00:53.040 --> 00:00:55.360] Hey, guys, and Evan Bernstein.
[00:00:55.360 --> 00:00:56.480] Good evening, everyone.
[00:00:56.480 --> 00:00:59.040] Don't worry, this is the correct episode you're looking for.
[00:00:59.200 --> 00:01:10.000] We're recording this on September 4th because we have to put an extra episode in the can, as we like to say, for when we are going to SaiCon at the end of October.
[00:01:10.000 --> 00:01:15.280] So, yes, this is the latest episode if you're downloading in October, even though we're recording it so much earlier.
[00:01:15.280 --> 00:01:23.120] And we tried to make this episode a little bit evergreen, so it's not going to be all like dated news items by the time it comes out.
[00:01:23.120 --> 00:01:24.480] That's the magic of podcasting.
[00:01:24.480 --> 00:01:25.680] It involves some time travel.
[00:01:25.760 --> 00:01:26.960] A lot of time travel.
[00:01:26.960 --> 00:01:33.040] So, I was showing Jay this earlier today, but this is the first time this has happened in my yard.
[00:01:33.200 --> 00:01:41.280] My wife and I noticed that there are these plants growing in our front yard, you know, but by where we have, by where we have like the shrubs, you know, by the porch in the front of the house.
[00:01:41.280 --> 00:01:45.120] And they look like pumpkin plants, which you grow in our garden.
[00:01:45.120 --> 00:01:47.200] So, like, okay, we didn't plan plant anything.
[00:01:47.200 --> 00:01:50.320] I didn't, you know, that's that's there's not, I don't plant anything there every year.
[00:01:50.320 --> 00:01:52.160] That's just where all the hedges and stuff are.
[00:01:52.160 --> 00:01:54.800] So, one of the plants has two pumpkins growing on it.
[00:01:54.800 --> 00:01:57.040] So, it's clear, it's clearly a pumpkin plant.
[00:01:57.040 --> 00:02:04.280] The other one, I didn't see anything growing on it, but now it has about six or seven of those gooseneck gourds.
[00:02:04.440 --> 00:02:06.360] You ever see those long neck gourds?
[00:02:06.360 --> 00:02:07.800] Yeah, sure, growing on it.
[00:02:07.800 --> 00:02:12.360] You can't really tell from the plant itself because they all look pretty much the same, all the squashy plants.
[00:02:12.360 --> 00:02:15.080] Yeah, so the question is, how did they get there?
[00:02:15.080 --> 00:02:17.480] So, do you grow both of those in your backyard?
[00:02:17.480 --> 00:02:18.520] No, only pumpkins.
[00:02:18.760 --> 00:02:20.760] We do grow pumpkins in the backyard.
[00:02:20.760 --> 00:02:22.760] We do not in the garden, you know.
[00:02:22.760 --> 00:02:31.800] We don't, I've never grown gourds, you know, in my garden, but we do buy them and have them in the over Halloween.
[00:02:31.800 --> 00:02:34.360] You know, we might have a couple of gourds somewhere.
[00:02:34.360 --> 00:02:38.120] All right, so if they were not deliberately planted, then what?
[00:02:38.120 --> 00:02:41.720] The wind picked up the could they have been pooped out?
[00:02:41.720 --> 00:02:42.840] Yeah, that's what I was wondering.
[00:02:43.000 --> 00:02:43.720] I think Bob's right.
[00:02:44.200 --> 00:02:46.920] Yeah, I was gonna say, have animals been eating your pumpkins.
[00:02:46.920 --> 00:02:47.720] Well, I don't know.
[00:02:47.720 --> 00:02:58.280] I mean, so we so they're out for Halloween and a little after, and then you know, a few days after Halloween, Halloween, everything goes into the compost.
[00:02:58.280 --> 00:03:09.560] Ah, so yeah, what I figure is that some animals ate, we were eating it in the compost, and then they pooped it in my front yard for some reason, or maybe that's just the place that doesn't get mowed.
[00:03:09.560 --> 00:03:17.480] You know, they probably would have been elsewhere, but that's like the well, because there's no food otherwise there, so you know, and the old saying is you don't go where you eat.
[00:03:17.480 --> 00:03:25.560] So, well, I'm just saying that maybe they might have been all over the place, but we mow everywhere else, you know, so it would have cut down those plants as they were coming up.
[00:03:25.560 --> 00:03:28.120] But here, we don't mow, obviously, it's just mulched.
[00:03:28.120 --> 00:03:28.760] So, it's interesting.
[00:03:28.760 --> 00:03:31.400] So, I got some free pumpkins and gourds growing in my front yard.
[00:03:31.400 --> 00:03:34.440] Does that ever happen to you where like a plant like that just starts to grow?
[00:03:34.440 --> 00:03:38.120] Yeah, I mean, I think there's like weed growing all over LA.
[00:03:38.360 --> 00:03:39.280] Yeah, it's always funny.
[00:03:39.760 --> 00:03:40.920] For real, yeah.
[00:03:40.920 --> 00:03:45.280] Like, it's always funny when you like see a little bit growing because somebody like tossed some.
[00:03:44.840 --> 00:03:48.000] It used to happen more.
[00:03:48.160 --> 00:03:53.920] Now people, I think, mostly are smoking like these crazy hydroponic brands that don't even have seeds.
[00:03:53.920 --> 00:04:00.080] But like, yeah, it definitely used to happen back in the day when people just like toss seeds and bags like on the side of the road.
[00:04:00.080 --> 00:04:00.640] Oh, interesting.
[00:04:00.640 --> 00:04:01.200] Yeah, so just like that.
[00:04:01.520 --> 00:04:02.640] Plants grow, you know?
[00:04:02.960 --> 00:04:03.360] Yeah.
[00:04:03.360 --> 00:04:04.080] They do it.
[00:04:04.080 --> 00:04:05.120] They find a way.
[00:04:05.120 --> 00:04:07.200] Yeah, especially if it's a place that you like.
[00:04:07.200 --> 00:04:10.080] You know, that's why it's good to plant native plants.
[00:04:10.800 --> 00:04:13.920] Because they're just good at growing in your environment.
[00:04:13.920 --> 00:04:14.720] Oh, absolutely.
[00:04:14.720 --> 00:04:18.160] And beware invasive species can be wreak havoc.
[00:04:18.160 --> 00:04:22.000] What's that one in New York that they tell you you can't even brush up against?
[00:04:22.000 --> 00:04:24.320] You can't even get a blade of this stuff on you.
[00:04:24.560 --> 00:04:25.360] It'll destroy you.
[00:04:25.840 --> 00:04:26.560] Hogsweed?
[00:04:26.560 --> 00:04:27.520] Is that what it's called?
[00:04:27.520 --> 00:04:29.360] I know in the south it's kudzu, right?
[00:04:29.360 --> 00:04:31.920] That's their invasive plant that's taking over.
[00:04:31.920 --> 00:04:32.480] Oh, I don't know.
[00:04:32.720 --> 00:04:34.080] Giant hogweed.
[00:04:34.400 --> 00:04:36.960] Yeah, this grows in New York State.
[00:04:37.280 --> 00:04:45.200] It's not native, but now it's all over the middle part of the state, and it'll mess you up.
[00:04:45.200 --> 00:04:46.880] It'll give you terrible rashes.
[00:04:46.880 --> 00:04:47.840] It'll make you sick.
[00:04:48.800 --> 00:04:49.760] And it's huge.
[00:04:49.760 --> 00:04:55.440] And you can't, you know, you go to cut it down, a little bit of its juices or anything gets onto you.
[00:04:55.840 --> 00:04:57.360] You wind up being in trouble.
[00:04:57.360 --> 00:05:02.640] Yeah, I know my friends who live in Oregon, everyone there has a problem with invasive blackberries.
[00:05:02.640 --> 00:05:04.400] They just grow like weeds.
[00:05:04.800 --> 00:05:07.520] They were an ag plant that ran away.
[00:05:07.520 --> 00:05:09.360] They could not get it under control.
[00:05:09.360 --> 00:05:10.480] And it's brutal.
[00:05:10.480 --> 00:05:12.320] It just takes over people's backyards.
[00:05:12.320 --> 00:05:14.400] Yeah, we have that in our backyard as well.
[00:05:14.400 --> 00:05:17.680] Not in the yard, but like, in the woods behind the yard.
[00:05:17.680 --> 00:05:20.400] And I have to whack that back every year.
[00:05:20.400 --> 00:05:23.520] Yeah, they say you've got to pull it by the root, but it's really hard to do.
[00:05:23.520 --> 00:05:23.760] It is.
[00:05:24.400 --> 00:05:27.760] Yeah, you've got to pull it up by the root, but the roots are runners, you know?
[00:05:27.760 --> 00:05:30.360] And so you never get them totally up.
[00:05:31.480 --> 00:05:32.840] At some point, it's going to break.
[00:05:33.480 --> 00:05:34.440] So it's hard to get it.
[00:05:29.840 --> 00:05:36.760] But I try to get as much of the roots as I can.
[00:05:37.080 --> 00:05:39.400] Yeah, because it's really like brambly, right?
[00:05:39.400 --> 00:05:40.280] And kind of painful.
[00:05:40.280 --> 00:05:42.680] And it, I don't know, sick leather gloves.
[00:05:42.680 --> 00:05:43.000] Yeah.
[00:05:43.080 --> 00:05:46.600] Pull it up by the roots, but you know, as far back as I can.
[00:05:46.600 --> 00:05:49.560] And it keeps it under control, but I got to keep on top of it.
[00:05:49.560 --> 00:05:53.480] If I didn't, if I go like a couple years without doing that, it would completely take over.
[00:05:53.480 --> 00:05:56.040] I'm so glad I don't have a yard.
[00:05:56.360 --> 00:05:57.560] It's so easy.
[00:05:57.800 --> 00:05:59.800] The upkeep of my house is so easy.
[00:05:59.800 --> 00:06:04.600] That said, I got a packet of wildflower seeds recently, so I planted them in a pot and put them in my window.
[00:06:04.600 --> 00:06:06.440] So I have wildflowers growing in a pot.
[00:06:06.440 --> 00:06:06.920] That's nice.
[00:06:06.920 --> 00:06:07.400] It's cheap.
[00:06:08.040 --> 00:06:11.640] My patio, I have some hooks where I buy some hanging plants each summer.
[00:06:11.640 --> 00:06:13.160] You know, that flower.
[00:06:13.400 --> 00:06:15.560] Last year, I bought one particular variety.
[00:06:15.560 --> 00:06:17.320] I have no idea what the heck it was.
[00:06:17.320 --> 00:06:24.840] But what happened this spring is to the, what borders my patio is a rock garden, right?
[00:06:24.840 --> 00:06:25.640] Just rocks.
[00:06:26.040 --> 00:06:27.080] Nothing else in there.
[00:06:27.400 --> 00:06:33.560] But those flowers from the plant from last year started to spontaneously, or not spontaneously, but they grew.
[00:06:34.120 --> 00:06:38.680] The seeds or whatever fell down to the, found its way down there, and there they are.
[00:06:38.840 --> 00:06:45.240] The little yellow flowers came up all spring and summer, just randomly throughout the rock patch.
[00:06:45.240 --> 00:06:46.120] It looks beautiful.
[00:06:46.120 --> 00:06:46.760] Wow.
[00:06:46.760 --> 00:06:51.880] Do you remember when we were in Texas and we were watching the eclipse in the field?
[00:06:52.200 --> 00:06:52.680] No.
[00:06:53.560 --> 00:06:55.160] I know it's like we were all kind of focused.
[00:06:55.160 --> 00:07:02.920] We were looking up, but when you looked down, and this is like, I guess, a Texas initiative, there are wildflowers everywhere.
[00:07:02.920 --> 00:07:03.560] Yeah, we noticed that.
[00:07:03.560 --> 00:07:04.400] The blue yellows are.
[00:07:04.520 --> 00:07:05.640] They're beautiful.
[00:07:05.640 --> 00:07:06.760] Yeah, they're gorgeous.
[00:07:06.760 --> 00:07:09.800] Yeah, just little, and the kids were like picking them and making bouquets and stuff.
[00:07:10.120 --> 00:07:10.760] Yeah, they're everywhere.
[00:07:10.760 --> 00:07:12.760] And there's bees and butterflies.
[00:07:12.760 --> 00:07:13.520] I love that.
[00:07:13.160 --> 00:07:13.760] Love that.
[00:07:14.200 --> 00:07:18.320] We have a butterfly bush in our backyard, so I bought one and planted it in the backyard.
[00:07:18.560 --> 00:07:25.760] But now I have three butterfly bushes because it just spontaneously propagated, just started growing elsewhere.
[00:07:25.760 --> 00:07:33.520] Again, not where we mow, but basically along the edge of my property where we don't mow, where we just have, you know, bushes and stuff.
[00:07:33.840 --> 00:07:45.840] Steve, it's funny you say that because I got mom a butterfly bush last year, and we're looking at her five feet away from it is a, is there like a stone patio on the ground.
[00:07:46.000 --> 00:07:54.320] So it's a stone patio and growing between two stones is exactly one little, it's a little butterfly bush with a with a beautiful bloom.
[00:07:54.560 --> 00:07:56.320] Yeah, with a peg.
[00:07:56.640 --> 00:07:57.120] What's that?
[00:07:57.120 --> 00:07:58.480] By the way, what's a butterfly bush?
[00:07:58.480 --> 00:08:00.160] Is that milkweed or is it no?
[00:08:00.160 --> 00:08:01.840] It's called butterfly bush.
[00:08:01.840 --> 00:08:02.000] Oh.
[00:08:02.480 --> 00:08:06.640] I'm sure it has a technical name, but that's the common name of it.
[00:08:06.640 --> 00:08:09.280] And it's, it, you know, it really attracts butterflies.
[00:08:09.280 --> 00:08:09.680] Oh, nice.
[00:08:09.680 --> 00:08:12.080] So it's after they're already butterflies.
[00:08:12.080 --> 00:08:16.480] It's also really good to plant milkweed if you want to provide habitat.
[00:08:16.640 --> 00:08:21.920] A lot of that in California, they have whole swaths of nature staked out just for the milkweed.
[00:08:22.480 --> 00:08:24.320] So the butterflies can make their trip, right?
[00:08:24.560 --> 00:08:31.200] But word of caution, if you're going to buy milkweed and plant it in your yard or whatever, make sure it's the right local native.
[00:08:31.440 --> 00:08:37.520] Because a lot of you can buy ones that are just ones that the monarch butterflies do not use.
[00:08:37.520 --> 00:08:37.840] Yeah.
[00:08:38.160 --> 00:08:39.040] But it is cool.
[00:08:39.040 --> 00:08:42.720] Like I had some on my patio when I lived in Florida.
[00:08:42.720 --> 00:08:47.280] And just I remember looking one day and it was like, I don't know, 12 caterpillars.
[00:08:47.600 --> 00:08:48.880] Because that's what they eat.
[00:08:48.880 --> 00:08:50.560] And then they pupate right there on it.
[00:08:50.560 --> 00:08:53.360] It's so cool to watch the whole process.
[00:08:53.360 --> 00:08:54.400] It is, yeah.
[00:08:54.720 --> 00:08:56.880] Yeah, we had a lot of caterpillar action going on.
[00:08:56.880 --> 00:09:00.600] We had these black and yellow caterpillars in our herb garden.
[00:09:00.600 --> 00:09:02.200] We sort of left them there.
[00:08:59.920 --> 00:09:04.360] Well, monarch caterpillars are black and yellow.
[00:09:04.600 --> 00:09:07.640] Yeah, I think these were swallows.
[00:09:07.800 --> 00:09:08.920] Those are so pretty.
[00:09:08.920 --> 00:09:10.200] I love swallowtails.
[00:09:10.200 --> 00:09:11.000] Swallops?
[00:09:11.000 --> 00:09:11.800] Swallops?
[00:09:11.800 --> 00:09:12.680] Swallops?
[00:09:13.560 --> 00:09:17.000] All right, Bob, you're going to start us off with a quickie.
[00:09:17.000 --> 00:09:17.960] Thank you, Steve.
[00:09:17.960 --> 00:09:20.840] Predicting earthquakes in the news.
[00:09:20.840 --> 00:09:35.800] Researchers recently used machine learning techniques to examine data from two recent earthquakes, and they believe it might be possible, actually possible maybe, to detect major earthquakes months, like three months in advance.
[00:09:35.800 --> 00:09:36.440] What?
[00:09:36.440 --> 00:09:38.360] Yeah, that was a great way to get it.
[00:09:39.080 --> 00:09:40.120] Oh, my gosh, that can't be.
[00:09:40.280 --> 00:09:55.080] Well, they looked at the two earthquakes they looked at was the 2018 magnitude 7.1 Anchorage earthquake and the 2019 Ridgecrest California earthquake, which apparently went from a 6.4 magnitude to 7.1.
[00:09:55.080 --> 00:10:01.560] For both, they found unusual low-level seismicity for three months before the big quakes hit.
[00:10:01.560 --> 00:10:08.200] And by low-level, we're talking about, say, a magnitude 1.5 or below, which is something you wouldn't even notice.
[00:10:08.200 --> 00:10:15.880] Because, I mean, I think three, magnitude three is the threshold for somebody to just like noticing it walking around.
[00:10:16.040 --> 00:10:18.520] So this is 1.5, so you're not even going to notice that.
[00:10:18.680 --> 00:10:24.600] They even have a theory about what caused these little warning rumbles for months that they detected.
[00:10:24.840 --> 00:10:29.960] They think it might be a dramatic increase in poor fluid pressure within a fault.
[00:10:29.960 --> 00:10:31.400] So, okay, there's that.
[00:10:31.640 --> 00:10:45.760] The researchers said our paper demonstrates that advanced statistical techniques, particularly machine learning, have the potential to identify precursors to large magnitude earthquakes by analyzing data sets derived from earthquake catalogs.
[00:10:45.760 --> 00:10:57.520] So the idea is that you can go through these catalogs of seismic data and train the machine learning on that, on that, on the catalog, so then it'll be optimized for that area.
[00:10:57.520 --> 00:11:07.440] So they also say modern seismic networks produce enormous data sets that, when properly analyzed, can offer valuable insights into the precursors of seismic events.
[00:11:07.440 --> 00:11:18.080] This is where advancements in machine learning and high-performance computing can play a transformative role, enabling researchers to identify meaningful patterns that could signal an impending earthquake.
[00:11:18.080 --> 00:11:26.160] So remember, though, it's important that before attempting in a new region, they say the system would have to be trained in the area's historical seismicity.
[00:11:26.160 --> 00:11:33.760] So you're not going to bring the system to a new area and just start cranking out some information of when, you know, if something's coming.
[00:11:34.000 --> 00:11:37.680] It's got to become well-versed, if you will, in the area itself.
[00:11:37.680 --> 00:11:40.880] So, but still, imagine three months' warning.
[00:11:40.880 --> 00:11:43.920] The potential to save lives is dramatic.
[00:11:43.920 --> 00:11:47.920] If only they had this technology in that fun movie, Dante's Peak.
[00:11:47.920 --> 00:11:54.480] That annoying grandmother and the cocky scientist might not have died, which would have been, yeah, I didn't care for them.
[00:11:54.480 --> 00:11:55.680] Anyway, spoiler alert.
[00:11:55.840 --> 00:11:58.320] So, yes, this has been your cookie with Bob.
[00:11:58.320 --> 00:11:59.280] Back to you, Steve.
[00:11:59.280 --> 00:12:02.320] Doesn't the cocky scientist die in every disaster movie?
[00:12:02.320 --> 00:12:03.760] Isn't that like the thing?
[00:12:04.720 --> 00:12:09.600] The guy who's making the character unlikable, and that way when they die, it's like, all right.
[00:12:10.320 --> 00:12:11.360] That's the formula.
[00:12:11.360 --> 00:12:12.400] It's not going to happen.
[00:12:12.400 --> 00:12:13.200] Don't worry about it.
[00:12:13.200 --> 00:12:13.920] That guy's dead.
[00:12:13.920 --> 00:12:16.080] You know that guy's going to die.
[00:12:16.400 --> 00:12:16.880] Right?
[00:12:16.880 --> 00:12:19.360] Yeah, he wasn't nearly as annoying as the grandmother.
[00:12:19.360 --> 00:12:20.480] She was really annoying.
[00:12:20.480 --> 00:12:22.480] Yeah, this volcano's going to erupt.
[00:12:22.480 --> 00:12:26.480] There's really no doubt I'm going to stay on the mountain in my cabin.
[00:12:26.480 --> 00:12:31.720] And of course, her goofy little grandkids run up and save her and put everyone at risk.
[00:12:31.720 --> 00:12:32.520] But whatever.
[00:12:32.520 --> 00:12:33.240] Still a fun movie.
[00:12:33.240 --> 00:12:33.640] I like it.
[00:12:33.640 --> 00:12:35.080] And this technology sounds pretty cool.
[00:12:35.480 --> 00:12:37.160] I hope this is really a thing.
[00:12:37.160 --> 00:12:38.040] That'd be sweet.
[00:12:38.040 --> 00:12:38.680] Three months.
[00:12:38.680 --> 00:12:39.480] Yeah, but what are we going to do?
[00:12:39.480 --> 00:12:44.040] Like, if they say, all right, California's going to be hit by a big earthquake in three months, are they going to vacate it?
[00:12:44.680 --> 00:12:49.480] No, but we can start making sure that anything that's not up to code is reinforced.
[00:12:49.480 --> 00:12:56.600] You can, if you're in a home that's like landslide, you know, territory, I'm sure that there's things they can do to prepare.
[00:12:56.600 --> 00:13:01.320] And also just homeowners, if you know an earthquake is coming, you can take things off the shelves.
[00:13:01.320 --> 00:13:03.320] You can prevent things from breaking.
[00:13:03.640 --> 00:13:04.840] Yeah, go on vacation.
[00:13:05.400 --> 00:13:05.880] That too.
[00:13:05.880 --> 00:13:06.520] Don't be there.
[00:13:06.520 --> 00:13:11.240] It's kind of like when hurricanes are coming in in Florida, there's little things.
[00:13:11.240 --> 00:13:13.480] Board up your windows, take in your patio furniture.
[00:13:13.480 --> 00:13:14.520] It can really help.
[00:13:14.520 --> 00:13:24.040] Yeah, they were saying they were giving some confidence levels at like 85, you know, 85%, 85% confidence level, which is pretty good.
[00:13:24.280 --> 00:13:28.920] And then it gets even a little bit better, I think, the closer you get to the actual big quake.
[00:13:28.920 --> 00:13:31.560] And really, a big part is just planning.
[00:13:31.560 --> 00:13:34.440] Like, anybody living in California should have an earthquake kit.
[00:13:34.680 --> 00:13:40.440] Yeah, we all have earthquake kits, but if you don't, then what's in an earthquake kit?
[00:13:40.440 --> 00:13:41.880] It's a bug bag.
[00:13:42.680 --> 00:13:43.240] Oh, oh, okay.
[00:13:43.240 --> 00:13:44.120] Just as a bugaway.
[00:13:44.680 --> 00:13:44.840] Yeah.
[00:13:45.080 --> 00:13:45.720] Oh, no, no, no.
[00:13:45.720 --> 00:13:56.360] It's for like if you are trapped in your house, it's food, it's a radio, it's flashlights, candles, whatever you would need if you don't have power, water.
[00:13:56.360 --> 00:13:57.000] Got it.
[00:13:57.000 --> 00:13:57.480] Yeah.
[00:13:57.480 --> 00:13:59.080] You could make sure that you have that.
[00:13:59.080 --> 00:14:04.200] And also, I think one of the big problems with an earthquake is like bridges, roads, mudslides.
[00:14:04.200 --> 00:14:10.200] So you could plan for that so people aren't, you know, it's not chaos afterward, like it always is.
[00:14:10.520 --> 00:14:18.960] I think everybody should be, you know, should be prepared to live at home with no outside contact for at least three weeks.
[00:14:14.840 --> 00:14:19.440] Three weeks.
[00:14:19.600 --> 00:14:22.400] I think the recommendation is three days, isn't it?
[00:14:22.400 --> 00:14:22.560] Yes.
[00:14:22.880 --> 00:14:23.440] 72 hours.
[00:14:23.600 --> 00:14:24.080] 72 hours.
[00:14:25.200 --> 00:14:27.280] Absolutely three days.
[00:14:27.280 --> 00:14:39.760] But I think if you were like ready to go for two to three weeks, that is, you know, I think that's a good middle ground between like basically living day to day and preppers.
[00:14:39.760 --> 00:14:40.720] You know what I mean?
[00:14:40.720 --> 00:14:43.280] This is essentially like the flu.
[00:14:43.280 --> 00:14:55.280] When the flu hit, and you couldn't, you know, couldn't get toilet paper for weeks or whatever, or there was a total run on the store and you had to stock up.
[00:14:55.280 --> 00:14:56.480] But it's not that hard.
[00:14:56.480 --> 00:15:04.720] It's not that hard to have two to three weeks of food and stuff in your house for most people who have a house, you know.
[00:15:04.720 --> 00:15:05.760] Right, if you have space.
[00:15:05.760 --> 00:15:06.000] Yeah.
[00:15:06.560 --> 00:15:07.840] But at least three days.
[00:15:07.840 --> 00:15:11.600] I think a lot of people, if they were to look right now, they don't have three days worth of water.
[00:15:12.400 --> 00:15:14.560] Need three days worth of clean drinking water.
[00:15:15.920 --> 00:15:16.640] I do.
[00:15:16.960 --> 00:15:18.000] Three months, though, Steve.
[00:15:18.000 --> 00:15:18.960] I mean, that's like...
[00:15:18.960 --> 00:15:19.760] I do as well.
[00:15:20.080 --> 00:15:24.240] The average person wouldn't, including myself, like really wouldn't know how to prep that.
[00:15:24.640 --> 00:15:26.480] I'm saying three weeks, Jay, not three months.
[00:15:26.480 --> 00:15:28.240] Three months, that's getting prepper level.
[00:15:28.240 --> 00:15:28.880] And I agree.
[00:15:28.880 --> 00:15:30.480] That's a total different level.
[00:15:30.480 --> 00:15:49.280] But it's not hard to have, again, like two to three weeks, essentially long enough that if there was a major disruption, an earthquake, a hurricane, a pandemic, something like that, you would be able, you and your family could survive without going to the store for a couple of weeks.
[00:15:49.280 --> 00:15:57.840] That would be time for the authorities to sort of re-establish some kind of infrastructure or whatever.
[00:15:57.840 --> 00:15:58.480] You know what I mean?
[00:15:59.200 --> 00:16:13.160] It wouldn't be like two days later, you're out of food, which I think a lot of people are in that situation where they buy their food for the week, maybe at most, and then by the end of the week, they have no food in their house or very, very little.
[00:16:13.160 --> 00:16:14.280] Yeah, I'm kind of like that.
[00:16:14.520 --> 00:16:15.080] And you're right.
[00:16:15.080 --> 00:16:17.800] It's good to have pantry staples.
[00:16:17.800 --> 00:16:22.280] Pantry staples, dried goods, rice, pasta, things like that, beans, things in cans.
[00:16:22.280 --> 00:16:28.200] Just have it so that your food has this two to three week cycle to it.
[00:16:28.200 --> 00:16:29.480] You know, it's not there forever.
[00:16:29.960 --> 00:16:30.520] You're eating it.
[00:16:30.520 --> 00:16:34.600] You're going through it, but it's just, you have this buffer you have.
[00:16:34.600 --> 00:16:35.240] Yeah, you just have to.
[00:16:35.400 --> 00:16:36.520] So you have all that, Steve?
[00:16:36.520 --> 00:16:37.240] Yeah, totally.
[00:16:37.240 --> 00:16:37.560] All right.
[00:16:37.560 --> 00:16:38.520] Shit hits the fan.
[00:16:38.520 --> 00:16:39.480] Steve's house.
[00:16:40.120 --> 00:16:42.360] And there goes his three-week supply becomes three weeks.
[00:16:42.520 --> 00:16:43.320] Well, that's the other thing.
[00:16:43.800 --> 00:16:45.320] Yep, make sure you can accommodate forever.
[00:16:45.400 --> 00:16:51.720] Yeah, if you have two to three weeks, if you do need to take on an extra person or two or whatever, it's not a disaster.
[00:16:52.120 --> 00:16:53.240] You can accommodate them.
[00:16:53.480 --> 00:16:55.480] What about if you don't have any power, though?
[00:16:55.480 --> 00:16:57.320] That's a different beast.
[00:16:57.560 --> 00:16:59.080] I'm a campaign to cook that stuff.
[00:16:59.400 --> 00:17:04.760] So, like, even though I'm not a generator, I have a jet boil.
[00:17:04.760 --> 00:17:07.080] I have, you know, propane.
[00:17:07.080 --> 00:17:14.200] I have all the things I need to survive to be able to survive when I'm camping.
[00:17:14.200 --> 00:17:14.920] Yeah, right.
[00:17:14.920 --> 00:17:16.040] So treat it like you're camping.
[00:17:16.040 --> 00:17:16.440] Yeah, exactly.
[00:17:17.000 --> 00:17:17.720] We have that stuff too.
[00:17:17.720 --> 00:17:18.680] We also have a fireplace.
[00:17:18.840 --> 00:17:22.040] I always have a half a cord, at least a wood or so around.
[00:17:22.040 --> 00:17:25.080] So if we have absolutely had to, we could do that.
[00:17:25.080 --> 00:17:26.440] Anyway, something to think about.
[00:17:26.440 --> 00:17:27.960] Don't be caught unawares.
[00:17:27.960 --> 00:17:29.480] All right, let's move on.
[00:17:29.480 --> 00:17:41.080] I'm going to talk about the World Health Organization's recent systematic review of the potential association between cell phones and brain cancer.
[00:17:41.120 --> 00:17:41.640] Ah, yes.
[00:17:41.760 --> 00:17:42.440] We talked about that.
[00:17:42.600 --> 00:17:43.080] I saw that.
[00:17:43.640 --> 00:17:44.640] We've been talking about this before.
[00:17:44.800 --> 00:17:45.680] It's about 10 years.
[00:17:44.600 --> 00:17:51.840] I think my first article about it was one of the first articles I wrote in science-based medicine was in 2010.
[00:17:52.160 --> 00:17:52.880] Did you read it?
[00:17:52.880 --> 00:17:54.640] 14 years ago, of course.
[00:17:54.640 --> 00:17:58.240] And I've written about it a few times since then.
[00:17:58.240 --> 00:18:00.560] And so now here's nothing new, right?
[00:18:00.560 --> 00:18:01.360] This is nothing new.
[00:18:01.360 --> 00:18:03.040] This is just an update.
[00:18:04.000 --> 00:18:08.560] It's a systematic review of studies that we've already been reading and reviewing ourselves.
[00:18:08.560 --> 00:18:09.120] You know what I mean?
[00:18:09.120 --> 00:18:12.960] So, like, it shouldn't take you by surprise if it's not new data.
[00:18:12.960 --> 00:18:13.600] It's just reviewing the data.
[00:18:13.840 --> 00:18:14.720] Reviewing the existing data.
[00:18:14.880 --> 00:18:16.080] Yeah, exactly.
[00:18:16.080 --> 00:18:19.280] Here's the bottom line: they found no association, right?
[00:18:19.280 --> 00:18:21.120] So none whatsoever.
[00:18:21.440 --> 00:18:21.840] None.
[00:18:21.840 --> 00:18:32.880] So this is looking at there's different kinds of evidence you can bring to bear to ask the question: is there any risk from the radio frequency EMF from cell phones, right?
[00:18:32.880 --> 00:18:37.120] The concern being that you're holding it up to your head, you're doing it for many hours a day for years.
[00:18:37.120 --> 00:18:39.040] And, you know, is there a potential risk?
[00:18:39.200 --> 00:18:42.000] Which is funny because, like, who holds their phone up to their head anymore?
[00:18:42.000 --> 00:18:42.640] I do sometimes.
[00:18:42.640 --> 00:18:43.760] Sometimes I do it in front of me.
[00:18:43.760 --> 00:18:44.960] Sometimes I hold it up to my head.
[00:18:44.960 --> 00:18:45.680] All right, boomer.
[00:18:45.760 --> 00:18:46.800] No, I'm kidding.
[00:18:47.440 --> 00:18:48.000] Sometimes.
[00:18:48.480 --> 00:18:49.440] I'm kidding.
[00:18:50.000 --> 00:18:50.800] I know you know this.
[00:18:50.800 --> 00:18:54.080] I have many conversations that need to be confidential.
[00:18:54.080 --> 00:18:54.480] Yeah.
[00:18:54.480 --> 00:18:56.960] I can't have it on speakerphone when I'm talking to a patient.
[00:18:57.520 --> 00:18:59.520] That's when I pull out my AirPods.
[00:18:59.520 --> 00:19:00.240] But no, it's true.
[00:19:00.560 --> 00:19:08.480] I think we've talked about this before when you ask somebody younger than a millennial, like a Gen Z or an alpha, what's the universal symbol for the phone?
[00:19:08.480 --> 00:19:11.280] They don't hold their finger and thumb up to their head.
[00:19:12.080 --> 00:19:12.640] I love that.
[00:19:12.640 --> 00:19:12.960] I love that.
[00:19:13.200 --> 00:19:15.760] They hold it like they're holding a deck of cards, like they're holding an iPhone.
[00:19:15.920 --> 00:19:16.400] Right, right, right.
[00:19:16.640 --> 00:19:16.960] So funny.
[00:19:17.040 --> 00:19:17.440] Oh my gosh.
[00:19:18.000 --> 00:19:18.480] All right.
[00:19:18.480 --> 00:19:22.400] So, so this was a review of 63 studies.
[00:19:23.040 --> 00:19:24.640] There are observational studies.
[00:19:24.640 --> 00:19:35.640] Obviously, you can't do experimental studies on risk like this, but there was our observational studies from 22 different countries looking at a total of 116 different comparisons, right, within those studies.
[00:19:36.040 --> 00:19:38.200] They had multiple comparisons.
[00:19:38.200 --> 00:19:41.880] And from 1994 to 2022.
[00:19:42.200 --> 00:19:42.920] 94.
[00:19:43.240 --> 00:19:48.680] Wow, those are the things that are flipping years of data, 28 years.
[00:19:48.680 --> 00:20:02.040] And they found that there was no observable increase in the relative risk for most investigated neoplasms with increasing time start, use of mobile phones, cumulative call time, or cumulative number of calls.
[00:20:02.280 --> 00:20:05.160] In other words, you try to do a dose response, right?
[00:20:05.480 --> 00:20:11.880] When you gather information about do you use a phone, do you not, how many times a day, how long are your calls, how many calls, whatever.
[00:20:11.880 --> 00:20:19.320] And any of those variables, there was no correlation with that and any of the brain cancers that they looked at.
[00:20:19.320 --> 00:20:25.160] They found no increased risk in children or adults, regardless of intensity or duration.
[00:20:25.160 --> 00:20:28.200] So that's all very reassuring, right?
[00:20:28.200 --> 00:20:38.040] I mean, that tells us if there is any tiny risk that's being missed in this pretty large set of data, it's got to be so small it's not worth worrying about.
[00:20:38.040 --> 00:20:41.800] This also correlates with other kinds of data that you could bring to bear as well.
[00:20:41.800 --> 00:20:43.800] So if you look at animal data, right?
[00:20:43.800 --> 00:20:47.480] Now, animal data doesn't look at risk, it looks at hazard.
[00:20:47.720 --> 00:20:50.280] I always like to point out the difference between those two things.
[00:20:50.280 --> 00:21:00.360] This is like, is there a potential biological mechanism by which it could cause harm versus what's the risk that it actually is calling harm, causing harm, right?
[00:21:00.360 --> 00:21:05.720] Like a loaded gun is a hazard, but it's not a risk if you have it locked away in a gun cabinet, right?
[00:21:05.720 --> 00:21:08.600] If a kid playing with a gun, that's a risk, right?
[00:21:08.600 --> 00:21:10.920] So that's one way to look at the difference.
[00:21:10.880 --> 00:21:11.200] Right?
[00:21:11.160 --> 00:21:13.880] You could say cell phone's a hazard, but if you don't use it, there's no risk.
[00:21:14.200 --> 00:21:14.760] Same thing.
[00:21:14.880 --> 00:21:24.240] So when looking at the animal data, we have to divide it up further into two subcategories: thermal and non-thermal.
[00:21:24.960 --> 00:21:27.120] So let me back up one notch.
[00:21:27.120 --> 00:21:38.400] When we're talking about radio frequency EMF, this is what we call non-ionizing radiation, which I know I've mentioned before on the show, but just to reiterate, non-ionizing radiation is low-energy radiation.
[00:21:38.400 --> 00:21:39.280] How low is it?
[00:21:39.280 --> 00:21:42.720] It's so low that it cannot break chemical bonds.
[00:21:42.720 --> 00:21:44.080] So that's the definition.
[00:21:44.080 --> 00:21:52.640] It does not break chemical bonds, which means that it does not damage DNA, it does not damage protein, so it doesn't cause mutations, it doesn't cause any tissue damage.
[00:21:52.880 --> 00:21:57.440] And the thinking is, well, non-ionizing radiation is basically safe.
[00:21:57.440 --> 00:22:04.000] You know, ionizing radiation, that's the stuff we have to worry about and limit your exposure to, like gamma rays and things like that.
[00:22:04.000 --> 00:22:16.400] But non-ionizing radiation can heat tissue a little bit, especially if you're holding it right up to the tissue over the, depending on, again, the intensity and duration, et cetera.
[00:22:16.400 --> 00:22:25.440] So any non-ionizing or radio frequency radiation that was not enough to cause any significant tissue heating is called non-thermal exposure.
[00:22:25.440 --> 00:22:32.880] And in that category, when doing animal studies, non-thermal RF, there's no risk.
[00:22:32.880 --> 00:22:36.640] I mean, it's basically there's no biological issue there.
[00:22:36.640 --> 00:22:50.560] For the thermal group, meaning that there was non-ionizing radiation exposure that was sufficient to cause measurable tissue heating, which could still just be like a degree or two, but it was measurable, it's still mostly negative.
[00:22:50.560 --> 00:22:52.800] It's still, you know, it's a little mixed, though.
[00:22:52.800 --> 00:22:54.320] But there were meaning some studies.
[00:22:54.640 --> 00:22:56.080] What could be risky about that?
[00:22:56.080 --> 00:22:57.120] About slightly warmer.
[00:22:57.280 --> 00:22:57.920] Artificial burn?
[00:22:58.080 --> 00:22:59.360] Yeah, who knows?
[00:22:59.360 --> 00:23:05.240] I mean, again, it's not that plausible a risk anyway, but at least you're saying something physically is happening.
[00:23:05.720 --> 00:23:12.200] Do you think that's why, like, the people who are like, don't keep your phone in your pocket by your balls because it'll affect your fertility?
[00:23:12.200 --> 00:23:14.680] Like, is that the conclusion they're trying to draw?
[00:23:14.680 --> 00:23:21.320] Yeah, it's completely different with testicles because those do need like one degree intimidated sperm production.
[00:23:21.560 --> 00:23:30.840] So, yeah, the idea is that cells, you know, we do carefully regulate our body temperature and small changes in body temperature can have an adverse effect on metabolism or whatever.
[00:23:30.840 --> 00:23:35.400] And so it's semi-plausible that a little bit of heating can be an issue.
[00:23:35.400 --> 00:23:37.000] But would it cause cancer?
[00:23:37.000 --> 00:23:39.800] There's really not really much of a plausible mechanism there.
[00:23:39.800 --> 00:23:42.040] And the evidence is still mostly negative.
[00:23:42.040 --> 00:23:46.600] So it's completely negative for non-thermal and mostly negative for thermal.
[00:23:46.920 --> 00:23:56.120] And I feel like if you looked at just the variability in like the temperature people keep their homes or like being near a lamp or you know there's so many other things.
[00:23:56.120 --> 00:23:57.320] But again, this is animal data.
[00:23:57.320 --> 00:24:04.600] So they're like exposing the animals to this, you know, and measuring the increase in temperature of the skin.
[00:24:04.600 --> 00:24:07.240] And then there's, you know, like ecological data.
[00:24:07.240 --> 00:24:12.520] So we could ask the question just in the last 30 years, has brain cancer increased?
[00:24:12.520 --> 00:24:12.840] Right?
[00:24:12.840 --> 00:24:15.720] And the answer, again, is basically no.
[00:24:15.720 --> 00:24:21.160] Again, there's a lot more complexity there because it depends on what subpopulations you're looking at, et cetera, et cetera.
[00:24:21.160 --> 00:24:25.080] Are you controlling for surveillance and for diagnostic, whatever?
[00:24:25.080 --> 00:24:30.440] But if you look at the totality of that research, it's basically a no.
[00:24:30.440 --> 00:24:34.360] There certainly isn't any big increase in brain cancer over the last 30 years.
[00:24:34.680 --> 00:24:38.480] Yeah, that's not one of the ones that I often see flagged, like colon cancer.
[00:24:38.800 --> 00:24:41.640] Yeah, yeah, increasing in younger and younger people.
[00:24:41.640 --> 00:24:44.080] Yeah, because that's there are lifestyle risk factors for that.
[00:24:43.720 --> 00:24:47.200] You know, there basically isn't, you know, for brain cancer.
[00:24:47.200 --> 00:24:47.600] Yeah.
[00:24:43.800 --> 00:24:50.720] Yeah, like ones that have big lifestyle factors, you can see differences.
[00:24:50.800 --> 00:25:00.480] You also see differences if we radically change our diagnostic procedures, like you know, doing mammograms or doing tests for prostate cancer, you know, PSA.
[00:25:01.280 --> 00:25:06.080] But for brain cancer, I don't think it's there hasn't really been anything significantly different.
[00:25:06.080 --> 00:25:09.200] So animal data, basically negative.
[00:25:09.520 --> 00:25:10.960] Ecological data, negative.
[00:25:10.960 --> 00:25:13.520] This observational data, negative.
[00:25:13.520 --> 00:25:22.080] Doesn't appear to be any plausible mechanism, and there's no risk that we could detect after 30 years out there in the public.
[00:25:22.080 --> 00:25:24.480] So this should pretty much put this to rest.
[00:25:24.480 --> 00:25:26.080] But of course it won't.
[00:25:26.480 --> 00:25:37.440] Because, you know, the press likes fear-mongering headlines, and there's already a cottage industry of quack snake oil devices to protect you from the RF in your cell phone.
[00:25:37.440 --> 00:25:46.160] And the people who want to keep this fear-mongering industry going always say the same kinds of things, which is like, well, we haven't been studying it long enough.
[00:25:46.160 --> 00:25:47.120] Well, what's long enough?
[00:25:47.120 --> 00:25:50.160] Longer than we've currently been studying it, apparently, whatever that is.
[00:25:50.160 --> 00:25:53.120] Right, longer than the amount of time we have available to us to study.
[00:25:53.360 --> 00:25:53.680] Yeah, right.
[00:25:53.680 --> 00:25:57.440] So if you had in the 90s and aughts, maybe you had a point.
[00:25:57.440 --> 00:25:59.520] Yes, we need to do longer, and that's what we said.
[00:25:59.520 --> 00:26:02.480] It's negative so far, but we need to do longer duration studies.
[00:26:02.480 --> 00:26:04.320] Now we have 30 years of data.
[00:26:04.320 --> 00:26:08.480] We would be seeing a signal if there was an actual effect.
[00:26:08.480 --> 00:26:12.560] And the other thing is, well, what, you know, 5G hasn't been out for that long.
[00:26:12.560 --> 00:26:14.160] Maybe 3G is safe.
[00:26:14.160 --> 00:26:19.280] And of course, once we get good data, like long-term data on 5G, 6G will be coming out.
[00:26:19.280 --> 00:26:24.880] Yeah, so there's always this sort of moving target that you can point to.
[00:26:24.880 --> 00:26:27.600] So I don't think this is ever going to completely go away.
[00:26:27.600 --> 00:26:37.720] But at this point, with this much data for this long and no detectable signal, to me, this is just not something you need to worry about.
[00:26:38.040 --> 00:26:47.560] And I think the fact that the basic science, it's non-ionizing radiation, it should be safe is also very reassuring.
[00:26:47.560 --> 00:26:53.080] Again, if you trust in the laws of physics, generally speaking, you're going to be good.
[00:26:53.080 --> 00:26:55.160] So anyway, big systematic review.
[00:26:55.160 --> 00:27:02.760] No surprise for those of us who have been following this research for the last 20 years, but it is good to see.
[00:27:03.080 --> 00:27:16.840] It seems like, tell me if this is a silly question, but it seems like the amount of non-ionizing radiation, which we know is safe, that you would get from having a cell phone near your head, is it really significantly different?
[00:27:16.840 --> 00:27:27.880] You know, given that it's safe, like put that on the shelf, is just the physical quantity really that significantly different from someone who uses a cell phone and someone who just walks around on Earth today?
[00:27:28.200 --> 00:27:28.600] Yeah, that's a good idea.
[00:27:28.920 --> 00:27:31.400] We have so much technology around there.
[00:27:32.920 --> 00:27:36.600] But it is because of the inverse square law.
[00:27:37.640 --> 00:27:38.680] It's just near you.
[00:27:38.840 --> 00:27:42.840] When it's right, right next to your head, it's just going to be more than ambient RF.
[00:27:42.920 --> 00:27:44.520] But like, does an alarm clock do that?
[00:27:44.520 --> 00:27:46.840] Because it's right next to your head, too.
[00:27:47.160 --> 00:27:54.520] I mean, it would give you some reason, but alarm clocks, unless they're 5G connected, they shouldn't be putting out a lot of RF.
[00:27:54.520 --> 00:27:55.320] Oh, okay.
[00:27:55.960 --> 00:27:58.040] So, I mean, yes, it's more exposure.
[00:27:58.600 --> 00:28:07.560] But since you bring that up, this review also included some studies which looked at occupational exposure, including people working like near cell towers.
[00:28:07.560 --> 00:28:08.040] You know what I mean?
[00:28:08.440 --> 00:28:10.280] Yeah, because that's always a big scaremongering thing, too.
[00:28:10.520 --> 00:28:12.120] Yeah, and those are all negative, too.
[00:28:12.120 --> 00:28:18.800] So, even like the way more exposure than you're going to get just using a cell phone didn't seem to produce any increased risk.
[00:28:19.120 --> 00:28:33.280] So, anyway, it's good to know for our modern civilization that radio frequency non-ID non-ionizing radiation is safe because it would be pretty bad if it weren't, you know, given how ubiquitous the technology is.
[00:28:33.520 --> 00:28:35.920] As I said, a lot of other things that you need to worry about.
[00:28:35.920 --> 00:28:37.360] That's not one of them.
[00:28:37.360 --> 00:28:42.560] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Mint Mobile.
[00:28:42.560 --> 00:28:45.440] Guys, I've been using Mint Mobile for the past few weeks.
[00:28:45.440 --> 00:28:47.200] I am really happy with the service, actually.
[00:28:47.440 --> 00:28:48.320] It's awesome.
[00:28:48.320 --> 00:28:50.480] You know, first off, the setup was easy.
[00:28:50.480 --> 00:28:52.000] You know, none of that BS.
[00:28:52.000 --> 00:28:53.920] There wasn't a lot of hoops to jump through.
[00:28:53.920 --> 00:28:58.400] The plan is 15 bucks a month if you purchase a three-month plan.
[00:28:58.400 --> 00:29:01.680] And it turns out that I've been paying way too much.
[00:29:01.840 --> 00:29:04.800] My God, you can't beat $15 a month.
[00:29:05.040 --> 00:29:08.320] And, you know, the bottom line is this: everything was easy.
[00:29:08.320 --> 00:29:09.920] The service is excellent.
[00:29:09.920 --> 00:29:11.440] And there was zero hassle.
[00:29:11.680 --> 00:29:12.960] My cell service is good.
[00:29:12.960 --> 00:29:14.080] Everything is good.
[00:29:14.080 --> 00:29:16.320] And I admit it, I do like Ryan Reynolds.
[00:29:16.320 --> 00:29:16.640] I do.
[00:29:16.640 --> 00:29:17.360] I like him.
[00:29:17.360 --> 00:29:20.720] To get started, go to mintmobile.com/slash SGU.
[00:29:20.720 --> 00:29:27.680] There you'll see that right now, all three-month plans are only $15 a month, as Jay said, including the unlimited plan.
[00:29:27.680 --> 00:29:34.240] All plans come with high-speed data and unlimited talk and text delivered on the nation's largest 5G network.
[00:29:34.240 --> 00:29:40.720] You can use your own phone with any Mint Mobile plan and bring your phone number along with all your existing contacts.
[00:29:40.720 --> 00:29:47.840] Find out how easy it is to switch to Mint Mobile and get three months of premium wireless service for $15 bucks a month.
[00:29:47.840 --> 00:29:56.080] To get this new customer offer and your new three-month premium wireless plan for just $15 a month, go to mintmobile.com/slash SGU.
[00:29:56.080 --> 00:29:58.560] That's mintmobile.com/slash SGU.
[00:29:58.560 --> 00:30:01.800] $45 upfront payment required, equivalent to $15 a month.
[00:30:01.800 --> 00:30:03.880] New customers on first three-month plan only.
[00:29:59.920 --> 00:30:06.280] Speed slower than 40 GB on unlimited plan.
[00:30:06.440 --> 00:30:08.440] Additional taxes, fees, and restrictions apply.
[00:30:08.440 --> 00:30:09.800] See Mintmobile for details.
[00:30:09.800 --> 00:30:12.040] All right, guys, let's get back to the show.
[00:30:12.680 --> 00:30:19.480] All right, Jay, tell us how we get gold from earthquakes, another earthquake-related item.
[00:30:19.480 --> 00:30:20.920] Do you love gold, Jay?
[00:30:20.920 --> 00:30:22.280] I love gold.
[00:30:22.280 --> 00:30:23.080] Gold!
[00:30:23.400 --> 00:30:27.000] I can't hear that word and not say that in my head.
[00:30:27.000 --> 00:30:33.400] This one really surprised me, and I learned a lot of things that I didn't know, and I was wrong about things.
[00:30:33.400 --> 00:30:34.680] So pay attention.
[00:30:34.680 --> 00:30:35.320] This is really cool.
[00:30:35.320 --> 00:30:40.280] So, a new study published in Nature Geoscience, this was on September 2nd of this year.
[00:30:40.280 --> 00:30:46.600] It proposes a pretty interesting explanation for how large gold nuggets form in quartz veins.
[00:30:46.600 --> 00:30:49.000] This has been a long-standing mystery in geology.
[00:30:49.000 --> 00:30:56.920] So, my first thing that I got wrong was I thought that gold was made in stars and that it was just in outer space, right?
[00:30:56.920 --> 00:30:59.240] I mean, that's where gold originates, correct, Steve?
[00:30:59.720 --> 00:31:01.080] Kilonovas, too.
[00:31:01.080 --> 00:31:04.600] Yeah, it's a heavier element.
[00:31:04.600 --> 00:31:12.840] So, basically, anything heavier than iron is not cooked up in the cores of stars just from fusion, right?
[00:31:13.160 --> 00:31:14.760] Because you can only get to iron.
[00:31:14.760 --> 00:31:20.040] Anything heavier, you know, there's a discussion among astronomers, like what are the processes that could do it?
[00:31:20.040 --> 00:31:25.480] So, supernova can do it, but there, but there's kilonovas as well.
[00:31:25.480 --> 00:31:32.120] Yeah, kilonovas, and I think also, just like in neutron stars or something, Bob, what am I remembering about an item from not too long ago?
[00:31:32.200 --> 00:31:33.560] No, that's well, that's what a kilonova is.
[00:31:33.720 --> 00:31:34.280] That's the kilonova.
[00:31:34.360 --> 00:31:35.360] Colliding neutron star.
[00:31:35.520 --> 00:31:36.440] Colliding neutron stars.
[00:31:36.520 --> 00:31:38.040] Yeah, that's what I was thinking of.
[00:31:38.280 --> 00:31:45.040] Yeah, but so you need something, some really violent, very energetic, powerful situation in order to generate the heavier elements.
[00:31:44.760 --> 00:31:48.080] And they seed the gas, interstellar gas clouds.
[00:31:48.320 --> 00:31:55.600] They talk about the metallicity of those clouds, depending on how much they've been seeded by supernova and kilonova.
[00:31:55.600 --> 00:32:04.960] And then when a new planet forms, a new system forms, that's where you have a whole bunch of heavier elements and they sort of clump together in asteroids and planets or whatever.
[00:32:04.960 --> 00:32:05.200] Right.
[00:32:05.200 --> 00:32:10.560] So the takeaway here is that the gold that exists on Earth has always been here.
[00:32:10.560 --> 00:32:12.880] It was here when Earth formed.
[00:32:12.880 --> 00:32:13.280] Right.
[00:32:13.280 --> 00:32:14.960] Or from later bombardment.
[00:32:14.960 --> 00:32:21.040] So it exists in lots of different places all around the planet at different depths and everything.
[00:32:21.040 --> 00:32:25.920] It's just kind of cooked in to all the regolith on Earth, right?
[00:32:25.920 --> 00:32:26.560] Okay.
[00:32:26.560 --> 00:32:35.440] So we're talking about like these quartz veins and why does gold accumulate on quartz veins?
[00:32:35.440 --> 00:32:37.200] Let me get into the details.
[00:32:37.440 --> 00:32:45.680] So when they say here that there's an explanation of why this is happening, and they say the word large gold nuggets, and I wanted to know what the hell that was.
[00:32:45.680 --> 00:32:50.160] So a large nugget would be up to hundreds of kilograms.
[00:32:50.160 --> 00:32:57.920] Specifically, the largest orogenic gold nuggets found can weigh around 60 kilograms.
[00:32:57.920 --> 00:32:59.520] That's 130 pounds.
[00:32:59.520 --> 00:33:10.320] The research suggests that earthquakes might create electric charges in quartz, causing free-floating gold particles to clump together, eventually resulting in these sizable nuggets.
[00:33:10.320 --> 00:33:12.960] This is really, I just think this is so cool, right?
[00:33:12.960 --> 00:33:17.680] Like there's a thing happening, and gold is collecting for a very specific reason here.
[00:33:17.680 --> 00:33:27.080] So, in quartz veins, right, we're talking about quartz as it as it has developed inside the Earth, where most significant gold deposits are found.
[00:33:27.080 --> 00:33:35.560] Typically, they form due to gold-rich hydrothermal fluids that are seeping through the cracks caused by earthquakes, right?
[00:33:29.680 --> 00:33:36.920] So, visualize that.
[00:33:37.240 --> 00:33:49.240] The earthquake happens, there's these super hot hydrothermal fluids that come flowing out of these cracks, they go into the soil, they meet up with a quartz vein, and something happens, right?
[00:33:49.240 --> 00:34:02.200] So, because gold isn't particularly soluble, traditional models would require impractical amounts of fluid to create a large nugget, meaning there'd have to be a ton of this hydrothermal fluid to make a certain amount of gold.
[00:34:02.200 --> 00:34:03.800] So, I tried to look it up and I figured it out.
[00:34:03.800 --> 00:34:14.200] So, for instance, to form a one-kilogram nugget, which is about the size of a standard chocolate bar, you'd need the equivalent of five Olympic swimming pools of hydrothermal fluid.
[00:34:14.200 --> 00:34:19.080] And that is considered to be very unrealistic because that's a lot of that fluid, right?
[00:34:19.080 --> 00:34:27.720] Now, hydrothermal fluid, you have to realize, has a higher gold density than just water that is out and you know that you would interact with regularly in your life.
[00:34:27.720 --> 00:34:35.800] Hydrothermal fluid has a lot more density of lots of different things, but there happens to be a lot of gold particles in there.
[00:34:35.800 --> 00:34:47.480] So, this guy named Christopher Voisey, who's a geoscientist from Monash University, he and his team theorized that the answer might lie in the piezoelectric properties of quartz.
[00:34:47.480 --> 00:34:49.000] This is something else I didn't know.
[00:34:49.000 --> 00:34:57.960] I didn't know that, I know that quartz watch and they vibrate the rock, and I know that if you pass electricity through quartz, that it vibrates.
[00:34:57.960 --> 00:35:00.520] I didn't realize that it had a piezoelectric property to it.
[00:35:00.600 --> 00:35:01.320] It's like piezoelectric.
[00:35:01.400 --> 00:35:02.200] Piezo, piezo.
[00:35:02.200 --> 00:35:05.560] I would say piezio because it reminds me of pizza, and I'm hungry.
[00:35:05.880 --> 00:35:08.360] It is piezoelectric, Steve.
[00:35:08.360 --> 00:35:10.680] You got to get these words straight.
[00:35:10.680 --> 00:35:13.880] When quartz is mechanically stressed, what does that mean?
[00:35:13.880 --> 00:35:19.280] Typically, meaning that it's coming from pressure that is created by earthquakes.
[00:35:14.360 --> 00:35:22.160] So that pressure that's created by earthquakes, right?
[00:35:22.160 --> 00:35:29.120] When the tetonic plates hit each other and the earthquakes happen, it generates an electric charge in the quartz.
[00:35:29.120 --> 00:35:35.840] The charge could cause gold ions in the surrounding fluid to gain electrons and then they start to form solid gold, right?
[00:35:35.840 --> 00:35:38.160] They're not, they start to chunk up.
[00:35:38.160 --> 00:35:47.840] So once the gold starts solidifying, it in and of itself acts as a conductor within the electric field that is being generated by the earthquakes.
[00:35:47.840 --> 00:35:54.960] And that chunk of gold that has formed, it draws even more gold, gold ions to the same location.
[00:35:54.960 --> 00:35:59.280] And this allows the particles to clump together and get bigger and bigger and bigger over time.
[00:35:59.280 --> 00:36:02.160] Lots of time, but it but it happens.
[00:36:02.160 --> 00:36:16.320] So to test the idea, the team simulated seismic activity by, this is all in the lab, by applying this mechanical stress that they call it to quartz, which is submerged in a gold-rich solution, which is kind of like the hydrothermal fluids that I was talking about.
[00:36:16.320 --> 00:36:21.360] So they found that even moderate stress caused gold to accumulate on the quartz.
[00:36:21.440 --> 00:36:22.000] They saw it.
[00:36:22.000 --> 00:36:31.520] It happened in the lab and supporting their theory that this effect that happens to quartz could very well be responsible for the nugget formation.
[00:36:31.520 --> 00:36:36.720] So over time, the solid gold would strengthen the electric reaction and this speeds up the process.
[00:36:36.720 --> 00:36:39.040] So it's kind of like your retirement.
[00:36:39.040 --> 00:36:43.600] The longer it's there and the bigger it gets, the bigger it gets and it gets bigger faster.
[00:36:43.600 --> 00:36:52.480] So while Voise's findings have been thought of as very promising, of course, this is a perfect example of the scientific method.
[00:36:52.480 --> 00:36:59.880] Another scientist read his research and said, this guy's name is Thomas Blankensop.
[00:36:59.880 --> 00:37:01.800] Blankensop, that's his name.
[00:37:01.800 --> 00:37:03.800] You know, it's like I get the hard names, Kara.
[00:36:57.760 --> 00:37:05.240] You get Jack Smith, and I get Thomas Blenkinsop.
[00:37:06.440 --> 00:37:08.360] I also love you, like thinking of Blankensop.
[00:37:08.360 --> 00:37:09.800] Blankensop, that's a name.
[00:37:09.800 --> 00:37:10.200] Yep.
[00:37:10.360 --> 00:37:12.600] Yes, I mean, I've never read it before.
[00:37:12.600 --> 00:37:14.040] You know, it's interesting.
[00:37:14.040 --> 00:37:16.200] I have no idea what nationality that name is.
[00:37:16.200 --> 00:37:25.240] So he is a structural geologist at Cardiff University, and he pointed out that more research is needed to confirm if the process occurs in natural settings, right?
[00:37:25.240 --> 00:37:29.320] Again, this is the scientific method in, you know, live in front of your eyes.
[00:37:29.320 --> 00:37:37.320] He challenges that for this electric effect to work, the quartz veins need to be oriented in a way that generates the necessary electric charge.
[00:37:37.320 --> 00:37:41.240] And it's not clear if gold deposits are found in these ideal orientations.
[00:37:41.240 --> 00:37:49.320] And right out of the gate, another geologist comes in with a really interesting and semi-hard to understand situation.
[00:37:49.320 --> 00:37:59.880] Like, no, the quartz has to actually be in this perfect orientation in order for this to happen, which I could not find a deeper explanation on, but it must be out there somewhere.
[00:37:59.880 --> 00:38:09.480] So Voisey also suggests that this electric effect might be just one piece of the puzzle, and that there's probably other factors that contribute to how these nuggets form.
[00:38:09.480 --> 00:38:17.560] And as long as people want gold, right, as long as we're out there for the hunt for gold, the scientists will still be paid to continue to do research like this.
[00:38:17.560 --> 00:38:21.560] Now, this study is really awesome for lots of reasons.
[00:38:21.560 --> 00:38:27.320] Because it could lead to interesting changes in how gold mining and gold exploration are done, right?
[00:38:27.320 --> 00:38:37.560] If this effect is proven to be the way it works, then they will absolutely change what they look for and they'll sharpen how they look for places where gold deposits might be.
[00:38:37.560 --> 00:38:49.920] Now, this effect that quartz has, it's confirmed in actual real, if it's confirmed in real-world deposits, geologists would start focusing obviously on areas around the world where quartz is oriented.
[00:38:49.920 --> 00:38:51.360] Like that's going to be the key indicator.
[00:38:51.360 --> 00:38:54.000] They're going to look for giant quartz deposits.
[00:38:54.000 --> 00:38:59.840] And if it's oriented the right way, blah, blah, blah, then they could find a lot more gold, a lot more than typical.
[00:38:59.840 --> 00:39:07.040] So the lab replication of this process also suggests that gold could be synthesized in controlled conditions.
[00:39:07.040 --> 00:39:15.200] And this is a long way off, but it is very possible that they've found a way to collect gold better than any other way we have right now.
[00:39:15.200 --> 00:39:17.200] This discovery also does one other thing.
[00:39:17.200 --> 00:39:23.760] It opens the door to possibly finding, first off, we have to figure out if there's more complex mechanisms at play in the nugget formation, right?
[00:39:23.760 --> 00:39:25.200] We'll figure all that out.
[00:39:25.200 --> 00:39:34.080] And then once we start to really wrap our head around all that, we might be able to find other minerals that might be under similar circumstances.
[00:39:34.080 --> 00:39:42.640] So a thing that occurred to me as I was reading this was like, you know, if you just handed me a piece of quartz that had a bunch of gold hanging from it, I would say thank you and I'd go to the bank.
[00:39:42.640 --> 00:39:47.280] But before I did that, I would think, you know, how did the gold get there?
[00:39:47.280 --> 00:39:48.640] Why is it mixed in?
[00:39:48.640 --> 00:39:54.400] You know, was it because the quartz like developed in our in the Earth's ground, right?
[00:39:54.400 --> 00:39:57.120] The quartz is from mineral deposits and stuff like that.
[00:39:57.120 --> 00:39:59.360] But then why would gold be all interlaced in it?
[00:40:00.000 --> 00:40:00.960] I've seen this.
[00:40:00.960 --> 00:40:06.640] I've seen pictures of this before and never really understood the process or why it would come to be that way.
[00:40:06.640 --> 00:40:17.200] And again, usually the answer is much more complicated than you think, because I originally just thought it was just when the Earth formed and things kind of smashed together and the gold was there, and there's big chunks of gold.
[00:40:17.200 --> 00:40:21.680] No, there probably wasn't ever really big chunks of gold just sitting around.
[00:40:21.680 --> 00:40:25.760] They probably formed, which I find that to be fascinating, right?
[00:40:25.760 --> 00:40:32.840] Because there's so much stuff happening underground that we don't realize, especially if you're not a geologist and you don't like you aren't into that type of thing.
[00:40:29.680 --> 00:40:40.920] And yes, it takes a very long time, but those processes happen, and that's why you get all of these different types of formations of different minerals and stuff around the world.
[00:40:40.920 --> 00:40:43.160] There's reasons why these things happen.
[00:40:43.160 --> 00:40:50.680] So, anyway, I would love, I would absolutely love to find myself a giant piece of quartz with a whole bunch of gold on it.
[00:40:50.680 --> 00:40:52.040] Would that be an amazing find?
[00:40:52.040 --> 00:40:53.800] Imagine if you just could get your hands on that.
[00:40:53.800 --> 00:40:57.080] You ever watch those guys that, you know, they do the old panning?
[00:40:57.320 --> 00:41:03.800] They have like this bench and they have a whole bunch of water coming down the ramp, and there's all these little catch bait pockets in there.
[00:41:03.800 --> 00:41:10.040] And at the end of the day, after an entire day of, you know, they're putting in soil that is rich in gold.
[00:41:10.040 --> 00:41:11.640] Like they found soil that's rich in gold.
[00:41:11.640 --> 00:41:16.840] They do this panning process, and it's really advanced, like way, way more advanced than 100 years ago.
[00:41:16.840 --> 00:41:20.840] And they end up with literally like, I mean, guys, a thimble full.
[00:41:21.160 --> 00:41:23.000] Not even, no, not even, not even.
[00:41:23.000 --> 00:41:27.720] It's like a tiny little, like if you took a pinch of salt and dropped it in your hand, it'd be like half that.
[00:41:27.720 --> 00:41:29.560] And they're like, yeah, we really hit it today.
[00:41:29.560 --> 00:41:30.440] No, you didn't.
[00:41:30.440 --> 00:41:32.680] Like, oh my God, it's almost worth nothing.
[00:41:32.680 --> 00:41:35.240] Like, it's really hard to collect gold.
[00:41:35.240 --> 00:41:38.440] And the current processes that we have are super expensive.
[00:41:38.440 --> 00:41:50.440] They have to put a ton of soil into these reactors that do all these processes and tons of chemicals and all sorts of shit that they do to extract the gold and to change it from one form to another and all this craziness.
[00:41:50.440 --> 00:41:53.400] This thing right here could be a really big find for gold.
[00:41:53.400 --> 00:41:55.640] What do you mean, change it from one form to another?
[00:41:55.640 --> 00:41:57.480] Well, if you talk about alchemy.
[00:41:59.400 --> 00:42:06.360] I am talking about the chemistry of extracting gold in the soil because they have, if you looked, I watched videos of this.
[00:42:06.440 --> 00:42:09.160] So, maybe they need something to make it stick to something else.
[00:42:09.160 --> 00:42:10.520] Do they use cyanide for that?
[00:42:10.520 --> 00:42:11.640] Yes, they use cyanide.
[00:42:11.640 --> 00:42:23.920] They use all these, but they use different chemicals, and you see like the process as it goes, and you're like, okay, they got to do this, then they got to heat this thing up, and they get rid of all this stuff, and the stuff that's left over, then they mix it with this thing, and then they got to stir it for like 20 hours, and then they got to boil it again.
[00:42:24.000 --> 00:42:24.880] Then they got, you know what I mean?
[00:42:24.880 --> 00:42:31.760] Like, Kara, it's this ridiculous process, and at the end, yeah, they come out with gold, but like a thousand people worked on that.
[00:42:31.760 --> 00:42:34.080] Yeah, it doesn't seem worth it.
[00:42:34.080 --> 00:42:44.080] And it does turn a profit, but you know, still, like, imagine if this could really like make a really big difference with us finding gold.
[00:42:44.080 --> 00:42:46.560] And, you know, gold is ridiculously useful.
[00:42:47.280 --> 00:42:55.920] Imagine if gold became a lot more common and we were able to use it for a lot more things, it would be very helpful for future technology if that were the case.
[00:42:55.920 --> 00:43:00.880] Yeah, it's interesting to think of why certain minerals clump where they do.
[00:43:00.880 --> 00:43:07.920] One reason is like with gold, gold is one of the metals that's found in its pure form, not as an alloy, right?
[00:43:07.920 --> 00:43:09.040] I mean, not as an ore.
[00:43:09.040 --> 00:43:19.040] We don't, it's not like an iron ore where it's bound to chemically to some other substance, and we have to then figure, you know, we have to smelt it in order to purify it as iron.
[00:43:19.200 --> 00:43:20.880] When you find gold, it's just gold, right?
[00:43:20.880 --> 00:43:22.560] It's just pure gold.
[00:43:22.560 --> 00:43:26.880] And that's because it already has been purified in a way.
[00:43:26.880 --> 00:43:30.400] It's by clumps together by these reactions.
[00:43:30.400 --> 00:43:36.080] And you find it in veins and nuggets and everything because of the geological processes that form it.
[00:43:36.400 --> 00:43:38.720] Steve, we do smelt for gold, though.
[00:43:39.040 --> 00:43:43.680] Well, what I'm saying is, like, when you find a nugget of gold, that's gold, right?
[00:43:43.680 --> 00:43:47.920] It's not, it's not chemically anything else, it's just gold, right?
[00:43:47.920 --> 00:43:53.120] As opposed to, like, when you get iron ore, it's not iron, it's iron bound to something else.
[00:43:53.120 --> 00:43:53.760] You know what I'm saying?
[00:43:53.760 --> 00:43:54.240] That's the difference.
[00:43:54.400 --> 00:43:55.680] It's an ore, it's not the metal.
[00:43:56.160 --> 00:43:56.560] Yeah, you're right.
[00:43:56.560 --> 00:44:00.440] So the gold, you mean that the gold is already there in its complete form.
[00:44:00.440 --> 00:44:00.920] It's gold.
[00:43:59.920 --> 00:44:02.360] We just have to get it out.
[00:44:02.840 --> 00:44:08.840] Chemically, it's elemental gold, as opposed to, you know, sometimes it is bound to other things.
[00:44:08.840 --> 00:44:12.920] You know, gold can form chemical bonds, but you do find nuggets of pure gold.
[00:44:12.920 --> 00:44:15.320] Silver is another one that exists like that.
[00:44:15.320 --> 00:44:23.080] Whereas like with iron and copper and those, you're finding ores that you then have to aluminum has to go through that process.
[00:44:23.320 --> 00:44:24.360] Aluminum is like in the dirt.
[00:44:24.360 --> 00:44:27.080] Yeah, it's not like in clumps or nuggets or anything.
[00:44:27.080 --> 00:44:34.440] But like with iron, what's interesting is that sometimes the reason why you find a deposit is because a meteor landed there.
[00:44:34.440 --> 00:44:36.440] Like an iron meteor crashed there.
[00:44:36.440 --> 00:44:36.840] Yeah, yeah, yeah.
[00:44:37.000 --> 00:44:38.360] Those things are impressive.
[00:44:38.360 --> 00:44:40.280] And then they make samurai sword, legit.
[00:44:40.360 --> 00:44:42.600] They make samurai swords out of that iron.
[00:44:42.600 --> 00:44:47.160] So you're, you're like, not you, Steve, because I wish I had the money to buy you that sword.
[00:44:47.160 --> 00:44:51.240] But you could buy a sword made out of, you know, iron from outer space.
[00:44:51.800 --> 00:44:54.600] And I think that is like about as cool as it gets, right?
[00:44:54.600 --> 00:44:56.040] Like, oh my God.
[00:44:56.040 --> 00:45:04.280] Yeah, although, you know, yes, swords forged from meteoric iron are not really very functional, right?
[00:45:04.280 --> 00:45:07.400] Because it's not, unless you make steel out of it.
[00:45:07.400 --> 00:45:11.880] But then you're kind of getting rid of the meteoric form of the iron.
[00:45:11.880 --> 00:45:19.160] It would be cool to have, you know, how like the iron meteorites have that, it's like iron and nickel in a crystalline pattern.
[00:45:19.160 --> 00:45:20.360] That blue tint, yeah.
[00:45:20.360 --> 00:45:26.120] It's not just the blue, it has a specific pattern to it, like when they slice it and then polish it.
[00:45:26.120 --> 00:45:28.920] The Widmont Staten pattern, ever you guys hear that word?
[00:45:28.920 --> 00:45:29.880] I know all about it.
[00:45:29.880 --> 00:45:31.000] Oh, that one?
[00:45:31.000 --> 00:45:33.080] It's an iron, nickel, strong.
[00:45:33.080 --> 00:45:33.640] Anyway.
[00:45:33.640 --> 00:45:34.120] Yeah, it's cool.
[00:45:34.440 --> 00:45:35.960] Let's move on.
[00:45:36.280 --> 00:45:39.480] Evan, do I have plastic in my brain?
[00:45:39.480 --> 00:45:40.600] Oh, my gosh.
[00:45:40.600 --> 00:45:42.440] Yeah, plastic and the brain.
[00:45:42.440 --> 00:45:44.120] Isn't that interesting?
[00:45:44.120 --> 00:45:53.760] That you mentioned those two words, Steve, because you know, all of all the things I've learned in my years of skepticism are the association of the words brain and plastic, right?
[00:45:53.760 --> 00:45:55.280] We've talked about this before, right?
[00:45:55.760 --> 00:45:58.880] Brain plasticity or neuroplasticity.
[00:45:58.880 --> 00:45:59.440] Oh, that's funny.
[00:45:59.600 --> 00:46:00.800] Different kind of plastic, yeah.
[00:46:00.800 --> 00:46:03.440] I know, it comes up, but we've talked about it before.
[00:46:03.440 --> 00:46:06.640] You know, that's the brain's ability to change and adapt over a person's lifetime.
[00:46:06.720 --> 00:46:16.880] That's funny because I was talking one time, like when my daughter was like six or something, I'm talking to my wife about neuroscience, and I mentioned that the brain is plastic.
[00:46:16.880 --> 00:46:21.440] And my daughter, who was listening in the back seat, is like, the brain is plastic?
[00:46:22.640 --> 00:46:26.320] Explains where I'm talking about a different kind of plastic.
[00:46:26.320 --> 00:46:27.040] Right.
[00:46:28.320 --> 00:46:37.120] And so whenever a news item comes along and the words brain and plastic are there, well, that's where my mind immediately goes.
[00:46:37.120 --> 00:46:38.000] First thought.
[00:46:38.000 --> 00:46:40.160] Is that a heuristic of some kind, right?
[00:46:40.160 --> 00:46:41.760] Am I utilizing a shortcut?
[00:46:42.720 --> 00:46:50.080] So, but it's throwing me off because here's a news item with brain and plastic, but it's not about neuroplasticity.
[00:46:50.080 --> 00:46:53.280] It's about microplastics found in the human brain.
[00:46:54.560 --> 00:46:55.120] Yeah.
[00:46:56.000 --> 00:46:58.720] And more, you know, nanoplastics.
[00:46:58.720 --> 00:46:59.120] Oh, boy.
[00:46:59.840 --> 00:47:00.720] There you go.
[00:47:00.720 --> 00:47:01.120] Yeah.
[00:47:01.120 --> 00:47:04.480] Nanoplastics are less than a micron.
[00:47:04.480 --> 00:47:07.280] That's one one-thousandth of a millimeter.
[00:47:07.280 --> 00:47:14.560] Microplastics are just above that, but up to one thousandth of a millimeter.
[00:47:14.560 --> 00:47:15.280] Whatever.
[00:47:15.440 --> 00:47:17.680] They're just throwing nano around, I think.
[00:47:18.800 --> 00:47:19.520] Well, yeah, but no.
[00:47:19.680 --> 00:47:25.520] A thousandth of a millimeter is a micrometer, and a thousandth of a micrometer is a nanometer.
[00:47:25.520 --> 00:47:26.800] Right, the micron, right?
[00:47:26.800 --> 00:47:28.560] One thousandth of a millimeter is a micron.
[00:47:28.880 --> 00:47:32.040] So, nanoplastics are less than a micron by definition.
[00:47:29.840 --> 00:47:33.720] Three orders of magnitude.
[00:47:35.720 --> 00:47:37.400] Wait, what's right under micro, guys?
[00:47:37.400 --> 00:47:38.680] Like, look at the actual thing.
[00:47:38.680 --> 00:47:40.600] Is it something people would recognize?
[00:47:40.600 --> 00:47:43.160] What's the prefix that's just underneath micro?
[00:47:43.160 --> 00:47:44.760] It's nano.
[00:47:45.080 --> 00:47:46.840] Yeah, then that's why they're nano.
[00:47:47.640 --> 00:47:49.080] Micro is million, right?
[00:47:49.080 --> 00:47:50.600] And nano is billion.
[00:47:50.600 --> 00:47:51.400] Yeah.
[00:47:51.720 --> 00:47:53.560] It's milli, micro, nano.
[00:47:53.560 --> 00:47:58.520] Yeah, so they're saying underneath the cutoff for something that is micro, they're just jumping.
[00:47:58.520 --> 00:48:04.040] They're calling nano anything between nano and micro, and they're calling micro anything between micro and milli.
[00:48:04.040 --> 00:48:04.440] Right.
[00:48:04.440 --> 00:48:04.840] Yeah.
[00:48:04.840 --> 00:48:06.440] So I don't like that.
[00:48:06.440 --> 00:48:13.800] Well, I know that when you're talking about like nanotechnology and nanomaterials, they use 100 nanometers as the cutoff.
[00:48:13.800 --> 00:48:20.280] So like if you have one dimension less than 100 nanometers, that's a nano sheet.
[00:48:20.280 --> 00:48:23.880] If you have two dimensions, then that's a nano fiber.
[00:48:23.880 --> 00:48:26.040] If it's rolled up, then it's a nanotube.
[00:48:26.040 --> 00:48:29.000] If you have all three dimensions, then it's a nano dot.
[00:48:29.880 --> 00:48:33.400] But they use the 100 nanometers as the cutoff for nano.
[00:48:33.400 --> 00:48:33.960] That's reasonable.
[00:48:33.960 --> 00:48:35.000] Yeah, I mean, I guess.
[00:48:35.320 --> 00:48:38.040] I think it just depends on what materials you're using.
[00:48:38.040 --> 00:48:40.600] And also, it depends on how you want to look at the number line, right?
[00:48:40.600 --> 00:48:45.960] Because you would talk about kilograms as anything between kilo and mega, and then you would move to megagrams.
[00:48:45.960 --> 00:48:47.640] But you're going up in the number line.
[00:48:47.640 --> 00:48:53.480] So you guys are assuming you should go down in the number line, but what if you always go up in the number line, even if you're on the other side of it?
[00:48:53.480 --> 00:48:55.160] Does that make sense, what I just said?
[00:48:55.160 --> 00:48:55.560] I think so.
[00:48:57.240 --> 00:49:03.800] Anything from 10 to the negative 9 to 10 to the negative 6, why would we call that micro and not nano?
[00:49:03.800 --> 00:49:08.360] Because you're saying you should move away from the number line, whether it's in the positive or the negative.
[00:49:08.360 --> 00:49:09.800] Or sorry, whether it's, yeah, in the bigger.
[00:49:09.960 --> 00:49:12.760] They just decided, though, it was 100 nanometers.
[00:49:13.080 --> 00:49:13.480] You're right.
[00:49:13.960 --> 00:49:14.280] Exactly.
[00:49:14.280 --> 00:49:14.800] These are all different.
[00:49:15.120 --> 00:49:15.600] They just decided.
[00:49:16.480 --> 00:49:18.080] Had to make a call here of some kind.
[00:49:14.760 --> 00:49:20.800] But microplastics and nanoplastics.
[00:49:21.360 --> 00:49:27.040] These terms are often used, I think, interchangeably, but they each have a specific size.
[00:49:27.040 --> 00:49:34.720] And a microplastic can be up to five millimeters, which is what, the size of a grain of rice or so?
[00:49:34.720 --> 00:49:35.120] Yeah, yeah.
[00:49:35.440 --> 00:49:36.560] So we're talking about the brain.
[00:49:36.560 --> 00:49:37.280] We're talking about what?
[00:49:37.280 --> 00:49:38.880] Microplastics and nanoplastics?
[00:49:38.880 --> 00:49:39.200] And I'm thinking about it.
[00:49:39.520 --> 00:49:41.360] Because nobody's going to call it a milliplastic.
[00:49:41.600 --> 00:49:42.480] Gosh, right?
[00:49:42.480 --> 00:49:42.880] Yeah.
[00:49:42.880 --> 00:49:43.120] Right?
[00:49:43.360 --> 00:49:48.160] But are doctors finding plastic shards the size of rice grains in people's brains?
[00:49:48.640 --> 00:49:51.120] Is that what this news item is about to tell me?
[00:49:51.280 --> 00:49:52.960] How horrifying would that be?
[00:49:52.960 --> 00:49:55.280] Most doctors aren't just digging around in people's brains.
[00:49:56.000 --> 00:49:57.760] Well, I mean, it would be an autopilot.
[00:49:58.400 --> 00:49:59.040] Yeah, it would be a plastic.
[00:49:59.200 --> 00:49:59.760] Yeah, that's right.
[00:50:00.000 --> 00:50:01.200] And that's what this is.
[00:50:01.520 --> 00:50:04.480] And CNN, they did a good job of covering this one.
[00:50:05.040 --> 00:50:11.360] There's their headline: Tiny shards of plastic are increasingly infiltrating our brains, a study says.
[00:50:11.360 --> 00:50:17.200] And the author of the article is Sandy Lamotte, L-A-M-O-T-T.
[00:50:17.200 --> 00:50:18.480] And here's the first line.
[00:50:18.480 --> 00:50:20.400] I like how she starts this.
[00:50:20.400 --> 00:50:32.400] Human brain samples collected at autopsy in early 2024 contained more tiny shards of plastic than samples collected eight years prior, according to a preprint posted online in May.
[00:50:32.400 --> 00:50:38.880] And then she writes, a preprint is a study which has not yet been peer-reviewed and published in a journal.
[00:50:38.880 --> 00:50:41.760] So, first of all, thanks for saying that right from the top.
[00:50:41.760 --> 00:50:44.160] That's a preprint article.
[00:50:44.160 --> 00:50:46.720] And what a preprint article is.
[00:50:46.720 --> 00:50:49.520] We've talked about preprinted articles before, right?
[00:50:49.520 --> 00:50:52.200] And we've cited a lot of the benefits of those.
[00:50:52.000 --> 00:50:52.360] Right.
[00:50:52.800 --> 00:50:55.040] And the ways you should be, you know, skeptical, article too.
[00:50:55.040 --> 00:50:55.520] Yeah, cautious.
[00:50:55.680 --> 00:50:57.200] Right, you should be cautious, right?
[00:50:57.240 --> 00:51:01.640] And but it's important to remember the preprint studies, they've not been pee
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
0:49:48.640 --> 00:49:51.120] Is that what this news item is about to tell me?
[00:49:51.280 --> 00:49:52.960] How horrifying would that be?
[00:49:52.960 --> 00:49:55.280] Most doctors aren't just digging around in people's brains.
[00:49:56.000 --> 00:49:57.760] Well, I mean, it would be an autopilot.
[00:49:58.400 --> 00:49:59.040] Yeah, it would be a plastic.
[00:49:59.200 --> 00:49:59.760] Yeah, that's right.
[00:50:00.000 --> 00:50:01.200] And that's what this is.
[00:50:01.520 --> 00:50:04.480] And CNN, they did a good job of covering this one.
[00:50:05.040 --> 00:50:11.360] There's their headline: Tiny shards of plastic are increasingly infiltrating our brains, a study says.
[00:50:11.360 --> 00:50:17.200] And the author of the article is Sandy Lamotte, L-A-M-O-T-T.
[00:50:17.200 --> 00:50:18.480] And here's the first line.
[00:50:18.480 --> 00:50:20.400] I like how she starts this.
[00:50:20.400 --> 00:50:32.400] Human brain samples collected at autopsy in early 2024 contained more tiny shards of plastic than samples collected eight years prior, according to a preprint posted online in May.
[00:50:32.400 --> 00:50:38.880] And then she writes, a preprint is a study which has not yet been peer-reviewed and published in a journal.
[00:50:38.880 --> 00:50:41.760] So, first of all, thanks for saying that right from the top.
[00:50:41.760 --> 00:50:44.160] That's a preprint article.
[00:50:44.160 --> 00:50:46.720] And what a preprint article is.
[00:50:46.720 --> 00:50:49.520] We've talked about preprinted articles before, right?
[00:50:49.520 --> 00:50:52.200] And we've cited a lot of the benefits of those.
[00:50:52.000 --> 00:50:52.360] Right.
[00:50:52.800 --> 00:50:55.040] And the ways you should be, you know, skeptical, article too.
[00:50:55.040 --> 00:50:55.520] Yeah, cautious.
[00:50:55.680 --> 00:50:57.200] Right, you should be cautious, right?
[00:50:57.240 --> 00:51:01.640] And but it's important to remember the preprint studies, they've not been peer-reviewed yet.
[00:51:01.640 --> 00:51:01.880] Exactly.
[00:50:59.680 --> 00:51:05.800] So you have to, so you have to weigh all of this appropriately.
[00:51:05.960 --> 00:51:08.520] And it was good that they kind of get that out right at the beginning.
[00:51:08.520 --> 00:51:10.200] They link right to the source.
[00:51:10.200 --> 00:51:14.600] Oh, by the way, preprints are free, no paywalls.
[00:51:14.600 --> 00:51:19.640] You can go see them, and that's another benefit of having preprints.
[00:51:19.640 --> 00:51:32.040] Here's the study: Bioaccumulation of Microplastics in decedent human brains assessed by pyrolysis gas chromatography mass spectrometry.
[00:51:32.040 --> 00:51:32.760] Good job.
[00:51:33.000 --> 00:51:33.640] Nice one.
[00:51:33.640 --> 00:51:34.040] Thank you.
[00:51:34.040 --> 00:51:34.600] Yeah.
[00:51:34.600 --> 00:51:35.160] Yeah.
[00:51:35.160 --> 00:51:36.920] Which I definitely had to do some looking at.
[00:51:37.000 --> 00:51:38.360] Evan, I don't even know how you did that.
[00:51:38.840 --> 00:51:40.440] Yeah, slowly.
[00:51:40.440 --> 00:51:41.160] Slowly.
[00:51:41.160 --> 00:51:43.160] And I practiced.
[00:51:43.480 --> 00:51:48.440] If you chunk the word, break it up, and it's easier to pronounce in bite-sized chunks rather than all of them.
[00:51:48.520 --> 00:51:49.480] Oh, is that how you do about it?
[00:51:49.560 --> 00:51:50.600] Is it gestalt?
[00:51:51.240 --> 00:51:54.680] Sometimes if it's difficult for me.
[00:51:54.680 --> 00:51:55.960] Nice use of gestalt.
[00:51:56.200 --> 00:51:56.760] I like that.
[00:51:56.760 --> 00:52:01.160] And for me, it's words like spectrometry that trip me up.
[00:52:01.160 --> 00:52:02.520] There are two R's in that word.
[00:52:03.160 --> 00:52:05.720] But it sounds like there should only be one R.
[00:52:05.720 --> 00:52:10.440] That's why I always say spec, because I always mix up spectroscopy and spectrometry.
[00:52:10.920 --> 00:52:12.440] I just say spec.
[00:52:12.760 --> 00:52:13.800] Smart and aspect.
[00:52:14.920 --> 00:52:18.680] All right, here are the highlights from the abstract and then from the conclusion.
[00:52:18.680 --> 00:52:20.200] I'll just touch on these.
[00:52:20.200 --> 00:52:20.920] Abstract.
[00:52:20.920 --> 00:52:23.720] We compared MNP accumulation.
[00:52:23.720 --> 00:52:28.360] Now, MNP means micro and nanoplastics, MNP.
[00:52:28.360 --> 00:52:32.600] We compared MNP accumulation in kidneys, livers, and brains.
[00:52:32.600 --> 00:52:38.600] These were autopsy samples collected in 2016 and in 2024.
[00:52:39.560 --> 00:52:41.960] It was an analysis of 12 polymers.
[00:52:41.960 --> 00:52:47.760] All organs exhibited significant increases from 2016 to 2024.
[00:52:44.680 --> 00:52:50.480] Polyethylene was the predominant polymer.
[00:52:50.720 --> 00:52:57.200] The polyethylene MNPs were greater in the brain samples than in the liver or kidney.
[00:52:57.200 --> 00:52:57.680] Oh my God.
[00:52:57.840 --> 00:52:58.000] Right?
[00:52:58.240 --> 00:52:59.760] Both bioaccumulating.
[00:53:00.240 --> 00:53:04.160] Yeah, for some reason it's gathering, it likes the brain.
[00:53:04.400 --> 00:53:05.520] And they verified it.
[00:53:05.520 --> 00:53:16.800] They verified the nanoscale nature of the isolated particles, which largely appeared to be aged shard-like plastics, remnants across a wide range of sizes.
[00:53:17.440 --> 00:53:18.320] Not good.
[00:53:18.320 --> 00:53:28.960] From the conclusion, MNP concentrations in decedent brain samples ranged from 7 to 30 times the concentrations seen in the livers and kidneys.
[00:53:28.960 --> 00:53:41.040] The majority of these are on the nanometer scale, so not the micro scale, the nanometer scale, the small, small pieces, but they are shard-like particles, definitely in their shape.
[00:53:41.040 --> 00:53:49.840] Lead study author Matthew Campen, who is a regions professor of pharmaceutical sciences at the University of New Mexico in Albuquerque.
[00:53:49.840 --> 00:53:52.080] He had some things to say about this.
[00:53:52.080 --> 00:53:59.600] He says, somehow these nanoplastics hijack their way through the body and get to the brain, crossing the blood-brain barrier.
[00:53:59.600 --> 00:54:11.520] Plastics love fats or lipids, so one theory is that the plastics are hijacking their way with the fats we eat when they are delivered to the organs that really like lipids.
[00:54:11.520 --> 00:54:14.240] The brain is top among those.
[00:54:14.240 --> 00:54:30.600] He also says the concentrations we saw in the brain tissue of normal individuals who had an average age of around 40 or 45 or 50 years old were 4,800 micrograms per gram, or 1 half percent by weight.
[00:54:30.760 --> 00:54:35.880] That's interesting that their average age of decedence was so young in their sample.
[00:54:29.920 --> 00:54:36.200] Yeah.
[00:54:36.520 --> 00:54:37.880] 45 to 50?
[00:54:37.880 --> 00:54:39.880] I wonder what bank they were going to have.
[00:54:40.040 --> 00:54:41.320] Yeah, why that?
[00:54:41.480 --> 00:54:43.320] Maybe they intentionally chose younger brains.
[00:54:43.320 --> 00:54:43.880] I don't know.
[00:54:43.880 --> 00:54:49.320] 92 people underwent the forensic autopsy to verify cause of death in both 2016 and 2024.
[00:54:49.320 --> 00:54:52.600] Brain tissue samples were gathered from the frontal cortex.
[00:54:53.400 --> 00:54:55.800] He went on to say, oh, that's frontal lobe.
[00:54:56.200 --> 00:54:57.560] This is cortical.
[00:54:57.560 --> 00:54:59.960] It's not like in the ventricle or something.
[00:55:00.280 --> 00:55:08.440] Because they were talking about perhaps what, a potential connection for Alzheimer's.
[00:55:08.600 --> 00:55:11.080] Yeah, we've seen lots of studies where they're trying to do it.
[00:55:11.800 --> 00:55:16.440] Again, they really, I think, more just touched on that rather than made that kind of the focus of this.
[00:55:16.600 --> 00:55:24.360] Steve, does that interest you that they were actually looking at cortical tissue and not like the lining of ventricles or like deeper structures?
[00:55:24.360 --> 00:55:25.080] Yeah, definitely.
[00:55:25.080 --> 00:55:27.000] I mean, you have to wonder how is it getting there?
[00:55:27.400 --> 00:55:27.880] Right.
[00:55:28.280 --> 00:55:37.560] Is it like, I was actually thinking, is there some process like the gold where it's like binding together from the blood and or whatever you're passing through the tissue?
[00:55:37.560 --> 00:55:39.720] I'm not really sure how it would get there.
[00:55:39.720 --> 00:55:51.560] Yeah, I guess I'd be curious, and Evan, I doubt you saw this, but like on histology, I'd be curious if it's almost like coating the brain or if it's like deep in the matrix, like deep into the cellular tissue.
[00:55:51.960 --> 00:55:57.480] And I think one of the things they said is they're unsure if the particles are fluid, right?
[00:55:57.480 --> 00:56:05.880] Kind of entering and leaving the brain or if they're collecting in the tissues and right, so are they just bathing it or are they working their way in?
[00:56:06.760 --> 00:56:14.280] So, but they definitely, the authors are the first to admit that further research is really needed here to understand what's going on.
[00:56:14.280 --> 00:56:18.240] I mean, there is just one study, it's not peer-reviewed yet, right?
[00:56:18.240 --> 00:56:23.920] So, again, this is where you have to take this a little bit with a grain of salt.
[00:56:23.920 --> 00:56:34.240] But it also makes you wonder, like, if they saw it in like all their brains and really they're just comparing concentrations, I think we're going to start seeing a lot of this research.
[00:56:34.240 --> 00:56:46.000] And they're saying that this study does not talk about brain damage or any sort of right, what the impact it's having, if at all, frankly.
[00:56:46.000 --> 00:56:52.560] I mean, how it are, I mean, it must be doing something, but is it so negligible that it's not having an impact?
[00:56:52.560 --> 00:56:59.600] I mean, definitely worthy of more study, but they're saying that you can't just draw too many conclusions off of this one study yet.
[00:56:59.600 --> 00:57:01.600] A lot more has to be done here.
[00:57:01.600 --> 00:57:02.720] All right, thanks, Evan.
[00:57:02.720 --> 00:57:03.040] Yep.
[00:57:03.040 --> 00:57:06.480] All right, Bob, tell us about quantum neural networks.
[00:57:06.480 --> 00:57:07.120] Ooh.
[00:57:07.120 --> 00:57:08.000] Ooh.
[00:57:08.000 --> 00:57:09.040] Or that sounds like bullshit.
[00:57:09.200 --> 00:57:11.280] Sounds a little deep park chopper-y to me.
[00:57:11.440 --> 00:57:11.760] Yeah.
[00:57:11.760 --> 00:57:16.800] I was like, ooh, and then I was like, quantum will do that to you.
[00:57:17.760 --> 00:57:18.560] All right, guys.
[00:57:18.560 --> 00:57:26.720] A researcher has devised a quantum neural network that appears to interpret some optical illusions the way people see them.
[00:57:26.720 --> 00:57:28.960] That really was so fascinating.
[00:57:28.960 --> 00:57:30.800] This is by Ivan S.
[00:57:30.800 --> 00:57:34.320] Naksimov, and he published in the journal APL Machine Learning.
[00:57:34.320 --> 00:57:39.840] The title of his study is Quantum Tunneling: Deep Neural Network for Optical Illusion Recognition.
[00:57:39.840 --> 00:57:43.120] So a machine learning system falling prey to optical illusions.
[00:57:43.120 --> 00:57:47.840] It really caught my attention this week, but my buddy the devil is in the details, as they say.
[00:57:47.840 --> 00:57:49.360] So, what's going on here?
[00:57:49.600 --> 00:57:52.000] As usual, we need to set the table a bit.
[00:57:52.000 --> 00:57:56.320] So, let's start with neural networks, which we've covered here and there on the show before.
[00:57:56.320 --> 00:57:57.680] Let's do a quick overview.
[00:57:57.680 --> 00:58:05.400] Neural networks are a branch of machine learning that attempts to process data in a way that is loosely, very loosely inspired by the human brain.
[00:58:05.720 --> 00:58:12.520] It consists of a layer of interconnected artificial neuron-like elements that allow it to store and classify data.
[00:58:12.520 --> 00:58:21.080] Imagine you have a signal arriving to an artificial neuron, and it's met by this traffic cop, and that traffic cop is called an activation function.
[00:58:21.080 --> 00:58:35.240] This function looks at all the signals that are coming in from other neurons, and it applies these weights or importance to those signals, and then decides if that signal is worthy of being passed on to the next neuron or layer.
[00:58:35.240 --> 00:58:36.200] Okay, you got that?
[00:58:36.200 --> 00:58:44.600] This is the activation function, which determines whether these signals have reached a level where they can be passed to the next level of neurons.
[00:58:44.840 --> 00:58:48.280] And when I say neuron, I'm just gonna, it's gonna be, I mean artificial neuron.
[00:58:48.280 --> 00:58:49.640] These aren't real neurons here.
[00:58:49.640 --> 00:58:55.560] So, as the neuron receives the right signals, it gets closer and closer to being activated, right?
[00:58:55.560 --> 00:58:57.560] It's like climbing a tall wall.
[00:58:57.560 --> 00:59:02.920] Eventually, it reaches the top of the wall, and the signal can then go to the next neuron.
[00:59:03.400 --> 00:59:04.760] It would get passed on.
[00:59:04.760 --> 00:59:08.920] In other words, the activation function allows the signal to continue.
[00:59:09.160 --> 00:59:13.880] Sorry, Bob, to be clear, just and I feel like you clarified this, but I'm still confused.
[00:59:13.880 --> 00:59:14.280] No, that's fine.
[00:59:14.680 --> 00:59:18.280] An artificial neuron is software, not hardware, right?
[00:59:18.280 --> 00:59:20.200] Right, yeah, this is all part of the.
[00:59:20.360 --> 00:59:23.480] This is all decision-making gates on off, yes, no.
[00:59:23.480 --> 00:59:24.200] Okay, making sure.
[00:59:24.200 --> 00:59:30.680] So, Maximov here is using not just a neural network, though, but he's employing a quantum neural network.
[00:59:30.680 --> 00:59:33.800] And this is not that as complicated as you might think.
[00:59:33.800 --> 00:59:37.960] It's essentially a neural network that uses some principle of quantum mechanics.
[00:59:37.960 --> 00:59:41.480] So, in this case, the primary principle here is quantum tunneling.
[00:59:41.640 --> 00:59:44.200] Quantum tunneling, we've mentioned this a couple times.
[00:59:44.200 --> 00:59:53.440] Quantum tunneling occurs when a particle like an electron, say, or an atom, passes through a potential energy barrier that's higher in energy than the particle's kinetic energy.
[00:59:53.440 --> 00:59:54.800] So this is actually real.
[00:59:54.800 --> 00:59:58.320] Quantum tunneling is one of the more bizarre aspects of quantum mechanics.
[00:59:58.640 --> 00:59:59.840] This is the real deal.
[01:00:00.080 --> 01:00:03.840] Tunneling is a critical component to make stars shine.
[01:00:03.840 --> 01:00:09.200] Now, you know how hydrogen atoms in a star have to get close enough to fuse into helium and release the energy, right?
[01:00:09.200 --> 01:00:10.000] That's a kind of.
[01:00:10.000 --> 01:00:11.600] It's from the nuclear force, right?
[01:00:12.560 --> 01:00:14.240] That's what the star does.
[01:00:14.240 --> 01:00:27.520] It turns out that much of the time, though, the hydrogen gets close enough, it gets kind of close, but it goes that final little distance because quantum tunneling allows it to go that last little bit that it's needed to fuse.
[01:00:27.520 --> 01:00:33.360] So without quantum tunneling, stars basically would not be able to fuse hardly much at all.
[01:00:33.360 --> 01:00:38.160] I mean, it would still fuse some, but it wouldn't be able to fuse nearly as much as they do.
[01:00:38.160 --> 01:00:40.880] And stars themselves might not even exist.
[01:00:40.880 --> 01:00:44.880] And the universe would certainly be in a very different place than it is today.
[01:00:44.880 --> 01:00:50.560] So yeah, quantum tunneling is bizarre, it's real, and it's critical to the universe as we know it.
[01:00:50.560 --> 01:00:52.720] And that's only one of the things that it's critical for.
[01:00:52.720 --> 01:00:55.760] It's also critical for radioactive decay, blah, blah, blah.
[01:00:55.760 --> 01:00:56.240] All right.
[01:00:56.240 --> 01:01:07.440] Anywho, so putting it all together then, Maximov's quantum neural network allows the signal to pass right through that wall and not to have to slowly climb it signal after signal.
[01:01:07.440 --> 01:01:14.640] So put another way, a form of quantum tunneling is now part of the activation function, that traffic cop.
[01:01:14.800 --> 01:01:22.880] Quantum tunneling is part of that function, and every now and then it's going to say, all right, you don't have to climb the wall, you're just going to go right through the wall, in a sense.
[01:01:22.880 --> 01:01:35.880] Or, you know, more realistically, you're going to be, you're going to activate this neuron, this artificial neuron, will be activated immediately, and you don't have to go through the long, laborious process of collecting lots of signals before it's allowed.
[01:01:36.040 --> 01:01:38.040] So, that's that's basically what he implemented here.
[01:01:38.120 --> 01:01:41.160] No one's ever, no one's ever done this before.
[01:01:41.160 --> 01:01:44.040] So, and this is now where the optical illusion part comes in.
[01:01:44.040 --> 01:01:57.480] Maximov trained his quantum neural network on two famous optical illusions that I'm sure all of you have heard about, or at least will recognize once I describe them: the Necker cube and the Rubens vases or faces illusion.
[01:01:57.480 --> 01:02:05.880] Now, the Necker cube, that's just imagine a wireframe cube that's in that's three-dimensional, appears to be three-dimensional.
[01:02:05.880 --> 01:02:10.600] So, your brain can interpret it as oriented one way or the other, right?
[01:02:10.600 --> 01:02:13.320] Depending on which one you focus on.
[01:02:13.320 --> 01:02:19.880] So, that's just an example of these ambiguous optical illusions where your brain kind of decides which way it goes.
[01:02:19.880 --> 01:02:23.480] The other one that's even more famous, the Rubens vases.
[01:02:23.640 --> 01:02:31.560] This is the one where it looks like a vase, but it also can look like the negative space could also appear to be two faces that are looking at each other.
[01:02:31.560 --> 01:02:32.600] And you go back and forth.
[01:02:32.600 --> 01:02:36.600] You see a vase, then you see two faces looking at each other, and you go back and forth.
[01:02:36.600 --> 01:02:38.280] So, that's the Rubens vases.
[01:02:38.280 --> 01:02:41.720] I'm sure you guys have all seen the vase one, right?
[01:02:41.720 --> 01:02:42.120] Yeah.
[01:02:43.080 --> 01:02:53.400] So, Maximov then says, Researchers believe we temporarily hold both interpretations at the same time until our brains decide which picture should be seen.
[01:02:53.400 --> 01:02:59.720] Now, this is an important, really important statement, maybe one of the most important statements in his paper.
[01:02:59.720 --> 01:03:04.520] And he even references five studies to back that statement up.
[01:03:05.000 --> 01:03:12.040] That idea that the brain, that researchers believe we temporarily hold both interpretations at the same time.
[01:03:12.040 --> 01:03:24.880] Now, that struck me as really odd because I've read about optical illusions many, many times, and they always seem to say that you can't hold both images in their mind, you know, in your minds at once.
[01:03:24.880 --> 01:03:30.080] So now he's saying that, oh, researchers think we can now, and that was very surprising to me.
[01:03:30.080 --> 01:03:36.800] I looked at he references five studies, and I didn't really have time to dive into the five studies, but I mean, that's what he's saying here.
[01:03:36.800 --> 01:03:49.920] He says in his paper, in the paper proper, he says that those studies have been interpreted as the ability of humans to see a quantum-like superposition of one and zero states.
[01:03:49.920 --> 01:03:53.680] So he's so quantum superposition, we've mentioned that on the show as well.
[01:03:53.920 --> 01:04:02.080] That's this idea that you're not one or the other, but you're some unusual mixture amalgamation of both of them at the same time.
[01:04:02.080 --> 01:04:03.680] That's what a superposition is.
[01:04:03.680 --> 01:04:08.320] You know, like, for example, an electron being spin up and spin down at the same time.
[01:04:08.320 --> 01:04:17.440] It's in the superposition of up-down until you interact with it or until something interacts with it, and then the wave function collapse and it picks up or down, one or the other.
[01:04:17.440 --> 01:04:26.400] So he's saying that, so humans basically have this ability to have a superposition of these faces/vases in their minds.
[01:04:26.400 --> 01:04:28.640] And that just struck me as very odd.
[01:04:28.640 --> 01:04:30.000] I had never heard of any of that before.
[01:04:30.000 --> 01:04:36.640] Steve, have you guys ever heard of anything like that about human perception taking advantage of quantum superposition?
[01:04:37.120 --> 01:04:37.760] So, yeah.
[01:04:37.760 --> 01:04:39.600] So, I'm a little skeptical of that for sure.
[01:04:39.920 --> 01:04:43.360] Yeah, because I feel like that's often used in like the woo-woo, what the bleep do we know about it?
[01:04:43.440 --> 01:04:44.560] Exactly, exactly.
[01:04:44.560 --> 01:04:49.920] And the reason why I'm belaboring this point is because it becomes obviously more important in a moment.
[01:04:50.240 --> 01:05:00.280] Okay, so when presented with these optical illusions, Maximov's neural network would sometimes describe a vase, basically saying, Yeah, hey, I'm seeing a vase here.
[01:05:00.440 --> 01:05:08.280] I don't know what his interface is with these quantum neural networks was, but essentially, if the neural network could speak, it would say, I see a vase.
[01:05:08.280 --> 01:05:10.600] And other times, it would see...
[01:05:10.600 --> 01:05:14.120] it would say, I see a face, just like a human does.
[01:05:14.120 --> 01:05:27.080] And that's interesting, but it's not unique because there are other implementations of neural networks, not necessarily quantum though, but other AI systems have had similar behavior.
[01:05:27.080 --> 01:05:31.560] They have also said, I see a vase, and then five minutes later, oh, I see a face.
[01:05:31.560 --> 01:05:38.040] So they go back and forth, which is interesting, but it's not unique for his implementation there.
[01:05:38.040 --> 01:05:51.960] Now, he says this, though, but in addition, my network produced some ambiguous results hovering between the two certain outputs, much like our own brains can hold both interpretations together before settling on one.
[01:05:51.960 --> 01:05:54.360] Now, this is what he claims is unique.
[01:05:54.360 --> 01:06:01.400] This is what sets his implementation apart from other neural networks, and this is unique to his quantum neural network.
[01:06:01.400 --> 01:06:17.640] The fact, what he's saying is that the fact that sometimes his neural network doesn't describe a face or a vase, but it seems to describe something that has elements of both at the same time, like humans appear to, according to the studies that he cites.
[01:06:17.640 --> 01:06:18.760] Okay, you got that?
[01:06:18.760 --> 01:06:30.280] So he thinks he's really onto something here because both people and his quantum neural network appear to be able to perceive images this way in this apparent superposition.
[01:06:30.280 --> 01:06:33.240] That's what sets his apart according to him.
[01:06:33.240 --> 01:06:48.160] So, my biggest problem, if you can't figure it out with his paper, his main achievement here seems to rely on this idea that human perception relies on quantum superposition at some level, which, as far as I can tell, is not the consensus at all.
[01:06:44.920 --> 01:06:55.120] You know, there may have been, I look, I did see references to some studies that may be interpreted that way.
[01:06:56.880 --> 01:06:59.040] Interpretated, yeah, interpreted that way.
[01:06:59.280 --> 01:07:04.160] But that's far, far from five sigma confidence levels at all.
[01:07:04.160 --> 01:07:10.560] But that said, all that said, Maximov does say the following in an article he wrote recently.
[01:07:10.560 --> 01:07:21.600] He said, while some have suggested that quantum effects play an important role in our brains, even if they don't, we may still find the laws of quantum mechanics useful to model human thinking.
[01:07:21.600 --> 01:07:37.840] So, okay, so here he's kind of saying, yeah, it kind of makes me think that maybe he doesn't totally buy this idea that quantum mechanics is critical or very, very important to human cognition or perception, which is not the consensus at all.
[01:07:38.080 --> 01:07:39.760] So that's the crux right there.
[01:07:39.760 --> 01:07:41.200] To me, this is the crux.
[01:07:41.200 --> 01:07:47.520] Does his study show that the laws of quantum mechanics might be useful to model human thinking?
[01:07:47.520 --> 01:07:49.600] And my answer to that is, maybe.
[01:07:49.600 --> 01:07:57.920] You know, maybe this type of modeling where you're modeling some aspect of human cognition using quantum mechanics, it may be a useful model.
[01:07:57.920 --> 01:08:02.400] It may have some interesting results that have some utility.
[01:08:02.400 --> 01:08:04.960] And I think that would be certainly very interesting.
[01:08:04.960 --> 01:08:09.360] And I think this is actually worth exploring more deeply because, I mean, who knows?
[01:08:09.360 --> 01:08:20.080] Even though the weird counterintuitive aspects of quantum mechanics, maybe it's not, you know, it's probably, in my mind, it doesn't seem like it's part of a critical aspect of human thinking.
[01:08:20.080 --> 01:08:35.720] But if you can incorporate that into these quantum neural networks and have some and create some interesting models and get some insights into how humans think, or even just creating some very new and interesting tools, then I think this would be definitely worth some further research and some extra money for this.
[01:08:36.120 --> 01:08:43.720] But it doesn't mean that consciousness is a quantum phenomenon and that we are entangled with the universe.
[01:08:43.720 --> 01:08:44.040] Right.
[01:08:44.280 --> 01:08:45.000] Absolutely not.
[01:08:45.000 --> 01:08:45.560] It doesn't mean that.
[01:08:45.880 --> 01:08:46.920] It doesn't mean that at all.
[01:08:46.920 --> 01:08:53.640] And it can prove to be potentially a very helpful, you know, helpful model that can give us some insights.
[01:08:53.640 --> 01:09:00.440] But even then, in that case, which is a big if, if that happens, it doesn't even necessarily mean that it doesn't mean it either.
[01:09:01.000 --> 01:09:04.920] It's just a model, an interesting model that has utility.
[01:09:04.920 --> 01:09:09.960] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Quince.
[01:09:09.960 --> 01:09:14.680] Yeah, guys, my wife and I have been shopping at Quince for years, and they're great.
[01:09:14.680 --> 01:09:17.160] I mean, let me tell you the breakdown here.
[01:09:17.240 --> 01:09:21.000] I recently got their Mongolian Cashmere pullover hoodie.
[01:09:21.000 --> 01:09:23.640] And when you hear Cashmere, you're like, it's going to cost a lot of money.
[01:09:23.640 --> 01:09:27.320] Well, the cool thing about Quince is that their prices are awesome.
[01:09:27.320 --> 01:09:28.360] And they're way less.
[01:09:28.360 --> 01:09:31.640] They're like 50 to 80% less than similar brands.
[01:09:31.640 --> 01:09:33.400] And these items are freaking great, man.
[01:09:33.400 --> 01:09:38.280] They're super well constructed, and that's important to me because I buy something and I want to have it for a long time.
[01:09:38.280 --> 01:09:41.400] They've got other stuff like leather jackets, cotton cardigans.
[01:09:41.400 --> 01:09:42.920] They have, you know, jeans.
[01:09:42.920 --> 01:09:43.880] They have a ton of stuff.
[01:09:43.880 --> 01:09:45.880] You just got to go to the website and check it out.
[01:09:45.880 --> 01:09:49.240] Get cozy in Quince's high-quality wardrobe essentials.
[01:09:49.240 --> 01:09:56.680] Go to quince.com/slash SGU for free shipping on your order and 365-day returns.
[01:09:56.680 --> 01:10:04.520] That's q-u-i-n-ce-e.com/slash s-g-u to get free shipping and 365-day returns.
[01:10:04.520 --> 01:10:07.160] Quince.com/slash SGU.
[01:10:07.160 --> 01:10:09.720] All right, guys, let's get back to the show.
[01:10:10.680 --> 01:10:13.400] All right, I'm going to finish up with a very quick news item.
[01:10:13.400 --> 01:10:15.680] I'm going to start by asking you a question.
[01:10:16.000 --> 01:10:25.200] So, which animals have been shown to use individual names to identify members of their own group?
[01:10:25.200 --> 01:10:26.240] Oh, interesting.
[01:10:26.240 --> 01:10:26.880] Oh, yeah.
[01:10:26.880 --> 01:10:30.400] Well, it was Tennessee Tuxedo and his Walrus Chumley.
[01:10:30.640 --> 01:10:31.600] That's one.
[01:10:31.600 --> 01:10:32.240] Dolphins.
[01:10:32.640 --> 01:10:33.840] Dolphins is correct.
[01:10:33.840 --> 01:10:34.400] Oh, I know.
[01:10:34.400 --> 01:10:35.040] I know for sure.
[01:10:35.040 --> 01:10:37.040] Oh, you meant, like, in reality.
[01:10:37.440 --> 01:10:38.720] Yes, in the real world.
[01:10:39.920 --> 01:10:40.640] Humans for sure.
[01:10:41.120 --> 01:10:44.560] It was humans other than humans, dolphins, and two other species.
[01:10:45.120 --> 01:10:45.680] Chimpanzees.
[01:10:46.000 --> 01:10:47.280] Chimpanzees and elephants.
[01:10:47.680 --> 01:10:48.560] Elephants is correct.
[01:10:48.560 --> 01:10:50.160] Chimpanzees is wrong.
[01:10:50.160 --> 01:10:50.480] Right.
[01:10:50.640 --> 01:10:51.920] Is it another ape, though?
[01:10:51.920 --> 01:10:52.800] How about bonobos?
[01:10:53.440 --> 01:10:54.960] No, no primate.
[01:10:54.960 --> 01:10:55.600] No primate.
[01:10:55.840 --> 01:10:56.240] Interesting.
[01:10:56.320 --> 01:10:56.640] Bird.
[01:10:56.640 --> 01:10:57.680] It's got to be a bird.
[01:10:57.680 --> 01:10:58.400] Yes, it's a bird.
[01:10:58.400 --> 01:10:58.960] What kind of bird?
[01:10:59.360 --> 01:11:00.400] No, cockachoo.
[01:11:00.960 --> 01:11:01.360] My mama.
[01:11:01.520 --> 01:11:01.760] Parrots.
[01:11:01.920 --> 01:11:02.240] A raven.
[01:11:02.880 --> 01:11:03.680] Some parrots.
[01:11:03.680 --> 01:11:11.760] So dolphins, elephants, and some parrots have been shown to use individualized sounds to refer to specific individuals.
[01:11:11.760 --> 01:11:12.400] Wow.
[01:11:12.640 --> 01:11:19.120] No primate, no non-human primates until this new study.
[01:11:19.120 --> 01:11:19.680] Interesting.
[01:11:19.680 --> 01:11:20.080] Which foundation.
[01:11:20.240 --> 01:11:21.280] Can I ask a quick question?
[01:11:21.680 --> 01:11:30.480] Prior to this new study where they discovered vocalizations that were individualized, did they use hand signals or other sort of signs?
[01:11:30.800 --> 01:11:32.080] So, no.
[01:11:32.080 --> 01:11:36.160] So there's no evidence that they have individual names by whatever means.
[01:11:36.160 --> 01:11:37.200] Oh, okay, interesting.
[01:11:37.200 --> 01:11:37.520] Yeah.
[01:11:37.520 --> 01:11:44.800] But so a recent study has apparently demonstrated individual names in marmosets.
[01:11:45.200 --> 01:11:48.960] So again, not even apes, in marmosets, which is interesting.
[01:11:48.960 --> 01:11:58.320] So they make calls, which they call fee calls, which are specific to individual members of their group, and they can identify that.
[01:11:58.320 --> 01:12:03.640] Not only that, but they clearly teach each other these names, right?
[01:12:03.640 --> 01:12:05.400] These are learned.
[01:12:05.400 --> 01:12:14.600] And if you compare one family group to another family group, each family group tends to use names which sound similar.
[01:12:14.600 --> 01:12:21.560] So there appears to be a culture or a dialect within each group that differs from group to group.
[01:12:21.560 --> 01:12:25.640] Which, again, is just more evidence that they're teaching these to each other.
[01:12:25.880 --> 01:12:30.040] It's not like sounds that are innate or evolved.
[01:12:30.040 --> 01:12:30.360] Right.
[01:12:30.360 --> 01:12:31.720] Yeah, so it's interesting.
[01:12:31.720 --> 01:12:36.680] So the question is: why marmosets, you know, of all the primates?
[01:12:36.680 --> 01:12:41.080] Is this just because we just haven't been able to see it in other primates?
[01:12:41.080 --> 01:12:42.760] And like, why, again, why dolphins?
[01:12:42.760 --> 01:12:43.960] Why elephants?
[01:12:43.960 --> 01:12:48.680] So what do you guys think, first of all, but do you have any idea why marmosets?
[01:12:48.680 --> 01:12:49.640] Is it random?
[01:12:49.640 --> 01:12:51.320] Is it just do they have some other?
[01:12:51.480 --> 01:12:53.240] Something about their social structure?
[01:12:53.400 --> 01:12:58.360] That's a good hypothesis, something about their social structure, but it's something else too, probably.
[01:12:58.360 --> 01:13:06.520] And we don't know for sure, but there is something that all of these groups have in common, that dolphins, elephants, parrots, and marmosets have in common.
[01:13:06.520 --> 01:13:14.280] In addition to having the ability to have this level of hunting strategy, it's just their environment.
[01:13:14.280 --> 01:13:19.560] In their environment, they lose visual contact with other members of their group.
[01:13:19.560 --> 01:13:21.080] Oh, interesting.
[01:13:21.080 --> 01:13:24.360] So these are dolphins that live in murky water.
[01:13:24.360 --> 01:13:28.600] Elephants can, just by sheer distance, they can communicate over miles.
[01:13:28.600 --> 01:13:30.520] Yeah, the infrasound travels far.
[01:13:30.520 --> 01:13:35.000] They have lost visual contact with their members because of distance.
[01:13:35.000 --> 01:13:44.440] And parrots and marmosets live in dense jungles in the trees where it's so dense they can't see each other even a short distance away.
[01:13:44.440 --> 01:13:58.080] So, the researchers think that it's this combination of they're neurologically sophisticated enough to produce this, they have the ability to produce very specific sounds, and they have the need to because of their environment.
[01:13:58.080 --> 01:14:06.480] You know, they're not just like, for example, you know, crows will send out warnings to all other crows within my you can hear my voice, right?
[01:14:06.480 --> 01:14:12.800] You know, they're just making sounds that are attracting a mate or a warning of a predator or whatever.
[01:14:12.800 --> 01:14:15.600] It's not directed at a specific individual.
[01:14:15.600 --> 01:14:26.800] But now, if you're a mommy marmoset and you've lost sight of your baby, you want to call them specifically, not just send out a call to all marmosets, right?
[01:14:27.040 --> 01:14:36.400] And so, yeah, it'd be evolutionarily advantageous to be able to do that, to summon your individual family member to you.
[01:14:36.400 --> 01:14:38.800] It's just weird because there are other monkeys that live in jungles.
[01:14:38.960 --> 01:14:39.360] Yeah, I know.
[01:14:39.360 --> 01:14:41.120] So, you know, it's like, how come they didn't know?
[01:14:41.280 --> 01:14:42.960] It's probably a combination of environment.
[01:14:43.040 --> 01:14:46.960] Well, it's also, even though they're probably also more common than we think.
[01:14:47.120 --> 01:14:48.640] Yeah, it might be more common than we think.
[01:14:48.640 --> 01:14:56.960] It may be that it's not just like they're in the jungle, it's like where in the jungle and what's their behavior and how big are they and how colorful are they, whatever.
[01:14:56.960 --> 01:15:05.520] And just a number of factors together that just create the evolutionary pressure of a specific advantage to being able to do this.
[01:15:05.520 --> 01:15:06.800] And then they have to hit upon it.
[01:15:07.200 --> 01:15:10.480] It's not going to evolve every time it would be beneficial.
[01:15:10.480 --> 01:15:21.200] But yeah, I wonder how many more species do this that we just haven't been able to observe long enough in the appropriate setting to document it.
[01:15:21.200 --> 01:15:27.120] And of course, they always raise, like, could this help us learn about the evolution of language in humans?
[01:15:27.120 --> 01:15:27.840] I don't know.
[01:15:27.840 --> 01:15:28.800] I guess so.
[01:15:28.800 --> 01:15:34.360] You know, unfortunately, there's a huge gap in our language evidence, right?
[01:15:34.360 --> 01:15:41.880] So we have evidence of human language going back 4,000 years or so to the earliest writings.
[01:15:42.440 --> 01:15:55.880] We have modern extant modern human languages that we could examine to see like what do they have in common and to try to reverse engineer like the way languages evolved over human history.
[01:15:56.360 --> 01:16:01.560] So basically we have this several thousand year window into modern languages.
[01:16:01.560 --> 01:16:08.600] And then if we go to our non-human ancestors, we're going back eight plus million years.
[01:16:08.600 --> 01:16:13.560] And we basically have no information between eight million years and four thousand years ago.
[01:16:13.560 --> 01:16:13.960] Wow.
[01:16:14.280 --> 01:16:32.360] In terms of the evolution of language, because we don't have any window into but all we could say is anatomically, we could look anatomically and say could they that's more about their physical ability to speak, not necessarily about their language.
[01:16:32.360 --> 01:16:39.080] And so we know that only humans, only Homo sapiens have fully modern larynxes.
[01:16:39.080 --> 01:16:43.880] Neanderthal, there's still a little bit of a debate about what kind of vocal range they actually had.
[01:16:43.880 --> 01:16:53.000] It was definitely a lot closer to human vocal cords than any other hominid, but not quite the same structure as modern humans.
[01:16:53.000 --> 01:16:57.080] And then going back, it's mainly about the hyoid bone.
[01:16:57.080 --> 01:16:57.800] Where is it?
[01:16:57.800 --> 01:17:00.600] What's the shape of it and where is it in the throat?
[01:17:00.600 --> 01:17:03.400] Yeah, it's a weird bone in the skull that's kind of hanging off by itself.
[01:17:03.640 --> 01:17:04.280] Yeah, yeah.
[01:17:04.600 --> 01:17:08.200] So the hominids did not have the full vocal range that humans do.
[01:17:08.440 --> 01:17:11.720] That doesn't mean that they didn't have language or they didn't speak.
[01:17:12.040 --> 01:17:14.520] And we don't really know anything about their language.
[01:17:14.640 --> 01:17:23.520] There's maybe a tiny little bit of evidence we can infer from skull casts, you know, seeing like, oh, yeah, they had the left temporal lobe is developed to a certain degree or not.
[01:17:23.520 --> 01:17:25.920] But again, it's very indirect evidence.
[01:17:26.160 --> 01:17:33.040] But so it's interesting, we just don't really have a good window into human language for most of its development and evolution.
[01:17:33.040 --> 01:17:39.680] So this maybe will shed a tiny bit of light on the early stages of the evolution of language.
[01:17:39.680 --> 01:17:44.800] But I do wonder which other species will eventually figure out have individual names.
[01:17:44.800 --> 01:17:45.440] Oh, totally.
[01:17:45.440 --> 01:17:52.400] And there may be something that's subtle that we aren't picking up on, like a subtle sign or a nonverbal one.
[01:17:53.040 --> 01:17:56.640] All right, there's no who's that noisy this week because we're out of sequence.
[01:17:56.640 --> 01:17:59.920] So we're just going to go right to some questions and emails.
[01:17:59.920 --> 01:18:00.160] Bob.
[01:18:00.320 --> 01:18:01.360] Ooh, I didn't know that.
[01:18:01.360 --> 01:18:04.800] This is a, yeah, I mentioned that.
[01:18:05.040 --> 01:18:05.440] Yeah.
[01:18:06.480 --> 01:18:12.320] Bob, this is a response to a news item that you did back in August.
[01:18:12.640 --> 01:18:14.400] This is on beetles.
[01:18:14.400 --> 01:18:34.160] You were talking about the fact that beetles, there's more beetles than any other group of animals and discussing why that might be in terms of, you know, they had evolutionarily, they had an opportunity to expand dramatically, you know, their numbers.
[01:18:34.160 --> 01:18:41.600] But this is a this email comes from a listener called Matiaz, I think, M-A-T-J-A-Z.
[01:18:41.600 --> 01:18:45.120] He didn't offer a pronunciation in the email, so we're on our own.
[01:18:45.120 --> 01:18:54.560] And they write, Hi, guys, in a recent show, you highlighted that beetles are by far the most diverse insect group and animal group on this taxonomic level.
[01:18:54.560 --> 01:19:02.120] Well, I have a correction that is not really a correction for you, but might be of interest since SGU is about science progress.
[01:18:59.760 --> 01:19:03.320] I thought about writing.
[01:19:03.480 --> 01:19:09.080] Anyway, beetles indeed currently are the most diverse, but will not stay so much longer.
[01:19:09.080 --> 01:19:17.640] Biologists are lately realizing, in large part thanks to molecular methods, yay science, that there is a big bias in what we know.
[01:19:17.640 --> 01:19:19.160] You can guess it, perhaps.
[01:19:19.160 --> 01:19:22.120] If it's small, it's less known.
[01:19:22.120 --> 01:19:35.160] There are two groups that biologists agree on will easily surpass beetles: wasps and kin, Hymenoptera, Hymenoptera, and flies, diptera.
[01:19:35.160 --> 01:19:43.800] So both have a crazy amount of tiny species that nobody studied and are super tough to taxonomically treat based on morphology.
[01:19:44.040 --> 01:19:49.480] These are called dark taxa, and there's a lot of research attention on these recently.
[01:19:49.480 --> 01:19:50.040] Yay science.
[01:19:51.080 --> 01:19:54.920] So as is the consensus today, flies are the largest insect order, followed by wasps.
[01:19:54.920 --> 01:20:00.920] Beetles are third, although species descriptions are slow, and so formally it will take a while.
[01:20:00.920 --> 01:20:03.800] There is a similar thing in arachnids with mites.
[01:20:03.800 --> 01:20:07.880] Mites are an order of mostly sub-millimeter animals.
[01:20:07.880 --> 01:20:12.840] The true diversity is likely tens to 100 times higher.
[01:20:12.840 --> 01:20:29.400] So yeah, so he's saying, yeah, in terms of named species, it is beetles, but biologists suspect that Hymenoptera, you know, the wasps and flies probably are more diverse once we fully characterize all of the small members of those groups.
[01:20:29.480 --> 01:20:30.760] Yeah, if we ever.
[01:20:30.920 --> 01:20:32.600] I wonder what their upper limit was.
[01:20:32.760 --> 01:20:37.480] If memory serves, for beetles, it was 400,000 different species.
[01:20:37.800 --> 01:20:46.880] But my research shows that they have estimates of the actual number for beetles going potentially as high as 2.1 million.
[01:20:47.360 --> 01:20:54.480] So I wonder if their upper potential limit for these other creatures are equally beyond that.
[01:20:54.880 --> 01:20:57.760] So, point taken, I wasn't aware of that.
[01:20:57.760 --> 01:21:05.520] That's interesting, that they may and probably even do exceed beetles, but they're just so tiny that they're just too lazy.
[01:21:05.920 --> 01:21:07.200] Well, no, it's just hard.
[01:21:07.920 --> 01:21:16.960] It's hard to find them, and then it's hard to categorize them because they're small, so it's hard to characterize their morphology.
[01:21:16.960 --> 01:21:23.760] So, there may be lots of instances where we are confusing multiple species for one, but it's really multiple species.
[01:21:23.760 --> 01:21:29.040] And if we knew that, it would dramatically increase the number of the known species.
[01:21:29.040 --> 01:21:29.600] So, we'll see.
[01:21:30.160 --> 01:21:40.880] Let the races begin, I guess, to see over time which order of insects has their described diversity increased more.
[01:21:40.880 --> 01:21:47.360] Right now, they're just speculating that it might be Hymenoptera and flies, but they don't really know until they look.
[01:21:47.680 --> 01:21:48.960] And you're correct.
[01:21:48.960 --> 01:21:53.040] While those orders increase, maybe we'll find lots of more beetles, too.
[01:21:53.040 --> 01:21:53.360] Yeah.
[01:21:53.360 --> 01:21:56.160] And the mites one is interesting because mites are teeny tiny.
[01:21:56.160 --> 01:22:03.360] The other thing is, when you're the smaller you are, the relatively bigger is your ecosystem, right?
[01:22:03.360 --> 01:22:06.240] Like, the world is a lot bigger place for you if you're teeny tiny.
[01:22:07.040 --> 01:22:09.280] That's why there's more bacteria than anything else.
[01:22:09.280 --> 01:22:20.560] So, it kind of makes sense, you know, relatively speaking, that there would be more, not only more individuals, but more species, more genera, et cetera, of teeny tiny groups like that.
[01:22:20.560 --> 01:22:21.440] Interesting.
[01:22:21.440 --> 01:22:25.840] All right, we're going to do a quick name-note logical fallacy, and then we'll go on to science or fiction.
[01:22:26.080 --> 01:22:33.000] This one is based on an email that came to us from Mario, and Mario said, Love the show, listened since the beginning.
[01:22:33.320 --> 01:22:37.720] What do you call it when a claim is technically true but deceptive?
[01:22:37.720 --> 01:22:41.640] Example: Someone claims that I have never lost a game of chess.
[01:22:41.640 --> 01:22:48.840] That gives the impression they are a great chess player, but it's simply because they have never played a single game of chess in their life.
[01:22:48.840 --> 01:22:54.680] Is that a logical fallacy or simply a lie of omission, Mario?
[01:22:55.000 --> 01:22:55.560] Did you guys?
[01:22:55.560 --> 01:22:58.440] I have to ask, did you guys ever watch That 70 show?
[01:22:58.440 --> 01:22:59.080] Yeah, sure.
[01:22:59.080 --> 01:22:59.800] Oh, yeah.
[01:22:59.800 --> 01:23:02.280] Do you remember the comedian Love Him?
[01:23:03.000 --> 01:23:05.000] Miss Him to this day, Mitch Hedberg.
[01:23:05.000 --> 01:23:05.480] Oh, yeah.
[01:23:05.480 --> 01:23:05.960] Oh, my God.
[01:23:05.960 --> 01:23:07.000] He's amazing.
[01:23:07.000 --> 01:23:18.680] So he was in a couple episodes of That 70 show, if you don't remember, he was like, he worked the counter at the restaurant, and there was an episode where one of them goes up to him and he asks for like extra fries.
[01:23:18.680 --> 01:23:22.360] And he goes, I did not lose a leg in Vietnam just so you could have extra fries.
[01:23:22.360 --> 01:23:23.480] And they're like, What are you talking about?
[01:23:23.480 --> 01:23:24.520] You weren't in Vietnam.
[01:23:24.520 --> 01:23:26.440] He goes, That's what I said.
[01:23:26.440 --> 01:23:28.040] I did not lose a leg.
[01:23:30.440 --> 01:23:33.400] I don't know why that reminds me of this even.
[01:23:34.680 --> 01:23:46.760] Yeah, I mean, there's a lot of actual comedy based upon this, where you make a statement that implies something that isn't true, and then you subvert expectations by flipping it on its head, right?
[01:23:47.640 --> 01:23:48.280] It's the doctor.
[01:23:48.280 --> 01:23:50.760] Doctor, will I be able to play the violin again?
[01:23:50.760 --> 01:23:51.720] Okay, well, great.
[01:23:51.720 --> 01:23:53.000] I never did before.
[01:23:53.640 --> 01:23:54.120] Right.
[01:23:54.120 --> 01:23:57.320] Or yeah, although you shouldn't say again because that implies a different person.
[01:23:57.400 --> 01:23:58.280] Yeah, well, okay.
[01:23:58.280 --> 01:24:00.920] Like, yeah, yeah, will I be able to play the piano?
[01:24:00.920 --> 01:24:01.640] Sure, you will.
[01:24:01.640 --> 01:24:01.880] Great.
[01:24:01.880 --> 01:24:02.440] I've never been there.
[01:24:02.440 --> 01:24:03.880] Yeah, I never played before.
[01:24:03.880 --> 01:24:06.920] So, yeah, that's a common type of comedy.
[01:24:06.920 --> 01:24:09.320] But anyway, is this a logical fallacy?
[01:24:09.320 --> 01:24:10.840] And the answer is clearly no.
[01:24:10.840 --> 01:24:15.200] It's not a logical fallacy because the problem is not with the logic.
[01:24:15.200 --> 01:24:15.680] Right?
[01:24:14.520 --> 01:24:20.160] So an argument or a statement has three components.
[01:24:20.320 --> 01:24:27.680] You have one or more premises connected through some kind of logic to a conclusion.
[01:24:27.680 --> 01:24:35.600] And if the conclusion does not logically flow from the premises, that's a logical fallacy.
[01:24:35.600 --> 01:24:40.640] That's invalid logic, leading to an unsound conclusion.
[01:24:40.640 --> 01:24:47.280] But the other component of the argument that can be wrong is the premises, right?
[01:24:47.280 --> 01:24:59.120] If you have a problem with your premises, it also can be an unsound argument or misleading argument or whatever, even if there's no logical fallacy involved.
[01:24:59.120 --> 01:25:00.320] Does that make sense?
[01:25:00.320 --> 01:25:03.600] So in this case, we have an incomplete premise.
[01:25:03.840 --> 01:25:07.200] We're not being given all the pieces of information, right?
[01:25:07.200 --> 01:25:10.320] You're given only a partial piece of information.
[01:25:10.320 --> 01:25:14.240] And then it's not really even, it's an implied claim.
[01:25:14.240 --> 01:25:17.280] It's not even really, like he's implying he's great at chess.
[01:25:17.280 --> 01:25:19.280] He's not really saying it outright.
[01:25:19.280 --> 01:25:30.160] So, right, but I think that this is, it is the lie of omission, meaning something was omitted from the factual basis of the implied claim.
[01:25:30.160 --> 01:25:35.520] So don't forget to examine the premises of an argument as well, even if the logic is correct.
[01:25:35.520 --> 01:25:49.600] Sometimes a premise can be wrong, or a premise can be misleading, or it could be incomplete, like in this case where you're just missing information, or it could be controversial, or it could be subjective.
[01:25:49.560 --> 01:25:54.720] Like it's not an objective, it's a subjective assessment, it's not an objective fact.
[01:25:54.720 --> 01:25:58.480] Like you're taking as a given something that is not given, right?
[01:25:58.480 --> 01:26:01.240] But isn't that what unstated major premise is?
[01:26:01.240 --> 01:26:02.200] Yeah, that's an unstated premise.
[01:26:02.280 --> 01:26:03.240] Isn't that a logical fallacy?
[01:26:03.400 --> 01:26:05.880] No, it's not really a logical fallacy.
[01:26:05.880 --> 01:26:07.240] It's a premise problem.
[01:26:07.240 --> 01:26:09.960] It's not technically a fallacy.
[01:26:09.960 --> 01:26:10.520] Oh, really?
[01:26:10.520 --> 01:26:13.000] I always thought that it was utilized as an introduction.
[01:26:13.240 --> 01:26:20.600] We include it in the list of informal logical fallacies, again, because they're informal, because it's basically a problem with your argument.
[01:26:20.600 --> 01:26:28.600] You know, if you're using logical fallacy, as there's some problem with your argument, but it's technically not a logical fallacy because there's no logic involved.
[01:26:28.600 --> 01:26:35.400] It's just that you have not explained all of your premises or specifically identified all of your premises.
[01:26:35.400 --> 01:26:37.960] You're using a premise you're not stating.
[01:26:38.280 --> 01:26:38.760] Right.
[01:26:38.760 --> 01:26:49.320] And like, as I'm kind of digging into it, I'm seeing language used like an unstated premise is a premise that a deductive argument requires, but it's not explicitly stated.
[01:26:49.560 --> 01:26:54.280] And in this case, you would deduce that he plays chess.
[01:26:54.280 --> 01:26:57.400] Like it's an intentional twisting of that.
[01:26:57.400 --> 01:26:57.800] Yes.
[01:26:58.040 --> 01:27:00.200] It's more than an illogical deduce.
[01:27:00.680 --> 01:27:02.760] It's an implied unstated major premise.
[01:27:02.760 --> 01:27:03.080] Yes.
[01:27:03.480 --> 01:27:04.680] Yeah, I agree.
[01:27:04.680 --> 01:27:12.760] But the reason why that's a bad argument is because it doesn't give the other person the opportunity to challenge the premise.
[01:27:12.760 --> 01:27:16.040] You're trying to assume the premise by not even stating it.
[01:27:16.040 --> 01:27:18.440] Therefore, like it's not even up for discussion.
[01:27:18.440 --> 01:27:20.600] It's baked into your argument.
[01:27:20.920 --> 01:27:27.000] When people talk about things having to do with religion and they just blanket assume that God exists and everything's founded on that.
[01:27:27.000 --> 01:27:27.320] Right.
[01:27:27.560 --> 01:27:28.120] Yeah.
[01:27:28.120 --> 01:27:33.160] Like without saying, let's assume God exists, they assume God exists and they proceed from there.
[01:27:33.160 --> 01:27:35.080] It's an unstated major premise.
[01:27:35.080 --> 01:27:36.040] That's always careful.
[01:27:36.040 --> 01:27:39.440] You have to be careful about that because that could be a very stealth problem with an argument.
[01:27:39.440 --> 01:27:41.480] Because, again, it's what's missing.
[01:27:41.480 --> 01:27:44.040] You know, what's missing is sometimes hard to identify.
[01:27:44.040 --> 01:27:46.320] That's why it's a great exercise.
[01:27:46.480 --> 01:27:58.160] Again, you should do this to police your own thinking, not just as a weapon against other people, but just to arrive at a consensus, you want to examine how sound all the arguments that are being put forward are.
[01:27:58.160 --> 01:28:00.640] And it's always good to say, okay, what are all the premises?
[01:28:00.640 --> 01:28:02.720] What's the logic being employed?
[01:28:02.720 --> 01:28:05.200] And is the conclusion justified by that?
[01:28:05.200 --> 01:28:15.280] And if you go consciously through that process, it becomes much easier to say, oh, wait a minute, there's an implied premise here you're not explicitly stating.
[01:28:15.280 --> 01:28:22.640] Or our disagreement is that we're making different assumptions about, we have different starting points.
[01:28:22.640 --> 01:28:25.840] You're assuming one thing, I'm assuming something else.
[01:28:26.320 --> 01:28:30.880] We have to talk about that first before we even get to talking about logic.
[01:28:32.560 --> 01:28:35.760] You have to make sure that you can agree on the premises.
[01:28:35.760 --> 01:28:45.200] Unfortunately, that's harder and harder to do these days because people have so much information they can cherry-pick whatever information they want to start as their premises.
[01:28:45.760 --> 01:28:46.880] All right, well, thank you, Mario.
[01:28:46.880 --> 01:28:47.600] That was a good question.
[01:28:47.600 --> 01:28:48.960] I enjoyed talking about that.
[01:28:48.960 --> 01:28:52.400] Guys, let's go on with science or fiction.
[01:28:54.640 --> 01:28:59.600] It's time for science or fiction.
[01:29:04.400 --> 01:29:13.120] Each week I come up with two science news items or facts, two real, one fake, and then I challenge my panel of skeptics to tell me which one is the fake.
[01:29:13.120 --> 01:29:18.240] There's a theme this week, because of course, we're not doing news items because we're out of sequence here.
[01:29:18.240 --> 01:29:22.800] The theme is science literacy.
[01:29:22.800 --> 01:29:28.320] So, these are three statements about what Americans believe.
[01:29:28.320 --> 01:29:31.160] I stuck with Americans just to make them all comfortable.
[01:29:29.920 --> 01:29:33.960] So, you'll understand in a second what I'm talking about.
[01:29:34.280 --> 01:29:44.440] Item number one: a 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:29:44.440 --> 01:29:46.680] In quotes, that last part.
[01:29:46.680 --> 01:29:57.160] Item number two: in a 2014 NSF survey, 26% of Americans stated that the sun revolves about the earth rather than the other way around.
[01:29:57.160 --> 01:30:12.040] And item number three, a 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time, while only 25% answered that they definitely did not.
[01:30:12.040 --> 01:30:13.000] Jay, go first.
[01:30:13.000 --> 01:30:20.600] Okay, yeah, this first one, 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:30:21.000 --> 01:30:32.440] That's really interesting because I could see where people would confuse the idea of like, you know, the power plants and what's coming, you know, what's being used in the power plants.
[01:30:32.440 --> 01:30:36.120] And then there's, you know, then there's all the like the leftover material.
[01:30:36.120 --> 01:30:43.560] I could just see there's confusion there, but 68% is an awful lot of people to believe that radioactivity is man-made.
[01:30:43.560 --> 01:30:45.720] That's a boy, that's a tricky one.
[01:30:45.720 --> 01:30:53.400] Second one, in 2014, the NSF survey, 26% of Americans stated that the sun revolved around the earth.
[01:30:53.400 --> 01:30:55.080] I think that one is science.
[01:30:55.080 --> 01:30:57.000] My God, I think that one is science.
[01:30:57.000 --> 01:31:04.760] Okay, the third one, a 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time.
[01:31:04.760 --> 01:31:08.200] Why, only a quarter of the people answered that they definitely did not.
[01:31:08.200 --> 01:31:09.400] Oh my god, Steve.
[01:31:09.400 --> 01:31:11.480] This is such an interesting one.
[01:31:11.800 --> 01:31:17.040] Ah, 41% people believe that dinosaurs and humans at the same time?
[01:31:17.040 --> 01:31:17.920] No.
[01:31:14.680 --> 01:31:21.760] So it's between the first one and the last one.
[01:31:21.840 --> 01:31:24.400] So either the radioactivity one.
[01:31:24.400 --> 01:31:25.440] Damn, that's a tip.
[01:31:25.440 --> 01:31:26.880] This is really hard, Steve.
[01:31:26.880 --> 01:31:30.400] You're basically asking us how stupid the average person is.
[01:31:30.720 --> 01:31:32.720] How scientifically literate are they?
[01:31:32.720 --> 01:31:37.440] All right, 41% of people think that dinosaurs.
[01:31:37.440 --> 01:31:39.600] And there's two percentages in that last one.
[01:31:39.600 --> 01:31:42.240] There's 41% that believe that they lived at the same time.
[01:31:42.240 --> 01:31:45.360] 25% were the number of people that answered correctly.
[01:31:45.360 --> 01:31:46.400] A quarter?
[01:31:46.400 --> 01:31:46.960] All right.
[01:31:46.960 --> 01:31:47.600] Okay.
[01:31:47.600 --> 01:31:51.360] I have been choosing so badly lately with science or fiction.
[01:31:51.360 --> 01:31:53.120] I know.
[01:31:53.120 --> 01:31:54.560] Bob, help me out, man.
[01:31:54.560 --> 01:31:55.280] No, no, no.
[01:31:55.280 --> 01:31:55.680] You're on the same side.
[01:31:56.640 --> 01:32:00.400] I'm going to say it's the dinosaur one, and I'm going to tell you exactly why.
[01:32:00.400 --> 01:32:07.600] I think more people know that dinosaurs, I think more than 25% people know for sure that it definitely didn't happen.
[01:32:07.840 --> 01:32:09.280] That's the indicator for me.
[01:32:09.280 --> 01:32:10.160] Okay, Kara.
[01:32:10.160 --> 01:32:14.800] I mean, when I first read all of them, that one was the obvious, like, what?
[01:32:14.800 --> 01:32:23.200] But then as I'm looking at the language, first of all, it's 2015, and we know that people are increasingly becoming more secular.
[01:32:23.200 --> 01:32:24.880] That's like almost 10 years ago now.
[01:32:24.880 --> 01:32:27.520] Oh, I hate to say that out loud.
[01:32:27.920 --> 01:32:30.640] 2015, you joined the skeptics guide, I believe.
[01:32:30.800 --> 01:32:30.960] Yeah.
[01:32:32.480 --> 01:32:33.120] Yes.
[01:32:33.120 --> 01:32:38.000] So 41% of people believing that dinosaurs and humans lived at the same time.
[01:32:38.000 --> 01:32:40.800] That's like a blanket statement, but you worded it funny here.
[01:32:40.800 --> 01:32:43.920] You said 25% said that they definitely didn't.
[01:32:43.920 --> 01:32:51.040] I'm assuming because a lot of times with these polls, there's like multiple, like maybe, probably, I'm completely sure of it, or whatever.
[01:32:51.040 --> 01:32:58.960] And yeah, I wouldn't be surprised if only 25% of people felt confident in their answer, in their correct answer, sadly.
[01:32:58.960 --> 01:33:00.600] And a lot of people are religious, man.
[01:33:00.600 --> 01:33:02.120] A lot of people are fundamentalists.
[01:33:02.120 --> 01:33:03.960] It's scary.
[01:32:59.920 --> 01:33:07.160] That probably tracks with people believing in evolution, too.
[01:33:07.480 --> 01:33:08.600] So I'm going to go backward.
[01:33:08.600 --> 01:33:16.520] The NSF survey also exactly 10 years ago now, about a quarter of people saying that the sun revolves about the earth.
[01:33:16.520 --> 01:33:17.480] How British?
[01:33:17.800 --> 01:33:21.000] Around the Earth rather than the other way around.
[01:33:21.320 --> 01:33:23.160] That's so sad.
[01:33:23.160 --> 01:33:25.800] But that doesn't surprise me either.
[01:33:26.760 --> 01:33:27.880] So I guess that leaves the.
[01:33:28.040 --> 01:33:31.880] Maybe they revolve around each other's barycenter, but that's over there.
[01:33:32.200 --> 01:33:33.320] Is that a choice?
[01:33:33.320 --> 01:33:33.640] Probably.
[01:33:36.680 --> 01:33:41.400] So that leaves the National Science Board survey, which is much more modern.
[01:33:41.400 --> 01:33:43.480] This was only four years ago.
[01:33:43.720 --> 01:33:51.640] 68% of Americans, so an overwhelming majority, believing that all radioactivity is man-made.
[01:33:51.640 --> 01:33:58.440] I think that my guess would be that that number wouldn't be so high for one of two reasons.
[01:33:58.440 --> 01:33:59.880] I'm hedging my bets here.
[01:33:59.880 --> 01:34:09.080] One is that hopefully more people than that understand the electromagnetic spectrum, but that is, I'm not confident in that assumption.
[01:34:09.080 --> 01:34:14.040] So my other guess would be that it's a coin flip, and this feels too high to be a coin flip.
[01:34:14.040 --> 01:34:16.680] So I'm going to say that that's the fiction.
[01:34:16.680 --> 01:34:17.480] Okay, Bob.
[01:34:17.480 --> 01:34:20.280] So, Carrie, you're saying this number one, the 68%?
[01:34:20.520 --> 01:34:21.880] Yeah, the radioactivity.
[01:34:21.880 --> 01:34:22.280] Yeah.
[01:34:22.600 --> 01:34:25.240] Yeah, that does sound pretty high.
[01:34:25.240 --> 01:34:29.560] And Jay, I like your pick as well about the dinosaurs.
[01:34:29.560 --> 01:34:34.280] The key thing there, though, definitely did not is rubbing me the wrong way as well.
[01:34:34.280 --> 01:34:36.280] I mean, definitely did not.
[01:34:36.280 --> 01:34:40.840] Did they do one of those surveys where you had, you know, you had that much granularity?
[01:34:40.840 --> 01:34:41.720] Probably did not.
[01:34:41.720 --> 01:34:42.600] Definitely did not.
[01:34:42.600 --> 01:34:47.200] I mean, I don't, that doesn't seem likely as I had thought.
[01:34:47.200 --> 01:34:50.240] And the second one, sun and earth revolve.
[01:34:44.600 --> 01:34:52.080] Yeah, that one sounds legit.
[01:34:52.560 --> 01:34:58.480] Yeah, so I agree it's between the radioactivity and the dinosaurs.
[01:34:58.480 --> 01:35:03.200] Ah, yeah, I'll go with Karen and say the radioactivity and 68%.
[01:35:03.200 --> 01:35:04.240] I'll say that's fiction.
[01:35:04.240 --> 01:35:07.840] But any of these, pretty much any of these, none of them would shock me.
[01:35:07.840 --> 01:35:09.520] Okay, everyone's true.
[01:35:09.520 --> 01:35:12.560] You realize how horrific this science or fiction is, right?
[01:35:13.280 --> 01:35:15.040] Because two of these are correct.
[01:35:15.040 --> 01:35:15.520] Okay?
[01:35:15.520 --> 01:35:16.000] That's right.
[01:35:16.000 --> 01:35:19.040] One of them is fake, so it may even be worse because it depends.
[01:35:19.040 --> 01:35:26.560] Let's say, for example, 98% of Americans believe that all radioactivity is man-made, and that turns out being the fiction, only in the worst direction.
[01:35:26.560 --> 01:35:27.760] This is a nightmare.
[01:35:27.760 --> 01:35:30.160] This science or fiction is horrible.
[01:35:30.160 --> 01:35:31.120] Right, Ev?
[01:35:31.680 --> 01:35:33.120] But I'll cut right to it.
[01:35:33.120 --> 01:35:37.040] And I am going to, sorry, Jay, I am going to agree with Kara and Bob.
[01:35:37.040 --> 01:35:46.000] And my only reason for this is I have seen before polling regarding the Earth and the Sun relationship.
[01:35:46.000 --> 01:35:49.120] I've seen polling before about dinosaurs and human relationships.
[01:35:49.120 --> 01:35:56.160] I've never seen a poll that I can remember off the top of my head about radioactivity and what they're asking people questions about that.
[01:35:58.080 --> 01:35:59.760] I have no recollection of it.
[01:35:59.760 --> 01:36:02.400] So that's why that one's rubbing me the wrong way.
[01:36:02.400 --> 01:36:04.560] Okay, so you all agree on number two, so we'll start there.
[01:36:04.560 --> 01:36:14.320] In the 2014 National Science Foundation survey, 26% of Americans stated that the sun revolves about the earth rather than the other way around.
[01:36:14.320 --> 01:36:19.600] You all think this one is science, and this one is science.
[01:36:19.600 --> 01:36:20.720] This one is science.
[01:36:21.200 --> 01:36:32.360] Yeah, and that figure, I looked around for other surveys, you know, rather than just going on one survey, I found figures anywhere between 18 and 28 percent, some even a little bit worse.
[01:36:32.680 --> 01:36:40.200] You know, 26 seems to be like the more 25, 26 percent, basically a quarter, I think, seems to be the most average number there.
[01:36:40.200 --> 01:36:46.040] And that's just like the basic fact that the Earth goes around the sun.
[01:36:46.040 --> 01:36:49.320] Forget about like how long it takes for that to happen.
[01:36:49.320 --> 01:36:51.640] Fewer people knew that, right?
[01:36:53.560 --> 01:36:57.080] I know, it's pretty disappointing.
[01:36:57.080 --> 01:37:08.280] And I got it, these were all in the U.S., but the numbers for these kinds of questions are pretty similar across many countries.
[01:37:08.600 --> 01:37:09.800] We're not an outlier.
[01:37:09.800 --> 01:37:11.240] We are not an outlier.
[01:37:12.360 --> 01:37:20.600] The worst country for those surveys that looked at multiple different countries and multiple different questions, what do you think the worst one was of, like, big countries?
[01:37:20.600 --> 01:37:22.280] Well, which are we talking globally?
[01:37:22.280 --> 01:37:23.080] Are we talking just in Europe?
[01:37:23.320 --> 01:37:23.640] Globally.
[01:37:24.680 --> 01:37:26.680] Just highly religious countries.
[01:37:26.680 --> 01:37:34.440] So the highly religious countries have specific questions where they're far worse than other countries.
[01:37:34.440 --> 01:37:45.720] So India and the United States stick out for any question that is where evolution is, you know, belief in evolution versus creationism is an issue.
[01:37:45.720 --> 01:37:51.720] But just in general, like not creationism-related questions, the worst country is basically China.
[01:37:51.720 --> 01:37:52.200] Oh, interesting.
[01:37:52.440 --> 01:37:58.840] Yeah, they're like way off the average for a lot of other countries.
[01:37:58.840 --> 01:38:01.320] Do you think that's because of income inequality?
[01:38:01.320 --> 01:38:09.080] Because I don't know why there's a stereotype in the West that Chinese schools are very strict, but also very science-based.
[01:38:09.080 --> 01:38:11.240] But that's probably only for rich kids.
[01:38:11.240 --> 01:38:11.880] Yeah.
[01:38:11.880 --> 01:38:14.960] And Russia does very bad too, by the way.
[01:38:14.960 --> 01:38:17.040] Malaysia does very poorly.
[01:38:14.680 --> 01:38:20.240] Japan does very well in most things.
[01:38:20.720 --> 01:38:22.240] But again, there's other ones where, like, really?
[01:38:22.240 --> 01:38:22.720] They got.
[01:38:23.040 --> 01:38:26.400] Anyway, we can go over that a little bit more detail when I go over the rest of the questions.
[01:38:26.400 --> 01:38:28.640] But it's, yeah, the U.S.
[01:38:28.640 --> 01:38:35.280] wasn't generally an outlier, but whenever there was a question that was clearly evolution-based, we were.
[01:38:35.600 --> 01:38:36.960] So let's go back to number one.
[01:38:36.960 --> 01:38:45.040] A 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:38:45.280 --> 01:38:48.400] Bob, Evan, and Kara think this one is the fiction.
[01:38:48.400 --> 01:38:50.000] Jay, you think this one is science?
[01:38:50.000 --> 01:38:54.240] And this one is the fiction.
[01:38:54.400 --> 01:38:55.360] I flipped it.
[01:38:55.360 --> 01:38:58.000] 68% got that question correct.
[01:38:58.640 --> 01:39:03.200] And so then 32% believe that all that's still high.
[01:39:03.600 --> 01:39:06.320] But 32% think that all radioactivity is man-made.
[01:39:06.320 --> 01:39:12.720] If you want to look at other countries on this question, to give you an example, in Canada, 72% got that question right.
[01:39:12.720 --> 01:39:14.560] China, 41%.
[01:39:14.560 --> 01:39:16.640] The EU, 59%.
[01:39:16.640 --> 01:39:18.320] Israel, 76%.
[01:39:18.320 --> 01:39:19.680] Japan, 64%.
[01:39:19.680 --> 01:39:21.040] Malaysia, 20.
[01:39:21.040 --> 01:39:22.880] Russia, 35%.
[01:39:22.880 --> 01:39:25.440] South Korea, 48.
[01:39:25.440 --> 01:39:28.800] So, yeah, we're actually, other than Canada, the U.S.
[01:39:28.800 --> 01:39:31.440] was better than most other countries there.
[01:39:31.440 --> 01:39:33.760] So that one is interesting, too.
[01:39:34.000 --> 01:39:35.200] It's not one we talk about a lot.
[01:39:35.520 --> 01:39:39.600] That misconception is not one that, you know, we've, I don't know, I've ever discussed that.
[01:39:40.560 --> 01:39:42.080] It's a recollection of having read this.
[01:39:42.240 --> 01:39:43.760] But yeah, you're right, though, Jay.
[01:39:43.760 --> 01:39:50.080] It's like, yeah, you could see how, because we talk about it in the context of nuclear power and nuclear weapons and whatever.
[01:39:50.080 --> 01:39:54.960] And they just may not realize that, yeah, uranium is an ore and it's radioactive, you know.
[01:39:54.960 --> 01:39:56.560] Or the banana is radioactive.
[01:39:56.640 --> 01:39:56.800] Right.
[01:39:58.160 --> 01:39:58.720] I love the radioactive activity.
[01:39:58.880 --> 01:40:01.080] Don't even go there.
[01:39:59.600 --> 01:40:02.520] All right, but let's go to number three.
[01:40:03.080 --> 01:40:12.040] A 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time, while only 25% answered that they definitely did not.
[01:40:12.040 --> 01:40:13.640] That, of course, is science.
[01:40:13.640 --> 01:40:17.960] And yeah, I figured, like, if I just had the 41%, you go, that's the creationist, right?
[01:40:17.960 --> 01:40:18.680] And it's true.
[01:40:18.920 --> 01:40:22.040] That's 100% what that 41% figure is from.
[01:40:22.040 --> 01:40:24.760] That's why I included the 25%.
[01:40:24.760 --> 01:40:30.200] Only 25% were confident that we didn't live at the same time as dinosaurs.
[01:40:30.280 --> 01:40:33.480] That's way higher than the creationist population in this country.
[01:40:33.800 --> 01:40:40.440] And it was a definitely did, probably did, probably did not, definitely did not, I don't know, right?
[01:40:40.440 --> 01:40:41.880] Those are the five choices.
[01:40:42.360 --> 01:40:47.320] Only 25% were confident in saying we definitely did not live at the same time with dinosaurs.
[01:40:47.320 --> 01:40:49.800] So, yeah, that's pretty bad.
[01:40:49.800 --> 01:40:53.880] So, we still have a lot of work to do in terms of just basic scientific literacy.
[01:40:54.600 --> 01:40:56.360] Other questions that are typical?
[01:40:56.360 --> 01:41:03.480] And I have done this when I've given seminars before, and I'm going to do this when I'm in Dubai, giving my skeptical conference.
[01:41:03.880 --> 01:41:06.840] There, I have my scientific literacy.
[01:41:06.840 --> 01:41:12.200] It's also critical thinking literacy and media literacy, sort of questionnaire that I've been giving out.
[01:41:12.200 --> 01:41:18.040] I'll do it sort of as a pre-test, you know, and then we can talk about the end of the seminar.
[01:41:18.280 --> 01:41:24.600] And I have some of the same questions in mind: like, well, here's one: electrons are smaller than atoms.
[01:41:24.840 --> 01:41:38.840] That ranged from across different nations in China, the highest figure was 60% in Israel, 46% in the U.S., less than half of people know that electrons are smaller than atoms.
[01:41:38.840 --> 01:41:39.760] That's crazy, man.
[01:41:39.880 --> 01:41:41.080] Yeah, it's bad.
[01:41:41.320 --> 01:41:42.680] I mean, all right.
[01:41:43.720 --> 01:41:46.960] I mean, the universe began with a huge explosion.
[01:41:46.960 --> 01:41:47.520] What do you think?
[01:41:47.520 --> 01:41:49.360] How many people got that one correct?
[01:41:49.520 --> 01:41:50.800] 55.
[01:41:50.800 --> 01:41:51.760] Yeah, coin flip.
[01:41:51.920 --> 01:41:52.720] 38.
[01:41:53.920 --> 01:41:59.600] But it was higher in most other developed nations because I think that's a creationism once they don't believe in the Big Bang.
[01:41:59.920 --> 01:42:00.480] Right.
[01:42:00.480 --> 01:42:00.960] Yeah.
[01:42:00.960 --> 01:42:01.520] Yeah.
[01:42:02.480 --> 01:42:03.200] Here's one again.
[01:42:03.200 --> 01:42:08.800] We like, it's not one that you commonly think of as a source of misconception.
[01:42:08.800 --> 01:42:11.600] The center of the earth is very hot.
[01:42:12.000 --> 01:42:14.080] What percentage of people said that was true?
[01:42:14.080 --> 01:42:14.720] 60.
[01:42:14.720 --> 01:42:15.200] 50?
[01:42:16.160 --> 01:42:16.960] 86.
[01:42:16.960 --> 01:42:19.200] So that was how it was still opposite.
[01:42:19.520 --> 01:42:20.480] I got the investigation.
[01:42:20.720 --> 01:42:24.480] 14% of people didn't know the answer to that question.
[01:42:24.480 --> 01:42:26.400] And that was one of the highest.
[01:42:26.400 --> 01:42:27.920] I mean, it's just across the border, though.
[01:42:27.920 --> 01:42:29.440] 47% in China.
[01:42:29.440 --> 01:42:31.040] Again, that's kind of an outlier.
[01:42:31.200 --> 01:42:32.080] Something's up.
[01:42:32.560 --> 01:42:33.120] Wow.
[01:42:33.440 --> 01:42:38.400] And I'm not surprised by the number, like when the numbers are really high, where somebody says, I just don't know.
[01:42:38.400 --> 01:42:38.800] Yeah.
[01:42:38.800 --> 01:42:40.480] Like, I just never, I don't know.
[01:42:40.480 --> 01:42:45.920] But when somebody confidently has the wrong answer, pernicious beliefs.
[01:42:45.920 --> 01:42:55.440] But those do seem to be not always, but like related to other strong beliefs, like religious beliefs or magical thinking or whatever.
[01:42:55.440 --> 01:42:57.600] All right, Evan, give us a quote.
[01:42:57.600 --> 01:43:06.800] Auditors and journalists and scientists are supposed to be trained in critical thinking, but they are subject to the same sorts of biases that we all have.
[01:43:06.800 --> 01:43:14.960] And the fact that we get some training about this doesn't necessarily immunize us against all the ways in which we can make mistakes.
[01:43:15.280 --> 01:43:28.560] And that was said by Daniel Simons, who's an experimental psychologist, cognitive scientist, and professor in the Department of Psychology and the Beckman Institute for Advanced Science and Technology at the University of Illinois.
[01:43:28.560 --> 01:43:34.440] And we will be meeting Daniel in Las Vegas this coming October because he'll be at Sycon 2024.
[01:43:34.440 --> 01:43:37.240] Yeah, that's probably the weekend this episode's going to air.
[01:43:37.240 --> 01:43:37.960] So now.
[01:43:38.280 --> 01:43:44.600] Hey, we're probably at Vegas right now if you're listening to this right as it gets downloaded.
[01:43:44.600 --> 01:43:45.000] All right.
[01:43:45.000 --> 01:43:51.400] Well, thank you guys for joining me for this out-of-sequence time traveling episode of the SGU.
[01:43:51.400 --> 01:43:51.800] Roger.
[01:43:51.880 --> 01:43:53.800] Hey, it'll be good to be with you.
[01:43:54.600 --> 01:43:59.560] And until next week, this is your Skeptic's Guide to the Universe.
[01:44:02.200 --> 01:44:08.840] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:44:08.840 --> 01:44:13.480] For more information, visit us at theskepticsguide.org.
[01:44:13.480 --> 01:44:17.400] Send your questions to info at the skepticsguide.org.
[01:44:17.400 --> 01:44:28.120] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:44:28.120 --> 01:44:31.640] Our listeners and supporters are what make SGU possible.
[01:44:39.240 --> 01:44:43.000] At Azure Standard, we believe healthy food shouldn't be complicated.
[01:44:43.000 --> 01:44:51.640] That's why we've spent 30 years delivering clean organic groceries and earth-friendly products to families who care about what goes on their plates and into their lives.
[01:44:51.640 --> 01:45:00.600] From pantry staples to wellness essentials and garden-ready seeds, everything we offer is rooted in real nutrition, transparency, and trust.
[01:45:00.600 --> 01:45:03.560] Join a community that believes in better naturally.
[01:45:03.560 --> 01:45:09.720] Visit AzureStandard.com today and discover a simpler, healthier way to shop for the things that matter most.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.680] Learn more at aarp.org/slash skills.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.720] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:42.720 --> 00:00:47.920] Today is Wednesday, September 4th, 2024, and this is your host, Stephen Novella.
[00:00:47.920 --> 00:00:49.600] Joining me this week are Bob Novella.
[00:00:49.600 --> 00:00:50.240] Hey, everybody.
[00:00:50.240 --> 00:00:51.600] Kara Santa Maria.
[00:00:51.600 --> 00:00:52.080] Howdy.
[00:00:52.080 --> 00:00:53.040] Jay Novella.
[00:00:53.040 --> 00:00:55.360] Hey, guys, and Evan Bernstein.
[00:00:55.360 --> 00:00:56.480] Good evening, everyone.
[00:00:56.480 --> 00:00:59.040] Don't worry, this is the correct episode you're looking for.
[00:00:59.200 --> 00:01:10.000] We're recording this on September 4th because we have to put an extra episode in the can, as we like to say, for when we are going to SaiCon at the end of October.
[00:01:10.000 --> 00:01:15.280] So, yes, this is the latest episode if you're downloading in October, even though we're recording it so much earlier.
[00:01:15.280 --> 00:01:23.120] And we tried to make this episode a little bit evergreen, so it's not going to be all like dated news items by the time it comes out.
[00:01:23.120 --> 00:01:24.480] That's the magic of podcasting.
[00:01:24.480 --> 00:01:25.680] It involves some time travel.
[00:01:25.760 --> 00:01:26.960] A lot of time travel.
[00:01:26.960 --> 00:01:33.040] So, I was showing Jay this earlier today, but this is the first time this has happened in my yard.
[00:01:33.200 --> 00:01:41.280] My wife and I noticed that there are these plants growing in our front yard, you know, but by where we have, by where we have like the shrubs, you know, by the porch in the front of the house.
[00:01:41.280 --> 00:01:45.120] And they look like pumpkin plants, which you grow in our garden.
[00:01:45.120 --> 00:01:47.200] So, like, okay, we didn't plan plant anything.
[00:01:47.200 --> 00:01:50.320] I didn't, you know, that's that's there's not, I don't plant anything there every year.
[00:01:50.320 --> 00:01:52.160] That's just where all the hedges and stuff are.
[00:01:52.160 --> 00:01:54.800] So, one of the plants has two pumpkins growing on it.
[00:01:54.800 --> 00:01:57.040] So, it's clear, it's clearly a pumpkin plant.
[00:01:57.040 --> 00:02:04.280] The other one, I didn't see anything growing on it, but now it has about six or seven of those gooseneck gourds.
[00:02:04.440 --> 00:02:06.360] You ever see those long neck gourds?
[00:02:06.360 --> 00:02:07.800] Yeah, sure, growing on it.
[00:02:07.800 --> 00:02:12.360] You can't really tell from the plant itself because they all look pretty much the same, all the squashy plants.
[00:02:12.360 --> 00:02:15.080] Yeah, so the question is, how did they get there?
[00:02:15.080 --> 00:02:17.480] So, do you grow both of those in your backyard?
[00:02:17.480 --> 00:02:18.520] No, only pumpkins.
[00:02:18.760 --> 00:02:20.760] We do grow pumpkins in the backyard.
[00:02:20.760 --> 00:02:22.760] We do not in the garden, you know.
[00:02:22.760 --> 00:02:31.800] We don't, I've never grown gourds, you know, in my garden, but we do buy them and have them in the over Halloween.
[00:02:31.800 --> 00:02:34.360] You know, we might have a couple of gourds somewhere.
[00:02:34.360 --> 00:02:38.120] All right, so if they were not deliberately planted, then what?
[00:02:38.120 --> 00:02:41.720] The wind picked up the could they have been pooped out?
[00:02:41.720 --> 00:02:42.840] Yeah, that's what I was wondering.
[00:02:43.000 --> 00:02:43.720] I think Bob's right.
[00:02:44.200 --> 00:02:46.920] Yeah, I was gonna say, have animals been eating your pumpkins.
[00:02:46.920 --> 00:02:47.720] Well, I don't know.
[00:02:47.720 --> 00:02:58.280] I mean, so we so they're out for Halloween and a little after, and then you know, a few days after Halloween, Halloween, everything goes into the compost.
[00:02:58.280 --> 00:03:09.560] Ah, so yeah, what I figure is that some animals ate, we were eating it in the compost, and then they pooped it in my front yard for some reason, or maybe that's just the place that doesn't get mowed.
[00:03:09.560 --> 00:03:17.480] You know, they probably would have been elsewhere, but that's like the well, because there's no food otherwise there, so you know, and the old saying is you don't go where you eat.
[00:03:17.480 --> 00:03:25.560] So, well, I'm just saying that maybe they might have been all over the place, but we mow everywhere else, you know, so it would have cut down those plants as they were coming up.
[00:03:25.560 --> 00:03:28.120] But here, we don't mow, obviously, it's just mulched.
[00:03:28.120 --> 00:03:28.760] So, it's interesting.
[00:03:28.760 --> 00:03:31.400] So, I got some free pumpkins and gourds growing in my front yard.
[00:03:31.400 --> 00:03:34.440] Does that ever happen to you where like a plant like that just starts to grow?
[00:03:34.440 --> 00:03:38.120] Yeah, I mean, I think there's like weed growing all over LA.
[00:03:38.360 --> 00:03:39.280] Yeah, it's always funny.
[00:03:39.760 --> 00:03:40.920] For real, yeah.
[00:03:40.920 --> 00:03:45.280] Like, it's always funny when you like see a little bit growing because somebody like tossed some.
[00:03:44.840 --> 00:03:48.000] It used to happen more.
[00:03:48.160 --> 00:03:53.920] Now people, I think, mostly are smoking like these crazy hydroponic brands that don't even have seeds.
[00:03:53.920 --> 00:04:00.080] But like, yeah, it definitely used to happen back in the day when people just like toss seeds and bags like on the side of the road.
[00:04:00.080 --> 00:04:00.640] Oh, interesting.
[00:04:00.640 --> 00:04:01.200] Yeah, so just like that.
[00:04:01.520 --> 00:04:02.640] Plants grow, you know?
[00:04:02.960 --> 00:04:03.360] Yeah.
[00:04:03.360 --> 00:04:04.080] They do it.
[00:04:04.080 --> 00:04:05.120] They find a way.
[00:04:05.120 --> 00:04:07.200] Yeah, especially if it's a place that you like.
[00:04:07.200 --> 00:04:10.080] You know, that's why it's good to plant native plants.
[00:04:10.800 --> 00:04:13.920] Because they're just good at growing in your environment.
[00:04:13.920 --> 00:04:14.720] Oh, absolutely.
[00:04:14.720 --> 00:04:18.160] And beware invasive species can be wreak havoc.
[00:04:18.160 --> 00:04:22.000] What's that one in New York that they tell you you can't even brush up against?
[00:04:22.000 --> 00:04:24.320] You can't even get a blade of this stuff on you.
[00:04:24.560 --> 00:04:25.360] It'll destroy you.
[00:04:25.840 --> 00:04:26.560] Hogsweed?
[00:04:26.560 --> 00:04:27.520] Is that what it's called?
[00:04:27.520 --> 00:04:29.360] I know in the south it's kudzu, right?
[00:04:29.360 --> 00:04:31.920] That's their invasive plant that's taking over.
[00:04:31.920 --> 00:04:32.480] Oh, I don't know.
[00:04:32.720 --> 00:04:34.080] Giant hogweed.
[00:04:34.400 --> 00:04:36.960] Yeah, this grows in New York State.
[00:04:37.280 --> 00:04:45.200] It's not native, but now it's all over the middle part of the state, and it'll mess you up.
[00:04:45.200 --> 00:04:46.880] It'll give you terrible rashes.
[00:04:46.880 --> 00:04:47.840] It'll make you sick.
[00:04:48.800 --> 00:04:49.760] And it's huge.
[00:04:49.760 --> 00:04:55.440] And you can't, you know, you go to cut it down, a little bit of its juices or anything gets onto you.
[00:04:55.840 --> 00:04:57.360] You wind up being in trouble.
[00:04:57.360 --> 00:05:02.640] Yeah, I know my friends who live in Oregon, everyone there has a problem with invasive blackberries.
[00:05:02.640 --> 00:05:04.400] They just grow like weeds.
[00:05:04.800 --> 00:05:07.520] They were an ag plant that ran away.
[00:05:07.520 --> 00:05:09.360] They could not get it under control.
[00:05:09.360 --> 00:05:10.480] And it's brutal.
[00:05:10.480 --> 00:05:12.320] It just takes over people's backyards.
[00:05:12.320 --> 00:05:14.400] Yeah, we have that in our backyard as well.
[00:05:14.400 --> 00:05:17.680] Not in the yard, but like, in the woods behind the yard.
[00:05:17.680 --> 00:05:20.400] And I have to whack that back every year.
[00:05:20.400 --> 00:05:23.520] Yeah, they say you've got to pull it by the root, but it's really hard to do.
[00:05:23.520 --> 00:05:23.760] It is.
[00:05:24.400 --> 00:05:27.760] Yeah, you've got to pull it up by the root, but the roots are runners, you know?
[00:05:27.760 --> 00:05:30.360] And so you never get them totally up.
[00:05:31.480 --> 00:05:32.840] At some point, it's going to break.
[00:05:33.480 --> 00:05:34.440] So it's hard to get it.
[00:05:29.840 --> 00:05:36.760] But I try to get as much of the roots as I can.
[00:05:37.080 --> 00:05:39.400] Yeah, because it's really like brambly, right?
[00:05:39.400 --> 00:05:40.280] And kind of painful.
[00:05:40.280 --> 00:05:42.680] And it, I don't know, sick leather gloves.
[00:05:42.680 --> 00:05:43.000] Yeah.
[00:05:43.080 --> 00:05:46.600] Pull it up by the roots, but you know, as far back as I can.
[00:05:46.600 --> 00:05:49.560] And it keeps it under control, but I got to keep on top of it.
[00:05:49.560 --> 00:05:53.480] If I didn't, if I go like a couple years without doing that, it would completely take over.
[00:05:53.480 --> 00:05:56.040] I'm so glad I don't have a yard.
[00:05:56.360 --> 00:05:57.560] It's so easy.
[00:05:57.800 --> 00:05:59.800] The upkeep of my house is so easy.
[00:05:59.800 --> 00:06:04.600] That said, I got a packet of wildflower seeds recently, so I planted them in a pot and put them in my window.
[00:06:04.600 --> 00:06:06.440] So I have wildflowers growing in a pot.
[00:06:06.440 --> 00:06:06.920] That's nice.
[00:06:06.920 --> 00:06:07.400] It's cheap.
[00:06:08.040 --> 00:06:11.640] My patio, I have some hooks where I buy some hanging plants each summer.
[00:06:11.640 --> 00:06:13.160] You know, that flower.
[00:06:13.400 --> 00:06:15.560] Last year, I bought one particular variety.
[00:06:15.560 --> 00:06:17.320] I have no idea what the heck it was.
[00:06:17.320 --> 00:06:24.840] But what happened this spring is to the, what borders my patio is a rock garden, right?
[00:06:24.840 --> 00:06:25.640] Just rocks.
[00:06:26.040 --> 00:06:27.080] Nothing else in there.
[00:06:27.400 --> 00:06:33.560] But those flowers from the plant from last year started to spontaneously, or not spontaneously, but they grew.
[00:06:34.120 --> 00:06:38.680] The seeds or whatever fell down to the, found its way down there, and there they are.
[00:06:38.840 --> 00:06:45.240] The little yellow flowers came up all spring and summer, just randomly throughout the rock patch.
[00:06:45.240 --> 00:06:46.120] It looks beautiful.
[00:06:46.120 --> 00:06:46.760] Wow.
[00:06:46.760 --> 00:06:51.880] Do you remember when we were in Texas and we were watching the eclipse in the field?
[00:06:52.200 --> 00:06:52.680] No.
[00:06:53.560 --> 00:06:55.160] I know it's like we were all kind of focused.
[00:06:55.160 --> 00:07:02.920] We were looking up, but when you looked down, and this is like, I guess, a Texas initiative, there are wildflowers everywhere.
[00:07:02.920 --> 00:07:03.560] Yeah, we noticed that.
[00:07:03.560 --> 00:07:04.400] The blue yellows are.
[00:07:04.520 --> 00:07:05.640] They're beautiful.
[00:07:05.640 --> 00:07:06.760] Yeah, they're gorgeous.
[00:07:06.760 --> 00:07:09.800] Yeah, just little, and the kids were like picking them and making bouquets and stuff.
[00:07:10.120 --> 00:07:10.760] Yeah, they're everywhere.
[00:07:10.760 --> 00:07:12.760] And there's bees and butterflies.
[00:07:12.760 --> 00:07:13.520] I love that.
[00:07:13.160 --> 00:07:13.760] Love that.
[00:07:14.200 --> 00:07:18.320] We have a butterfly bush in our backyard, so I bought one and planted it in the backyard.
[00:07:18.560 --> 00:07:25.760] But now I have three butterfly bushes because it just spontaneously propagated, just started growing elsewhere.
[00:07:25.760 --> 00:07:33.520] Again, not where we mow, but basically along the edge of my property where we don't mow, where we just have, you know, bushes and stuff.
[00:07:33.840 --> 00:07:45.840] Steve, it's funny you say that because I got mom a butterfly bush last year, and we're looking at her five feet away from it is a, is there like a stone patio on the ground.
[00:07:46.000 --> 00:07:54.320] So it's a stone patio and growing between two stones is exactly one little, it's a little butterfly bush with a with a beautiful bloom.
[00:07:54.560 --> 00:07:56.320] Yeah, with a peg.
[00:07:56.640 --> 00:07:57.120] What's that?
[00:07:57.120 --> 00:07:58.480] By the way, what's a butterfly bush?
[00:07:58.480 --> 00:08:00.160] Is that milkweed or is it no?
[00:08:00.160 --> 00:08:01.840] It's called butterfly bush.
[00:08:01.840 --> 00:08:02.000] Oh.
[00:08:02.480 --> 00:08:06.640] I'm sure it has a technical name, but that's the common name of it.
[00:08:06.640 --> 00:08:09.280] And it's, it, you know, it really attracts butterflies.
[00:08:09.280 --> 00:08:09.680] Oh, nice.
[00:08:09.680 --> 00:08:12.080] So it's after they're already butterflies.
[00:08:12.080 --> 00:08:16.480] It's also really good to plant milkweed if you want to provide habitat.
[00:08:16.640 --> 00:08:21.920] A lot of that in California, they have whole swaths of nature staked out just for the milkweed.
[00:08:22.480 --> 00:08:24.320] So the butterflies can make their trip, right?
[00:08:24.560 --> 00:08:31.200] But word of caution, if you're going to buy milkweed and plant it in your yard or whatever, make sure it's the right local native.
[00:08:31.440 --> 00:08:37.520] Because a lot of you can buy ones that are just ones that the monarch butterflies do not use.
[00:08:37.520 --> 00:08:37.840] Yeah.
[00:08:38.160 --> 00:08:39.040] But it is cool.
[00:08:39.040 --> 00:08:42.720] Like I had some on my patio when I lived in Florida.
[00:08:42.720 --> 00:08:47.280] And just I remember looking one day and it was like, I don't know, 12 caterpillars.
[00:08:47.600 --> 00:08:48.880] Because that's what they eat.
[00:08:48.880 --> 00:08:50.560] And then they pupate right there on it.
[00:08:50.560 --> 00:08:53.360] It's so cool to watch the whole process.
[00:08:53.360 --> 00:08:54.400] It is, yeah.
[00:08:54.720 --> 00:08:56.880] Yeah, we had a lot of caterpillar action going on.
[00:08:56.880 --> 00:09:00.600] We had these black and yellow caterpillars in our herb garden.
[00:09:00.600 --> 00:09:02.200] We sort of left them there.
[00:08:59.920 --> 00:09:04.360] Well, monarch caterpillars are black and yellow.
[00:09:04.600 --> 00:09:07.640] Yeah, I think these were swallows.
[00:09:07.800 --> 00:09:08.920] Those are so pretty.
[00:09:08.920 --> 00:09:10.200] I love swallowtails.
[00:09:10.200 --> 00:09:11.000] Swallops?
[00:09:11.000 --> 00:09:11.800] Swallops?
[00:09:11.800 --> 00:09:12.680] Swallops?
[00:09:13.560 --> 00:09:17.000] All right, Bob, you're going to start us off with a quickie.
[00:09:17.000 --> 00:09:17.960] Thank you, Steve.
[00:09:17.960 --> 00:09:20.840] Predicting earthquakes in the news.
[00:09:20.840 --> 00:09:35.800] Researchers recently used machine learning techniques to examine data from two recent earthquakes, and they believe it might be possible, actually possible maybe, to detect major earthquakes months, like three months in advance.
[00:09:35.800 --> 00:09:36.440] What?
[00:09:36.440 --> 00:09:38.360] Yeah, that was a great way to get it.
[00:09:39.080 --> 00:09:40.120] Oh, my gosh, that can't be.
[00:09:40.280 --> 00:09:55.080] Well, they looked at the two earthquakes they looked at was the 2018 magnitude 7.1 Anchorage earthquake and the 2019 Ridgecrest California earthquake, which apparently went from a 6.4 magnitude to 7.1.
[00:09:55.080 --> 00:10:01.560] For both, they found unusual low-level seismicity for three months before the big quakes hit.
[00:10:01.560 --> 00:10:08.200] And by low-level, we're talking about, say, a magnitude 1.5 or below, which is something you wouldn't even notice.
[00:10:08.200 --> 00:10:15.880] Because, I mean, I think three, magnitude three is the threshold for somebody to just like noticing it walking around.
[00:10:16.040 --> 00:10:18.520] So this is 1.5, so you're not even going to notice that.
[00:10:18.680 --> 00:10:24.600] They even have a theory about what caused these little warning rumbles for months that they detected.
[00:10:24.840 --> 00:10:29.960] They think it might be a dramatic increase in poor fluid pressure within a fault.
[00:10:29.960 --> 00:10:31.400] So, okay, there's that.
[00:10:31.640 --> 00:10:45.760] The researchers said our paper demonstrates that advanced statistical techniques, particularly machine learning, have the potential to identify precursors to large magnitude earthquakes by analyzing data sets derived from earthquake catalogs.
[00:10:45.760 --> 00:10:57.520] So the idea is that you can go through these catalogs of seismic data and train the machine learning on that, on that, on the catalog, so then it'll be optimized for that area.
[00:10:57.520 --> 00:11:07.440] So they also say modern seismic networks produce enormous data sets that, when properly analyzed, can offer valuable insights into the precursors of seismic events.
[00:11:07.440 --> 00:11:18.080] This is where advancements in machine learning and high-performance computing can play a transformative role, enabling researchers to identify meaningful patterns that could signal an impending earthquake.
[00:11:18.080 --> 00:11:26.160] So remember, though, it's important that before attempting in a new region, they say the system would have to be trained in the area's historical seismicity.
[00:11:26.160 --> 00:11:33.760] So you're not going to bring the system to a new area and just start cranking out some information of when, you know, if something's coming.
[00:11:34.000 --> 00:11:37.680] It's got to become well-versed, if you will, in the area itself.
[00:11:37.680 --> 00:11:40.880] So, but still, imagine three months' warning.
[00:11:40.880 --> 00:11:43.920] The potential to save lives is dramatic.
[00:11:43.920 --> 00:11:47.920] If only they had this technology in that fun movie, Dante's Peak.
[00:11:47.920 --> 00:11:54.480] That annoying grandmother and the cocky scientist might not have died, which would have been, yeah, I didn't care for them.
[00:11:54.480 --> 00:11:55.680] Anyway, spoiler alert.
[00:11:55.840 --> 00:11:58.320] So, yes, this has been your cookie with Bob.
[00:11:58.320 --> 00:11:59.280] Back to you, Steve.
[00:11:59.280 --> 00:12:02.320] Doesn't the cocky scientist die in every disaster movie?
[00:12:02.320 --> 00:12:03.760] Isn't that like the thing?
[00:12:04.720 --> 00:12:09.600] The guy who's making the character unlikable, and that way when they die, it's like, all right.
[00:12:10.320 --> 00:12:11.360] That's the formula.
[00:12:11.360 --> 00:12:12.400] It's not going to happen.
[00:12:12.400 --> 00:12:13.200] Don't worry about it.
[00:12:13.200 --> 00:12:13.920] That guy's dead.
[00:12:13.920 --> 00:12:16.080] You know that guy's going to die.
[00:12:16.400 --> 00:12:16.880] Right?
[00:12:16.880 --> 00:12:19.360] Yeah, he wasn't nearly as annoying as the grandmother.
[00:12:19.360 --> 00:12:20.480] She was really annoying.
[00:12:20.480 --> 00:12:22.480] Yeah, this volcano's going to erupt.
[00:12:22.480 --> 00:12:26.480] There's really no doubt I'm going to stay on the mountain in my cabin.
[00:12:26.480 --> 00:12:31.720] And of course, her goofy little grandkids run up and save her and put everyone at risk.
[00:12:31.720 --> 00:12:32.520] But whatever.
[00:12:32.520 --> 00:12:33.240] Still a fun movie.
[00:12:33.240 --> 00:12:33.640] I like it.
[00:12:33.640 --> 00:12:35.080] And this technology sounds pretty cool.
[00:12:35.480 --> 00:12:37.160] I hope this is really a thing.
[00:12:37.160 --> 00:12:38.040] That'd be sweet.
[00:12:38.040 --> 00:12:38.680] Three months.
[00:12:38.680 --> 00:12:39.480] Yeah, but what are we going to do?
[00:12:39.480 --> 00:12:44.040] Like, if they say, all right, California's going to be hit by a big earthquake in three months, are they going to vacate it?
[00:12:44.680 --> 00:12:49.480] No, but we can start making sure that anything that's not up to code is reinforced.
[00:12:49.480 --> 00:12:56.600] You can, if you're in a home that's like landslide, you know, territory, I'm sure that there's things they can do to prepare.
[00:12:56.600 --> 00:13:01.320] And also just homeowners, if you know an earthquake is coming, you can take things off the shelves.
[00:13:01.320 --> 00:13:03.320] You can prevent things from breaking.
[00:13:03.640 --> 00:13:04.840] Yeah, go on vacation.
[00:13:05.400 --> 00:13:05.880] That too.
[00:13:05.880 --> 00:13:06.520] Don't be there.
[00:13:06.520 --> 00:13:11.240] It's kind of like when hurricanes are coming in in Florida, there's little things.
[00:13:11.240 --> 00:13:13.480] Board up your windows, take in your patio furniture.
[00:13:13.480 --> 00:13:14.520] It can really help.
[00:13:14.520 --> 00:13:24.040] Yeah, they were saying they were giving some confidence levels at like 85, you know, 85%, 85% confidence level, which is pretty good.
[00:13:24.280 --> 00:13:28.920] And then it gets even a little bit better, I think, the closer you get to the actual big quake.
[00:13:28.920 --> 00:13:31.560] And really, a big part is just planning.
[00:13:31.560 --> 00:13:34.440] Like, anybody living in California should have an earthquake kit.
[00:13:34.680 --> 00:13:40.440] Yeah, we all have earthquake kits, but if you don't, then what's in an earthquake kit?
[00:13:40.440 --> 00:13:41.880] It's a bug bag.
[00:13:42.680 --> 00:13:43.240] Oh, oh, okay.
[00:13:43.240 --> 00:13:44.120] Just as a bugaway.
[00:13:44.680 --> 00:13:44.840] Yeah.
[00:13:45.080 --> 00:13:45.720] Oh, no, no, no.
[00:13:45.720 --> 00:13:56.360] It's for like if you are trapped in your house, it's food, it's a radio, it's flashlights, candles, whatever you would need if you don't have power, water.
[00:13:56.360 --> 00:13:57.000] Got it.
[00:13:57.000 --> 00:13:57.480] Yeah.
[00:13:57.480 --> 00:13:59.080] You could make sure that you have that.
[00:13:59.080 --> 00:14:04.200] And also, I think one of the big problems with an earthquake is like bridges, roads, mudslides.
[00:14:04.200 --> 00:14:10.200] So you could plan for that so people aren't, you know, it's not chaos afterward, like it always is.
[00:14:10.520 --> 00:14:18.960] I think everybody should be, you know, should be prepared to live at home with no outside contact for at least three weeks.
[00:14:14.840 --> 00:14:19.440] Three weeks.
[00:14:19.600 --> 00:14:22.400] I think the recommendation is three days, isn't it?
[00:14:22.400 --> 00:14:22.560] Yes.
[00:14:22.880 --> 00:14:23.440] 72 hours.
[00:14:23.600 --> 00:14:24.080] 72 hours.
[00:14:25.200 --> 00:14:27.280] Absolutely three days.
[00:14:27.280 --> 00:14:39.760] But I think if you were like ready to go for two to three weeks, that is, you know, I think that's a good middle ground between like basically living day to day and preppers.
[00:14:39.760 --> 00:14:40.720] You know what I mean?
[00:14:40.720 --> 00:14:43.280] This is essentially like the flu.
[00:14:43.280 --> 00:14:55.280] When the flu hit, and you couldn't, you know, couldn't get toilet paper for weeks or whatever, or there was a total run on the store and you had to stock up.
[00:14:55.280 --> 00:14:56.480] But it's not that hard.
[00:14:56.480 --> 00:15:04.720] It's not that hard to have two to three weeks of food and stuff in your house for most people who have a house, you know.
[00:15:04.720 --> 00:15:05.760] Right, if you have space.
[00:15:05.760 --> 00:15:06.000] Yeah.
[00:15:06.560 --> 00:15:07.840] But at least three days.
[00:15:07.840 --> 00:15:11.600] I think a lot of people, if they were to look right now, they don't have three days worth of water.
[00:15:12.400 --> 00:15:14.560] Need three days worth of clean drinking water.
[00:15:15.920 --> 00:15:16.640] I do.
[00:15:16.960 --> 00:15:18.000] Three months, though, Steve.
[00:15:18.000 --> 00:15:18.960] I mean, that's like...
[00:15:18.960 --> 00:15:19.760] I do as well.
[00:15:20.080 --> 00:15:24.240] The average person wouldn't, including myself, like really wouldn't know how to prep that.
[00:15:24.640 --> 00:15:26.480] I'm saying three weeks, Jay, not three months.
[00:15:26.480 --> 00:15:28.240] Three months, that's getting prepper level.
[00:15:28.240 --> 00:15:28.880] And I agree.
[00:15:28.880 --> 00:15:30.480] That's a total different level.
[00:15:30.480 --> 00:15:49.280] But it's not hard to have, again, like two to three weeks, essentially long enough that if there was a major disruption, an earthquake, a hurricane, a pandemic, something like that, you would be able, you and your family could survive without going to the store for a couple of weeks.
[00:15:49.280 --> 00:15:57.840] That would be time for the authorities to sort of re-establish some kind of infrastructure or whatever.
[00:15:57.840 --> 00:15:58.480] You know what I mean?
[00:15:59.200 --> 00:16:13.160] It wouldn't be like two days later, you're out of food, which I think a lot of people are in that situation where they buy their food for the week, maybe at most, and then by the end of the week, they have no food in their house or very, very little.
[00:16:13.160 --> 00:16:14.280] Yeah, I'm kind of like that.
[00:16:14.520 --> 00:16:15.080] And you're right.
[00:16:15.080 --> 00:16:17.800] It's good to have pantry staples.
[00:16:17.800 --> 00:16:22.280] Pantry staples, dried goods, rice, pasta, things like that, beans, things in cans.
[00:16:22.280 --> 00:16:28.200] Just have it so that your food has this two to three week cycle to it.
[00:16:28.200 --> 00:16:29.480] You know, it's not there forever.
[00:16:29.960 --> 00:16:30.520] You're eating it.
[00:16:30.520 --> 00:16:34.600] You're going through it, but it's just, you have this buffer you have.
[00:16:34.600 --> 00:16:35.240] Yeah, you just have to.
[00:16:35.400 --> 00:16:36.520] So you have all that, Steve?
[00:16:36.520 --> 00:16:37.240] Yeah, totally.
[00:16:37.240 --> 00:16:37.560] All right.
[00:16:37.560 --> 00:16:38.520] Shit hits the fan.
[00:16:38.520 --> 00:16:39.480] Steve's house.
[00:16:40.120 --> 00:16:42.360] And there goes his three-week supply becomes three weeks.
[00:16:42.520 --> 00:16:43.320] Well, that's the other thing.
[00:16:43.800 --> 00:16:45.320] Yep, make sure you can accommodate forever.
[00:16:45.400 --> 00:16:51.720] Yeah, if you have two to three weeks, if you do need to take on an extra person or two or whatever, it's not a disaster.
[00:16:52.120 --> 00:16:53.240] You can accommodate them.
[00:16:53.480 --> 00:16:55.480] What about if you don't have any power, though?
[00:16:55.480 --> 00:16:57.320] That's a different beast.
[00:16:57.560 --> 00:16:59.080] I'm a campaign to cook that stuff.
[00:16:59.400 --> 00:17:04.760] So, like, even though I'm not a generator, I have a jet boil.
[00:17:04.760 --> 00:17:07.080] I have, you know, propane.
[00:17:07.080 --> 00:17:14.200] I have all the things I need to survive to be able to survive when I'm camping.
[00:17:14.200 --> 00:17:14.920] Yeah, right.
[00:17:14.920 --> 00:17:16.040] So treat it like you're camping.
[00:17:16.040 --> 00:17:16.440] Yeah, exactly.
[00:17:17.000 --> 00:17:17.720] We have that stuff too.
[00:17:17.720 --> 00:17:18.680] We also have a fireplace.
[00:17:18.840 --> 00:17:22.040] I always have a half a cord, at least a wood or so around.
[00:17:22.040 --> 00:17:25.080] So if we have absolutely had to, we could do that.
[00:17:25.080 --> 00:17:26.440] Anyway, something to think about.
[00:17:26.440 --> 00:17:27.960] Don't be caught unawares.
[00:17:27.960 --> 00:17:29.480] All right, let's move on.
[00:17:29.480 --> 00:17:41.080] I'm going to talk about the World Health Organization's recent systematic review of the potential association between cell phones and brain cancer.
[00:17:41.120 --> 00:17:41.640] Ah, yes.
[00:17:41.760 --> 00:17:42.440] We talked about that.
[00:17:42.600 --> 00:17:43.080] I saw that.
[00:17:43.640 --> 00:17:44.640] We've been talking about this before.
[00:17:44.800 --> 00:17:45.680] It's about 10 years.
[00:17:44.600 --> 00:17:51.840] I think my first article about it was one of the first articles I wrote in science-based medicine was in 2010.
[00:17:52.160 --> 00:17:52.880] Did you read it?
[00:17:52.880 --> 00:17:54.640] 14 years ago, of course.
[00:17:54.640 --> 00:17:58.240] And I've written about it a few times since then.
[00:17:58.240 --> 00:18:00.560] And so now here's nothing new, right?
[00:18:00.560 --> 00:18:01.360] This is nothing new.
[00:18:01.360 --> 00:18:03.040] This is just an update.
[00:18:04.000 --> 00:18:08.560] It's a systematic review of studies that we've already been reading and reviewing ourselves.
[00:18:08.560 --> 00:18:09.120] You know what I mean?
[00:18:09.120 --> 00:18:12.960] So, like, it shouldn't take you by surprise if it's not new data.
[00:18:12.960 --> 00:18:13.600] It's just reviewing the data.
[00:18:13.840 --> 00:18:14.720] Reviewing the existing data.
[00:18:14.880 --> 00:18:16.080] Yeah, exactly.
[00:18:16.080 --> 00:18:19.280] Here's the bottom line: they found no association, right?
[00:18:19.280 --> 00:18:21.120] So none whatsoever.
[00:18:21.440 --> 00:18:21.840] None.
[00:18:21.840 --> 00:18:32.880] So this is looking at there's different kinds of evidence you can bring to bear to ask the question: is there any risk from the radio frequency EMF from cell phones, right?
[00:18:32.880 --> 00:18:37.120] The concern being that you're holding it up to your head, you're doing it for many hours a day for years.
[00:18:37.120 --> 00:18:39.040] And, you know, is there a potential risk?
[00:18:39.200 --> 00:18:42.000] Which is funny because, like, who holds their phone up to their head anymore?
[00:18:42.000 --> 00:18:42.640] I do sometimes.
[00:18:42.640 --> 00:18:43.760] Sometimes I do it in front of me.
[00:18:43.760 --> 00:18:44.960] Sometimes I hold it up to my head.
[00:18:44.960 --> 00:18:45.680] All right, boomer.
[00:18:45.760 --> 00:18:46.800] No, I'm kidding.
[00:18:47.440 --> 00:18:48.000] Sometimes.
[00:18:48.480 --> 00:18:49.440] I'm kidding.
[00:18:50.000 --> 00:18:50.800] I know you know this.
[00:18:50.800 --> 00:18:54.080] I have many conversations that need to be confidential.
[00:18:54.080 --> 00:18:54.480] Yeah.
[00:18:54.480 --> 00:18:56.960] I can't have it on speakerphone when I'm talking to a patient.
[00:18:57.520 --> 00:18:59.520] That's when I pull out my AirPods.
[00:18:59.520 --> 00:19:00.240] But no, it's true.
[00:19:00.560 --> 00:19:08.480] I think we've talked about this before when you ask somebody younger than a millennial, like a Gen Z or an alpha, what's the universal symbol for the phone?
[00:19:08.480 --> 00:19:11.280] They don't hold their finger and thumb up to their head.
[00:19:12.080 --> 00:19:12.640] I love that.
[00:19:12.640 --> 00:19:12.960] I love that.
[00:19:13.200 --> 00:19:15.760] They hold it like they're holding a deck of cards, like they're holding an iPhone.
[00:19:15.920 --> 00:19:16.400] Right, right, right.
[00:19:16.640 --> 00:19:16.960] So funny.
[00:19:17.040 --> 00:19:17.440] Oh my gosh.
[00:19:18.000 --> 00:19:18.480] All right.
[00:19:18.480 --> 00:19:22.400] So, so this was a review of 63 studies.
[00:19:23.040 --> 00:19:24.640] There are observational studies.
[00:19:24.640 --> 00:19:35.640] Obviously, you can't do experimental studies on risk like this, but there was our observational studies from 22 different countries looking at a total of 116 different comparisons, right, within those studies.
[00:19:36.040 --> 00:19:38.200] They had multiple comparisons.
[00:19:38.200 --> 00:19:41.880] And from 1994 to 2022.
[00:19:42.200 --> 00:19:42.920] 94.
[00:19:43.240 --> 00:19:48.680] Wow, those are the things that are flipping years of data, 28 years.
[00:19:48.680 --> 00:20:02.040] And they found that there was no observable increase in the relative risk for most investigated neoplasms with increasing time start, use of mobile phones, cumulative call time, or cumulative number of calls.
[00:20:02.280 --> 00:20:05.160] In other words, you try to do a dose response, right?
[00:20:05.480 --> 00:20:11.880] When you gather information about do you use a phone, do you not, how many times a day, how long are your calls, how many calls, whatever.
[00:20:11.880 --> 00:20:19.320] And any of those variables, there was no correlation with that and any of the brain cancers that they looked at.
[00:20:19.320 --> 00:20:25.160] They found no increased risk in children or adults, regardless of intensity or duration.
[00:20:25.160 --> 00:20:28.200] So that's all very reassuring, right?
[00:20:28.200 --> 00:20:38.040] I mean, that tells us if there is any tiny risk that's being missed in this pretty large set of data, it's got to be so small it's not worth worrying about.
[00:20:38.040 --> 00:20:41.800] This also correlates with other kinds of data that you could bring to bear as well.
[00:20:41.800 --> 00:20:43.800] So if you look at animal data, right?
[00:20:43.800 --> 00:20:47.480] Now, animal data doesn't look at risk, it looks at hazard.
[00:20:47.720 --> 00:20:50.280] I always like to point out the difference between those two things.
[00:20:50.280 --> 00:21:00.360] This is like, is there a potential biological mechanism by which it could cause harm versus what's the risk that it actually is calling harm, causing harm, right?
[00:21:00.360 --> 00:21:05.720] Like a loaded gun is a hazard, but it's not a risk if you have it locked away in a gun cabinet, right?
[00:21:05.720 --> 00:21:08.600] If a kid playing with a gun, that's a risk, right?
[00:21:08.600 --> 00:21:10.920] So that's one way to look at the difference.
[00:21:10.880 --> 00:21:11.200] Right?
[00:21:11.160 --> 00:21:13.880] You could say cell phone's a hazard, but if you don't use it, there's no risk.
[00:21:14.200 --> 00:21:14.760] Same thing.
[00:21:14.880 --> 00:21:24.240] So when looking at the animal data, we have to divide it up further into two subcategories: thermal and non-thermal.
[00:21:24.960 --> 00:21:27.120] So let me back up one notch.
[00:21:27.120 --> 00:21:38.400] When we're talking about radio frequency EMF, this is what we call non-ionizing radiation, which I know I've mentioned before on the show, but just to reiterate, non-ionizing radiation is low-energy radiation.
[00:21:38.400 --> 00:21:39.280] How low is it?
[00:21:39.280 --> 00:21:42.720] It's so low that it cannot break chemical bonds.
[00:21:42.720 --> 00:21:44.080] So that's the definition.
[00:21:44.080 --> 00:21:52.640] It does not break chemical bonds, which means that it does not damage DNA, it does not damage protein, so it doesn't cause mutations, it doesn't cause any tissue damage.
[00:21:52.880 --> 00:21:57.440] And the thinking is, well, non-ionizing radiation is basically safe.
[00:21:57.440 --> 00:22:04.000] You know, ionizing radiation, that's the stuff we have to worry about and limit your exposure to, like gamma rays and things like that.
[00:22:04.000 --> 00:22:16.400] But non-ionizing radiation can heat tissue a little bit, especially if you're holding it right up to the tissue over the, depending on, again, the intensity and duration, et cetera.
[00:22:16.400 --> 00:22:25.440] So any non-ionizing or radio frequency radiation that was not enough to cause any significant tissue heating is called non-thermal exposure.
[00:22:25.440 --> 00:22:32.880] And in that category, when doing animal studies, non-thermal RF, there's no risk.
[00:22:32.880 --> 00:22:36.640] I mean, it's basically there's no biological issue there.
[00:22:36.640 --> 00:22:50.560] For the thermal group, meaning that there was non-ionizing radiation exposure that was sufficient to cause measurable tissue heating, which could still just be like a degree or two, but it was measurable, it's still mostly negative.
[00:22:50.560 --> 00:22:52.800] It's still, you know, it's a little mixed, though.
[00:22:52.800 --> 00:22:54.320] But there were meaning some studies.
[00:22:54.640 --> 00:22:56.080] What could be risky about that?
[00:22:56.080 --> 00:22:57.120] About slightly warmer.
[00:22:57.280 --> 00:22:57.920] Artificial burn?
[00:22:58.080 --> 00:22:59.360] Yeah, who knows?
[00:22:59.360 --> 00:23:05.240] I mean, again, it's not that plausible a risk anyway, but at least you're saying something physically is happening.
[00:23:05.720 --> 00:23:12.200] Do you think that's why, like, the people who are like, don't keep your phone in your pocket by your balls because it'll affect your fertility?
[00:23:12.200 --> 00:23:14.680] Like, is that the conclusion they're trying to draw?
[00:23:14.680 --> 00:23:21.320] Yeah, it's completely different with testicles because those do need like one degree intimidated sperm production.
[00:23:21.560 --> 00:23:30.840] So, yeah, the idea is that cells, you know, we do carefully regulate our body temperature and small changes in body temperature can have an adverse effect on metabolism or whatever.
[00:23:30.840 --> 00:23:35.400] And so it's semi-plausible that a little bit of heating can be an issue.
[00:23:35.400 --> 00:23:37.000] But would it cause cancer?
[00:23:37.000 --> 00:23:39.800] There's really not really much of a plausible mechanism there.
[00:23:39.800 --> 00:23:42.040] And the evidence is still mostly negative.
[00:23:42.040 --> 00:23:46.600] So it's completely negative for non-thermal and mostly negative for thermal.
[00:23:46.920 --> 00:23:56.120] And I feel like if you looked at just the variability in like the temperature people keep their homes or like being near a lamp or you know there's so many other things.
[00:23:56.120 --> 00:23:57.320] But again, this is animal data.
[00:23:57.320 --> 00:24:04.600] So they're like exposing the animals to this, you know, and measuring the increase in temperature of the skin.
[00:24:04.600 --> 00:24:07.240] And then there's, you know, like ecological data.
[00:24:07.240 --> 00:24:12.520] So we could ask the question just in the last 30 years, has brain cancer increased?
[00:24:12.520 --> 00:24:12.840] Right?
[00:24:12.840 --> 00:24:15.720] And the answer, again, is basically no.
[00:24:15.720 --> 00:24:21.160] Again, there's a lot more complexity there because it depends on what subpopulations you're looking at, et cetera, et cetera.
[00:24:21.160 --> 00:24:25.080] Are you controlling for surveillance and for diagnostic, whatever?
[00:24:25.080 --> 00:24:30.440] But if you look at the totality of that research, it's basically a no.
[00:24:30.440 --> 00:24:34.360] There certainly isn't any big increase in brain cancer over the last 30 years.
[00:24:34.680 --> 00:24:38.480] Yeah, that's not one of the ones that I often see flagged, like colon cancer.
[00:24:38.800 --> 00:24:41.640] Yeah, yeah, increasing in younger and younger people.
[00:24:41.640 --> 00:24:44.080] Yeah, because that's there are lifestyle risk factors for that.
[00:24:43.720 --> 00:24:47.200] You know, there basically isn't, you know, for brain cancer.
[00:24:47.200 --> 00:24:47.600] Yeah.
[00:24:43.800 --> 00:24:50.720] Yeah, like ones that have big lifestyle factors, you can see differences.
[00:24:50.800 --> 00:25:00.480] You also see differences if we radically change our diagnostic procedures, like you know, doing mammograms or doing tests for prostate cancer, you know, PSA.
[00:25:01.280 --> 00:25:06.080] But for brain cancer, I don't think it's there hasn't really been anything significantly different.
[00:25:06.080 --> 00:25:09.200] So animal data, basically negative.
[00:25:09.520 --> 00:25:10.960] Ecological data, negative.
[00:25:10.960 --> 00:25:13.520] This observational data, negative.
[00:25:13.520 --> 00:25:22.080] Doesn't appear to be any plausible mechanism, and there's no risk that we could detect after 30 years out there in the public.
[00:25:22.080 --> 00:25:24.480] So this should pretty much put this to rest.
[00:25:24.480 --> 00:25:26.080] But of course it won't.
[00:25:26.480 --> 00:25:37.440] Because, you know, the press likes fear-mongering headlines, and there's already a cottage industry of quack snake oil devices to protect you from the RF in your cell phone.
[00:25:37.440 --> 00:25:46.160] And the people who want to keep this fear-mongering industry going always say the same kinds of things, which is like, well, we haven't been studying it long enough.
[00:25:46.160 --> 00:25:47.120] Well, what's long enough?
[00:25:47.120 --> 00:25:50.160] Longer than we've currently been studying it, apparently, whatever that is.
[00:25:50.160 --> 00:25:53.120] Right, longer than the amount of time we have available to us to study.
[00:25:53.360 --> 00:25:53.680] Yeah, right.
[00:25:53.680 --> 00:25:57.440] So if you had in the 90s and aughts, maybe you had a point.
[00:25:57.440 --> 00:25:59.520] Yes, we need to do longer, and that's what we said.
[00:25:59.520 --> 00:26:02.480] It's negative so far, but we need to do longer duration studies.
[00:26:02.480 --> 00:26:04.320] Now we have 30 years of data.
[00:26:04.320 --> 00:26:08.480] We would be seeing a signal if there was an actual effect.
[00:26:08.480 --> 00:26:12.560] And the other thing is, well, what, you know, 5G hasn't been out for that long.
[00:26:12.560 --> 00:26:14.160] Maybe 3G is safe.
[00:26:14.160 --> 00:26:19.280] And of course, once we get good data, like long-term data on 5G, 6G will be coming out.
[00:26:19.280 --> 00:26:24.880] Yeah, so there's always this sort of moving target that you can point to.
[00:26:24.880 --> 00:26:27.600] So I don't think this is ever going to completely go away.
[00:26:27.600 --> 00:26:37.720] But at this point, with this much data for this long and no detectable signal, to me, this is just not something you need to worry about.
[00:26:38.040 --> 00:26:47.560] And I think the fact that the basic science, it's non-ionizing radiation, it should be safe is also very reassuring.
[00:26:47.560 --> 00:26:53.080] Again, if you trust in the laws of physics, generally speaking, you're going to be good.
[00:26:53.080 --> 00:26:55.160] So anyway, big systematic review.
[00:26:55.160 --> 00:27:02.760] No surprise for those of us who have been following this research for the last 20 years, but it is good to see.
[00:27:03.080 --> 00:27:16.840] It seems like, tell me if this is a silly question, but it seems like the amount of non-ionizing radiation, which we know is safe, that you would get from having a cell phone near your head, is it really significantly different?
[00:27:16.840 --> 00:27:27.880] You know, given that it's safe, like put that on the shelf, is just the physical quantity really that significantly different from someone who uses a cell phone and someone who just walks around on Earth today?
[00:27:28.200 --> 00:27:28.600] Yeah, that's a good idea.
[00:27:28.920 --> 00:27:31.400] We have so much technology around there.
[00:27:32.920 --> 00:27:36.600] But it is because of the inverse square law.
[00:27:37.640 --> 00:27:38.680] It's just near you.
[00:27:38.840 --> 00:27:42.840] When it's right, right next to your head, it's just going to be more than ambient RF.
[00:27:42.920 --> 00:27:44.520] But like, does an alarm clock do that?
[00:27:44.520 --> 00:27:46.840] Because it's right next to your head, too.
[00:27:47.160 --> 00:27:54.520] I mean, it would give you some reason, but alarm clocks, unless they're 5G connected, they shouldn't be putting out a lot of RF.
[00:27:54.520 --> 00:27:55.320] Oh, okay.
[00:27:55.960 --> 00:27:58.040] So, I mean, yes, it's more exposure.
[00:27:58.600 --> 00:28:07.560] But since you bring that up, this review also included some studies which looked at occupational exposure, including people working like near cell towers.
[00:28:07.560 --> 00:28:08.040] You know what I mean?
[00:28:08.440 --> 00:28:10.280] Yeah, because that's always a big scaremongering thing, too.
[00:28:10.520 --> 00:28:12.120] Yeah, and those are all negative, too.
[00:28:12.120 --> 00:28:18.800] So, even like the way more exposure than you're going to get just using a cell phone didn't seem to produce any increased risk.
[00:28:19.120 --> 00:28:33.280] So, anyway, it's good to know for our modern civilization that radio frequency non-ID non-ionizing radiation is safe because it would be pretty bad if it weren't, you know, given how ubiquitous the technology is.
[00:28:33.520 --> 00:28:35.920] As I said, a lot of other things that you need to worry about.
[00:28:35.920 --> 00:28:37.360] That's not one of them.
[00:28:37.360 --> 00:28:42.560] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Mint Mobile.
[00:28:42.560 --> 00:28:45.440] Guys, I've been using Mint Mobile for the past few weeks.
[00:28:45.440 --> 00:28:47.200] I am really happy with the service, actually.
[00:28:47.440 --> 00:28:48.320] It's awesome.
[00:28:48.320 --> 00:28:50.480] You know, first off, the setup was easy.
[00:28:50.480 --> 00:28:52.000] You know, none of that BS.
[00:28:52.000 --> 00:28:53.920] There wasn't a lot of hoops to jump through.
[00:28:53.920 --> 00:28:58.400] The plan is 15 bucks a month if you purchase a three-month plan.
[00:28:58.400 --> 00:29:01.680] And it turns out that I've been paying way too much.
[00:29:01.840 --> 00:29:04.800] My God, you can't beat $15 a month.
[00:29:05.040 --> 00:29:08.320] And, you know, the bottom line is this: everything was easy.
[00:29:08.320 --> 00:29:09.920] The service is excellent.
[00:29:09.920 --> 00:29:11.440] And there was zero hassle.
[00:29:11.680 --> 00:29:12.960] My cell service is good.
[00:29:12.960 --> 00:29:14.080] Everything is good.
[00:29:14.080 --> 00:29:16.320] And I admit it, I do like Ryan Reynolds.
[00:29:16.320 --> 00:29:16.640] I do.
[00:29:16.640 --> 00:29:17.360] I like him.
[00:29:17.360 --> 00:29:20.720] To get started, go to mintmobile.com/slash SGU.
[00:29:20.720 --> 00:29:27.680] There you'll see that right now, all three-month plans are only $15 a month, as Jay said, including the unlimited plan.
[00:29:27.680 --> 00:29:34.240] All plans come with high-speed data and unlimited talk and text delivered on the nation's largest 5G network.
[00:29:34.240 --> 00:29:40.720] You can use your own phone with any Mint Mobile plan and bring your phone number along with all your existing contacts.
[00:29:40.720 --> 00:29:47.840] Find out how easy it is to switch to Mint Mobile and get three months of premium wireless service for $15 bucks a month.
[00:29:47.840 --> 00:29:56.080] To get this new customer offer and your new three-month premium wireless plan for just $15 a month, go to mintmobile.com/slash SGU.
[00:29:56.080 --> 00:29:58.560] That's mintmobile.com/slash SGU.
[00:29:58.560 --> 00:30:01.800] $45 upfront payment required, equivalent to $15 a month.
[00:30:01.800 --> 00:30:03.880] New customers on first three-month plan only.
[00:29:59.920 --> 00:30:06.280] Speed slower than 40 GB on unlimited plan.
[00:30:06.440 --> 00:30:08.440] Additional taxes, fees, and restrictions apply.
[00:30:08.440 --> 00:30:09.800] See Mintmobile for details.
[00:30:09.800 --> 00:30:12.040] All right, guys, let's get back to the show.
[00:30:12.680 --> 00:30:19.480] All right, Jay, tell us how we get gold from earthquakes, another earthquake-related item.
[00:30:19.480 --> 00:30:20.920] Do you love gold, Jay?
[00:30:20.920 --> 00:30:22.280] I love gold.
[00:30:22.280 --> 00:30:23.080] Gold!
[00:30:23.400 --> 00:30:27.000] I can't hear that word and not say that in my head.
[00:30:27.000 --> 00:30:33.400] This one really surprised me, and I learned a lot of things that I didn't know, and I was wrong about things.
[00:30:33.400 --> 00:30:34.680] So pay attention.
[00:30:34.680 --> 00:30:35.320] This is really cool.
[00:30:35.320 --> 00:30:40.280] So, a new study published in Nature Geoscience, this was on September 2nd of this year.
[00:30:40.280 --> 00:30:46.600] It proposes a pretty interesting explanation for how large gold nuggets form in quartz veins.
[00:30:46.600 --> 00:30:49.000] This has been a long-standing mystery in geology.
[00:30:49.000 --> 00:30:56.920] So, my first thing that I got wrong was I thought that gold was made in stars and that it was just in outer space, right?
[00:30:56.920 --> 00:30:59.240] I mean, that's where gold originates, correct, Steve?
[00:30:59.720 --> 00:31:01.080] Kilonovas, too.
[00:31:01.080 --> 00:31:04.600] Yeah, it's a heavier element.
[00:31:04.600 --> 00:31:12.840] So, basically, anything heavier than iron is not cooked up in the cores of stars just from fusion, right?
[00:31:13.160 --> 00:31:14.760] Because you can only get to iron.
[00:31:14.760 --> 00:31:20.040] Anything heavier, you know, there's a discussion among astronomers, like what are the processes that could do it?
[00:31:20.040 --> 00:31:25.480] So, supernova can do it, but there, but there's kilonovas as well.
[00:31:25.480 --> 00:31:32.120] Yeah, kilonovas, and I think also, just like in neutron stars or something, Bob, what am I remembering about an item from not too long ago?
[00:31:32.200 --> 00:31:33.560] No, that's well, that's what a kilonova is.
[00:31:33.720 --> 00:31:34.280] That's the kilonova.
[00:31:34.360 --> 00:31:35.360] Colliding neutron star.
[00:31:35.520 --> 00:31:36.440] Colliding neutron stars.
[00:31:36.520 --> 00:31:38.040] Yeah, that's what I was thinking of.
[00:31:38.280 --> 00:31:45.040] Yeah, but so you need something, some really violent, very energetic, powerful situation in order to generate the heavier elements.
[00:31:44.760 --> 00:31:48.080] And they seed the gas, interstellar gas clouds.
[00:31:48.320 --> 00:31:55.600] They talk about the metallicity of those clouds, depending on how much they've been seeded by supernova and kilonova.
[00:31:55.600 --> 00:32:04.960] And then when a new planet forms, a new system forms, that's where you have a whole bunch of heavier elements and they sort of clump together in asteroids and planets or whatever.
[00:32:04.960 --> 00:32:05.200] Right.
[00:32:05.200 --> 00:32:10.560] So the takeaway here is that the gold that exists on Earth has always been here.
[00:32:10.560 --> 00:32:12.880] It was here when Earth formed.
[00:32:12.880 --> 00:32:13.280] Right.
[00:32:13.280 --> 00:32:14.960] Or from later bombardment.
[00:32:14.960 --> 00:32:21.040] So it exists in lots of different places all around the planet at different depths and everything.
[00:32:21.040 --> 00:32:25.920] It's just kind of cooked in to all the regolith on Earth, right?
[00:32:25.920 --> 00:32:26.560] Okay.
[00:32:26.560 --> 00:32:35.440] So we're talking about like these quartz veins and why does gold accumulate on quartz veins?
[00:32:35.440 --> 00:32:37.200] Let me get into the details.
[00:32:37.440 --> 00:32:45.680] So when they say here that there's an explanation of why this is happening, and they say the word large gold nuggets, and I wanted to know what the hell that was.
[00:32:45.680 --> 00:32:50.160] So a large nugget would be up to hundreds of kilograms.
[00:32:50.160 --> 00:32:57.920] Specifically, the largest orogenic gold nuggets found can weigh around 60 kilograms.
[00:32:57.920 --> 00:32:59.520] That's 130 pounds.
[00:32:59.520 --> 00:33:10.320] The research suggests that earthquakes might create electric charges in quartz, causing free-floating gold particles to clump together, eventually resulting in these sizable nuggets.
[00:33:10.320 --> 00:33:12.960] This is really, I just think this is so cool, right?
[00:33:12.960 --> 00:33:17.680] Like there's a thing happening, and gold is collecting for a very specific reason here.
[00:33:17.680 --> 00:33:27.080] So, in quartz veins, right, we're talking about quartz as it as it has developed inside the Earth, where most significant gold deposits are found.
[00:33:27.080 --> 00:33:35.560] Typically, they form due to gold-rich hydrothermal fluids that are seeping through the cracks caused by earthquakes, right?
[00:33:29.680 --> 00:33:36.920] So, visualize that.
[00:33:37.240 --> 00:33:49.240] The earthquake happens, there's these super hot hydrothermal fluids that come flowing out of these cracks, they go into the soil, they meet up with a quartz vein, and something happens, right?
[00:33:49.240 --> 00:34:02.200] So, because gold isn't particularly soluble, traditional models would require impractical amounts of fluid to create a large nugget, meaning there'd have to be a ton of this hydrothermal fluid to make a certain amount of gold.
[00:34:02.200 --> 00:34:03.800] So, I tried to look it up and I figured it out.
[00:34:03.800 --> 00:34:14.200] So, for instance, to form a one-kilogram nugget, which is about the size of a standard chocolate bar, you'd need the equivalent of five Olympic swimming pools of hydrothermal fluid.
[00:34:14.200 --> 00:34:19.080] And that is considered to be very unrealistic because that's a lot of that fluid, right?
[00:34:19.080 --> 00:34:27.720] Now, hydrothermal fluid, you have to realize, has a higher gold density than just water that is out and you know that you would interact with regularly in your life.
[00:34:27.720 --> 00:34:35.800] Hydrothermal fluid has a lot more density of lots of different things, but there happens to be a lot of gold particles in there.
[00:34:35.800 --> 00:34:47.480] So, this guy named Christopher Voisey, who's a geoscientist from Monash University, he and his team theorized that the answer might lie in the piezoelectric properties of quartz.
[00:34:47.480 --> 00:34:49.000] This is something else I didn't know.
[00:34:49.000 --> 00:34:57.960] I didn't know that, I know that quartz watch and they vibrate the rock, and I know that if you pass electricity through quartz, that it vibrates.
[00:34:57.960 --> 00:35:00.520] I didn't realize that it had a piezoelectric property to it.
[00:35:00.600 --> 00:35:01.320] It's like piezoelectric.
[00:35:01.400 --> 00:35:02.200] Piezo, piezo.
[00:35:02.200 --> 00:35:05.560] I would say piezio because it reminds me of pizza, and I'm hungry.
[00:35:05.880 --> 00:35:08.360] It is piezoelectric, Steve.
[00:35:08.360 --> 00:35:10.680] You got to get these words straight.
[00:35:10.680 --> 00:35:13.880] When quartz is mechanically stressed, what does that mean?
[00:35:13.880 --> 00:35:19.280] Typically, meaning that it's coming from pressure that is created by earthquakes.
[00:35:14.360 --> 00:35:22.160] So that pressure that's created by earthquakes, right?
[00:35:22.160 --> 00:35:29.120] When the tetonic plates hit each other and the earthquakes happen, it generates an electric charge in the quartz.
[00:35:29.120 --> 00:35:35.840] The charge could cause gold ions in the surrounding fluid to gain electrons and then they start to form solid gold, right?
[00:35:35.840 --> 00:35:38.160] They're not, they start to chunk up.
[00:35:38.160 --> 00:35:47.840] So once the gold starts solidifying, it in and of itself acts as a conductor within the electric field that is being generated by the earthquakes.
[00:35:47.840 --> 00:35:54.960] And that chunk of gold that has formed, it draws even more gold, gold ions to the same location.
[00:35:54.960 --> 00:35:59.280] And this allows the particles to clump together and get bigger and bigger and bigger over time.
[00:35:59.280 --> 00:36:02.160] Lots of time, but it but it happens.
[00:36:02.160 --> 00:36:16.320] So to test the idea, the team simulated seismic activity by, this is all in the lab, by applying this mechanical stress that they call it to quartz, which is submerged in a gold-rich solution, which is kind of like the hydrothermal fluids that I was talking about.
[00:36:16.320 --> 00:36:21.360] So they found that even moderate stress caused gold to accumulate on the quartz.
[00:36:21.440 --> 00:36:22.000] They saw it.
[00:36:22.000 --> 00:36:31.520] It happened in the lab and supporting their theory that this effect that happens to quartz could very well be responsible for the nugget formation.
[00:36:31.520 --> 00:36:36.720] So over time, the solid gold would strengthen the electric reaction and this speeds up the process.
[00:36:36.720 --> 00:36:39.040] So it's kind of like your retirement.
[00:36:39.040 --> 00:36:43.600] The longer it's there and the bigger it gets, the bigger it gets and it gets bigger faster.
[00:36:43.600 --> 00:36:52.480] So while Voise's findings have been thought of as very promising, of course, this is a perfect example of the scientific method.
[00:36:52.480 --> 00:36:59.880] Another scientist read his research and said, this guy's name is Thomas Blankensop.
[00:36:59.880 --> 00:37:01.800] Blankensop, that's his name.
[00:37:01.800 --> 00:37:03.800] You know, it's like I get the hard names, Kara.
[00:36:57.760 --> 00:37:05.240] You get Jack Smith, and I get Thomas Blenkinsop.
[00:37:06.440 --> 00:37:08.360] I also love you, like thinking of Blankensop.
[00:37:08.360 --> 00:37:09.800] Blankensop, that's a name.
[00:37:09.800 --> 00:37:10.200] Yep.
[00:37:10.360 --> 00:37:12.600] Yes, I mean, I've never read it before.
[00:37:12.600 --> 00:37:14.040] You know, it's interesting.
[00:37:14.040 --> 00:37:16.200] I have no idea what nationality that name is.
[00:37:16.200 --> 00:37:25.240] So he is a structural geologist at Cardiff University, and he pointed out that more research is needed to confirm if the process occurs in natural settings, right?
[00:37:25.240 --> 00:37:29.320] Again, this is the scientific method in, you know, live in front of your eyes.
[00:37:29.320 --> 00:37:37.320] He challenges that for this electric effect to work, the quartz veins need to be oriented in a way that generates the necessary electric charge.
[00:37:37.320 --> 00:37:41.240] And it's not clear if gold deposits are found in these ideal orientations.
[00:37:41.240 --> 00:37:49.320] And right out of the gate, another geologist comes in with a really interesting and semi-hard to understand situation.
[00:37:49.320 --> 00:37:59.880] Like, no, the quartz has to actually be in this perfect orientation in order for this to happen, which I could not find a deeper explanation on, but it must be out there somewhere.
[00:37:59.880 --> 00:38:09.480] So Voisey also suggests that this electric effect might be just one piece of the puzzle, and that there's probably other factors that contribute to how these nuggets form.
[00:38:09.480 --> 00:38:17.560] And as long as people want gold, right, as long as we're out there for the hunt for gold, the scientists will still be paid to continue to do research like this.
[00:38:17.560 --> 00:38:21.560] Now, this study is really awesome for lots of reasons.
[00:38:21.560 --> 00:38:27.320] Because it could lead to interesting changes in how gold mining and gold exploration are done, right?
[00:38:27.320 --> 00:38:37.560] If this effect is proven to be the way it works, then they will absolutely change what they look for and they'll sharpen how they look for places where gold deposits might be.
[00:38:37.560 --> 00:38:49.920] Now, this effect that quartz has, it's confirmed in actual real, if it's confirmed in real-world deposits, geologists would start focusing obviously on areas around the world where quartz is oriented.
[00:38:49.920 --> 00:38:51.360] Like that's going to be the key indicator.
[00:38:51.360 --> 00:38:54.000] They're going to look for giant quartz deposits.
[00:38:54.000 --> 00:38:59.840] And if it's oriented the right way, blah, blah, blah, then they could find a lot more gold, a lot more than typical.
[00:38:59.840 --> 00:39:07.040] So the lab replication of this process also suggests that gold could be synthesized in controlled conditions.
[00:39:07.040 --> 00:39:15.200] And this is a long way off, but it is very possible that they've found a way to collect gold better than any other way we have right now.
[00:39:15.200 --> 00:39:17.200] This discovery also does one other thing.
[00:39:17.200 --> 00:39:23.760] It opens the door to possibly finding, first off, we have to figure out if there's more complex mechanisms at play in the nugget formation, right?
[00:39:23.760 --> 00:39:25.200] We'll figure all that out.
[00:39:25.200 --> 00:39:34.080] And then once we start to really wrap our head around all that, we might be able to find other minerals that might be under similar circumstances.
[00:39:34.080 --> 00:39:42.640] So a thing that occurred to me as I was reading this was like, you know, if you just handed me a piece of quartz that had a bunch of gold hanging from it, I would say thank you and I'd go to the bank.
[00:39:42.640 --> 00:39:47.280] But before I did that, I would think, you know, how did the gold get there?
[00:39:47.280 --> 00:39:48.640] Why is it mixed in?
[00:39:48.640 --> 00:39:54.400] You know, was it because the quartz like developed in our in the Earth's ground, right?
[00:39:54.400 --> 00:39:57.120] The quartz is from mineral deposits and stuff like that.
[00:39:57.120 --> 00:39:59.360] But then why would gold be all interlaced in it?
[00:40:00.000 --> 00:40:00.960] I've seen this.
[00:40:00.960 --> 00:40:06.640] I've seen pictures of this before and never really understood the process or why it would come to be that way.
[00:40:06.640 --> 00:40:17.200] And again, usually the answer is much more complicated than you think, because I originally just thought it was just when the Earth formed and things kind of smashed together and the gold was there, and there's big chunks of gold.
[00:40:17.200 --> 00:40:21.680] No, there probably wasn't ever really big chunks of gold just sitting around.
[00:40:21.680 --> 00:40:25.760] They probably formed, which I find that to be fascinating, right?
[00:40:25.760 --> 00:40:32.840] Because there's so much stuff happening underground that we don't realize, especially if you're not a geologist and you don't like you aren't into that type of thing.
[00:40:29.680 --> 00:40:40.920] And yes, it takes a very long time, but those processes happen, and that's why you get all of these different types of formations of different minerals and stuff around the world.
[00:40:40.920 --> 00:40:43.160] There's reasons why these things happen.
[00:40:43.160 --> 00:40:50.680] So, anyway, I would love, I would absolutely love to find myself a giant piece of quartz with a whole bunch of gold on it.
[00:40:50.680 --> 00:40:52.040] Would that be an amazing find?
[00:40:52.040 --> 00:40:53.800] Imagine if you just could get your hands on that.
[00:40:53.800 --> 00:40:57.080] You ever watch those guys that, you know, they do the old panning?
[00:40:57.320 --> 00:41:03.800] They have like this bench and they have a whole bunch of water coming down the ramp, and there's all these little catch bait pockets in there.
[00:41:03.800 --> 00:41:10.040] And at the end of the day, after an entire day of, you know, they're putting in soil that is rich in gold.
[00:41:10.040 --> 00:41:11.640] Like they found soil that's rich in gold.
[00:41:11.640 --> 00:41:16.840] They do this panning process, and it's really advanced, like way, way more advanced than 100 years ago.
[00:41:16.840 --> 00:41:20.840] And they end up with literally like, I mean, guys, a thimble full.
[00:41:21.160 --> 00:41:23.000] Not even, no, not even, not even.
[00:41:23.000 --> 00:41:27.720] It's like a tiny little, like if you took a pinch of salt and dropped it in your hand, it'd be like half that.
[00:41:27.720 --> 00:41:29.560] And they're like, yeah, we really hit it today.
[00:41:29.560 --> 00:41:30.440] No, you didn't.
[00:41:30.440 --> 00:41:32.680] Like, oh my God, it's almost worth nothing.
[00:41:32.680 --> 00:41:35.240] Like, it's really hard to collect gold.
[00:41:35.240 --> 00:41:38.440] And the current processes that we have are super expensive.
[00:41:38.440 --> 00:41:50.440] They have to put a ton of soil into these reactors that do all these processes and tons of chemicals and all sorts of shit that they do to extract the gold and to change it from one form to another and all this craziness.
[00:41:50.440 --> 00:41:53.400] This thing right here could be a really big find for gold.
[00:41:53.400 --> 00:41:55.640] What do you mean, change it from one form to another?
[00:41:55.640 --> 00:41:57.480] Well, if you talk about alchemy.
[00:41:59.400 --> 00:42:06.360] I am talking about the chemistry of extracting gold in the soil because they have, if you looked, I watched videos of this.
[00:42:06.440 --> 00:42:09.160] So, maybe they need something to make it stick to something else.
[00:42:09.160 --> 00:42:10.520] Do they use cyanide for that?
[00:42:10.520 --> 00:42:11.640] Yes, they use cyanide.
[00:42:11.640 --> 00:42:23.920] They use all these, but they use different chemicals, and you see like the process as it goes, and you're like, okay, they got to do this, then they got to heat this thing up, and they get rid of all this stuff, and the stuff that's left over, then they mix it with this thing, and then they got to stir it for like 20 hours, and then they got to boil it again.
[00:42:24.000 --> 00:42:24.880] Then they got, you know what I mean?
[00:42:24.880 --> 00:42:31.760] Like, Kara, it's this ridiculous process, and at the end, yeah, they come out with gold, but like a thousand people worked on that.
[00:42:31.760 --> 00:42:34.080] Yeah, it doesn't seem worth it.
[00:42:34.080 --> 00:42:44.080] And it does turn a profit, but you know, still, like, imagine if this could really like make a really big difference with us finding gold.
[00:42:44.080 --> 00:42:46.560] And, you know, gold is ridiculously useful.
[00:42:47.280 --> 00:42:55.920] Imagine if gold became a lot more common and we were able to use it for a lot more things, it would be very helpful for future technology if that were the case.
[00:42:55.920 --> 00:43:00.880] Yeah, it's interesting to think of why certain minerals clump where they do.
[00:43:00.880 --> 00:43:07.920] One reason is like with gold, gold is one of the metals that's found in its pure form, not as an alloy, right?
[00:43:07.920 --> 00:43:09.040] I mean, not as an ore.
[00:43:09.040 --> 00:43:19.040] We don't, it's not like an iron ore where it's bound to chemically to some other substance, and we have to then figure, you know, we have to smelt it in order to purify it as iron.
[00:43:19.200 --> 00:43:20.880] When you find gold, it's just gold, right?
[00:43:20.880 --> 00:43:22.560] It's just pure gold.
[00:43:22.560 --> 00:43:26.880] And that's because it already has been purified in a way.
[00:43:26.880 --> 00:43:30.400] It's by clumps together by these reactions.
[00:43:30.400 --> 00:43:36.080] And you find it in veins and nuggets and everything because of the geological processes that form it.
[00:43:36.400 --> 00:43:38.720] Steve, we do smelt for gold, though.
[00:43:39.040 --> 00:43:43.680] Well, what I'm saying is, like, when you find a nugget of gold, that's gold, right?
[00:43:43.680 --> 00:43:47.920] It's not, it's not chemically anything else, it's just gold, right?
[00:43:47.920 --> 00:43:53.120] As opposed to, like, when you get iron ore, it's not iron, it's iron bound to something else.
[00:43:53.120 --> 00:43:53.760] You know what I'm saying?
[00:43:53.760 --> 00:43:54.240] That's the difference.
[00:43:54.400 --> 00:43:55.680] It's an ore, it's not the metal.
[00:43:56.160 --> 00:43:56.560] Yeah, you're right.
[00:43:56.560 --> 00:44:00.440] So the gold, you mean that the gold is already there in its complete form.
[00:44:00.440 --> 00:44:00.920] It's gold.
[00:43:59.920 --> 00:44:02.360] We just have to get it out.
[00:44:02.840 --> 00:44:08.840] Chemically, it's elemental gold, as opposed to, you know, sometimes it is bound to other things.
[00:44:08.840 --> 00:44:12.920] You know, gold can form chemical bonds, but you do find nuggets of pure gold.
[00:44:12.920 --> 00:44:15.320] Silver is another one that exists like that.
[00:44:15.320 --> 00:44:23.080] Whereas like with iron and copper and those, you're finding ores that you then have to aluminum has to go through that process.
[00:44:23.320 --> 00:44:24.360] Aluminum is like in the dirt.
[00:44:24.360 --> 00:44:27.080] Yeah, it's not like in clumps or nuggets or anything.
[00:44:27.080 --> 00:44:34.440] But like with iron, what's interesting is that sometimes the reason why you find a deposit is because a meteor landed there.
[00:44:34.440 --> 00:44:36.440] Like an iron meteor crashed there.
[00:44:36.440 --> 00:44:36.840] Yeah, yeah, yeah.
[00:44:37.000 --> 00:44:38.360] Those things are impressive.
[00:44:38.360 --> 00:44:40.280] And then they make samurai sword, legit.
[00:44:40.360 --> 00:44:42.600] They make samurai swords out of that iron.
[00:44:42.600 --> 00:44:47.160] So you're, you're like, not you, Steve, because I wish I had the money to buy you that sword.
[00:44:47.160 --> 00:44:51.240] But you could buy a sword made out of, you know, iron from outer space.
[00:44:51.800 --> 00:44:54.600] And I think that is like about as cool as it gets, right?
[00:44:54.600 --> 00:44:56.040] Like, oh my God.
[00:44:56.040 --> 00:45:04.280] Yeah, although, you know, yes, swords forged from meteoric iron are not really very functional, right?
[00:45:04.280 --> 00:45:07.400] Because it's not, unless you make steel out of it.
[00:45:07.400 --> 00:45:11.880] But then you're kind of getting rid of the meteoric form of the iron.
[00:45:11.880 --> 00:45:19.160] It would be cool to have, you know, how like the iron meteorites have that, it's like iron and nickel in a crystalline pattern.
[00:45:19.160 --> 00:45:20.360] That blue tint, yeah.
[00:45:20.360 --> 00:45:26.120] It's not just the blue, it has a specific pattern to it, like when they slice it and then polish it.
[00:45:26.120 --> 00:45:28.920] The Widmont Staten pattern, ever you guys hear that word?
[00:45:28.920 --> 00:45:29.880] I know all about it.
[00:45:29.880 --> 00:45:31.000] Oh, that one?
[00:45:31.000 --> 00:45:33.080] It's an iron, nickel, strong.
[00:45:33.080 --> 00:45:33.640] Anyway.
[00:45:33.640 --> 00:45:34.120] Yeah, it's cool.
[00:45:34.440 --> 00:45:35.960] Let's move on.
[00:45:36.280 --> 00:45:39.480] Evan, do I have plastic in my brain?
[00:45:39.480 --> 00:45:40.600] Oh, my gosh.
[00:45:40.600 --> 00:45:42.440] Yeah, plastic and the brain.
[00:45:42.440 --> 00:45:44.120] Isn't that interesting?
[00:45:44.120 --> 00:45:53.760] That you mentioned those two words, Steve, because you know, all of all the things I've learned in my years of skepticism are the association of the words brain and plastic, right?
[00:45:53.760 --> 00:45:55.280] We've talked about this before, right?
[00:45:55.760 --> 00:45:58.880] Brain plasticity or neuroplasticity.
[00:45:58.880 --> 00:45:59.440] Oh, that's funny.
[00:45:59.600 --> 00:46:00.800] Different kind of plastic, yeah.
[00:46:00.800 --> 00:46:03.440] I know, it comes up, but we've talked about it before.
[00:46:03.440 --> 00:46:06.640] You know, that's the brain's ability to change and adapt over a person's lifetime.
[00:46:06.720 --> 00:46:16.880] That's funny because I was talking one time, like when my daughter was like six or something, I'm talking to my wife about neuroscience, and I mentioned that the brain is plastic.
[00:46:16.880 --> 00:46:21.440] And my daughter, who was listening in the back seat, is like, the brain is plastic?
[00:46:22.640 --> 00:46:26.320] Explains where I'm talking about a different kind of plastic.
[00:46:26.320 --> 00:46:27.040] Right.
[00:46:28.320 --> 00:46:37.120] And so whenever a news item comes along and the words brain and plastic are there, well, that's where my mind immediately goes.
[00:46:37.120 --> 00:46:38.000] First thought.
[00:46:38.000 --> 00:46:40.160] Is that a heuristic of some kind, right?
[00:46:40.160 --> 00:46:41.760] Am I utilizing a shortcut?
[00:46:42.720 --> 00:46:50.080] So, but it's throwing me off because here's a news item with brain and plastic, but it's not about neuroplasticity.
[00:46:50.080 --> 00:46:53.280] It's about microplastics found in the human brain.
[00:46:54.560 --> 00:46:55.120] Yeah.
[00:46:56.000 --> 00:46:58.720] And more, you know, nanoplastics.
[00:46:58.720 --> 00:46:59.120] Oh, boy.
[00:46:59.840 --> 00:47:00.720] There you go.
[00:47:00.720 --> 00:47:01.120] Yeah.
[00:47:01.120 --> 00:47:04.480] Nanoplastics are less than a micron.
[00:47:04.480 --> 00:47:07.280] That's one one-thousandth of a millimeter.
[00:47:07.280 --> 00:47:14.560] Microplastics are just above that, but up to one thousandth of a millimeter.
[00:47:14.560 --> 00:47:15.280] Whatever.
[00:47:15.440 --> 00:47:17.680] They're just throwing nano around, I think.
[00:47:18.800 --> 00:47:19.520] Well, yeah, but no.
[00:47:19.680 --> 00:47:25.520] A thousandth of a millimeter is a micrometer, and a thousandth of a micrometer is a nanometer.
[00:47:25.520 --> 00:47:26.800] Right, the micron, right?
[00:47:26.800 --> 00:47:28.560] One thousandth of a millimeter is a micron.
[00:47:28.880 --> 00:47:32.040] So, nanoplastics are less than a micron by definition.
[00:47:29.840 --> 00:47:33.720] Three orders of magnitude.
[00:47:35.720 --> 00:47:37.400] Wait, what's right under micro, guys?
[00:47:37.400 --> 00:47:38.680] Like, look at the actual thing.
[00:47:38.680 --> 00:47:40.600] Is it something people would recognize?
[00:47:40.600 --> 00:47:43.160] What's the prefix that's just underneath micro?
[00:47:43.160 --> 00:47:44.760] It's nano.
[00:47:45.080 --> 00:47:46.840] Yeah, then that's why they're nano.
[00:47:47.640 --> 00:47:49.080] Micro is million, right?
[00:47:49.080 --> 00:47:50.600] And nano is billion.
[00:47:50.600 --> 00:47:51.400] Yeah.
[00:47:51.720 --> 00:47:53.560] It's milli, micro, nano.
[00:47:53.560 --> 00:47:58.520] Yeah, so they're saying underneath the cutoff for something that is micro, they're just jumping.
[00:47:58.520 --> 00:48:04.040] They're calling nano anything between nano and micro, and they're calling micro anything between micro and milli.
[00:48:04.040 --> 00:48:04.440] Right.
[00:48:04.440 --> 00:48:04.840] Yeah.
[00:48:04.840 --> 00:48:06.440] So I don't like that.
[00:48:06.440 --> 00:48:13.800] Well, I know that when you're talking about like nanotechnology and nanomaterials, they use 100 nanometers as the cutoff.
[00:48:13.800 --> 00:48:20.280] So like if you have one dimension less than 100 nanometers, that's a nano sheet.
[00:48:20.280 --> 00:48:23.880] If you have two dimensions, then that's a nano fiber.
[00:48:23.880 --> 00:48:26.040] If it's rolled up, then it's a nanotube.
[00:48:26.040 --> 00:48:29.000] If you have all three dimensions, then it's a nano dot.
[00:48:29.880 --> 00:48:33.400] But they use the 100 nanometers as the cutoff for nano.
[00:48:33.400 --> 00:48:33.960] That's reasonable.
[00:48:33.960 --> 00:48:35.000] Yeah, I mean, I guess.
[00:48:35.320 --> 00:48:38.040] I think it just depends on what materials you're using.
[00:48:38.040 --> 00:48:40.600] And also, it depends on how you want to look at the number line, right?
[00:48:40.600 --> 00:48:45.960] Because you would talk about kilograms as anything between kilo and mega, and then you would move to megagrams.
[00:48:45.960 --> 00:48:47.640] But you're going up in the number line.
[00:48:47.640 --> 00:48:53.480] So you guys are assuming you should go down in the number line, but what if you always go up in the number line, even if you're on the other side of it?
[00:48:53.480 --> 00:48:55.160] Does that make sense, what I just said?
[00:48:55.160 --> 00:48:55.560] I think so.
[00:48:57.240 --> 00:49:03.800] Anything from 10 to the negative 9 to 10 to the negative 6, why would we call that micro and not nano?
[00:49:03.800 --> 00:49:08.360] Because you're saying you should move away from the number line, whether it's in the positive or the negative.
[00:49:08.360 --> 00:49:09.800] Or sorry, whether it's, yeah, in the bigger.
[00:49:09.960 --> 00:49:12.760] They just decided, though, it was 100 nanometers.
[00:49:13.080 --> 00:49:13.480] You're right.
[00:49:13.960 --> 00:49:14.280] Exactly.
[00:49:14.280 --> 00:49:14.800] These are all different.
[00:49:15.120 --> 00:49:15.600] They just decided.
[00:49:16.480 --> 00:49:18.080] Had to make a call here of some kind.
[00:49:14.760 --> 00:49:20.800] But microplastics and nanoplastics.
[00:49:21.360 --> 00:49:27.040] These terms are often used, I think, interchangeably, but they each have a specific size.
[00:49:27.040 --> 00:49:34.720] And a microplastic can be up to five millimeters, which is what, the size of a grain of rice or so?
[00:49:34.720 --> 00:49:35.120] Yeah, yeah.
[00:49:35.440 --> 00:49:36.560] So we're talking about the brain.
[00:49:36.560 --> 00:49:37.280] We're talking about what?
[00:49:37.280 --> 00:49:38.880] Microplastics and nanoplastics?
[00:49:38.880 --> 00:49:39.200] And I'm thinking about it.
[00:49:39.520 --> 00:49:41.360] Because nobody's going to call it a milliplastic.
[00:49:41.600 --> 00:49:42.480] Gosh, right?
[00:49:42.480 --> 00:49:42.880] Yeah.
[00:49:42.880 --> 00:49:43.120] Right?
[00:49:43.360 --> 00:49:48.160] But are doctors finding plastic shards the size of rice grains in people's brains?
[00:49:48.640 --> 00:49:51.120] Is that what this news item is about to tell me?
[00:49:51.280 --> 00:49:52.960] How horrifying would that be?
[00:49:52.960 --> 00:49:55.280] Most doctors aren't just digging around in people's brains.
[00:49:56.000 --> 00:49:57.760] Well, I mean, it would be an autopilot.
[00:49:58.400 --> 00:49:59.040] Yeah, it would be a plastic.
[00:49:59.200 --> 00:49:59.760] Yeah, that's right.
[00:50:00.000 --> 00:50:01.200] And that's what this is.
[00:50:01.520 --> 00:50:04.480] And CNN, they did a good job of covering this one.
[00:50:05.040 --> 00:50:11.360] There's their headline: Tiny shards of plastic are increasingly infiltrating our brains, a study says.
[00:50:11.360 --> 00:50:17.200] And the author of the article is Sandy Lamotte, L-A-M-O-T-T.
[00:50:17.200 --> 00:50:18.480] And here's the first line.
[00:50:18.480 --> 00:50:20.400] I like how she starts this.
[00:50:20.400 --> 00:50:32.400] Human brain samples collected at autopsy in early 2024 contained more tiny shards of plastic than samples collected eight years prior, according to a preprint posted online in May.
[00:50:32.400 --> 00:50:38.880] And then she writes, a preprint is a study which has not yet been peer-reviewed and published in a journal.
[00:50:38.880 --> 00:50:41.760] So, first of all, thanks for saying that right from the top.
[00:50:41.760 --> 00:50:44.160] That's a preprint article.
[00:50:44.160 --> 00:50:46.720] And what a preprint article is.
[00:50:46.720 --> 00:50:49.520] We've talked about preprinted articles before, right?
[00:50:49.520 --> 00:50:52.200] And we've cited a lot of the benefits of those.
[00:50:52.000 --> 00:50:52.360] Right.
[00:50:52.800 --> 00:50:55.040] And the ways you should be, you know, skeptical, article too.
[00:50:55.040 --> 00:50:55.520] Yeah, cautious.
[00:50:55.680 --> 00:50:57.200] Right, you should be cautious, right?
[00:50:57.240 --> 00:51:01.640] And but it's important to remember the preprint studies, they've not been peer-reviewed yet.
[00:51:01.640 --> 00:51:01.880] Exactly.
[00:50:59.680 --> 00:51:05.800] So you have to, so you have to weigh all of this appropriately.
[00:51:05.960 --> 00:51:08.520] And it was good that they kind of get that out right at the beginning.
[00:51:08.520 --> 00:51:10.200] They link right to the source.
[00:51:10.200 --> 00:51:14.600] Oh, by the way, preprints are free, no paywalls.
[00:51:14.600 --> 00:51:19.640] You can go see them, and that's another benefit of having preprints.
[00:51:19.640 --> 00:51:32.040] Here's the study: Bioaccumulation of Microplastics in decedent human brains assessed by pyrolysis gas chromatography mass spectrometry.
[00:51:32.040 --> 00:51:32.760] Good job.
[00:51:33.000 --> 00:51:33.640] Nice one.
[00:51:33.640 --> 00:51:34.040] Thank you.
[00:51:34.040 --> 00:51:34.600] Yeah.
[00:51:34.600 --> 00:51:35.160] Yeah.
[00:51:35.160 --> 00:51:36.920] Which I definitely had to do some looking at.
[00:51:37.000 --> 00:51:38.360] Evan, I don't even know how you did that.
[00:51:38.840 --> 00:51:40.440] Yeah, slowly.
[00:51:40.440 --> 00:51:41.160] Slowly.
[00:51:41.160 --> 00:51:43.160] And I practiced.
[00:51:43.480 --> 00:51:48.440] If you chunk the word, break it up, and it's easier to pronounce in bite-sized chunks rather than all of them.
[00:51:48.520 --> 00:51:49.480] Oh, is that how you do about it?
[00:51:49.560 --> 00:51:50.600] Is it gestalt?
[00:51:51.240 --> 00:51:54.680] Sometimes if it's difficult for me.
[00:51:54.680 --> 00:51:55.960] Nice use of gestalt.
[00:51:56.200 --> 00:51:56.760] I like that.
[00:51:56.760 --> 00:52:01.160] And for me, it's words like spectrometry that trip me up.
[00:52:01.160 --> 00:52:02.520] There are two R's in that word.
[00:52:03.160 --> 00:52:05.720] But it sounds like there should only be one R.
[00:52:05.720 --> 00:52:10.440] That's why I always say spec, because I always mix up spectroscopy and spectrometry.
[00:52:10.920 --> 00:52:12.440] I just say spec.
[00:52:12.760 --> 00:52:13.800] Smart and aspect.
[00:52:14.920 --> 00:52:18.680] All right, here are the highlights from the abstract and then from the conclusion.
[00:52:18.680 --> 00:52:20.200] I'll just touch on these.
[00:52:20.200 --> 00:52:20.920] Abstract.
[00:52:20.920 --> 00:52:23.720] We compared MNP accumulation.
[00:52:23.720 --> 00:52:28.360] Now, MNP means micro and nanoplastics, MNP.
[00:52:28.360 --> 00:52:32.600] We compared MNP accumulation in kidneys, livers, and brains.
[00:52:32.600 --> 00:52:38.600] These were autopsy samples collected in 2016 and in 2024.
[00:52:39.560 --> 00:52:41.960] It was an analysis of 12 polymers.
[00:52:41.960 --> 00:52:47.760] All organs exhibited significant increases from 2016 to 2024.
[00:52:44.680 --> 00:52:50.480] Polyethylene was the predominant polymer.
[00:52:50.720 --> 00:52:57.200] The polyethylene MNPs were greater in the brain samples than in the liver or kidney.
[00:52:57.200 --> 00:52:57.680] Oh my God.
[00:52:57.840 --> 00:52:58.000] Right?
[00:52:58.240 --> 00:52:59.760] Both bioaccumulating.
[00:53:00.240 --> 00:53:04.160] Yeah, for some reason it's gathering, it likes the brain.
[00:53:04.400 --> 00:53:05.520] And they verified it.
[00:53:05.520 --> 00:53:16.800] They verified the nanoscale nature of the isolated particles, which largely appeared to be aged shard-like plastics, remnants across a wide range of sizes.
[00:53:17.440 --> 00:53:18.320] Not good.
[00:53:18.320 --> 00:53:28.960] From the conclusion, MNP concentrations in decedent brain samples ranged from 7 to 30 times the concentrations seen in the livers and kidneys.
[00:53:28.960 --> 00:53:41.040] The majority of these are on the nanometer scale, so not the micro scale, the nanometer scale, the small, small pieces, but they are shard-like particles, definitely in their shape.
[00:53:41.040 --> 00:53:49.840] Lead study author Matthew Campen, who is a regions professor of pharmaceutical sciences at the University of New Mexico in Albuquerque.
[00:53:49.840 --> 00:53:52.080] He had some things to say about this.
[00:53:52.080 --> 00:53:59.600] He says, somehow these nanoplastics hijack their way through the body and get to the brain, crossing the blood-brain barrier.
[00:53:59.600 --> 00:54:11.520] Plastics love fats or lipids, so one theory is that the plastics are hijacking their way with the fats we eat when they are delivered to the organs that really like lipids.
[00:54:11.520 --> 00:54:14.240] The brain is top among those.
[00:54:14.240 --> 00:54:30.600] He also says the concentrations we saw in the brain tissue of normal individuals who had an average age of around 40 or 45 or 50 years old were 4,800 micrograms per gram, or 1 half percent by weight.
[00:54:30.760 --> 00:54:35.880] That's interesting that their average age of decedence was so young in their sample.
[00:54:29.920 --> 00:54:36.200] Yeah.
[00:54:36.520 --> 00:54:37.880] 45 to 50?
[00:54:37.880 --> 00:54:39.880] I wonder what bank they were going to have.
[00:54:40.040 --> 00:54:41.320] Yeah, why that?
[00:54:41.480 --> 00:54:43.320] Maybe they intentionally chose younger brains.
[00:54:43.320 --> 00:54:43.880] I don't know.
[00:54:43.880 --> 00:54:49.320] 92 people underwent the forensic autopsy to verify cause of death in both 2016 and 2024.
[00:54:49.320 --> 00:54:52.600] Brain tissue samples were gathered from the frontal cortex.
[00:54:53.400 --> 00:54:55.800] He went on to say, oh, that's frontal lobe.
[00:54:56.200 --> 00:54:57.560] This is cortical.
[00:54:57.560 --> 00:54:59.960] It's not like in the ventricle or something.
[00:55:00.280 --> 00:55:08.440] Because they were talking about perhaps what, a potential connection for Alzheimer's.
[00:55:08.600 --> 00:55:11.080] Yeah, we've seen lots of studies where they're trying to do it.
[00:55:11.800 --> 00:55:16.440] Again, they really, I think, more just touched on that rather than made that kind of the focus of this.
[00:55:16.600 --> 00:55:24.360] Steve, does that interest you that they were actually looking at cortical tissue and not like the lining of ventricles or like deeper structures?
[00:55:24.360 --> 00:55:25.080] Yeah, definitely.
[00:55:25.080 --> 00:55:27.000] I mean, you have to wonder how is it getting there?
[00:55:27.400 --> 00:55:27.880] Right.
[00:55:28.280 --> 00:55:37.560] Is it like, I was actually thinking, is there some process like the gold where it's like binding together from the blood and or whatever you're passing through the tissue?
[00:55:37.560 --> 00:55:39.720] I'm not really sure how it would get there.
[00:55:39.720 --> 00:55:51.560] Yeah, I guess I'd be curious, and Evan, I doubt you saw this, but like on histology, I'd be curious if it's almost like coating the brain or if it's like deep in the matrix, like deep into the cellular tissue.
[00:55:51.960 --> 00:55:57.480] And I think one of the things they said is they're unsure if the particles are fluid, right?
[00:55:57.480 --> 00:56:05.880] Kind of entering and leaving the brain or if they're collecting in the tissues and right, so are they just bathing it or are they working their way in?
[00:56:06.760 --> 00:56:14.280] So, but they definitely, the authors are the first to admit that further research is really needed here to understand what's going on.
[00:56:14.280 --> 00:56:18.240] I mean, there is just one study, it's not peer-reviewed yet, right?
[00:56:18.240 --> 00:56:23.920] So, again, this is where you have to take this a little bit with a grain of salt.
[00:56:23.920 --> 00:56:34.240] But it also makes you wonder, like, if they saw it in like all their brains and really they're just comparing concentrations, I think we're going to start seeing a lot of this research.
[00:56:34.240 --> 00:56:46.000] And they're saying that this study does not talk about brain damage or any sort of right, what the impact it's having, if at all, frankly.
[00:56:46.000 --> 00:56:52.560] I mean, how it are, I mean, it must be doing something, but is it so negligible that it's not having an impact?
[00:56:52.560 --> 00:56:59.600] I mean, definitely worthy of more study, but they're saying that you can't just draw too many conclusions off of this one study yet.
[00:56:59.600 --> 00:57:01.600] A lot more has to be done here.
[00:57:01.600 --> 00:57:02.720] All right, thanks, Evan.
[00:57:02.720 --> 00:57:03.040] Yep.
[00:57:03.040 --> 00:57:06.480] All right, Bob, tell us about quantum neural networks.
[00:57:06.480 --> 00:57:07.120] Ooh.
[00:57:07.120 --> 00:57:08.000] Ooh.
[00:57:08.000 --> 00:57:09.040] Or that sounds like bullshit.
[00:57:09.200 --> 00:57:11.280] Sounds a little deep park chopper-y to me.
[00:57:11.440 --> 00:57:11.760] Yeah.
[00:57:11.760 --> 00:57:16.800] I was like, ooh, and then I was like, quantum will do that to you.
[00:57:17.760 --> 00:57:18.560] All right, guys.
[00:57:18.560 --> 00:57:26.720] A researcher has devised a quantum neural network that appears to interpret some optical illusions the way people see them.
[00:57:26.720 --> 00:57:28.960] That really was so fascinating.
[00:57:28.960 --> 00:57:30.800] This is by Ivan S.
[00:57:30.800 --> 00:57:34.320] Naksimov, and he published in the journal APL Machine Learning.
[00:57:34.320 --> 00:57:39.840] The title of his study is Quantum Tunneling: Deep Neural Network for Optical Illusion Recognition.
[00:57:39.840 --> 00:57:43.120] So a machine learning system falling prey to optical illusions.
[00:57:43.120 --> 00:57:47.840] It really caught my attention this week, but my buddy the devil is in the details, as they say.
[00:57:47.840 --> 00:57:49.360] So, what's going on here?
[00:57:49.600 --> 00:57:52.000] As usual, we need to set the table a bit.
[00:57:52.000 --> 00:57:56.320] So, let's start with neural networks, which we've covered here and there on the show before.
[00:57:56.320 --> 00:57:57.680] Let's do a quick overview.
[00:57:57.680 --> 00:58:05.400] Neural networks are a branch of machine learning that attempts to process data in a way that is loosely, very loosely inspired by the human brain.
[00:58:05.720 --> 00:58:12.520] It consists of a layer of interconnected artificial neuron-like elements that allow it to store and classify data.
[00:58:12.520 --> 00:58:21.080] Imagine you have a signal arriving to an artificial neuron, and it's met by this traffic cop, and that traffic cop is called an activation function.
[00:58:21.080 --> 00:58:35.240] This function looks at all the signals that are coming in from other neurons, and it applies these weights or importance to those signals, and then decides if that signal is worthy of being passed on to the next neuron or layer.
[00:58:35.240 --> 00:58:36.200] Okay, you got that?
[00:58:36.200 --> 00:58:44.600] This is the activation function, which determines whether these signals have reached a level where they can be passed to the next level of neurons.
[00:58:44.840 --> 00:58:48.280] And when I say neuron, I'm just gonna, it's gonna be, I mean artificial neuron.
[00:58:48.280 --> 00:58:49.640] These aren't real neurons here.
[00:58:49.640 --> 00:58:55.560] So, as the neuron receives the right signals, it gets closer and closer to being activated, right?
[00:58:55.560 --> 00:58:57.560] It's like climbing a tall wall.
[00:58:57.560 --> 00:59:02.920] Eventually, it reaches the top of the wall, and the signal can then go to the next neuron.
[00:59:03.400 --> 00:59:04.760] It would get passed on.
[00:59:04.760 --> 00:59:08.920] In other words, the activation function allows the signal to continue.
[00:59:09.160 --> 00:59:13.880] Sorry, Bob, to be clear, just and I feel like you clarified this, but I'm still confused.
[00:59:13.880 --> 00:59:14.280] No, that's fine.
[00:59:14.680 --> 00:59:18.280] An artificial neuron is software, not hardware, right?
[00:59:18.280 --> 00:59:20.200] Right, yeah, this is all part of the.
[00:59:20.360 --> 00:59:23.480] This is all decision-making gates on off, yes, no.
[00:59:23.480 --> 00:59:24.200] Okay, making sure.
[00:59:24.200 --> 00:59:30.680] So, Maximov here is using not just a neural network, though, but he's employing a quantum neural network.
[00:59:30.680 --> 00:59:33.800] And this is not that as complicated as you might think.
[00:59:33.800 --> 00:59:37.960] It's essentially a neural network that uses some principle of quantum mechanics.
[00:59:37.960 --> 00:59:41.480] So, in this case, the primary principle here is quantum tunneling.
[00:59:41.640 --> 00:59:44.200] Quantum tunneling, we've mentioned this a couple times.
[00:59:44.200 --> 00:59:53.440] Quantum tunneling occurs when a particle like an electron, say, or an atom, passes through a potential energy barrier that's higher in energy than the particle's kinetic energy.
[00:59:53.440 --> 00:59:54.800] So this is actually real.
[00:59:54.800 --> 00:59:58.320] Quantum tunneling is one of the more bizarre aspects of quantum mechanics.
[00:59:58.640 --> 00:59:59.840] This is the real deal.
[01:00:00.080 --> 01:00:03.840] Tunneling is a critical component to make stars shine.
[01:00:03.840 --> 01:00:09.200] Now, you know how hydrogen atoms in a star have to get close enough to fuse into helium and release the energy, right?
[01:00:09.200 --> 01:00:10.000] That's a kind of.
[01:00:10.000 --> 01:00:11.600] It's from the nuclear force, right?
[01:00:12.560 --> 01:00:14.240] That's what the star does.
[01:00:14.240 --> 01:00:27.520] It turns out that much of the time, though, the hydrogen gets close enough, it gets kind of close, but it goes that final little distance because quantum tunneling allows it to go that last little bit that it's needed to fuse.
[01:00:27.520 --> 01:00:33.360] So without quantum tunneling, stars basically would not be able to fuse hardly much at all.
[01:00:33.360 --> 01:00:38.160] I mean, it would still fuse some, but it wouldn't be able to fuse nearly as much as they do.
[01:00:38.160 --> 01:00:40.880] And stars themselves might not even exist.
[01:00:40.880 --> 01:00:44.880] And the universe would certainly be in a very different place than it is today.
[01:00:44.880 --> 01:00:50.560] So yeah, quantum tunneling is bizarre, it's real, and it's critical to the universe as we know it.
[01:00:50.560 --> 01:00:52.720] And that's only one of the things that it's critical for.
[01:00:52.720 --> 01:00:55.760] It's also critical for radioactive decay, blah, blah, blah.
[01:00:55.760 --> 01:00:56.240] All right.
[01:00:56.240 --> 01:01:07.440] Anywho, so putting it all together then, Maximov's quantum neural network allows the signal to pass right through that wall and not to have to slowly climb it signal after signal.
[01:01:07.440 --> 01:01:14.640] So put another way, a form of quantum tunneling is now part of the activation function, that traffic cop.
[01:01:14.800 --> 01:01:22.880] Quantum tunneling is part of that function, and every now and then it's going to say, all right, you don't have to climb the wall, you're just going to go right through the wall, in a sense.
[01:01:22.880 --> 01:01:35.880] Or, you know, more realistically, you're going to be, you're going to activate this neuron, this artificial neuron, will be activated immediately, and you don't have to go through the long, laborious process of collecting lots of signals before it's allowed.
[01:01:36.040 --> 01:01:38.040] So, that's that's basically what he implemented here.
[01:01:38.120 --> 01:01:41.160] No one's ever, no one's ever done this before.
[01:01:41.160 --> 01:01:44.040] So, and this is now where the optical illusion part comes in.
[01:01:44.040 --> 01:01:57.480] Maximov trained his quantum neural network on two famous optical illusions that I'm sure all of you have heard about, or at least will recognize once I describe them: the Necker cube and the Rubens vases or faces illusion.
[01:01:57.480 --> 01:02:05.880] Now, the Necker cube, that's just imagine a wireframe cube that's in that's three-dimensional, appears to be three-dimensional.
[01:02:05.880 --> 01:02:10.600] So, your brain can interpret it as oriented one way or the other, right?
[01:02:10.600 --> 01:02:13.320] Depending on which one you focus on.
[01:02:13.320 --> 01:02:19.880] So, that's just an example of these ambiguous optical illusions where your brain kind of decides which way it goes.
[01:02:19.880 --> 01:02:23.480] The other one that's even more famous, the Rubens vases.
[01:02:23.640 --> 01:02:31.560] This is the one where it looks like a vase, but it also can look like the negative space could also appear to be two faces that are looking at each other.
[01:02:31.560 --> 01:02:32.600] And you go back and forth.
[01:02:32.600 --> 01:02:36.600] You see a vase, then you see two faces looking at each other, and you go back and forth.
[01:02:36.600 --> 01:02:38.280] So, that's the Rubens vases.
[01:02:38.280 --> 01:02:41.720] I'm sure you guys have all seen the vase one, right?
[01:02:41.720 --> 01:02:42.120] Yeah.
[01:02:43.080 --> 01:02:53.400] So, Maximov then says, Researchers believe we temporarily hold both interpretations at the same time until our brains decide which picture should be seen.
[01:02:53.400 --> 01:02:59.720] Now, this is an important, really important statement, maybe one of the most important statements in his paper.
[01:02:59.720 --> 01:03:04.520] And he even references five studies to back that statement up.
[01:03:05.000 --> 01:03:12.040] That idea that the brain, that researchers believe we temporarily hold both interpretations at the same time.
[01:03:12.040 --> 01:03:24.880] Now, that struck me as really odd because I've read about optical illusions many, many times, and they always seem to say that you can't hold both images in their mind, you know, in your minds at once.
[01:03:24.880 --> 01:03:30.080] So now he's saying that, oh, researchers think we can now, and that was very surprising to me.
[01:03:30.080 --> 01:03:36.800] I looked at he references five studies, and I didn't really have time to dive into the five studies, but I mean, that's what he's saying here.
[01:03:36.800 --> 01:03:49.920] He says in his paper, in the paper proper, he says that those studies have been interpreted as the ability of humans to see a quantum-like superposition of one and zero states.
[01:03:49.920 --> 01:03:53.680] So he's so quantum superposition, we've mentioned that on the show as well.
[01:03:53.920 --> 01:04:02.080] That's this idea that you're not one or the other, but you're some unusual mixture amalgamation of both of them at the same time.
[01:04:02.080 --> 01:04:03.680] That's what a superposition is.
[01:04:03.680 --> 01:04:08.320] You know, like, for example, an electron being spin up and spin down at the same time.
[01:04:08.320 --> 01:04:17.440] It's in the superposition of up-down until you interact with it or until something interacts with it, and then the wave function collapse and it picks up or down, one or the other.
[01:04:17.440 --> 01:04:26.400] So he's saying that, so humans basically have this ability to have a superposition of these faces/vases in their minds.
[01:04:26.400 --> 01:04:28.640] And that just struck me as very odd.
[01:04:28.640 --> 01:04:30.000] I had never heard of any of that before.
[01:04:30.000 --> 01:04:36.640] Steve, have you guys ever heard of anything like that about human perception taking advantage of quantum superposition?
[01:04:37.120 --> 01:04:37.760] So, yeah.
[01:04:37.760 --> 01:04:39.600] So, I'm a little skeptical of that for sure.
[01:04:39.920 --> 01:04:43.360] Yeah, because I feel like that's often used in like the woo-woo, what the bleep do we know about it?
[01:04:43.440 --> 01:04:44.560] Exactly, exactly.
[01:04:44.560 --> 01:04:49.920] And the reason why I'm belaboring this point is because it becomes obviously more important in a moment.
[01:04:50.240 --> 01:05:00.280] Okay, so when presented with these optical illusions, Maximov's neural network would sometimes describe a vase, basically saying, Yeah, hey, I'm seeing a vase here.
[01:05:00.440 --> 01:05:08.280] I don't know what his interface is with these quantum neural networks was, but essentially, if the neural network could speak, it would say, I see a vase.
[01:05:08.280 --> 01:05:10.600] And other times, it would see...
[01:05:10.600 --> 01:05:14.120] it would say, I see a face, just like a human does.
[01:05:14.120 --> 01:05:27.080] And that's interesting, but it's not unique because there are other implementations of neural networks, not necessarily quantum though, but other AI systems have had similar behavior.
[01:05:27.080 --> 01:05:31.560] They have also said, I see a vase, and then five minutes later, oh, I see a face.
[01:05:31.560 --> 01:05:38.040] So they go back and forth, which is interesting, but it's not unique for his implementation there.
[01:05:38.040 --> 01:05:51.960] Now, he says this, though, but in addition, my network produced some ambiguous results hovering between the two certain outputs, much like our own brains can hold both interpretations together before settling on one.
[01:05:51.960 --> 01:05:54.360] Now, this is what he claims is unique.
[01:05:54.360 --> 01:06:01.400] This is what sets his implementation apart from other neural networks, and this is unique to his quantum neural network.
[01:06:01.400 --> 01:06:17.640] The fact, what he's saying is that the fact that sometimes his neural network doesn't describe a face or a vase, but it seems to describe something that has elements of both at the same time, like humans appear to, according to the studies that he cites.
[01:06:17.640 --> 01:06:18.760] Okay, you got that?
[01:06:18.760 --> 01:06:30.280] So he thinks he's really onto something here because both people and his quantum neural network appear to be able to perceive images this way in this apparent superposition.
[01:06:30.280 --> 01:06:33.240] That's what sets his apart according to him.
[01:06:33.240 --> 01:06:48.160] So, my biggest problem, if you can't figure it out with his paper, his main achievement here seems to rely on this idea that human perception relies on quantum superposition at some level, which, as far as I can tell, is not the consensus at all.
[01:06:44.920 --> 01:06:55.120] You know, there may have been, I look, I did see references to some studies that may be interpreted that way.
[01:06:56.880 --> 01:06:59.040] Interpretated, yeah, interpreted that way.
[01:06:59.280 --> 01:07:04.160] But that's far, far from five sigma confidence levels at all.
[01:07:04.160 --> 01:07:10.560] But that said, all that said, Maximov does say the following in an article he wrote recently.
[01:07:10.560 --> 01:07:21.600] He said, while some have suggested that quantum effects play an important role in our brains, even if they don't, we may still find the laws of quantum mechanics useful to model human thinking.
[01:07:21.600 --> 01:07:37.840] So, okay, so here he's kind of saying, yeah, it kind of makes me think that maybe he doesn't totally buy this idea that quantum mechanics is critical or very, very important to human cognition or perception, which is not the consensus at all.
[01:07:38.080 --> 01:07:39.760] So that's the crux right there.
[01:07:39.760 --> 01:07:41.200] To me, this is the crux.
[01:07:41.200 --> 01:07:47.520] Does his study show that the laws of quantum mechanics might be useful to model human thinking?
[01:07:47.520 --> 01:07:49.600] And my answer to that is, maybe.
[01:07:49.600 --> 01:07:57.920] You know, maybe this type of modeling where you're modeling some aspect of human cognition using quantum mechanics, it may be a useful model.
[01:07:57.920 --> 01:08:02.400] It may have some interesting results that have some utility.
[01:08:02.400 --> 01:08:04.960] And I think that would be certainly very interesting.
[01:08:04.960 --> 01:08:09.360] And I think this is actually worth exploring more deeply because, I mean, who knows?
[01:08:09.360 --> 01:08:20.080] Even though the weird counterintuitive aspects of quantum mechanics, maybe it's not, you know, it's probably, in my mind, it doesn't seem like it's part of a critical aspect of human thinking.
[01:08:20.080 --> 01:08:35.720] But if you can incorporate that into these quantum neural networks and have some and create some interesting models and get some insights into how humans think, or even just creating some very new and interesting tools, then I think this would be definitely worth some further research and some extra money for this.
[01:08:36.120 --> 01:08:43.720] But it doesn't mean that consciousness is a quantum phenomenon and that we are entangled with the universe.
[01:08:43.720 --> 01:08:44.040] Right.
[01:08:44.280 --> 01:08:45.000] Absolutely not.
[01:08:45.000 --> 01:08:45.560] It doesn't mean that.
[01:08:45.880 --> 01:08:46.920] It doesn't mean that at all.
[01:08:46.920 --> 01:08:53.640] And it can prove to be potentially a very helpful, you know, helpful model that can give us some insights.
[01:08:53.640 --> 01:09:00.440] But even then, in that case, which is a big if, if that happens, it doesn't even necessarily mean that it doesn't mean it either.
[01:09:01.000 --> 01:09:04.920] It's just a model, an interesting model that has utility.
[01:09:04.920 --> 01:09:09.960] Well, everyone, we're going to take a quick break from our show to talk about one of our sponsors this week, Quince.
[01:09:09.960 --> 01:09:14.680] Yeah, guys, my wife and I have been shopping at Quince for years, and they're great.
[01:09:14.680 --> 01:09:17.160] I mean, let me tell you the breakdown here.
[01:09:17.240 --> 01:09:21.000] I recently got their Mongolian Cashmere pullover hoodie.
[01:09:21.000 --> 01:09:23.640] And when you hear Cashmere, you're like, it's going to cost a lot of money.
[01:09:23.640 --> 01:09:27.320] Well, the cool thing about Quince is that their prices are awesome.
[01:09:27.320 --> 01:09:28.360] And they're way less.
[01:09:28.360 --> 01:09:31.640] They're like 50 to 80% less than similar brands.
[01:09:31.640 --> 01:09:33.400] And these items are freaking great, man.
[01:09:33.400 --> 01:09:38.280] They're super well constructed, and that's important to me because I buy something and I want to have it for a long time.
[01:09:38.280 --> 01:09:41.400] They've got other stuff like leather jackets, cotton cardigans.
[01:09:41.400 --> 01:09:42.920] They have, you know, jeans.
[01:09:42.920 --> 01:09:43.880] They have a ton of stuff.
[01:09:43.880 --> 01:09:45.880] You just got to go to the website and check it out.
[01:09:45.880 --> 01:09:49.240] Get cozy in Quince's high-quality wardrobe essentials.
[01:09:49.240 --> 01:09:56.680] Go to quince.com/slash SGU for free shipping on your order and 365-day returns.
[01:09:56.680 --> 01:10:04.520] That's q-u-i-n-ce-e.com/slash s-g-u to get free shipping and 365-day returns.
[01:10:04.520 --> 01:10:07.160] Quince.com/slash SGU.
[01:10:07.160 --> 01:10:09.720] All right, guys, let's get back to the show.
[01:10:10.680 --> 01:10:13.400] All right, I'm going to finish up with a very quick news item.
[01:10:13.400 --> 01:10:15.680] I'm going to start by asking you a question.
[01:10:16.000 --> 01:10:25.200] So, which animals have been shown to use individual names to identify members of their own group?
[01:10:25.200 --> 01:10:26.240] Oh, interesting.
[01:10:26.240 --> 01:10:26.880] Oh, yeah.
[01:10:26.880 --> 01:10:30.400] Well, it was Tennessee Tuxedo and his Walrus Chumley.
[01:10:30.640 --> 01:10:31.600] That's one.
[01:10:31.600 --> 01:10:32.240] Dolphins.
[01:10:32.640 --> 01:10:33.840] Dolphins is correct.
[01:10:33.840 --> 01:10:34.400] Oh, I know.
[01:10:34.400 --> 01:10:35.040] I know for sure.
[01:10:35.040 --> 01:10:37.040] Oh, you meant, like, in reality.
[01:10:37.440 --> 01:10:38.720] Yes, in the real world.
[01:10:39.920 --> 01:10:40.640] Humans for sure.
[01:10:41.120 --> 01:10:44.560] It was humans other than humans, dolphins, and two other species.
[01:10:45.120 --> 01:10:45.680] Chimpanzees.
[01:10:46.000 --> 01:10:47.280] Chimpanzees and elephants.
[01:10:47.680 --> 01:10:48.560] Elephants is correct.
[01:10:48.560 --> 01:10:50.160] Chimpanzees is wrong.
[01:10:50.160 --> 01:10:50.480] Right.
[01:10:50.640 --> 01:10:51.920] Is it another ape, though?
[01:10:51.920 --> 01:10:52.800] How about bonobos?
[01:10:53.440 --> 01:10:54.960] No, no primate.
[01:10:54.960 --> 01:10:55.600] No primate.
[01:10:55.840 --> 01:10:56.240] Interesting.
[01:10:56.320 --> 01:10:56.640] Bird.
[01:10:56.640 --> 01:10:57.680] It's got to be a bird.
[01:10:57.680 --> 01:10:58.400] Yes, it's a bird.
[01:10:58.400 --> 01:10:58.960] What kind of bird?
[01:10:59.360 --> 01:11:00.400] No, cockachoo.
[01:11:00.960 --> 01:11:01.360] My mama.
[01:11:01.520 --> 01:11:01.760] Parrots.
[01:11:01.920 --> 01:11:02.240] A raven.
[01:11:02.880 --> 01:11:03.680] Some parrots.
[01:11:03.680 --> 01:11:11.760] So dolphins, elephants, and some parrots have been shown to use individualized sounds to refer to specific individuals.
[01:11:11.760 --> 01:11:12.400] Wow.
[01:11:12.640 --> 01:11:19.120] No primate, no non-human primates until this new study.
[01:11:19.120 --> 01:11:19.680] Interesting.
[01:11:19.680 --> 01:11:20.080] Which foundation.
[01:11:20.240 --> 01:11:21.280] Can I ask a quick question?
[01:11:21.680 --> 01:11:30.480] Prior to this new study where they discovered vocalizations that were individualized, did they use hand signals or other sort of signs?
[01:11:30.800 --> 01:11:32.080] So, no.
[01:11:32.080 --> 01:11:36.160] So there's no evidence that they have individual names by whatever means.
[01:11:36.160 --> 01:11:37.200] Oh, okay, interesting.
[01:11:37.200 --> 01:11:37.520] Yeah.
[01:11:37.520 --> 01:11:44.800] But so a recent study has apparently demonstrated individual names in marmosets.
[01:11:45.200 --> 01:11:48.960] So again, not even apes, in marmosets, which is interesting.
[01:11:48.960 --> 01:11:58.320] So they make calls, which they call fee calls, which are specific to individual members of their group, and they can identify that.
[01:11:58.320 --> 01:12:03.640] Not only that, but they clearly teach each other these names, right?
[01:12:03.640 --> 01:12:05.400] These are learned.
[01:12:05.400 --> 01:12:14.600] And if you compare one family group to another family group, each family group tends to use names which sound similar.
[01:12:14.600 --> 01:12:21.560] So there appears to be a culture or a dialect within each group that differs from group to group.
[01:12:21.560 --> 01:12:25.640] Which, again, is just more evidence that they're teaching these to each other.
[01:12:25.880 --> 01:12:30.040] It's not like sounds that are innate or evolved.
[01:12:30.040 --> 01:12:30.360] Right.
[01:12:30.360 --> 01:12:31.720] Yeah, so it's interesting.
[01:12:31.720 --> 01:12:36.680] So the question is: why marmosets, you know, of all the primates?
[01:12:36.680 --> 01:12:41.080] Is this just because we just haven't been able to see it in other primates?
[01:12:41.080 --> 01:12:42.760] And like, why, again, why dolphins?
[01:12:42.760 --> 01:12:43.960] Why elephants?
[01:12:43.960 --> 01:12:48.680] So what do you guys think, first of all, but do you have any idea why marmosets?
[01:12:48.680 --> 01:12:49.640] Is it random?
[01:12:49.640 --> 01:12:51.320] Is it just do they have some other?
[01:12:51.480 --> 01:12:53.240] Something about their social structure?
[01:12:53.400 --> 01:12:58.360] That's a good hypothesis, something about their social structure, but it's something else too, probably.
[01:12:58.360 --> 01:13:06.520] And we don't know for sure, but there is something that all of these groups have in common, that dolphins, elephants, parrots, and marmosets have in common.
[01:13:06.520 --> 01:13:14.280] In addition to having the ability to have this level of hunting strategy, it's just their environment.
[01:13:14.280 --> 01:13:19.560] In their environment, they lose visual contact with other members of their group.
[01:13:19.560 --> 01:13:21.080] Oh, interesting.
[01:13:21.080 --> 01:13:24.360] So these are dolphins that live in murky water.
[01:13:24.360 --> 01:13:28.600] Elephants can, just by sheer distance, they can communicate over miles.
[01:13:28.600 --> 01:13:30.520] Yeah, the infrasound travels far.
[01:13:30.520 --> 01:13:35.000] They have lost visual contact with their members because of distance.
[01:13:35.000 --> 01:13:44.440] And parrots and marmosets live in dense jungles in the trees where it's so dense they can't see each other even a short distance away.
[01:13:44.440 --> 01:13:58.080] So, the researchers think that it's this combination of they're neurologically sophisticated enough to produce this, they have the ability to produce very specific sounds, and they have the need to because of their environment.
[01:13:58.080 --> 01:14:06.480] You know, they're not just like, for example, you know, crows will send out warnings to all other crows within my you can hear my voice, right?
[01:14:06.480 --> 01:14:12.800] You know, they're just making sounds that are attracting a mate or a warning of a predator or whatever.
[01:14:12.800 --> 01:14:15.600] It's not directed at a specific individual.
[01:14:15.600 --> 01:14:26.800] But now, if you're a mommy marmoset and you've lost sight of your baby, you want to call them specifically, not just send out a call to all marmosets, right?
[01:14:27.040 --> 01:14:36.400] And so, yeah, it'd be evolutionarily advantageous to be able to do that, to summon your individual family member to you.
[01:14:36.400 --> 01:14:38.800] It's just weird because there are other monkeys that live in jungles.
[01:14:38.960 --> 01:14:39.360] Yeah, I know.
[01:14:39.360 --> 01:14:41.120] So, you know, it's like, how come they didn't know?
[01:14:41.280 --> 01:14:42.960] It's probably a combination of environment.
[01:14:43.040 --> 01:14:46.960] Well, it's also, even though they're probably also more common than we think.
[01:14:47.120 --> 01:14:48.640] Yeah, it might be more common than we think.
[01:14:48.640 --> 01:14:56.960] It may be that it's not just like they're in the jungle, it's like where in the jungle and what's their behavior and how big are they and how colorful are they, whatever.
[01:14:56.960 --> 01:15:05.520] And just a number of factors together that just create the evolutionary pressure of a specific advantage to being able to do this.
[01:15:05.520 --> 01:15:06.800] And then they have to hit upon it.
[01:15:07.200 --> 01:15:10.480] It's not going to evolve every time it would be beneficial.
[01:15:10.480 --> 01:15:21.200] But yeah, I wonder how many more species do this that we just haven't been able to observe long enough in the appropriate setting to document it.
[01:15:21.200 --> 01:15:27.120] And of course, they always raise, like, could this help us learn about the evolution of language in humans?
[01:15:27.120 --> 01:15:27.840] I don't know.
[01:15:27.840 --> 01:15:28.800] I guess so.
[01:15:28.800 --> 01:15:34.360] You know, unfortunately, there's a huge gap in our language evidence, right?
[01:15:34.360 --> 01:15:41.880] So we have evidence of human language going back 4,000 years or so to the earliest writings.
[01:15:42.440 --> 01:15:55.880] We have modern extant modern human languages that we could examine to see like what do they have in common and to try to reverse engineer like the way languages evolved over human history.
[01:15:56.360 --> 01:16:01.560] So basically we have this several thousand year window into modern languages.
[01:16:01.560 --> 01:16:08.600] And then if we go to our non-human ancestors, we're going back eight plus million years.
[01:16:08.600 --> 01:16:13.560] And we basically have no information between eight million years and four thousand years ago.
[01:16:13.560 --> 01:16:13.960] Wow.
[01:16:14.280 --> 01:16:32.360] In terms of the evolution of language, because we don't have any window into but all we could say is anatomically, we could look anatomically and say could they that's more about their physical ability to speak, not necessarily about their language.
[01:16:32.360 --> 01:16:39.080] And so we know that only humans, only Homo sapiens have fully modern larynxes.
[01:16:39.080 --> 01:16:43.880] Neanderthal, there's still a little bit of a debate about what kind of vocal range they actually had.
[01:16:43.880 --> 01:16:53.000] It was definitely a lot closer to human vocal cords than any other hominid, but not quite the same structure as modern humans.
[01:16:53.000 --> 01:16:57.080] And then going back, it's mainly about the hyoid bone.
[01:16:57.080 --> 01:16:57.800] Where is it?
[01:16:57.800 --> 01:17:00.600] What's the shape of it and where is it in the throat?
[01:17:00.600 --> 01:17:03.400] Yeah, it's a weird bone in the skull that's kind of hanging off by itself.
[01:17:03.640 --> 01:17:04.280] Yeah, yeah.
[01:17:04.600 --> 01:17:08.200] So the hominids did not have the full vocal range that humans do.
[01:17:08.440 --> 01:17:11.720] That doesn't mean that they didn't have language or they didn't speak.
[01:17:12.040 --> 01:17:14.520] And we don't really know anything about their language.
[01:17:14.640 --> 01:17:23.520] There's maybe a tiny little bit of evidence we can infer from skull casts, you know, seeing like, oh, yeah, they had the left temporal lobe is developed to a certain degree or not.
[01:17:23.520 --> 01:17:25.920] But again, it's very indirect evidence.
[01:17:26.160 --> 01:17:33.040] But so it's interesting, we just don't really have a good window into human language for most of its development and evolution.
[01:17:33.040 --> 01:17:39.680] So this maybe will shed a tiny bit of light on the early stages of the evolution of language.
[01:17:39.680 --> 01:17:44.800] But I do wonder which other species will eventually figure out have individual names.
[01:17:44.800 --> 01:17:45.440] Oh, totally.
[01:17:45.440 --> 01:17:52.400] And there may be something that's subtle that we aren't picking up on, like a subtle sign or a nonverbal one.
[01:17:53.040 --> 01:17:56.640] All right, there's no who's that noisy this week because we're out of sequence.
[01:17:56.640 --> 01:17:59.920] So we're just going to go right to some questions and emails.
[01:17:59.920 --> 01:18:00.160] Bob.
[01:18:00.320 --> 01:18:01.360] Ooh, I didn't know that.
[01:18:01.360 --> 01:18:04.800] This is a, yeah, I mentioned that.
[01:18:05.040 --> 01:18:05.440] Yeah.
[01:18:06.480 --> 01:18:12.320] Bob, this is a response to a news item that you did back in August.
[01:18:12.640 --> 01:18:14.400] This is on beetles.
[01:18:14.400 --> 01:18:34.160] You were talking about the fact that beetles, there's more beetles than any other group of animals and discussing why that might be in terms of, you know, they had evolutionarily, they had an opportunity to expand dramatically, you know, their numbers.
[01:18:34.160 --> 01:18:41.600] But this is a this email comes from a listener called Matiaz, I think, M-A-T-J-A-Z.
[01:18:41.600 --> 01:18:45.120] He didn't offer a pronunciation in the email, so we're on our own.
[01:18:45.120 --> 01:18:54.560] And they write, Hi, guys, in a recent show, you highlighted that beetles are by far the most diverse insect group and animal group on this taxonomic level.
[01:18:54.560 --> 01:19:02.120] Well, I have a correction that is not really a correction for you, but might be of interest since SGU is about science progress.
[01:18:59.760 --> 01:19:03.320] I thought about writing.
[01:19:03.480 --> 01:19:09.080] Anyway, beetles indeed currently are the most diverse, but will not stay so much longer.
[01:19:09.080 --> 01:19:17.640] Biologists are lately realizing, in large part thanks to molecular methods, yay science, that there is a big bias in what we know.
[01:19:17.640 --> 01:19:19.160] You can guess it, perhaps.
[01:19:19.160 --> 01:19:22.120] If it's small, it's less known.
[01:19:22.120 --> 01:19:35.160] There are two groups that biologists agree on will easily surpass beetles: wasps and kin, Hymenoptera, Hymenoptera, and flies, diptera.
[01:19:35.160 --> 01:19:43.800] So both have a crazy amount of tiny species that nobody studied and are super tough to taxonomically treat based on morphology.
[01:19:44.040 --> 01:19:49.480] These are called dark taxa, and there's a lot of research attention on these recently.
[01:19:49.480 --> 01:19:50.040] Yay science.
[01:19:51.080 --> 01:19:54.920] So as is the consensus today, flies are the largest insect order, followed by wasps.
[01:19:54.920 --> 01:20:00.920] Beetles are third, although species descriptions are slow, and so formally it will take a while.
[01:20:00.920 --> 01:20:03.800] There is a similar thing in arachnids with mites.
[01:20:03.800 --> 01:20:07.880] Mites are an order of mostly sub-millimeter animals.
[01:20:07.880 --> 01:20:12.840] The true diversity is likely tens to 100 times higher.
[01:20:12.840 --> 01:20:29.400] So yeah, so he's saying, yeah, in terms of named species, it is beetles, but biologists suspect that Hymenoptera, you know, the wasps and flies probably are more diverse once we fully characterize all of the small members of those groups.
[01:20:29.480 --> 01:20:30.760] Yeah, if we ever.
[01:20:30.920 --> 01:20:32.600] I wonder what their upper limit was.
[01:20:32.760 --> 01:20:37.480] If memory serves, for beetles, it was 400,000 different species.
[01:20:37.800 --> 01:20:46.880] But my research shows that they have estimates of the actual number for beetles going potentially as high as 2.1 million.
[01:20:47.360 --> 01:20:54.480] So I wonder if their upper potential limit for these other creatures are equally beyond that.
[01:20:54.880 --> 01:20:57.760] So, point taken, I wasn't aware of that.
[01:20:57.760 --> 01:21:05.520] That's interesting, that they may and probably even do exceed beetles, but they're just so tiny that they're just too lazy.
[01:21:05.920 --> 01:21:07.200] Well, no, it's just hard.
[01:21:07.920 --> 01:21:16.960] It's hard to find them, and then it's hard to categorize them because they're small, so it's hard to characterize their morphology.
[01:21:16.960 --> 01:21:23.760] So, there may be lots of instances where we are confusing multiple species for one, but it's really multiple species.
[01:21:23.760 --> 01:21:29.040] And if we knew that, it would dramatically increase the number of the known species.
[01:21:29.040 --> 01:21:29.600] So, we'll see.
[01:21:30.160 --> 01:21:40.880] Let the races begin, I guess, to see over time which order of insects has their described diversity increased more.
[01:21:40.880 --> 01:21:47.360] Right now, they're just speculating that it might be Hymenoptera and flies, but they don't really know until they look.
[01:21:47.680 --> 01:21:48.960] And you're correct.
[01:21:48.960 --> 01:21:53.040] While those orders increase, maybe we'll find lots of more beetles, too.
[01:21:53.040 --> 01:21:53.360] Yeah.
[01:21:53.360 --> 01:21:56.160] And the mites one is interesting because mites are teeny tiny.
[01:21:56.160 --> 01:22:03.360] The other thing is, when you're the smaller you are, the relatively bigger is your ecosystem, right?
[01:22:03.360 --> 01:22:06.240] Like, the world is a lot bigger place for you if you're teeny tiny.
[01:22:07.040 --> 01:22:09.280] That's why there's more bacteria than anything else.
[01:22:09.280 --> 01:22:20.560] So, it kind of makes sense, you know, relatively speaking, that there would be more, not only more individuals, but more species, more genera, et cetera, of teeny tiny groups like that.
[01:22:20.560 --> 01:22:21.440] Interesting.
[01:22:21.440 --> 01:22:25.840] All right, we're going to do a quick name-note logical fallacy, and then we'll go on to science or fiction.
[01:22:26.080 --> 01:22:33.000] This one is based on an email that came to us from Mario, and Mario said, Love the show, listened since the beginning.
[01:22:33.320 --> 01:22:37.720] What do you call it when a claim is technically true but deceptive?
[01:22:37.720 --> 01:22:41.640] Example: Someone claims that I have never lost a game of chess.
[01:22:41.640 --> 01:22:48.840] That gives the impression they are a great chess player, but it's simply because they have never played a single game of chess in their life.
[01:22:48.840 --> 01:22:54.680] Is that a logical fallacy or simply a lie of omission, Mario?
[01:22:55.000 --> 01:22:55.560] Did you guys?
[01:22:55.560 --> 01:22:58.440] I have to ask, did you guys ever watch That 70 show?
[01:22:58.440 --> 01:22:59.080] Yeah, sure.
[01:22:59.080 --> 01:22:59.800] Oh, yeah.
[01:22:59.800 --> 01:23:02.280] Do you remember the comedian Love Him?
[01:23:03.000 --> 01:23:05.000] Miss Him to this day, Mitch Hedberg.
[01:23:05.000 --> 01:23:05.480] Oh, yeah.
[01:23:05.480 --> 01:23:05.960] Oh, my God.
[01:23:05.960 --> 01:23:07.000] He's amazing.
[01:23:07.000 --> 01:23:18.680] So he was in a couple episodes of That 70 show, if you don't remember, he was like, he worked the counter at the restaurant, and there was an episode where one of them goes up to him and he asks for like extra fries.
[01:23:18.680 --> 01:23:22.360] And he goes, I did not lose a leg in Vietnam just so you could have extra fries.
[01:23:22.360 --> 01:23:23.480] And they're like, What are you talking about?
[01:23:23.480 --> 01:23:24.520] You weren't in Vietnam.
[01:23:24.520 --> 01:23:26.440] He goes, That's what I said.
[01:23:26.440 --> 01:23:28.040] I did not lose a leg.
[01:23:30.440 --> 01:23:33.400] I don't know why that reminds me of this even.
[01:23:34.680 --> 01:23:46.760] Yeah, I mean, there's a lot of actual comedy based upon this, where you make a statement that implies something that isn't true, and then you subvert expectations by flipping it on its head, right?
[01:23:47.640 --> 01:23:48.280] It's the doctor.
[01:23:48.280 --> 01:23:50.760] Doctor, will I be able to play the violin again?
[01:23:50.760 --> 01:23:51.720] Okay, well, great.
[01:23:51.720 --> 01:23:53.000] I never did before.
[01:23:53.640 --> 01:23:54.120] Right.
[01:23:54.120 --> 01:23:57.320] Or yeah, although you shouldn't say again because that implies a different person.
[01:23:57.400 --> 01:23:58.280] Yeah, well, okay.
[01:23:58.280 --> 01:24:00.920] Like, yeah, yeah, will I be able to play the piano?
[01:24:00.920 --> 01:24:01.640] Sure, you will.
[01:24:01.640 --> 01:24:01.880] Great.
[01:24:01.880 --> 01:24:02.440] I've never been there.
[01:24:02.440 --> 01:24:03.880] Yeah, I never played before.
[01:24:03.880 --> 01:24:06.920] So, yeah, that's a common type of comedy.
[01:24:06.920 --> 01:24:09.320] But anyway, is this a logical fallacy?
[01:24:09.320 --> 01:24:10.840] And the answer is clearly no.
[01:24:10.840 --> 01:24:15.200] It's not a logical fallacy because the problem is not with the logic.
[01:24:15.200 --> 01:24:15.680] Right?
[01:24:14.520 --> 01:24:20.160] So an argument or a statement has three components.
[01:24:20.320 --> 01:24:27.680] You have one or more premises connected through some kind of logic to a conclusion.
[01:24:27.680 --> 01:24:35.600] And if the conclusion does not logically flow from the premises, that's a logical fallacy.
[01:24:35.600 --> 01:24:40.640] That's invalid logic, leading to an unsound conclusion.
[01:24:40.640 --> 01:24:47.280] But the other component of the argument that can be wrong is the premises, right?
[01:24:47.280 --> 01:24:59.120] If you have a problem with your premises, it also can be an unsound argument or misleading argument or whatever, even if there's no logical fallacy involved.
[01:24:59.120 --> 01:25:00.320] Does that make sense?
[01:25:00.320 --> 01:25:03.600] So in this case, we have an incomplete premise.
[01:25:03.840 --> 01:25:07.200] We're not being given all the pieces of information, right?
[01:25:07.200 --> 01:25:10.320] You're given only a partial piece of information.
[01:25:10.320 --> 01:25:14.240] And then it's not really even, it's an implied claim.
[01:25:14.240 --> 01:25:17.280] It's not even really, like he's implying he's great at chess.
[01:25:17.280 --> 01:25:19.280] He's not really saying it outright.
[01:25:19.280 --> 01:25:30.160] So, right, but I think that this is, it is the lie of omission, meaning something was omitted from the factual basis of the implied claim.
[01:25:30.160 --> 01:25:35.520] So don't forget to examine the premises of an argument as well, even if the logic is correct.
[01:25:35.520 --> 01:25:49.600] Sometimes a premise can be wrong, or a premise can be misleading, or it could be incomplete, like in this case where you're just missing information, or it could be controversial, or it could be subjective.
[01:25:49.560 --> 01:25:54.720] Like it's not an objective, it's a subjective assessment, it's not an objective fact.
[01:25:54.720 --> 01:25:58.480] Like you're taking as a given something that is not given, right?
[01:25:58.480 --> 01:26:01.240] But isn't that what unstated major premise is?
[01:26:01.240 --> 01:26:02.200] Yeah, that's an unstated premise.
[01:26:02.280 --> 01:26:03.240] Isn't that a logical fallacy?
[01:26:03.400 --> 01:26:05.880] No, it's not really a logical fallacy.
[01:26:05.880 --> 01:26:07.240] It's a premise problem.
[01:26:07.240 --> 01:26:09.960] It's not technically a fallacy.
[01:26:09.960 --> 01:26:10.520] Oh, really?
[01:26:10.520 --> 01:26:13.000] I always thought that it was utilized as an introduction.
[01:26:13.240 --> 01:26:20.600] We include it in the list of informal logical fallacies, again, because they're informal, because it's basically a problem with your argument.
[01:26:20.600 --> 01:26:28.600] You know, if you're using logical fallacy, as there's some problem with your argument, but it's technically not a logical fallacy because there's no logic involved.
[01:26:28.600 --> 01:26:35.400] It's just that you have not explained all of your premises or specifically identified all of your premises.
[01:26:35.400 --> 01:26:37.960] You're using a premise you're not stating.
[01:26:38.280 --> 01:26:38.760] Right.
[01:26:38.760 --> 01:26:49.320] And like, as I'm kind of digging into it, I'm seeing language used like an unstated premise is a premise that a deductive argument requires, but it's not explicitly stated.
[01:26:49.560 --> 01:26:54.280] And in this case, you would deduce that he plays chess.
[01:26:54.280 --> 01:26:57.400] Like it's an intentional twisting of that.
[01:26:57.400 --> 01:26:57.800] Yes.
[01:26:58.040 --> 01:27:00.200] It's more than an illogical deduce.
[01:27:00.680 --> 01:27:02.760] It's an implied unstated major premise.
[01:27:02.760 --> 01:27:03.080] Yes.
[01:27:03.480 --> 01:27:04.680] Yeah, I agree.
[01:27:04.680 --> 01:27:12.760] But the reason why that's a bad argument is because it doesn't give the other person the opportunity to challenge the premise.
[01:27:12.760 --> 01:27:16.040] You're trying to assume the premise by not even stating it.
[01:27:16.040 --> 01:27:18.440] Therefore, like it's not even up for discussion.
[01:27:18.440 --> 01:27:20.600] It's baked into your argument.
[01:27:20.920 --> 01:27:27.000] When people talk about things having to do with religion and they just blanket assume that God exists and everything's founded on that.
[01:27:27.000 --> 01:27:27.320] Right.
[01:27:27.560 --> 01:27:28.120] Yeah.
[01:27:28.120 --> 01:27:33.160] Like without saying, let's assume God exists, they assume God exists and they proceed from there.
[01:27:33.160 --> 01:27:35.080] It's an unstated major premise.
[01:27:35.080 --> 01:27:36.040] That's always careful.
[01:27:36.040 --> 01:27:39.440] You have to be careful about that because that could be a very stealth problem with an argument.
[01:27:39.440 --> 01:27:41.480] Because, again, it's what's missing.
[01:27:41.480 --> 01:27:44.040] You know, what's missing is sometimes hard to identify.
[01:27:44.040 --> 01:27:46.320] That's why it's a great exercise.
[01:27:46.480 --> 01:27:58.160] Again, you should do this to police your own thinking, not just as a weapon against other people, but just to arrive at a consensus, you want to examine how sound all the arguments that are being put forward are.
[01:27:58.160 --> 01:28:00.640] And it's always good to say, okay, what are all the premises?
[01:28:00.640 --> 01:28:02.720] What's the logic being employed?
[01:28:02.720 --> 01:28:05.200] And is the conclusion justified by that?
[01:28:05.200 --> 01:28:15.280] And if you go consciously through that process, it becomes much easier to say, oh, wait a minute, there's an implied premise here you're not explicitly stating.
[01:28:15.280 --> 01:28:22.640] Or our disagreement is that we're making different assumptions about, we have different starting points.
[01:28:22.640 --> 01:28:25.840] You're assuming one thing, I'm assuming something else.
[01:28:26.320 --> 01:28:30.880] We have to talk about that first before we even get to talking about logic.
[01:28:32.560 --> 01:28:35.760] You have to make sure that you can agree on the premises.
[01:28:35.760 --> 01:28:45.200] Unfortunately, that's harder and harder to do these days because people have so much information they can cherry-pick whatever information they want to start as their premises.
[01:28:45.760 --> 01:28:46.880] All right, well, thank you, Mario.
[01:28:46.880 --> 01:28:47.600] That was a good question.
[01:28:47.600 --> 01:28:48.960] I enjoyed talking about that.
[01:28:48.960 --> 01:28:52.400] Guys, let's go on with science or fiction.
[01:28:54.640 --> 01:28:59.600] It's time for science or fiction.
[01:29:04.400 --> 01:29:13.120] Each week I come up with two science news items or facts, two real, one fake, and then I challenge my panel of skeptics to tell me which one is the fake.
[01:29:13.120 --> 01:29:18.240] There's a theme this week, because of course, we're not doing news items because we're out of sequence here.
[01:29:18.240 --> 01:29:22.800] The theme is science literacy.
[01:29:22.800 --> 01:29:28.320] So, these are three statements about what Americans believe.
[01:29:28.320 --> 01:29:31.160] I stuck with Americans just to make them all comfortable.
[01:29:29.920 --> 01:29:33.960] So, you'll understand in a second what I'm talking about.
[01:29:34.280 --> 01:29:44.440] Item number one: a 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:29:44.440 --> 01:29:46.680] In quotes, that last part.
[01:29:46.680 --> 01:29:57.160] Item number two: in a 2014 NSF survey, 26% of Americans stated that the sun revolves about the earth rather than the other way around.
[01:29:57.160 --> 01:30:12.040] And item number three, a 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time, while only 25% answered that they definitely did not.
[01:30:12.040 --> 01:30:13.000] Jay, go first.
[01:30:13.000 --> 01:30:20.600] Okay, yeah, this first one, 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:30:21.000 --> 01:30:32.440] That's really interesting because I could see where people would confuse the idea of like, you know, the power plants and what's coming, you know, what's being used in the power plants.
[01:30:32.440 --> 01:30:36.120] And then there's, you know, then there's all the like the leftover material.
[01:30:36.120 --> 01:30:43.560] I could just see there's confusion there, but 68% is an awful lot of people to believe that radioactivity is man-made.
[01:30:43.560 --> 01:30:45.720] That's a boy, that's a tricky one.
[01:30:45.720 --> 01:30:53.400] Second one, in 2014, the NSF survey, 26% of Americans stated that the sun revolved around the earth.
[01:30:53.400 --> 01:30:55.080] I think that one is science.
[01:30:55.080 --> 01:30:57.000] My God, I think that one is science.
[01:30:57.000 --> 01:31:04.760] Okay, the third one, a 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time.
[01:31:04.760 --> 01:31:08.200] Why, only a quarter of the people answered that they definitely did not.
[01:31:08.200 --> 01:31:09.400] Oh my god, Steve.
[01:31:09.400 --> 01:31:11.480] This is such an interesting one.
[01:31:11.800 --> 01:31:17.040] Ah, 41% people believe that dinosaurs and humans at the same time?
[01:31:17.040 --> 01:31:17.920] No.
[01:31:14.680 --> 01:31:21.760] So it's between the first one and the last one.
[01:31:21.840 --> 01:31:24.400] So either the radioactivity one.
[01:31:24.400 --> 01:31:25.440] Damn, that's a tip.
[01:31:25.440 --> 01:31:26.880] This is really hard, Steve.
[01:31:26.880 --> 01:31:30.400] You're basically asking us how stupid the average person is.
[01:31:30.720 --> 01:31:32.720] How scientifically literate are they?
[01:31:32.720 --> 01:31:37.440] All right, 41% of people think that dinosaurs.
[01:31:37.440 --> 01:31:39.600] And there's two percentages in that last one.
[01:31:39.600 --> 01:31:42.240] There's 41% that believe that they lived at the same time.
[01:31:42.240 --> 01:31:45.360] 25% were the number of people that answered correctly.
[01:31:45.360 --> 01:31:46.400] A quarter?
[01:31:46.400 --> 01:31:46.960] All right.
[01:31:46.960 --> 01:31:47.600] Okay.
[01:31:47.600 --> 01:31:51.360] I have been choosing so badly lately with science or fiction.
[01:31:51.360 --> 01:31:53.120] I know.
[01:31:53.120 --> 01:31:54.560] Bob, help me out, man.
[01:31:54.560 --> 01:31:55.280] No, no, no.
[01:31:55.280 --> 01:31:55.680] You're on the same side.
[01:31:56.640 --> 01:32:00.400] I'm going to say it's the dinosaur one, and I'm going to tell you exactly why.
[01:32:00.400 --> 01:32:07.600] I think more people know that dinosaurs, I think more than 25% people know for sure that it definitely didn't happen.
[01:32:07.840 --> 01:32:09.280] That's the indicator for me.
[01:32:09.280 --> 01:32:10.160] Okay, Kara.
[01:32:10.160 --> 01:32:14.800] I mean, when I first read all of them, that one was the obvious, like, what?
[01:32:14.800 --> 01:32:23.200] But then as I'm looking at the language, first of all, it's 2015, and we know that people are increasingly becoming more secular.
[01:32:23.200 --> 01:32:24.880] That's like almost 10 years ago now.
[01:32:24.880 --> 01:32:27.520] Oh, I hate to say that out loud.
[01:32:27.920 --> 01:32:30.640] 2015, you joined the skeptics guide, I believe.
[01:32:30.800 --> 01:32:30.960] Yeah.
[01:32:32.480 --> 01:32:33.120] Yes.
[01:32:33.120 --> 01:32:38.000] So 41% of people believing that dinosaurs and humans lived at the same time.
[01:32:38.000 --> 01:32:40.800] That's like a blanket statement, but you worded it funny here.
[01:32:40.800 --> 01:32:43.920] You said 25% said that they definitely didn't.
[01:32:43.920 --> 01:32:51.040] I'm assuming because a lot of times with these polls, there's like multiple, like maybe, probably, I'm completely sure of it, or whatever.
[01:32:51.040 --> 01:32:58.960] And yeah, I wouldn't be surprised if only 25% of people felt confident in their answer, in their correct answer, sadly.
[01:32:58.960 --> 01:33:00.600] And a lot of people are religious, man.
[01:33:00.600 --> 01:33:02.120] A lot of people are fundamentalists.
[01:33:02.120 --> 01:33:03.960] It's scary.
[01:32:59.920 --> 01:33:07.160] That probably tracks with people believing in evolution, too.
[01:33:07.480 --> 01:33:08.600] So I'm going to go backward.
[01:33:08.600 --> 01:33:16.520] The NSF survey also exactly 10 years ago now, about a quarter of people saying that the sun revolves about the earth.
[01:33:16.520 --> 01:33:17.480] How British?
[01:33:17.800 --> 01:33:21.000] Around the Earth rather than the other way around.
[01:33:21.320 --> 01:33:23.160] That's so sad.
[01:33:23.160 --> 01:33:25.800] But that doesn't surprise me either.
[01:33:26.760 --> 01:33:27.880] So I guess that leaves the.
[01:33:28.040 --> 01:33:31.880] Maybe they revolve around each other's barycenter, but that's over there.
[01:33:32.200 --> 01:33:33.320] Is that a choice?
[01:33:33.320 --> 01:33:33.640] Probably.
[01:33:36.680 --> 01:33:41.400] So that leaves the National Science Board survey, which is much more modern.
[01:33:41.400 --> 01:33:43.480] This was only four years ago.
[01:33:43.720 --> 01:33:51.640] 68% of Americans, so an overwhelming majority, believing that all radioactivity is man-made.
[01:33:51.640 --> 01:33:58.440] I think that my guess would be that that number wouldn't be so high for one of two reasons.
[01:33:58.440 --> 01:33:59.880] I'm hedging my bets here.
[01:33:59.880 --> 01:34:09.080] One is that hopefully more people than that understand the electromagnetic spectrum, but that is, I'm not confident in that assumption.
[01:34:09.080 --> 01:34:14.040] So my other guess would be that it's a coin flip, and this feels too high to be a coin flip.
[01:34:14.040 --> 01:34:16.680] So I'm going to say that that's the fiction.
[01:34:16.680 --> 01:34:17.480] Okay, Bob.
[01:34:17.480 --> 01:34:20.280] So, Carrie, you're saying this number one, the 68%?
[01:34:20.520 --> 01:34:21.880] Yeah, the radioactivity.
[01:34:21.880 --> 01:34:22.280] Yeah.
[01:34:22.600 --> 01:34:25.240] Yeah, that does sound pretty high.
[01:34:25.240 --> 01:34:29.560] And Jay, I like your pick as well about the dinosaurs.
[01:34:29.560 --> 01:34:34.280] The key thing there, though, definitely did not is rubbing me the wrong way as well.
[01:34:34.280 --> 01:34:36.280] I mean, definitely did not.
[01:34:36.280 --> 01:34:40.840] Did they do one of those surveys where you had, you know, you had that much granularity?
[01:34:40.840 --> 01:34:41.720] Probably did not.
[01:34:41.720 --> 01:34:42.600] Definitely did not.
[01:34:42.600 --> 01:34:47.200] I mean, I don't, that doesn't seem likely as I had thought.
[01:34:47.200 --> 01:34:50.240] And the second one, sun and earth revolve.
[01:34:44.600 --> 01:34:52.080] Yeah, that one sounds legit.
[01:34:52.560 --> 01:34:58.480] Yeah, so I agree it's between the radioactivity and the dinosaurs.
[01:34:58.480 --> 01:35:03.200] Ah, yeah, I'll go with Karen and say the radioactivity and 68%.
[01:35:03.200 --> 01:35:04.240] I'll say that's fiction.
[01:35:04.240 --> 01:35:07.840] But any of these, pretty much any of these, none of them would shock me.
[01:35:07.840 --> 01:35:09.520] Okay, everyone's true.
[01:35:09.520 --> 01:35:12.560] You realize how horrific this science or fiction is, right?
[01:35:13.280 --> 01:35:15.040] Because two of these are correct.
[01:35:15.040 --> 01:35:15.520] Okay?
[01:35:15.520 --> 01:35:16.000] That's right.
[01:35:16.000 --> 01:35:19.040] One of them is fake, so it may even be worse because it depends.
[01:35:19.040 --> 01:35:26.560] Let's say, for example, 98% of Americans believe that all radioactivity is man-made, and that turns out being the fiction, only in the worst direction.
[01:35:26.560 --> 01:35:27.760] This is a nightmare.
[01:35:27.760 --> 01:35:30.160] This science or fiction is horrible.
[01:35:30.160 --> 01:35:31.120] Right, Ev?
[01:35:31.680 --> 01:35:33.120] But I'll cut right to it.
[01:35:33.120 --> 01:35:37.040] And I am going to, sorry, Jay, I am going to agree with Kara and Bob.
[01:35:37.040 --> 01:35:46.000] And my only reason for this is I have seen before polling regarding the Earth and the Sun relationship.
[01:35:46.000 --> 01:35:49.120] I've seen polling before about dinosaurs and human relationships.
[01:35:49.120 --> 01:35:56.160] I've never seen a poll that I can remember off the top of my head about radioactivity and what they're asking people questions about that.
[01:35:58.080 --> 01:35:59.760] I have no recollection of it.
[01:35:59.760 --> 01:36:02.400] So that's why that one's rubbing me the wrong way.
[01:36:02.400 --> 01:36:04.560] Okay, so you all agree on number two, so we'll start there.
[01:36:04.560 --> 01:36:14.320] In the 2014 National Science Foundation survey, 26% of Americans stated that the sun revolves about the earth rather than the other way around.
[01:36:14.320 --> 01:36:19.600] You all think this one is science, and this one is science.
[01:36:19.600 --> 01:36:20.720] This one is science.
[01:36:21.200 --> 01:36:32.360] Yeah, and that figure, I looked around for other surveys, you know, rather than just going on one survey, I found figures anywhere between 18 and 28 percent, some even a little bit worse.
[01:36:32.680 --> 01:36:40.200] You know, 26 seems to be like the more 25, 26 percent, basically a quarter, I think, seems to be the most average number there.
[01:36:40.200 --> 01:36:46.040] And that's just like the basic fact that the Earth goes around the sun.
[01:36:46.040 --> 01:36:49.320] Forget about like how long it takes for that to happen.
[01:36:49.320 --> 01:36:51.640] Fewer people knew that, right?
[01:36:53.560 --> 01:36:57.080] I know, it's pretty disappointing.
[01:36:57.080 --> 01:37:08.280] And I got it, these were all in the U.S., but the numbers for these kinds of questions are pretty similar across many countries.
[01:37:08.600 --> 01:37:09.800] We're not an outlier.
[01:37:09.800 --> 01:37:11.240] We are not an outlier.
[01:37:12.360 --> 01:37:20.600] The worst country for those surveys that looked at multiple different countries and multiple different questions, what do you think the worst one was of, like, big countries?
[01:37:20.600 --> 01:37:22.280] Well, which are we talking globally?
[01:37:22.280 --> 01:37:23.080] Are we talking just in Europe?
[01:37:23.320 --> 01:37:23.640] Globally.
[01:37:24.680 --> 01:37:26.680] Just highly religious countries.
[01:37:26.680 --> 01:37:34.440] So the highly religious countries have specific questions where they're far worse than other countries.
[01:37:34.440 --> 01:37:45.720] So India and the United States stick out for any question that is where evolution is, you know, belief in evolution versus creationism is an issue.
[01:37:45.720 --> 01:37:51.720] But just in general, like not creationism-related questions, the worst country is basically China.
[01:37:51.720 --> 01:37:52.200] Oh, interesting.
[01:37:52.440 --> 01:37:58.840] Yeah, they're like way off the average for a lot of other countries.
[01:37:58.840 --> 01:38:01.320] Do you think that's because of income inequality?
[01:38:01.320 --> 01:38:09.080] Because I don't know why there's a stereotype in the West that Chinese schools are very strict, but also very science-based.
[01:38:09.080 --> 01:38:11.240] But that's probably only for rich kids.
[01:38:11.240 --> 01:38:11.880] Yeah.
[01:38:11.880 --> 01:38:14.960] And Russia does very bad too, by the way.
[01:38:14.960 --> 01:38:17.040] Malaysia does very poorly.
[01:38:14.680 --> 01:38:20.240] Japan does very well in most things.
[01:38:20.720 --> 01:38:22.240] But again, there's other ones where, like, really?
[01:38:22.240 --> 01:38:22.720] They got.
[01:38:23.040 --> 01:38:26.400] Anyway, we can go over that a little bit more detail when I go over the rest of the questions.
[01:38:26.400 --> 01:38:28.640] But it's, yeah, the U.S.
[01:38:28.640 --> 01:38:35.280] wasn't generally an outlier, but whenever there was a question that was clearly evolution-based, we were.
[01:38:35.600 --> 01:38:36.960] So let's go back to number one.
[01:38:36.960 --> 01:38:45.040] A 2020 National Science Board survey found that 68% of Americans believe that all radioactivity is man-made.
[01:38:45.280 --> 01:38:48.400] Bob, Evan, and Kara think this one is the fiction.
[01:38:48.400 --> 01:38:50.000] Jay, you think this one is science?
[01:38:50.000 --> 01:38:54.240] And this one is the fiction.
[01:38:54.400 --> 01:38:55.360] I flipped it.
[01:38:55.360 --> 01:38:58.000] 68% got that question correct.
[01:38:58.640 --> 01:39:03.200] And so then 32% believe that all that's still high.
[01:39:03.600 --> 01:39:06.320] But 32% think that all radioactivity is man-made.
[01:39:06.320 --> 01:39:12.720] If you want to look at other countries on this question, to give you an example, in Canada, 72% got that question right.
[01:39:12.720 --> 01:39:14.560] China, 41%.
[01:39:14.560 --> 01:39:16.640] The EU, 59%.
[01:39:16.640 --> 01:39:18.320] Israel, 76%.
[01:39:18.320 --> 01:39:19.680] Japan, 64%.
[01:39:19.680 --> 01:39:21.040] Malaysia, 20.
[01:39:21.040 --> 01:39:22.880] Russia, 35%.
[01:39:22.880 --> 01:39:25.440] South Korea, 48.
[01:39:25.440 --> 01:39:28.800] So, yeah, we're actually, other than Canada, the U.S.
[01:39:28.800 --> 01:39:31.440] was better than most other countries there.
[01:39:31.440 --> 01:39:33.760] So that one is interesting, too.
[01:39:34.000 --> 01:39:35.200] It's not one we talk about a lot.
[01:39:35.520 --> 01:39:39.600] That misconception is not one that, you know, we've, I don't know, I've ever discussed that.
[01:39:40.560 --> 01:39:42.080] It's a recollection of having read this.
[01:39:42.240 --> 01:39:43.760] But yeah, you're right, though, Jay.
[01:39:43.760 --> 01:39:50.080] It's like, yeah, you could see how, because we talk about it in the context of nuclear power and nuclear weapons and whatever.
[01:39:50.080 --> 01:39:54.960] And they just may not realize that, yeah, uranium is an ore and it's radioactive, you know.
[01:39:54.960 --> 01:39:56.560] Or the banana is radioactive.
[01:39:56.640 --> 01:39:56.800] Right.
[01:39:58.160 --> 01:39:58.720] I love the radioactive activity.
[01:39:58.880 --> 01:40:01.080] Don't even go there.
[01:39:59.600 --> 01:40:02.520] All right, but let's go to number three.
[01:40:03.080 --> 01:40:12.040] A 2015 YouGov online survey found that 41% of people believe dinosaurs and humans lived at the same time, while only 25% answered that they definitely did not.
[01:40:12.040 --> 01:40:13.640] That, of course, is science.
[01:40:13.640 --> 01:40:17.960] And yeah, I figured, like, if I just had the 41%, you go, that's the creationist, right?
[01:40:17.960 --> 01:40:18.680] And it's true.
[01:40:18.920 --> 01:40:22.040] That's 100% what that 41% figure is from.
[01:40:22.040 --> 01:40:24.760] That's why I included the 25%.
[01:40:24.760 --> 01:40:30.200] Only 25% were confident that we didn't live at the same time as dinosaurs.
[01:40:30.280 --> 01:40:33.480] That's way higher than the creationist population in this country.
[01:40:33.800 --> 01:40:40.440] And it was a definitely did, probably did, probably did not, definitely did not, I don't know, right?
[01:40:40.440 --> 01:40:41.880] Those are the five choices.
[01:40:42.360 --> 01:40:47.320] Only 25% were confident in saying we definitely did not live at the same time with dinosaurs.
[01:40:47.320 --> 01:40:49.800] So, yeah, that's pretty bad.
[01:40:49.800 --> 01:40:53.880] So, we still have a lot of work to do in terms of just basic scientific literacy.
[01:40:54.600 --> 01:40:56.360] Other questions that are typical?
[01:40:56.360 --> 01:41:03.480] And I have done this when I've given seminars before, and I'm going to do this when I'm in Dubai, giving my skeptical conference.
[01:41:03.880 --> 01:41:06.840] There, I have my scientific literacy.
[01:41:06.840 --> 01:41:12.200] It's also critical thinking literacy and media literacy, sort of questionnaire that I've been giving out.
[01:41:12.200 --> 01:41:18.040] I'll do it sort of as a pre-test, you know, and then we can talk about the end of the seminar.
[01:41:18.280 --> 01:41:24.600] And I have some of the same questions in mind: like, well, here's one: electrons are smaller than atoms.
[01:41:24.840 --> 01:41:38.840] That ranged from across different nations in China, the highest figure was 60% in Israel, 46% in the U.S., less than half of people know that electrons are smaller than atoms.
[01:41:38.840 --> 01:41:39.760] That's crazy, man.
[01:41:39.880 --> 01:41:41.080] Yeah, it's bad.
[01:41:41.320 --> 01:41:42.680] I mean, all right.
[01:41:43.720 --> 01:41:46.960] I mean, the universe began with a huge explosion.
[01:41:46.960 --> 01:41:47.520] What do you think?
[01:41:47.520 --> 01:41:49.360] How many people got that one correct?
[01:41:49.520 --> 01:41:50.800] 55.
[01:41:50.800 --> 01:41:51.760] Yeah, coin flip.
[01:41:51.920 --> 01:41:52.720] 38.
[01:41:53.920 --> 01:41:59.600] But it was higher in most other developed nations because I think that's a creationism once they don't believe in the Big Bang.
[01:41:59.920 --> 01:42:00.480] Right.
[01:42:00.480 --> 01:42:00.960] Yeah.
[01:42:00.960 --> 01:42:01.520] Yeah.
[01:42:02.480 --> 01:42:03.200] Here's one again.
[01:42:03.200 --> 01:42:08.800] We like, it's not one that you commonly think of as a source of misconception.
[01:42:08.800 --> 01:42:11.600] The center of the earth is very hot.
[01:42:12.000 --> 01:42:14.080] What percentage of people said that was true?
[01:42:14.080 --> 01:42:14.720] 60.
[01:42:14.720 --> 01:42:15.200] 50?
[01:42:16.160 --> 01:42:16.960] 86.
[01:42:16.960 --> 01:42:19.200] So that was how it was still opposite.
[01:42:19.520 --> 01:42:20.480] I got the investigation.
[01:42:20.720 --> 01:42:24.480] 14% of people didn't know the answer to that question.
[01:42:24.480 --> 01:42:26.400] And that was one of the highest.
[01:42:26.400 --> 01:42:27.920] I mean, it's just across the border, though.
[01:42:27.920 --> 01:42:29.440] 47% in China.
[01:42:29.440 --> 01:42:31.040] Again, that's kind of an outlier.
[01:42:31.200 --> 01:42:32.080] Something's up.
[01:42:32.560 --> 01:42:33.120] Wow.
[01:42:33.440 --> 01:42:38.400] And I'm not surprised by the number, like when the numbers are really high, where somebody says, I just don't know.
[01:42:38.400 --> 01:42:38.800] Yeah.
[01:42:38.800 --> 01:42:40.480] Like, I just never, I don't know.
[01:42:40.480 --> 01:42:45.920] But when somebody confidently has the wrong answer, pernicious beliefs.
[01:42:45.920 --> 01:42:55.440] But those do seem to be not always, but like related to other strong beliefs, like religious beliefs or magical thinking or whatever.
[01:42:55.440 --> 01:42:57.600] All right, Evan, give us a quote.
[01:42:57.600 --> 01:43:06.800] Auditors and journalists and scientists are supposed to be trained in critical thinking, but they are subject to the same sorts of biases that we all have.
[01:43:06.800 --> 01:43:14.960] And the fact that we get some training about this doesn't necessarily immunize us against all the ways in which we can make mistakes.
[01:43:15.280 --> 01:43:28.560] And that was said by Daniel Simons, who's an experimental psychologist, cognitive scientist, and professor in the Department of Psychology and the Beckman Institute for Advanced Science and Technology at the University of Illinois.
[01:43:28.560 --> 01:43:34.440] And we will be meeting Daniel in Las Vegas this coming October because he'll be at Sycon 2024.
[01:43:34.440 --> 01:43:37.240] Yeah, that's probably the weekend this episode's going to air.
[01:43:37.240 --> 01:43:37.960] So now.
[01:43:38.280 --> 01:43:44.600] Hey, we're probably at Vegas right now if you're listening to this right as it gets downloaded.
[01:43:44.600 --> 01:43:45.000] All right.
[01:43:45.000 --> 01:43:51.400] Well, thank you guys for joining me for this out-of-sequence time traveling episode of the SGU.
[01:43:51.400 --> 01:43:51.800] Roger.
[01:43:51.880 --> 01:43:53.800] Hey, it'll be good to be with you.
[01:43:54.600 --> 01:43:59.560] And until next week, this is your Skeptic's Guide to the Universe.
[01:44:02.200 --> 01:44:08.840] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:44:08.840 --> 01:44:13.480] For more information, visit us at theskepticsguide.org.
[01:44:13.480 --> 01:44:17.400] Send your questions to info at the skepticsguide.org.
[01:44:17.400 --> 01:44:28.120] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:44:28.120 --> 01:44:31.640] Our listeners and supporters are what make SGU possible.
[01:44:39.240 --> 01:44:43.000] At Azure Standard, we believe healthy food shouldn't be complicated.
[01:44:43.000 --> 01:44:51.640] That's why we've spent 30 years delivering clean organic groceries and earth-friendly products to families who care about what goes on their plates and into their lives.
[01:44:51.640 --> 01:45:00.600] From pantry staples to wellness essentials and garden-ready seeds, everything we offer is rooted in real nutrition, transparency, and trust.
[01:45:00.600 --> 01:45:03.560] Join a community that believes in better naturally.
[01:45:03.560 --> 01:45:09.720] Visit AzureStandard.com today and discover a simpler, healthier way to shop for the things that matter most.