Debug Information
Processing Details
- VTT File: skepticast2024-11-09.vtt
- Processing Time: September 09, 2025 at 10:15 AM
- Total Chunks: 3
- Transcript Length: 173,522 characters
- Caption Count: 1,809 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs, who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.440] Learn more at aarp.org/slash skills.
[00:00:33.280 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.880] Hello, and welcome to the Skeptic's Guide to the Universe.
[00:00:42.880 --> 00:00:47.520] Today is Wednesday, November 6th, 2024, and this is your host, Stephen Novella.
[00:00:47.520 --> 00:00:49.200] Joining me this week are Bob Novella.
[00:00:49.200 --> 00:00:49.760] Hey, everybody.
[00:00:49.760 --> 00:00:50.960] Kara Santa Maria.
[00:00:50.960 --> 00:00:51.440] Howdy.
[00:00:51.440 --> 00:00:52.240] Jay Novella.
[00:00:52.240 --> 00:00:52.880] Hey, guys.
[00:00:52.880 --> 00:00:54.400] And Evan Bernstein.
[00:00:54.400 --> 00:00:55.440] Good evening, everyone.
[00:00:55.440 --> 00:00:58.320] Let's see what's going on in the world today.
[00:00:58.640 --> 00:01:01.360] Oh, I think it's a waxing crescent moon out.
[00:01:02.240 --> 00:01:02.880] It's warm.
[00:01:02.880 --> 00:01:03.120] Thanks.
[00:01:03.280 --> 00:01:03.920] Very warm.
[00:01:03.920 --> 00:01:05.200] It's unseasonably warm out there.
[00:01:05.600 --> 00:01:10.080] Oh, in Connecticut, it's been just beyond 20 degrees above normal temperature.
[00:01:10.080 --> 00:01:11.280] Oh, yeah, and Trump won the election.
[00:01:12.480 --> 00:01:12.800] Oh, yeah.
[00:01:12.960 --> 00:01:13.680] I forgot about that.
[00:01:13.680 --> 00:01:14.480] Oh, Halloween happened.
[00:01:15.280 --> 00:01:16.240] I didn't forget about that.
[00:01:16.240 --> 00:01:17.040] That was pretty awesome.
[00:01:18.000 --> 00:01:18.800] Thanksgiving coming up.
[00:01:19.920 --> 00:01:20.960] Yeah, some things happened.
[00:01:21.360 --> 00:01:24.080] Yeah, it was like a random assortment of miscellaneous stuff.
[00:01:24.080 --> 00:01:26.880] So, listen, you know, obviously, we're recording this the day after the election.
[00:01:26.880 --> 00:01:30.400] Trump, we just found out, you know, whatever, half a day ago that Trump won.
[00:01:30.400 --> 00:01:31.520] We're still processing it.
[00:01:31.520 --> 00:01:33.760] This is sort of a big thing to wrap your head around.
[00:01:33.760 --> 00:01:34.400] We're not happy.
[00:01:34.400 --> 00:01:36.000] We're not going to pretend we're happy.
[00:01:36.000 --> 00:01:38.240] There's a great deal to be concerned about.
[00:01:38.240 --> 00:01:39.920] He thinks global warming is a hoax.
[00:01:39.920 --> 00:01:43.280] He wants to roll back anything that we're doing about it.
[00:01:43.280 --> 00:01:46.320] He's going to probably put an anti-vaxxer in charge of our health care.
[00:01:46.320 --> 00:01:50.080] I mean, there's a lot to be concerned about, but we're not going to talk about the politics of this.
[00:01:50.080 --> 00:01:56.400] Obviously, there's a lot of scientific and skeptical considerations that we'll be exploring this episode and in the future.
[00:01:56.400 --> 00:01:57.760] But we're going to move on.
[00:01:57.760 --> 00:01:59.040] So, a lot actually happened.
[00:01:59.040 --> 00:02:06.360] It's been two weeks since we've recorded a show, and we went to SciCon last weekend.
[00:01:59.680 --> 00:02:07.160] That was a lot of fun.
[00:02:07.480 --> 00:02:07.960] That was great.
[00:02:08.440 --> 00:02:09.880] Yeah, we had a nice time.
[00:02:09.880 --> 00:02:13.560] Skeptical events are always recharging for me, like getting together.
[00:02:13.560 --> 00:02:15.800] We got to see some of our old friends.
[00:02:15.800 --> 00:02:25.880] I gave a talk on controversies within the skeptical group, things that skeptics disagree with each other about, because I think we should focus on that more at these conferences.
[00:02:25.880 --> 00:02:31.800] We should be working out with each other, you know, in a positive, constructive way, the things that we disagree about.
[00:02:31.800 --> 00:02:36.760] Because while you still learn some stuff, it gets kind of boring talking about the stuff we all agree on over and over again.
[00:02:36.760 --> 00:02:37.400] You know what I mean?
[00:02:37.640 --> 00:02:39.800] And then there's this like big elephant in the room.
[00:02:39.800 --> 00:02:40.280] Right, right.
[00:02:40.280 --> 00:02:42.600] And everyone's like, don't talk about the thing we can't talk about.
[00:02:42.600 --> 00:02:44.680] No, I was like, I just talked about it.
[00:02:44.680 --> 00:02:45.080] Oh, gosh.
[00:02:45.960 --> 00:02:47.240] That's when you should talk about it.
[00:02:47.400 --> 00:02:47.640] Exactly.
[00:02:48.040 --> 00:02:49.080] That was an amazing talk.
[00:02:49.240 --> 00:02:51.960] The best one I've ever heard, Steve, that you've done.
[00:02:51.960 --> 00:02:53.240] Standing ovation.
[00:02:53.240 --> 00:02:56.920] Well, certainly the few accolades and tears afterwards by people.
[00:02:56.920 --> 00:02:59.240] It was amazing to just experience.
[00:02:59.400 --> 00:03:02.360] Well, people were clearly hungry to address that issue, you know.
[00:03:02.360 --> 00:03:05.320] Yes, avoiding it is not the way to do it.
[00:03:05.800 --> 00:03:09.480] Part of the talk was on the biological sex controversy.
[00:03:09.480 --> 00:03:15.800] If you want to read about it, I actually blogged about it on Neurologica because there was a bit of disagreement, which, again, I think is healthy and good.
[00:03:15.800 --> 00:03:17.400] But you can read about that there.
[00:03:17.400 --> 00:03:19.400] We'll probably be putting out a video at some point.
[00:03:19.400 --> 00:03:25.160] SciCon will be putting out just recordings of the talks at some point.
[00:03:25.160 --> 00:03:29.000] So I do think we need to explore these issues.
[00:03:29.000 --> 00:03:39.880] And I think, you know, we can, if, hey, if there's a if there's a group of people who should be able to disagree politely and constructively and focus on logic and evidence, it's us, right?
[00:03:39.880 --> 00:03:43.160] I mean, if we can't do it, who how can we expect anybody to do it?
[00:03:43.160 --> 00:03:44.520] We have to practice what we preach.
[00:03:44.520 --> 00:03:44.920] You know what I'm saying?
[00:03:45.680 --> 00:03:46.240] I agree.
[00:03:46.240 --> 00:03:46.640] Holy.
[00:03:46.640 --> 00:03:47.600] Yeah, absolutely.
[00:03:47.600 --> 00:03:50.160] I mean, it's look, not gonna lie, it's hard.
[00:03:50.160 --> 00:03:51.200] You know, it's hard.
[00:03:51.200 --> 00:03:52.480] We've been doing this a long time.
[00:03:52.480 --> 00:03:58.880] We've been trying to teach people critical thinking, and sometimes the world doesn't bend the way that you'd like it to.
[00:03:58.880 --> 00:04:00.400] And it's quite all used.
[00:04:00.800 --> 00:04:04.080] And it's good to remind ourselves of a lot of these things as well.
[00:04:04.080 --> 00:04:05.440] Yeah, I mean, that was one of the themes as well.
[00:04:05.440 --> 00:04:06.960] It was an exercise in humility.
[00:04:06.960 --> 00:04:10.480] It's like, look, we disagree pretty much along political lines.
[00:04:10.480 --> 00:04:15.440] As much as we like to pretend we are trans partisan or whatever, we're not.
[00:04:15.440 --> 00:04:20.480] You know, we are just as susceptible to ideological bias as anybody else.
[00:04:20.480 --> 00:04:23.360] We just have to be more aware of it and confront it.
[00:04:23.360 --> 00:04:24.000] You know what I mean?
[00:04:24.000 --> 00:04:32.800] That's the whole metacognition angle, not ignore it or pretend like the other side is the only side that's ideologically biased and your side isn't.
[00:04:33.440 --> 00:04:35.360] But motivated reasoning is so powerful.
[00:04:35.360 --> 00:04:42.080] I think that's one of the hardest things, even for seasoned, knowledgeable skeptics to deal with.
[00:04:42.080 --> 00:04:43.120] Because we're good at it.
[00:04:43.120 --> 00:04:56.560] We're good at motivated reasoning, ironically, because people who are able to argue in a sophisticated way and put forth a nuanced argument and have a lot of knowledge at their fingertips are actually better at motivated reasoning.
[00:04:56.560 --> 00:04:58.800] And they're more confident in it as well.
[00:04:59.200 --> 00:05:01.760] So that we have to be very aware.
[00:05:01.760 --> 00:05:04.400] That's like our big Achilles heel in the movement.
[00:05:04.400 --> 00:05:05.600] We have to be very aware of it.
[00:05:05.600 --> 00:05:06.720] Yeah, so that was fun.
[00:05:06.720 --> 00:05:09.280] I think also kind of understanding.
[00:05:09.280 --> 00:05:13.360] Oftentimes we talk about causation and the directionality of causation.
[00:05:13.360 --> 00:05:16.160] It's such an important part of science literacy.
[00:05:16.160 --> 00:05:26.880] And I think we have to remember it when it comes to our own moral standings and our own decision-making around things like policy.
[00:05:27.360 --> 00:05:39.240] Because very often, we may think that something is a decision or an ideology that we came to via evidence when actually we came to it because, well, I agree with them about this.
[00:05:39.240 --> 00:05:41.480] Why wouldn't I agree with them about that?
[00:05:42.200 --> 00:05:51.640] You know, is my position based on my investigation, or am I open to somebody telling me what my position should be?
[00:05:51.640 --> 00:05:54.120] Because, yeah, we're on the same team.
[00:05:54.760 --> 00:06:11.880] That was one of the controversies I've referred to: the idea, the question of can all even ethical and moral questions be resolved with science, or does it require philosophy and morality and ethics, you know, separate from science?
[00:06:12.120 --> 00:06:17.080] I tend to align myself with the philosophers in the movement on this issue.
[00:06:17.080 --> 00:06:23.240] I've engaged with that discussion, and I think the philosophers absolutely crush it.
[00:06:23.240 --> 00:06:27.480] You know, when they come up against the other people, like, we don't need to do philosophy, philosophy is irrelevant to science.
[00:06:27.560 --> 00:06:29.480] Like, no, you're doing it, whether you know it or not.
[00:06:29.480 --> 00:06:30.200] You are doing it.
[00:06:30.200 --> 00:06:31.960] You're just doing it wrong.
[00:06:33.560 --> 00:06:36.600] And then you guys went home, and I went to Dubai.
[00:06:36.600 --> 00:06:39.560] It was the first time in an Arabic country.
[00:06:39.560 --> 00:06:42.360] It was very, you know, it was a good experience.
[00:06:42.360 --> 00:06:44.680] So, Dubai is a very interesting city.
[00:06:44.680 --> 00:06:46.440] It's very wealthy.
[00:06:46.440 --> 00:06:48.440] It's an international city.
[00:06:48.440 --> 00:06:50.120] There was English everywhere.
[00:06:50.120 --> 00:06:55.000] The only one touristy thing I did was entirely in English.
[00:06:55.240 --> 00:06:59.080] But there's this layer of Arabic Muslim culture there as well.
[00:06:59.080 --> 00:07:02.760] So it's kind of like, you're in an alternate universe in a sci-fi movie.
[00:07:02.760 --> 00:07:03.240] You know what I mean?
[00:07:03.240 --> 00:07:04.760] Where everything's the same but different?
[00:07:04.760 --> 00:07:05.080] Yeah.
[00:07:05.640 --> 00:07:11.800] So I gave what turned out to be a nine-hour seminar on scientific skepticism.
[00:07:11.800 --> 00:07:12.760] It's not a lot of time.
[00:07:12.760 --> 00:07:14.520] It went by super fast.
[00:07:14.520 --> 00:07:14.760] Really?
[00:07:15.040 --> 00:07:18.800] I had the perfect audience because they were very smart, very engaged.
[00:07:18.800 --> 00:07:23.680] They knew why they were there and they wanted something out of the conference, but this was all new to them.
[00:07:23.680 --> 00:07:24.800] Oh, yeah, that's exciting.
[00:07:24.800 --> 00:07:25.280] Yeah.
[00:07:25.280 --> 00:07:26.880] So they asked tons of questions.
[00:07:26.880 --> 00:07:29.280] Like every question was anticipating a future slides.
[00:07:29.360 --> 00:07:30.640] Like, that's a good question.
[00:07:30.640 --> 00:07:31.360] We're going to get to that.
[00:07:31.360 --> 00:07:32.240] Or let's talk about that.
[00:07:33.200 --> 00:07:43.680] And you realize when you're doing it, I didn't, I realized when I was writing it, but especially when I'm giving it, what a massive body of knowledge we have amassed under the umbrella of scientific skepticism.
[00:07:43.680 --> 00:07:45.360] There is so much to talk about.
[00:07:45.360 --> 00:07:49.520] Nine hours was nothing, it was scratching the surface.
[00:07:49.840 --> 00:07:54.400] I can't talk nine hours about anything, even if it was counting numbers.
[00:07:54.400 --> 00:07:55.680] I couldn't do it.
[00:07:56.320 --> 00:07:58.480] Well, Bob, I wasn't giving a nine-hour lecture.
[00:07:58.480 --> 00:07:59.360] You have to engage.
[00:07:59.360 --> 00:08:00.480] It was asking questions.
[00:08:00.480 --> 00:08:01.760] They were asking me questions.
[00:08:01.760 --> 00:08:02.880] We were doing demonstrations.
[00:08:02.880 --> 00:08:04.640] I did all the psychological stuff that we do.
[00:08:04.640 --> 00:08:05.280] You know what I mean?
[00:08:05.280 --> 00:08:06.880] It was dynamic.
[00:08:07.120 --> 00:08:07.920] The Socratic method.
[00:08:08.000 --> 00:08:09.280] It wasn't me blabbering for nine hours.
[00:08:09.600 --> 00:08:11.120] Assuming you had breaks, too.
[00:08:11.120 --> 00:08:12.320] And there were breaks in there, yeah.
[00:08:13.200 --> 00:08:16.640] There's three, three-hour sessions with a break in the middle of each session.
[00:08:16.640 --> 00:08:21.600] So I did what I usually do at these longer, longer classes that I give.
[00:08:21.920 --> 00:08:35.840] I gave a quiz that I've come up with, which is a quiz focusing on scientific literacy, critical thinking skills, media savvy, and just like belief in the paranormal, you know, kind of stuff.
[00:08:35.840 --> 00:08:37.360] 30 questions.
[00:08:37.520 --> 00:08:45.760] They were able to do the questions on their phones, and then I was able to get, you know, they did it all in Google Forms, so you get like a chart, pie chart of all the answers.
[00:08:46.000 --> 00:08:46.720] Yeah, everybody, yeah.
[00:08:46.720 --> 00:08:50.480] Yeah, so then we went over them at the end of the seminar, and it was a lot of fun.
[00:08:50.480 --> 00:08:55.120] But what was really interesting is that they basically were average.
[00:08:56.640 --> 00:09:02.280] Their results were aligned with a lot of published surveys about these questions.
[00:09:02.440 --> 00:09:07.320] Like one of the questions was: did humans and non-avian dinosaurs live at the same time?
[00:09:07.320 --> 00:09:09.640] 60% of them got that question wrong.
[00:09:10.440 --> 00:09:14.280] Oh, yeah, so we're all science nerds, right?
[00:09:14.280 --> 00:09:26.840] So, like, to us, these questions are blazingly obvious, but to even well-educated, these are all like CEOs, very power, you know, very high-end, high-achieving business people, but they're not science nerds.
[00:09:26.840 --> 00:09:27.720] So very few of them were.
[00:09:27.720 --> 00:09:29.400] There's a couple engineers thrown in there.
[00:09:29.720 --> 00:09:33.560] And you just realized how compartmentalized knowledge can be, you know?
[00:09:33.560 --> 00:09:34.040] Yeah.
[00:09:34.040 --> 00:09:37.080] Part of the whole theme of the talk was to be humbling.
[00:09:37.400 --> 00:09:43.480] It's like, yeah, even you, you know, the whole idea, like even though you might be very successful in one area, you don't know everything.
[00:09:43.480 --> 00:09:44.840] No one's an expert in everything.
[00:09:44.840 --> 00:09:48.520] And that was part of the demonstration of that fact.
[00:09:49.560 --> 00:09:50.920] You still have a brain.
[00:09:50.920 --> 00:09:51.960] You still have a human brain.
[00:09:51.960 --> 00:09:52.840] What's that, Bob?
[00:09:52.840 --> 00:09:53.720] What was the name of the talk?
[00:09:53.720 --> 00:09:55.080] You don't know Squat?
[00:09:55.080 --> 00:09:59.800] No, it was just, well, the whole theme of the conference was futurism, actually.
[00:09:59.800 --> 00:10:04.440] So it was like science, critical thinking, and the future, or something is what I called it.
[00:10:04.440 --> 00:10:07.160] But I did a little futurism at the end as well, like from our book.
[00:10:07.160 --> 00:10:07.720] Oh, damn.
[00:10:07.720 --> 00:10:08.040] Yeah.
[00:10:08.040 --> 00:10:08.680] Yeah, it was good.
[00:10:08.680 --> 00:10:09.640] It was very good.
[00:10:09.640 --> 00:10:12.680] Oh, so it wasn't specifically a skeptical conference.
[00:10:12.680 --> 00:10:13.640] No, it wasn't.
[00:10:13.960 --> 00:10:15.880] It was really, and it's month-long.
[00:10:15.880 --> 00:10:19.480] It was a month-long conference where people were coming and going, you know.
[00:10:19.480 --> 00:10:24.600] And it was just a series of lectures and seminars and classes on a bunch of topics.
[00:10:24.600 --> 00:10:31.800] The guy who is the dean of the Dubai Future Foundation, Muhammad Qassem, he's a big fan of the show.
[00:10:31.800 --> 00:10:33.080] He's been listening to us for years.
[00:10:33.080 --> 00:10:46.320] He's the one who invited me there, and he wants to inject skepticism into the United Arab Emirates and the whole region, not just the UAE, not just Dubai, but the whole into the whole culture.
[00:10:44.520 --> 00:10:49.600] We may all get invited there in the future.
[00:10:50.480 --> 00:10:53.440] He made it sound like this is the beginning of our relationship.
[00:10:54.000 --> 00:10:55.360] This is sort of a building kind of thing.
[00:10:55.360 --> 00:10:55.760] I love it.
[00:10:55.920 --> 00:10:56.240] Nice.
[00:10:56.240 --> 00:10:56.720] Nice.
[00:10:57.520 --> 00:11:03.120] That's why it's great to like, because we've obviously been operating within our own culture for 30 years.
[00:11:03.120 --> 00:11:10.000] It's nice to go somewhere else where there isn't an organized skeptical movement.
[00:11:10.160 --> 00:11:11.760] It's just virgin territory.
[00:11:11.760 --> 00:11:12.880] It's like, oh, my God.
[00:11:12.880 --> 00:11:14.080] I could be packed in 20 minutes.
[00:11:14.080 --> 00:11:15.280] It's just insane.
[00:11:15.920 --> 00:11:17.680] It's a 15-hour flight, just warning you.
[00:11:18.080 --> 00:11:20.000] Bob, you have to talk for nine hours about something.
[00:11:20.480 --> 00:11:22.880] But the food, the food was incredible.
[00:11:22.880 --> 00:11:25.680] I'm a fan of Middle Eastern food.
[00:11:25.840 --> 00:11:27.440] Not as if that's a monolithic thing.
[00:11:28.000 --> 00:11:31.040] It's like I had Iraqi food, never had Iraqi food before.
[00:11:31.040 --> 00:11:34.880] It was Middle Eastern food, same spice palate, same kind of vibe, but it was different.
[00:11:34.880 --> 00:11:36.320] It was everything was just cool.
[00:11:36.640 --> 00:11:38.800] Nothing I've ever had before was awesome.
[00:11:38.800 --> 00:11:40.000] It was incredible.
[00:11:40.000 --> 00:11:41.360] Anyway, that was a lot of fun.
[00:11:41.360 --> 00:11:46.000] Yeah, I hope I expect we will be going there sometime in the future.
[00:11:46.000 --> 00:11:47.920] Just please not October.
[00:11:48.240 --> 00:11:49.520] No promises.
[00:11:49.520 --> 00:11:54.080] But it was really interesting how similar it was.
[00:11:54.240 --> 00:11:59.120] I had conversations with people who were from the UAE.
[00:11:59.280 --> 00:12:03.680] I talked to somebody who was heavily involved in the power industry.
[00:12:04.400 --> 00:12:06.320] I could have been talking to somebody here.
[00:12:06.320 --> 00:12:07.760] It was really no different.
[00:12:07.760 --> 00:12:14.240] They had all the same issues, all the same information, as far as that was concerned, pretty much the same attitudes.
[00:12:14.240 --> 00:12:18.160] I was kind of surprised, but it's interesting how similar things were.
[00:12:18.160 --> 00:12:21.360] But I think that's because it's an international city.
[00:12:21.360 --> 00:12:33.160] And also, they are looking to, I think they're looking to elevate the UAE as a modern technological sort of state and Dubai as an international, modern city.
[00:12:33.160 --> 00:12:36.280] So they're very interested in that sort of thing.
[00:12:29.840 --> 00:12:37.160] Hey, George is here.
[00:12:37.320 --> 00:12:38.360] George, how are you doing, man?
[00:12:38.360 --> 00:12:38.840] George.
[00:12:38.840 --> 00:12:39.640] Hi, everybody.
[00:12:40.440 --> 00:12:41.080] What's going on?
[00:12:41.080 --> 00:12:41.800] What's going on?
[00:12:42.200 --> 00:12:45.240] Are you as excited as I am for December to get here?
[00:12:45.240 --> 00:12:46.600] Can we get the holidays happening?
[00:12:46.600 --> 00:12:49.800] Can we get the vibe of the holidays please happening?
[00:12:50.120 --> 00:12:54.280] I think we need to celebrate Pearl Harbor Day with a special event or two.
[00:12:55.720 --> 00:12:56.600] Perfect.
[00:12:56.600 --> 00:13:02.200] So we've got this monster day, December 7th, in Washington, D.C., of all places.
[00:13:02.200 --> 00:13:03.320] Are you ready for it?
[00:13:03.320 --> 00:13:04.360] I hope you're ready for it.
[00:13:04.760 --> 00:13:05.560] I hope you're ready.
[00:13:05.560 --> 00:13:09.480] Steve and I are working on supplementing the SGU swag stuff.
[00:13:09.480 --> 00:13:17.720] So if you're going to be there in particular, the best and pretty much the only time that you're going to get good SGU swag is if you come to a live event because we bring all the premium stuff.
[00:13:17.720 --> 00:13:20.040] George, the question is: are you excited?
[00:13:20.040 --> 00:13:25.000] We rarely do a big double show in one day like this.
[00:13:25.000 --> 00:13:39.240] And what's so great about doing both a private show earlier in the day and then doing the extravaganza at night is we get so wonked out, giddy, funny, silly, almost like skeptic drunk by the end of it.
[00:13:39.400 --> 00:13:40.840] It's a long day.
[00:13:40.840 --> 00:13:42.680] And those are the best shows.
[00:13:42.680 --> 00:13:49.720] So if you're going to go to one show this year, go to the extravaganza, go to the private show that's happening on the 7th because it's going to be this marathon day.
[00:13:49.720 --> 00:13:51.240] It is such a blast.
[00:13:51.240 --> 00:13:56.920] And what's super cool is this is one of our very special holiday-themed extravaganza shows.
[00:13:56.920 --> 00:13:58.840] We've only done this one other time.
[00:13:58.840 --> 00:14:00.200] I think it was in Arizona, right?
[00:14:00.440 --> 00:14:01.440] I think, yes, yes.
[00:14:01.440 --> 00:14:05.160] During that Arizona show, we had the famous Yukon Cornelius incident.
[00:14:05.160 --> 00:14:05.720] Do you remember that?
[00:14:05.880 --> 00:14:06.440] Of course, I do.
[00:14:07.080 --> 00:14:10.440] Oh my gosh, one of the biggest laughs of all time.
[00:14:10.440 --> 00:14:11.480] Ask us about that sometime.
[00:14:11.480 --> 00:14:12.600] If you see us, ask us.
[00:14:12.600 --> 00:14:14.200] We'll tell you in D.C.
[00:14:14.200 --> 00:14:18.080] about the Yukon Cornelius incident that happened in Arizona.
[00:14:18.080 --> 00:14:18.880] It was fantastic.
[00:14:14.920 --> 00:14:20.320] We'll try to recreate that on some level.
[00:14:20.960 --> 00:14:29.120] It's also important to say, guys, that the private show is a private show plus, which means there's an extra hour of content.
[00:14:29.120 --> 00:14:30.880] Basically, we do audience interaction.
[00:14:30.880 --> 00:14:32.160] It's different every time.
[00:14:32.160 --> 00:14:33.040] Yeah, we've done it.
[00:14:33.040 --> 00:14:34.480] We did a scavenger hunt once.
[00:14:34.480 --> 00:14:36.080] We did a bunch of trivia things once.
[00:14:36.080 --> 00:14:37.520] We did like some song stuff once.
[00:14:37.680 --> 00:14:39.600] Every time it's different, every time it's special.
[00:14:39.600 --> 00:14:43.600] So even if you've done this before, come to it again because it's guaranteed to be different.
[00:14:43.600 --> 00:14:45.200] Same thing with the extravaganzas.
[00:14:45.200 --> 00:14:50.320] Every extravaganza, the framework is the same, but there's so much improv and incidental stuff that happens.
[00:14:50.320 --> 00:14:51.840] Every show is different.
[00:14:51.840 --> 00:14:58.160] We literally had people going to two shows in a row when we did this back a couple months ago, and they fully enjoyed both shows.
[00:14:58.880 --> 00:15:04.800] I love doing those shows because we've had weekends where we've done like six shows in three days.
[00:15:04.800 --> 00:15:13.520] Like we've had some pretty serious tours that we've been on, but I really like it when all of us have to really work together to make it all happen.
[00:15:13.520 --> 00:15:17.200] And it's such a feel-good when we do the shows and we finish them.
[00:15:17.200 --> 00:15:19.840] It's like, oh my God, guys, it was so awesome, right?
[00:15:19.840 --> 00:15:21.520] And then we get to see everybody.
[00:15:21.520 --> 00:15:24.800] You know, for most of our lives, we're sitting here talking to microphones in our basements.
[00:15:24.800 --> 00:15:30.240] So it's kind of nice to see people's faces and hear everybody laugh and interact and enjoy themselves.
[00:15:30.240 --> 00:15:34.960] There's a lot of audience participation that happens at the extravaganza as well as the private show.
[00:15:34.960 --> 00:15:36.160] So come on out.
[00:15:36.880 --> 00:15:40.160] Talk about a great gift for someone who likes the show.
[00:15:40.160 --> 00:15:41.600] I mean, could you think of a better presentation?
[00:15:41.680 --> 00:15:45.920] And, George, everyone should know that you are one funny bastard.
[00:15:47.520 --> 00:15:48.240] It's true.
[00:15:48.560 --> 00:15:49.920] I just had my license renewed.
[00:15:50.320 --> 00:15:51.720] Yeah, funny bachelor license renewed.
[00:15:51.760 --> 00:15:52.000] That's great.
[00:15:52.160 --> 00:15:55.520] My funny bastard license renewed, freshened.
[00:15:55.520 --> 00:15:57.440] I got the new test taken for that.
[00:15:57.440 --> 00:15:58.480] So, yeah, all good.
[00:15:58.480 --> 00:15:59.280] No, we're excited.
[00:15:59.280 --> 00:15:59.680] I'm excited.
[00:15:59.680 --> 00:16:03.960] I hope you guys are as excited as I am because that means that you'll be equally as excited.
[00:15:59.840 --> 00:16:06.360] And then the equation's all even and good.
[00:16:06.680 --> 00:16:09.880] Go to the skepticsguide.org.
[00:16:09.880 --> 00:16:13.560] And Ian has put these special buttons on there that you can click.
[00:16:13.560 --> 00:16:17.400] And if you're interested in either show, you just click the button that you like.
[00:16:17.400 --> 00:16:21.800] And also, don't forget, I do think there's some VIP tickets left for the extravaganza.
[00:16:22.440 --> 00:16:27.960] And also, let's not, while we have George on, let's plug one more thing because it's actually a huge deal.
[00:16:27.960 --> 00:16:29.000] Not a con?
[00:16:29.000 --> 00:16:29.400] Yes.
[00:16:29.400 --> 00:16:31.000] We have Nauticon coming up.
[00:16:31.000 --> 00:16:33.880] This is, yes, the weekend of May 15th.
[00:16:33.880 --> 00:16:36.120] It's going to be in White Plains, New York.
[00:16:36.120 --> 00:16:39.800] And let me tell you, we had a goddamn awesome time.
[00:16:39.800 --> 00:16:40.440] It was epic.
[00:16:40.600 --> 00:16:41.080] Fabulous time.
[00:16:41.240 --> 00:16:45.320] It really was the best conference I've ever been to, and it was the best conference I was involved with.
[00:16:45.320 --> 00:16:48.280] And real quick, I must have said this story, but I'll just say it really quick.
[00:16:48.280 --> 00:16:50.840] There was this one guy that came up to us and said, you know, I came here.
[00:16:50.840 --> 00:16:51.800] I didn't have any friends.
[00:16:51.800 --> 00:16:52.600] I didn't know anyone.
[00:16:52.600 --> 00:16:54.040] I was almost not going to come.
[00:16:54.040 --> 00:16:54.920] And I said, what the hell?
[00:16:54.920 --> 00:16:56.760] I got to force myself to do it.
[00:16:56.760 --> 00:17:04.600] And we have an event-wide puzzle that you have to live through the entire event to be able to figure out the entirety of this puzzle.
[00:17:04.600 --> 00:17:11.960] And he like picks up the puzzle sheet and he said, the next thing you know, I'm sitting down with like six people I don't know and we're hanging out like we're friends.
[00:17:11.960 --> 00:17:12.680] And that was it.
[00:17:12.680 --> 00:17:13.880] He was bingo.
[00:17:13.880 --> 00:17:15.320] Yeah, he was hooked in.
[00:17:15.800 --> 00:17:16.920] The magic of Nauticon.
[00:17:17.080 --> 00:17:18.360] But isn't that what we saw, guys?
[00:17:18.360 --> 00:17:26.280] I mean, you know, we've known for a very long time that we've curated an amazing group of listeners and patrons.
[00:17:26.280 --> 00:17:35.720] But this group of people, man, I got to tell you, like, I feel like I have a lot of friendships with the people that come, and there's a lot of love and a lot of affection and a lot of awesome stuff.
[00:17:35.720 --> 00:17:38.040] So, if you're interested to come have a great time.
[00:17:38.040 --> 00:17:40.680] You know, this conference is about socializing.
[00:17:41.000 --> 00:17:45.600] If you're interested, you can go to naughticon.com.
[00:17:45.600 --> 00:17:48.000] Remember, George, that Ian made this weird URL?
[00:17:43.640 --> 00:17:49.200] NaticonCon.com.
[00:17:49.280 --> 00:17:50.320] NatakonCon.
[00:17:44.920 --> 00:17:51.280] Check your con, check your con.
[00:17:51.440 --> 00:17:54.160] Or you go to our homepage and just click on the natacon button.
[00:17:54.160 --> 00:17:55.360] Yeah, there's a natacon button there.
[00:17:55.840 --> 00:17:57.360] Natacon 2.
[00:17:57.360 --> 00:17:58.480] Skeptical.
[00:17:58.480 --> 00:17:59.040] Mystery.
[00:17:59.120 --> 00:18:01.280] And, George, what's happening at Natacon this year?
[00:18:01.280 --> 00:18:02.800] My God, what's happening at Specon?
[00:18:02.960 --> 00:18:03.920] I can't even keep track of it.
[00:18:04.000 --> 00:18:05.920] We have so many fantastic ideas happening.
[00:18:05.920 --> 00:18:08.080] We've got this kind of Beatles theme that's going through.
[00:18:08.080 --> 00:18:12.720] We might have like a special super game show, another game show, more audience participation.
[00:18:12.720 --> 00:18:18.320] We're going to have SG University, where you get to learn stuff in 15-minute chunks from all these experts on stage.
[00:18:18.560 --> 00:18:20.160] It's just overwhelmingly cool.
[00:18:20.160 --> 00:18:24.000] But the most important thing is, like you just said, Jay, you get to hang.
[00:18:24.880 --> 00:18:35.760] We've built it in so that there is this time to just meet and greet and hang and talk about stuff and have that very wonderful beer that Westchester's famous for.
[00:18:36.240 --> 00:18:37.040] It's just the best.
[00:18:37.040 --> 00:18:38.000] It's the best time.
[00:18:38.000 --> 00:18:40.080] Well, George, thank you for popping in.
[00:18:40.080 --> 00:18:40.560] Thanks.
[00:18:40.560 --> 00:18:41.760] I'll pop in anytime.
[00:18:41.760 --> 00:18:42.880] I'm going to go pop out now.
[00:18:42.880 --> 00:18:44.560] And we'll see you on the other side, friend.
[00:18:44.560 --> 00:18:44.960] All righty.
[00:18:44.960 --> 00:18:45.440] Bye, guys.
[00:18:45.440 --> 00:18:45.840] Hey, George.
[00:18:47.040 --> 00:18:51.120] While we were at SciCon, we actually did a number of interviews.
[00:18:51.120 --> 00:18:56.160] We have a really great interview coming up later in this episode with Brian Cox.
[00:18:56.160 --> 00:18:56.640] Oh, yeah.
[00:18:56.640 --> 00:18:57.520] The Brian Cox.
[00:18:57.520 --> 00:18:58.160] Brian Cox.
[00:18:58.160 --> 00:19:08.960] And Brian Wecht, who is our friend and collaborator on Natacon and other things, who's also the Brian Wecht has also been a physicist and a ninja and an actual ninja.
[00:19:09.120 --> 00:19:10.960] Don't even ask because we can't tell you.
[00:19:11.760 --> 00:19:14.880] He was there for the interview and made it extra special.
[00:19:14.880 --> 00:19:16.640] He was a special sauce.
[00:19:16.800 --> 00:19:18.880] So I'll be later in the show.
[00:19:18.880 --> 00:19:21.600] But now we're going to go on with our new items.
[00:19:21.600 --> 00:19:23.840] Bob, you're going to start us off with a quickie.
[00:19:23.840 --> 00:19:24.240] Sure.
[00:19:24.240 --> 00:19:25.200] Thank you, Steve.
[00:19:25.200 --> 00:19:27.280] This is your quickie with Bob.
[00:19:27.280 --> 00:19:39.240] So, the international team of scientists have actually determined the chemical properties of some super heavy elements, muscovium, element 115, and nihonium, element 113.
[00:19:39.800 --> 00:19:44.040] So this makes muscovium the heaviest element ever chemically studied.
[00:19:44.040 --> 00:19:51.400] And you might be thinking, how the hell do they test an element chemically if that element lasts for milliseconds before decaying?
[00:19:51.400 --> 00:19:52.360] Very good question.
[00:19:52.360 --> 00:19:53.400] So that's what I'm going to describe.
[00:19:53.400 --> 00:19:55.480] The lab setup was very slick.
[00:19:55.480 --> 00:20:06.280] Essentially, they create super heavy particles and using gas chromatography, they quickly kind of shuttle those elements through various detectors that are lined with quartz and gold.
[00:20:06.280 --> 00:20:19.240] Now, if the element, say muscovium, if it binds with the quartz or the gold, that is detected and recorded, as well as how long it binds with the gold or the quartz and where that super heavy element ends up.
[00:20:19.240 --> 00:20:25.480] So that binding information is invaluable data as to how these elements behave chemically.
[00:20:25.480 --> 00:20:34.120] Now they found generally, the results said that both super heavy elements, muscovium and nihonium, they had weak interactions with quartz.
[00:20:34.120 --> 00:20:36.840] And this actually lines up with their predictions.
[00:20:36.840 --> 00:20:38.440] So here's where it gets really interesting.
[00:20:38.440 --> 00:20:39.720] I wasn't aware of this.
[00:20:39.720 --> 00:20:51.640] They predicted that relativistic effects would make super heavy elements less reactive than similar elements on the periodic table with fewer protons, say like lead, for various reasons.
[00:20:51.640 --> 00:20:58.200] They think lead and muscovium should behave similarly in how it binds with quartz and gold, for example.
[00:20:58.200 --> 00:20:59.160] And it didn't.
[00:20:59.160 --> 00:21:03.640] The muscovium did not react, did not bind as well, and that was predicted.
[00:21:03.880 --> 00:21:10.040] So it's believed that relativistic effects come into play with super heavy elements because they have more protons.
[00:21:10.040 --> 00:21:11.240] All right, so imagine that.
[00:21:11.720 --> 00:21:15.200] You've got a super heavy element with 115 protons.
[00:21:14.280 --> 00:21:16.880] There's so many protons in the nucleus.
[00:21:17.040 --> 00:21:23.120] Now that increases the electromagnetic forces that are available, that are in play in the nucleus.
[00:21:23.120 --> 00:21:26.640] And that alters the electron, the electrons in the atom.
[00:21:26.640 --> 00:21:39.280] Now, classically, if you look at this classically, you could say that the electrons get closer to the protons because the electromagnetic force is stronger, and that makes them move at speeds closer to the speed of light.
[00:21:39.280 --> 00:21:41.520] And doing that, you're gaining mass.
[00:21:41.520 --> 00:21:43.600] Now, you can look at it quantum mechanically.
[00:21:43.600 --> 00:21:51.920] You can look at it as the electron cloud around the atom becoming distorted because of the stronger electromagnetic forces because of the protons, right?
[00:21:51.920 --> 00:21:52.640] You with me?
[00:21:52.640 --> 00:22:01.520] So either way you look at it, whether you look at it classically or quantum mechanically, the super heavy elements should change how they interact with other elements chemically.
[00:22:01.520 --> 00:22:04.000] And that's exactly what they showed with this experiment.
[00:22:04.400 --> 00:22:12.080] These relativistic effects come into play with these super heavy elements, causing them to behave differently chemically.
[00:22:12.240 --> 00:22:14.080] That's exactly what they showed.
[00:22:14.080 --> 00:22:25.600] So sure, this doesn't mean that we're going to add muscovium or nihonium to the iPhone 20, but understanding the chemical properties of super heavy elements could lead to breakthroughs in material science and other fields.
[00:22:25.600 --> 00:22:33.120] Who knows what kind of interesting breakthroughs could happen with a deeper understanding of relativistic effects influencing super heavy elements.
[00:22:33.120 --> 00:22:36.480] So this has been your Relativistic Super Heavy Quickie with Bob.
[00:22:36.480 --> 00:22:37.600] Back to you, Steve.
[00:22:37.600 --> 00:22:38.880] Thank you, Bob.
[00:22:38.880 --> 00:22:40.720] Jay, give us the first news item.
[00:22:40.720 --> 00:22:43.120] You're going to tell us about the Biogenome Project.
[00:22:43.120 --> 00:22:43.600] What is that?
[00:22:44.240 --> 00:22:45.600] All right, this is awesome, guys.
[00:22:45.600 --> 00:22:49.200] The Earth Biogenome Project, it's also called EBP.
[00:22:49.200 --> 00:22:51.280] This is a really amazing effort.
[00:22:51.280 --> 00:23:00.760] They started it in 2017, and their goal is to sequence the genomes of all 1.67 million named eukaryotic species.
[00:22:59.920 --> 00:23:05.720] There's 8.7 known species, but they're just talking about the ones that are currently named.
[00:23:06.040 --> 00:23:11.000] This encompasses, ready, plants, animals, fungi, and other microbes.
[00:23:11.000 --> 00:23:14.920] So the eukaryotic organisms include, let me give you the bigger list.
[00:23:14.920 --> 00:23:21.320] There's animals, plants, fungi, and protists encompassing all multicellular life and some unicellular species.
[00:23:21.640 --> 00:23:25.480] So basically, Jay, that's everything but bacteria and archaea?
[00:23:25.880 --> 00:23:26.200] Exactly.
[00:23:26.200 --> 00:23:26.680] That's exactly.
[00:23:26.760 --> 00:23:27.400] Is there anything else?
[00:23:27.800 --> 00:23:29.320] It's like algae single-cell?
[00:23:29.640 --> 00:23:30.120] Prokaryotic?
[00:23:30.680 --> 00:23:31.640] It depends on the type of algae.
[00:23:32.120 --> 00:23:33.240] Blue-green algae is not.
[00:23:33.240 --> 00:23:34.760] So there's some of each, okay.
[00:23:34.760 --> 00:23:44.600] Well, guys, they specifically are distinguishing eukaryotes from simpler, you know, the non-nucleated prokaryotic organisms like bacteria.
[00:23:44.600 --> 00:23:47.080] So I think we define that pretty good.
[00:23:47.080 --> 00:23:55.160] So the project scale literally, it's so massive, bigger than earlier efforts that have come before it, like the Human Genome Project.
[00:23:55.160 --> 00:23:57.320] And that was a cool thing that they did.
[00:23:57.320 --> 00:24:02.680] That project, it focused on mapping a single species genome, so the human species, right?
[00:24:02.680 --> 00:24:13.160] The Earth Biogenome Project's ultimate goal is to revolutionize this understanding of all the Earth's biodiversity, which will improve conservation strategies.
[00:24:13.160 --> 00:24:16.680] It'll drive advancement in agriculture and human health.
[00:24:16.680 --> 00:24:18.360] So there's three phases.
[00:24:18.360 --> 00:24:19.320] It's structured.
[00:24:19.320 --> 00:24:21.400] The first phase is in 2026.
[00:24:21.400 --> 00:24:23.720] They want to sequence 10,000 species.
[00:24:23.720 --> 00:24:29.560] This should be covering one representative per eukaryotic family.
[00:24:29.560 --> 00:24:32.920] Then phase two, 2030, is the end date.
[00:24:32.920 --> 00:24:37.800] They want to expand that to 150,000 genomes, which includes one for each genus.
[00:24:37.800 --> 00:24:46.160] And then by phase three, which is 2032, they want to complete the genomes for all 1.67 million named species.
[00:24:46.160 --> 00:24:48.080] It's huge, absolutely.
[00:24:48.080 --> 00:24:49.440] But they have a plan.
[00:24:49.440 --> 00:24:50.720] So let's go into that.
[00:24:44.920 --> 00:24:52.160] They don't have an idea or a concept.
[00:24:52.320 --> 00:24:53.920] They actually have a plan.
[00:24:54.880 --> 00:25:07.440] So far, the EBP, right, remember I told you that's what they call it, they've sequenced 3,000 genomes from 1,060 families, and this covers a lot of work, but that's far from its phase one target.
[00:25:07.440 --> 00:25:11.200] And the initial cost estimates for phase one were over $600 million.
[00:25:11.200 --> 00:25:15.680] But they were able to reduce that to $265 million because why, guys?
[00:25:15.920 --> 00:25:16.560] AI.
[00:25:16.560 --> 00:25:18.160] That's a great idea, but that wasn't it.
[00:25:18.560 --> 00:25:21.440] Because of rapid advancements in sequencing technology.
[00:25:21.440 --> 00:25:26.640] And again, keep in mind, like sequencing technology is constantly being improved on as the years go by.
[00:25:26.640 --> 00:25:32.400] It's been in a constant state of flux, which is awesome because look at that price difference, $600 million to $265 million.
[00:25:32.720 --> 00:25:39.600] So by comparison, the Human Genome Project's first draft of their genome project cost over $100 million.
[00:25:39.600 --> 00:25:43.840] But again, keep in mind, the Human Genome Project works on one species.
[00:25:43.840 --> 00:25:49.440] The EBP is going to work on 1.6 somewhat more, you know, 1.6 million, that's so much more.
[00:25:49.440 --> 00:25:50.480] It's ridiculous.
[00:25:50.480 --> 00:25:55.920] So despite the investments, the EBP's progress needs to happen faster than it currently is.
[00:25:55.920 --> 00:25:58.240] And they're trying to figure out how they're going to do it.
[00:25:58.240 --> 00:26:05.600] So what they want to do is they want the weekly genome production to scale from 20 to 721 by phase two.
[00:26:05.600 --> 00:26:10.080] And they've scoped it out that that'll meet their deadlines.
[00:26:10.080 --> 00:26:15.600] So unlike its predecessors, EBP is a decentralized global project.
[00:26:15.600 --> 00:26:16.160] This is great.
[00:26:16.160 --> 00:26:18.480] It involves affiliates from 28 countries.
[00:26:18.480 --> 00:26:23.600] Two major hubs are BGI in China and the Wellcome Sanger Institute in the UK.
[00:26:23.600 --> 00:26:30.000] So the project needs regions that have massive biodiversity to get funds to keep the project moving forward.
[00:26:29.960 --> 00:26:36.600] So the problem is that a lot of these places that have a huge biodiversity just don't have enough money to make it all happen.
[00:26:36.600 --> 00:26:43.720] Some regions have funding, they have these funding issues, but the people behind the EBP came up with a solution to help them out, and this is a great idea.
[00:26:43.720 --> 00:26:50.200] They developed a portable sequencing lab they call G-Boxes, and I could not find anything on why they call it G-Boxes.
[00:26:50.200 --> 00:26:53.720] So these labs, you know, they can be deployed wherever they need to.
[00:26:54.040 --> 00:27:00.440] Each unit costs approximately $5.5 million to install and operate over a three-year period.
[00:27:00.440 --> 00:27:04.520] And they can sequence between 1,000 and 3,000 genomes annually.
[00:27:04.520 --> 00:27:08.840] And they can just pump these, you know, these G-boxes out where they need to go.
[00:27:08.840 --> 00:27:12.920] And they're relatively, you know, inexpensive, all things considered.
[00:27:12.920 --> 00:27:17.240] So the funding has been ongoing, and it's been a huge challenge for them.
[00:27:17.240 --> 00:27:22.280] You know, right now, guys, as everybody knows, there's geopolitical tensions and they get in the way.
[00:27:22.280 --> 00:27:26.360] And it really matters when countries don't agree and don't have good relations with each other.
[00:27:26.360 --> 00:27:31.080] You know, science basically suffers a lot, and particularly between the U.S.
[00:27:31.080 --> 00:27:31.800] and China.
[00:27:31.800 --> 00:27:33.960] And you'd think, you know, why can't they just talk and fix it?
[00:27:33.960 --> 00:27:35.720] But they can't because we can't even agree.
[00:27:35.720 --> 00:27:38.520] And, you know, each country can't even agree on what's going on.
[00:27:38.520 --> 00:27:43.400] Even partial success here would still be awesome, you know, if they don't actually get all the way.
[00:27:43.400 --> 00:27:46.280] And they're actually describing it as transformative.
[00:27:46.280 --> 00:27:49.560] You know, a high-quality reference genomes could do lots of things.
[00:27:49.560 --> 00:27:52.840] It could accelerate conservation efforts for endangered species.
[00:27:52.840 --> 00:27:59.880] It could identify genes critical for crop resistance, as an example, or it can improve our understanding of ecosystem dynamics.
[00:27:59.880 --> 00:28:01.400] There's a lot to this.
[00:28:01.400 --> 00:28:05.640] It's not just getting the information, but there are applications there that they know of.
[00:28:05.640 --> 00:28:10.920] So, as it stands, EBP right now, it's the most ambitious genome initiative in history.
[00:28:10.920 --> 00:28:18.960] And if it's fully successful, it'll mark a new era in science, which is offering insights into the intricate tapestry of life on Earth.
[00:28:19.040 --> 00:28:31.200] You know, great, it sounds great, but really, they're saying we're going to basically have a backup of the genome of all these named species, and it will be the foundation for lots of future solutions and global challenges that are coming up, right?
[00:28:31.200 --> 00:28:35.200] Because if we want to save species, we can understand what their needs are better.
[00:28:35.200 --> 00:28:39.200] And if we lose the species, we might someday be able to bring it back, which would be great.
[00:28:39.200 --> 00:28:51.360] So, even in the project's early stages, the project does demonstrate, I would consider it to be extraordinary potential and a hugely collaborative workspace for science on a planetary scale.
[00:28:51.360 --> 00:28:58.320] You know, that's the exact environment that science I think is best when there's tons of people involved globally and it's something that's happening all over.
[00:28:58.320 --> 00:29:03.440] Yeah, the cool thing is, like, essentially, this will give us a map of all living things at the genetic level.
[00:29:03.440 --> 00:29:08.560] And we've got to start, we can start asking really interesting questions evolutionarily and otherwise.
[00:29:08.560 --> 00:29:10.720] You know, it's just a massive data set.
[00:29:10.720 --> 00:29:16.960] And imagine we sick AI after it to try to find patterns and stuff, yeah, in that vast data.
[00:29:16.960 --> 00:29:18.000] That's absolutely.
[00:29:18.000 --> 00:29:25.040] I mean, you would think this would be something that would have hopefully already been done, but it's still incredibly laborious.
[00:29:25.040 --> 00:29:28.080] It's very hard with today's technology.
[00:29:28.080 --> 00:29:31.680] But, you know, this type of science actually drives innovation as well.
[00:29:31.680 --> 00:29:44.320] So it's really good when, for example, I just read a cool statistic that like NASA, you know, in 2022, NASA brought tens of billions of dollars of industry by its inventions and everything that all the science that they're doing.
[00:29:44.320 --> 00:29:52.400] Like it's really valuable downstream when they literally hand over the technology that they're creating to companies in the United States.
[00:29:52.400 --> 00:29:55.680] So, bottom line is, I'm 100% behind this project.
[00:29:55.680 --> 00:29:56.480] I think it's wonderful.
[00:29:56.480 --> 00:30:03.960] It's the exact type of science that needs to be funded globally, and I can't wait to hear that bell ring when they finish.
[00:29:59.680 --> 00:30:05.240] Jay, let me ask you one more question.
[00:30:05.960 --> 00:30:07.800] Does the sun have a magnetic field?
[00:30:07.800 --> 00:30:08.360] It's gotta.
[00:30:08.520 --> 00:30:09.320] Of course, it does.
[00:30:09.320 --> 00:30:10.600] Yeah, it does.
[00:30:10.920 --> 00:30:11.880] How strong is it?
[00:30:12.280 --> 00:30:15.240] Steve, it's wicked strong, yeah.
[00:30:15.800 --> 00:30:19.960] It's like a 18/23 in Dungeons and Dragons.
[00:30:21.160 --> 00:30:24.840] It's the strongest force I would imagine in the galaxy, no?
[00:30:27.080 --> 00:30:28.680] It extends past Pluto.
[00:30:29.160 --> 00:30:30.200] It's big, it's big.
[00:30:30.200 --> 00:30:32.840] It's actually very variable.
[00:30:32.840 --> 00:30:38.840] So, the Earth's magnetic field at basically the Earth's surface is one Gauss?
[00:30:38.840 --> 00:30:41.160] It's 0.6 Gauss.
[00:30:41.160 --> 00:30:41.400] Oh.
[00:30:41.400 --> 00:30:41.720] Yeah.
[00:30:41.720 --> 00:30:42.280] Not even one.
[00:30:42.600 --> 00:30:43.000] Okay.
[00:30:44.120 --> 00:30:49.000] The average magnetic field of the Sun at its surface is one Gauss.
[00:30:49.000 --> 00:30:51.880] So just slightly stronger.
[00:30:51.880 --> 00:30:55.160] Average strength around one Gauss, but it's highly variable.
[00:30:55.160 --> 00:30:57.400] With sunspots being the strongest.
[00:30:57.400 --> 00:30:59.720] That's why that's what sunspots are.
[00:31:00.520 --> 00:31:05.800] Sunspots can reach magnetic field strengths of 2,000 to 3,000 Gauss.
[00:31:05.800 --> 00:31:06.760] That's attractive.
[00:31:06.760 --> 00:31:08.520] That's huge, right?
[00:31:08.520 --> 00:31:09.000] Yeah.
[00:31:09.560 --> 00:31:15.160] Yeah, you've got very powerful magnetic effects like reconnection happening that can unleash very powerful.
[00:31:15.240 --> 00:31:17.320] Now, it's a very big magnetic field.
[00:31:17.320 --> 00:31:20.520] It's bigger than Earth's magnetic field, but it still decreases with distance.
[00:31:20.520 --> 00:31:27.240] So at the Earth, how strong do you think the Sun's magnetic field is at the Earth's distance?
[00:31:27.240 --> 00:31:31.400] Probably pretty weak because it's 50 micro Gauss.
[00:31:31.400 --> 00:31:31.800] Yeah.
[00:31:31.800 --> 00:31:32.120] Ooh.
[00:31:32.840 --> 00:31:33.240] Wow.
[00:31:33.720 --> 00:31:35.640] One five hundred thousandth?
[00:31:35.720 --> 00:31:38.360] So yeah, a micro is a millionth.
[00:31:38.520 --> 00:31:39.920] So 50 millionths of a gauss.
[00:31:39.880 --> 00:31:40.480] 50 million.
[00:31:40.760 --> 00:31:42.360] Or five micro Tesla.
[00:31:42.360 --> 00:31:44.120] So I guess the Tesla is 10 Gauss.
[00:31:44.120 --> 00:31:45.520] Well, that's barely detectable, right?
[00:31:45.520 --> 00:31:45.920] Yep.
[00:31:45.920 --> 00:31:47.120] What about at Pluto?
[00:31:47.120 --> 00:31:47.760] Oh, it has to be.
[00:31:48.000 --> 00:31:48.960] Nano Gauss.
[00:31:48.960 --> 00:31:49.520] I mean, really?
[00:31:49.520 --> 00:31:51.920] Yeah, 0.2 nano Tesla.
[00:31:44.920 --> 00:31:52.240] Got it.
[00:31:52.480 --> 00:31:53.440] Oh, Tesla.
[00:31:55.120 --> 00:31:56.800] I got nano right.
[00:31:56.800 --> 00:31:57.040] Yeah.
[00:31:58.240 --> 00:31:59.760] Or so what would that be?
[00:32:00.240 --> 00:32:02.640] Or 2 nano Gauss.
[00:32:02.640 --> 00:32:03.280] Ah, still right.
[00:32:03.280 --> 00:32:03.520] Yeah.
[00:32:03.760 --> 00:32:04.400] Awesome.
[00:32:04.640 --> 00:32:06.320] What does that even do at that point?
[00:32:06.480 --> 00:32:07.520] It's negligible, right?
[00:32:07.520 --> 00:32:09.120] That's basically negligible.
[00:32:09.120 --> 00:32:12.160] I don't even know if we can measure calculating it.
[00:32:12.160 --> 00:32:16.000] But it still keeps out the interstellar medium, the intermission.
[00:32:16.160 --> 00:32:16.800] To some extent.
[00:32:17.520 --> 00:32:17.920] Oh, yeah.
[00:32:18.560 --> 00:32:21.200] It defines the boundary until you get to the helio.
[00:32:22.080 --> 00:32:22.400] Yeah.
[00:32:22.400 --> 00:32:23.040] Yeah.
[00:32:23.040 --> 00:32:26.880] And helio shock and all the helio sheath and all that other stuff.
[00:32:26.880 --> 00:32:40.560] Now the question is: has this always been the case, or has the, in the past, especially the distant past, especially the very early solar system, was the sun's magnetic field stronger?
[00:32:40.880 --> 00:32:42.160] I'd say much stronger.
[00:32:42.160 --> 00:32:46.560] Aren't there a lot of, don't some young stars are very kind of chaotic?
[00:32:46.560 --> 00:32:49.200] Like, were they like Wolf-Rayford stars?
[00:32:49.200 --> 00:32:49.680] Yeah.
[00:32:49.680 --> 00:32:51.680] How could we answer that question, do you think?
[00:32:51.680 --> 00:32:53.840] Well, we're doing a time machine would do.
[00:32:54.080 --> 00:32:54.960] Yeah, that'll do.
[00:32:54.960 --> 00:33:04.080] Is it the amount of elements in the current star versus what we estimate was in the early stars?
[00:33:04.080 --> 00:33:07.520] Or we could measure it from other stars.
[00:33:07.520 --> 00:33:07.920] No.
[00:33:08.560 --> 00:33:09.840] Through telescopes.
[00:33:10.400 --> 00:33:13.760] How do we know what the Earth's magnetic field was in the past?
[00:33:13.760 --> 00:33:15.520] Oh, you look at fossilized metal.
[00:33:15.760 --> 00:33:16.080] Rocks.
[00:33:16.320 --> 00:33:16.720] Yeah.
[00:33:16.720 --> 00:33:17.760] Oh, right.
[00:33:17.920 --> 00:33:18.800] Or the water samples?
[00:33:18.960 --> 00:33:19.360] No, water.
[00:33:19.600 --> 00:33:34.120] You're looking for rocks that have elements that will align themselves to a magnetic field, and you basically dealign them, and however much energy you have to put in to dealign them is how much energy was put in to align them, basically.
[00:33:29.840 --> 00:33:35.000] I think this is how it works.
[00:33:35.240 --> 00:33:39.320] So then we could say that, well, that's how strong the magnetic field was when this thing crystallized, right?
[00:33:39.320 --> 00:33:40.600] When this formed.
[00:33:40.600 --> 00:33:44.520] So how could we get rocks from 4 billion years ago?
[00:33:44.520 --> 00:33:45.320] Oh, asteroids.
[00:33:45.320 --> 00:33:46.520] Asteroids, exactly.
[00:33:46.600 --> 00:33:47.320] Meteors.
[00:33:47.320 --> 00:33:48.520] Asteroids and meteors.
[00:33:48.520 --> 00:33:56.840] So for meteors that hit the Earth, basically those are going to tell us about the magnetic field of the inner solar system.
[00:33:56.840 --> 00:34:01.720] Because most of the meteors that are going to land on the Earth probably formed in the inner solar system.
[00:34:01.720 --> 00:34:03.160] And we have done that.
[00:34:03.160 --> 00:34:18.760] But if we could get an asteroid from the outer solar system, then we could get a measure of the magnetic field, you know, when that asteroid formed, probably 4 billion years ago, whatever, 4.6 billion years ago, in the outer solar system.
[00:34:18.760 --> 00:34:20.600] So this is what was just done.
[00:34:20.920 --> 00:34:23.560] Do you guys remember Ryugu, the Japanese?
[00:34:23.960 --> 00:34:27.400] Oh, yes, it went and landed on an asteroid and took a sample.
[00:34:27.720 --> 00:34:29.800] So the asteroid was Ryugu, right?
[00:34:29.800 --> 00:34:34.120] We talked about this, I think, when they recovered it and they got the grains from.
[00:34:34.440 --> 00:34:34.600] Right.
[00:34:34.760 --> 00:34:37.960] Just like we talked about, Bennu was the NASA project.
[00:34:37.960 --> 00:34:39.800] This was the Japanese one.
[00:34:39.800 --> 00:34:50.680] So they recently, a team analyzed particles from Ryugu for signs of an ancient magnetic field in the outer solar system.
[00:34:51.000 --> 00:34:55.160] And what they found was nothing, right?
[00:34:55.720 --> 00:34:57.880] They found no evidence of a magnetic field.
[00:34:57.880 --> 00:35:12.280] But what they said, what this means is, given the techniques that they used, they can only say that there's an upper limit to how strong the magnetic field could have been, and that's 15 microtesla.
[00:35:12.280 --> 00:35:19.360] So if there was a magnetic field in the outer solar, so past Jupiter, when I say outer solar, I'm talking about past Jupiter.
[00:35:19.360 --> 00:35:30.640] If there was a magnetic field beyond Jupiter, it would have been at, you know, basically where that asteroid formed, it would have been less than 15 micro Tesla, which is not negligible.
[00:35:30.640 --> 00:35:32.880] It's small, but it's not negligible.
[00:35:32.880 --> 00:35:35.280] It's not in the nanotesla range.
[00:35:35.280 --> 00:35:38.000] And what do I mean by not negligible?
[00:35:38.000 --> 00:35:39.280] Something very specific here.
[00:35:39.280 --> 00:35:50.800] Because one of the questions was: what was the impact of the sun's magnetic field in the formation of planets and other objects circling the sun?
[00:35:50.800 --> 00:36:08.320] They analyzed other data, and the researchers concluded that there may have been a other data basically indicated that there was a five micro Tesla magnetic field in the outer solar system, which fits with the, it had to be less than 15 micro Tesla.
[00:36:08.320 --> 00:36:11.360] So this is not completely confirmed.
[00:36:11.360 --> 00:36:19.600] They are hoping to get the same analysis from the material brought back by asteroid Bennu and to see if they can confirm that.
[00:36:19.600 --> 00:36:27.840] But the data that we have so far is kind of lining up to there was probably a weak magnetic field, but not insignificant magnetic field in the outer solar system.
[00:36:27.840 --> 00:36:32.640] So what effect does the magnetic field have on the formation of the solar system?
[00:36:32.640 --> 00:36:45.920] You know, when the solar system condenses out of a cloud of gas, like it becomes basically a spinning disk, the sun forms in the middle, and you have a cloud of ionized gas, like a disk of ionized gas surrounding the sun.
[00:36:45.920 --> 00:36:49.680] It's ionized, so it responds to a magnetic field.
[00:36:49.920 --> 00:37:03.000] The thinking is that in the inner solar system, that magnetic field, the sun's magnetic field, would have caused clumping to occur, which probably formed into the inner rocky planets.
[00:37:03.000 --> 00:37:09.320] And so the question was: could the same process have been happening for the outer planets and even other stuff out there?
[00:37:09.320 --> 00:37:14.200] And this, so this is sort of moving in that direction and saying, yeah, it could have.
[00:37:14.200 --> 00:37:23.080] You know, if there was this five microtesla or whatever field out there, it was probably 50 to 200 microtesla in the inner solar system.
[00:37:23.080 --> 00:37:29.640] If it was like five or whatever microtesla in the outer solar system, that could have contributed to the formation of the gas giants.
[00:37:29.640 --> 00:37:36.200] And basically, everything condensed into like asteroids and comets and stuff past Jupiter.
[00:37:36.200 --> 00:37:45.320] So that's why they're so interested in this question, you know, because this affects their models of early planetary formation, right, in the early solar system.
[00:37:45.320 --> 00:37:45.720] Yeah.
[00:37:45.720 --> 00:38:02.520] So the next step, I think, is going to be doing the same analysis of particles from Bennu to see if that also lines up with the, maybe they can get a more precise number or confirm it that there is, well, there was like a five microtesla magnetic field in the outer solar system at that time.
[00:38:02.520 --> 00:38:03.880] So that's pretty cool.
[00:38:03.880 --> 00:38:04.520] Nice.
[00:38:04.760 --> 00:38:07.640] All right, Kara, we're going to go in a bit of a different direction here.
[00:38:07.640 --> 00:38:13.960] Tell us about the possibility of RFK being in charge of our health.
[00:38:13.960 --> 00:38:14.600] Oh, geez.
[00:38:14.600 --> 00:38:15.800] RFK Jr., I should say.
[00:38:16.520 --> 00:38:25.320] RFK Jr., who Trump has publicly verbalized, he plans to put into a, quote, position of power.
[00:38:25.320 --> 00:38:29.960] Actually, I guess the real quote was to go wild on health.
[00:38:31.160 --> 00:38:36.040] I think that's actually what he said, and possibly, quote, declare war on the FDA.
[00:38:36.040 --> 00:38:46.320] He's a proponent of this sort of newly dubbed health freedom movement, or what some are calling MAHA, Make America Ha Healthy Again.
[00:38:46.640 --> 00:38:57.520] When he suspended his campaign for the presidency in August of this year and backed Donald Trump, he promised to make him give him a big position.
[00:38:57.520 --> 00:39:03.440] He hasn't verbalized or he hasn't specified what that position might be, heading up health policy.
[00:39:03.440 --> 00:39:24.080] You know, some have speculated positions with the FDA or the CDC, although those require approval, like congressional approval, whereas, let's see, the Secretary of Health and Human Services or possibly a czar role, I think would be possibly easier to put him into.
[00:39:24.080 --> 00:39:25.600] Yeah, it wouldn't require approval.
[00:39:25.600 --> 00:39:25.760] Yeah.
[00:39:26.080 --> 00:39:26.720] You just appointed him.
[00:39:26.720 --> 00:39:28.080] You're the health czar and that's it.
[00:39:28.080 --> 00:39:32.000] You're the czar and now you're like his personal advisor in a way.
[00:39:32.000 --> 00:39:32.320] Right.
[00:39:33.680 --> 00:39:38.400] And so a lot of people that are writing about this are showing a tweet.
[00:39:38.400 --> 00:39:41.760] I mean, there's so much data to mine here.
[00:39:41.760 --> 00:39:46.720] There's so much evidence of what will happen that this is not crystal balling.
[00:39:46.720 --> 00:39:49.680] This is just saying, well, this is what the dude said he was going to do.
[00:39:49.680 --> 00:39:52.800] But here is a tweet from RFK Jr., Robert F.
[00:39:52.800 --> 00:40:12.480] Kennedy Jr., who quickly, like, I guess for those of you who don't know who he is, he is, he is an environmental lawyer and actually has a very long track record of sort of working to protect vulnerable people against environmental disaster.
[00:40:12.480 --> 00:40:16.080] But also, he's a hardcore anti-vaccine activist.
[00:40:16.080 --> 00:40:24.200] And I love that according to his Wikipedia page, he's an environmental lawyer, American politician, anti-vaccine activist, and conspiracy theorist.
[00:40:24.200 --> 00:40:25.840] Just right there in the first line.
[00:40:25.840 --> 00:40:31.400] So he founded the Children's Health Defense, which is an anti-vax group.
[00:40:29.840 --> 00:40:36.440] They're one of the biggest peddlers of COVID-19 vaccine misinformation.
[00:40:36.760 --> 00:40:40.840] Obviously, he tried, he attempted to run for president, and he is a Kennedy.
[00:40:40.840 --> 00:40:42.920] So he's the son of Robert F.
[00:40:42.920 --> 00:40:47.000] Kennedy, and he's the nephew of JFK and Ted Kennedy.
[00:40:47.000 --> 00:40:48.680] Back to the tweet here.
[00:40:48.680 --> 00:40:53.720] So this was October 25th, 2024, so only about a week ago, two weeks ago.
[00:40:53.720 --> 00:40:57.080] FDA's war on public health is about to end.
[00:40:57.080 --> 00:41:17.240] This includes its aggressive suppression of psychedelics, peptides, stem cells, raw milk, hyperbaric therapies, chelating compounds, ivermectin, hydroxychloroquine, vitamins, clean foods, sunshine, exercise, nutraceuticals, and anything else that advances human health and can't be patented by pharma.
[00:41:17.480 --> 00:41:19.000] If you laugh at last.
[00:41:23.160 --> 00:41:27.480] If you work for the FDA and are part of this corrupt system, I have two messages for you.
[00:41:27.480 --> 00:41:30.520] One, preserve your records, and two, pack your bags.
[00:41:30.520 --> 00:41:31.240] Oh, boy.
[00:41:31.240 --> 00:41:32.440] Oh, boy.
[00:41:32.440 --> 00:41:32.840] Okay, so.
[00:41:33.160 --> 00:41:38.600] David Gorski describes this as an extinction-level event for science-based federal health policy.
[00:41:39.320 --> 00:41:45.960] Yeah, that's in the headline of a very long piece that he wrote about how dangerous this would be.
[00:41:46.200 --> 00:41:50.440] And he, of course, wrote it the day before the election.
[00:41:50.760 --> 00:41:57.880] So let's kind of talk for a second about some of the things that RFK Jr.
[00:41:58.520 --> 00:42:00.760] wants to see changed.
[00:42:00.760 --> 00:42:03.640] A lot of people are focusing on fluoride.
[00:42:03.640 --> 00:42:07.560] I think we'll come back to fluoride because, Steve, you wrote a piece about fluoride.
[00:42:08.200 --> 00:42:09.320] Because of this, yeah.
[00:42:09.320 --> 00:42:10.680] Because of this, exactly.
[00:42:10.680 --> 00:42:15.000] But there are other things that have been highlighted across the board that are problematic.
[00:42:15.280 --> 00:42:20.640] Obviously, he is one of the biggest anti-vaccine voices out there.
[00:42:20.880 --> 00:42:30.640] The New York Times did a post where they sort of dug deep into whether or not can he actually affect real change here?
[00:42:30.640 --> 00:42:34.400] Like, could Trump ban vaccines, for example?
[00:42:34.720 --> 00:42:42.480] And what the New York Times, what this article at least is saying, is that, you know, the short answer is no, because public health in the U.S.
[00:42:42.480 --> 00:42:47.040] is mostly controlled by the states, not the federal government.
[00:42:47.040 --> 00:42:52.240] And then where there is federal kind of oversight, that's with the FDA.
[00:42:52.240 --> 00:43:04.800] They license the vaccines, and the president can't just remove a product that's lawful and licensed from the market, at least not without trying to do so through legal maneuvering.
[00:43:04.800 --> 00:43:08.640] But the president could put pressure on the FDA.
[00:43:08.640 --> 00:43:16.400] The president could make sure that the judges that are appointed limit the power of federal agencies.
[00:43:16.400 --> 00:43:25.760] We also know that he, before he left office last time, where he changed the classification of a lot of federal agencies.
[00:43:25.760 --> 00:43:27.120] Yeah, I forget the name of the order, but yeah.
[00:43:27.440 --> 00:43:33.440] Yeah, federal employees so that they could be let go and their jobs were not protected.
[00:43:33.440 --> 00:43:35.840] And so he passed this before he left office last time.
[00:43:35.840 --> 00:43:44.400] Like week one, Biden reversed that, but it's very likely that he's just going to dive right back in once he takes office.
[00:43:44.400 --> 00:43:47.920] So yes, there could be a lot of change here.
[00:43:47.920 --> 00:43:54.720] One person is quoted in this article saying, there's a lot of mischief that can be done, but a flat-out ban, no.
[00:43:55.040 --> 00:43:59.600] But they could pull funding if anybody doesn't follow their dictates.
[00:43:59.600 --> 00:44:01.320] That's how the federal government could put a lot of funding.
[00:44:01.720 --> 00:44:04.360] The state controls it, but the federal government could just pull funding.
[00:44:04.680 --> 00:44:05.720] Yes, exactly.
[00:43:59.840 --> 00:44:06.920] Affordable Care Act.
[00:44:07.880 --> 00:44:20.920] Recently, and I think that this was in a final bid to get elected, Trump kind of pulled back a little bit on his rhetoric around the Affordable Care Act and said that he never mentioned ending the program.
[00:44:20.920 --> 00:44:32.360] He never would have thought about such a thing, but there's a lot of, like, he's on tape, you know, saying that he did want to appeal Obamacare or the ACA.
[00:44:32.360 --> 00:44:44.360] As we know, that actually requires an act of Congress, but he could use executive power to undercut the law or to restrict access to the law.
[00:44:44.360 --> 00:44:49.960] And this article really details a lot of the ways that he would be able to do that.
[00:44:49.960 --> 00:45:01.480] So, like, for example, in January of 2021, Biden issued an executive order that strengthened Medicaid and the ACA, and Trump could just undo that as soon as he's in office.
[00:45:01.800 --> 00:45:06.280] And then, if Congress were to repeal the act, what would come next?
[00:45:06.280 --> 00:45:10.680] Well, going back to what you said, Jay, well, there's concepts of a plan.
[00:45:11.160 --> 00:45:12.120] They have no idea.
[00:45:13.080 --> 00:45:14.040] They have no idea.
[00:45:14.040 --> 00:45:15.160] They got nothing.
[00:45:15.160 --> 00:45:15.800] There is nothing.
[00:45:15.800 --> 00:45:17.640] There's only so many levers you could pull there.
[00:45:17.640 --> 00:45:19.880] You know, there's no magic here.
[00:45:20.520 --> 00:45:27.720] And so then we're going to backtrack a little bit to Fluoride, but I did want to point out a friend of the show, Dr.
[00:45:27.720 --> 00:45:38.760] Andrea Love, she has a blog called Immunologic, and she wrote a piece back in September about the Congressional American Health and Nutrition Roundtable.
[00:45:38.760 --> 00:45:46.640] The headline of her article is: The Congressional American Health and Nutrition Roundtable was an egregious display of anti-science disinformation.
[00:45:46.960 --> 00:46:21.280] So, basically, what happened is that on Monday, September 23rd, GOP senator for Wisconsin, Ron Johnson, hosted a public taxpayer dollar-funded event where he put together what he claimed to be a panel of quote experts who will provide a foundational and historical understanding of the changes that have occurred over the last century within public sanitation, agriculture, food processing, and healthcare industries, which impact the current state of national health, end quote.
[00:46:21.280 --> 00:46:27.120] But what really he put together was a panel including, I'm going to list some names: Robert F.
[00:46:27.120 --> 00:46:54.320] Kennedy Jr., Jordan Peterson, Michaela Fuller, Casey and Callie Means, Vanny Hari, Max Lugavir, Courtney Swan, Marty Mackery, Alex Clark, Jason Karp, Brigham Bueller, and even more, scrolling, scrolling, Gillian Michaels, Chris Palmer, and Grace Price, none of whom have any expertise relevant to what this panel is supposed to cover.
[00:46:54.320 --> 00:46:55.840] You're all a bunch of quacks and sharp.
[00:46:56.720 --> 00:46:59.120] That's food-based.
[00:46:59.120 --> 00:47:04.000] Yeah, all of whom are notorious anti-science quacks.
[00:47:04.320 --> 00:47:07.520] You know, they all peddle misinformation.
[00:47:07.520 --> 00:47:15.600] And not only do they peddle misinformation, the vast majority of them benefit financially from misinformation.
[00:47:15.600 --> 00:47:27.680] Many of these individuals have make their money having outlets, whether it's books or blog posts or podcasts or whatever, that are directly funded by supplement industries.
[00:47:27.680 --> 00:47:41.720] So, what they did during this panel is they demonized big pharma, they demonized big food, but then they talked a big game about, quote, big wellness and big organic food without calling them that, right?
[00:47:41.720 --> 00:47:46.840] And this is the sort of playbook that we've seen over and over and over.
[00:47:46.840 --> 00:48:02.680] Why would you trust the kind of mainstream individuals who have, you know, the FDA and the CDC have their boots on their necks, when instead you can trust the wellness industry because they have the real answers.
[00:48:03.080 --> 00:48:10.520] And the reason that they're able to tell you what's real is that the FDA isn't, you know, tying their hands behind their back.
[00:48:10.520 --> 00:48:15.240] And this is really the basis for the Make America Healthy Again movement.
[00:48:15.240 --> 00:48:16.920] Deregulation.
[00:48:17.240 --> 00:48:21.320] It's just about going in and deregulating, deregulating, deregulating.
[00:48:21.320 --> 00:48:36.280] The funny thing is, when we, we are already quite deregulated when it comes to alternative medicine, but when we start to deregulate legitimate medicine, that's when the pseudoscience is no longer compartmentalized.
[00:48:36.280 --> 00:48:40.520] It now bleeds its way into the legitimate medicine game.
[00:48:40.520 --> 00:48:40.840] Yeah.
[00:48:40.840 --> 00:48:53.880] And big pharma, the very quote, industry, and I'm saying that in, you know, air quotes, big pharma, the industry that they are so angry with is going to start peddling pseudoscience.
[00:48:54.120 --> 00:48:54.760] Totally.
[00:48:54.760 --> 00:48:56.440] They're happy to be deregulated.
[00:48:56.440 --> 00:48:57.880] It's like, oh, we could sell crap.
[00:48:57.880 --> 00:49:01.000] We don't have to research and charge up the wazoo for it.
[00:49:01.000 --> 00:49:01.800] Sure.
[00:49:02.120 --> 00:49:03.320] We're all in.
[00:49:03.320 --> 00:49:05.080] This is not an anti-big pharma bill.
[00:49:05.080 --> 00:49:07.880] It's a pro-quackery, pro-snake oil movement.
[00:49:08.120 --> 00:49:09.000] That's what it is.
[00:49:09.000 --> 00:49:09.400] Yep.
[00:49:09.400 --> 00:49:13.240] Deregulation will hurt so many people.
[00:49:13.880 --> 00:49:15.280] People will die.
[00:49:15.280 --> 00:49:15.760] They are dying.
[00:49:14.280 --> 00:49:18.480] I think it's they are dying and they will continue to die.
[00:49:19.280 --> 00:49:27.920] And very often the people that will die are women and children, individuals of color, LGBTQIA individuals, people who are vulnerable.
[00:49:27.920 --> 00:49:29.600] Yeah, and low socioeconomic status.
[00:49:29.600 --> 00:49:30.320] Yep.
[00:49:30.320 --> 00:49:31.360] Absolutely.
[00:49:31.680 --> 00:49:34.880] And so now I guess we should talk a little bit about fluoride.
[00:49:34.880 --> 00:49:38.960] I don't want it to take up the whole thing because we have covered it in the past.
[00:49:38.960 --> 00:49:42.080] But basically, RFK Jr.
[00:49:42.640 --> 00:49:46.480] is touting lines from Dr.
[00:49:46.480 --> 00:49:47.440] Strangelove.
[00:49:47.680 --> 00:49:54.000] Like, and I think, you know, it's the argument is that fluoride, I've got to find his quote because I got it right here.
[00:49:54.000 --> 00:49:54.560] You got it right here.
[00:49:55.760 --> 00:49:56.560] I have it too.
[00:49:56.720 --> 00:49:59.120] He described fluoride as an industrial waste.
[00:49:59.120 --> 00:50:08.320] Industrial waste associated with arthritis, bone fractures, bone cancer, IQ loss, neurodevelopmental disorders, and thyroid disease.
[00:50:08.320 --> 00:50:15.440] And then in an interview just this past Sunday, Trump said that the idea of doing away with fluoridation, quote, sounds okay to me.
[00:50:16.000 --> 00:50:21.600] And so, Steve, you did a really great job of going through each of the things in that list and debunking.
[00:50:21.600 --> 00:50:29.440] You know, we know that although some fluoride is produced through industrial processes, it is not industrial waste.
[00:50:29.440 --> 00:50:30.240] And it doesn't matter.
[00:50:30.240 --> 00:50:32.480] It's like one of those, it's a chemophobia thing.
[00:50:32.480 --> 00:50:37.280] Because at the end of the day, it completely dissociates into fluoride ions.
[00:50:37.280 --> 00:50:37.920] That's it.
[00:50:37.920 --> 00:50:38.960] It's a fluoride ion.
[00:50:38.960 --> 00:50:39.840] It's an element.
[00:50:39.840 --> 00:50:40.480] It doesn't matter.
[00:50:40.480 --> 00:50:43.040] It's like saying this hydrogen atom came from poison.
[00:50:43.040 --> 00:50:43.520] Who cares?
[00:50:43.520 --> 00:50:45.120] It's now a proton.
[00:50:45.120 --> 00:50:45.840] You know what I mean?
[00:50:45.840 --> 00:50:48.080] It doesn't matter where you sourced it from.
[00:50:48.320 --> 00:50:50.480] That's just fear-mongering chemophobia.
[00:50:50.480 --> 00:50:50.960] That's what that is.
[00:50:51.200 --> 00:50:53.680] Of course, but that's totally from the playbook, right?
[00:50:53.920 --> 00:50:55.840] That's from all of the rhetoric.
[00:50:55.840 --> 00:51:01.720] Arthritis, bone fractures, bone cancer, IQ loss, neurodevelopmental disorders, and thyroid disease.
[00:51:02.040 --> 00:51:15.160] You do a great job of going through and saying, okay, this was this one study, or this was when the levels that were looked at were X number of times greater than any acceptable level by the FDA.
[00:51:15.160 --> 00:51:18.120] And time and time again, what we have seen.
[00:51:18.360 --> 00:51:19.480] It's the EPA, actually.
[00:51:20.040 --> 00:51:20.440] Oh, thank you.
[00:51:20.600 --> 00:51:24.920] Yeah, because if fluoride is naturally occurring in water, and the U.S.
[00:51:24.920 --> 00:51:29.480] has patients with high levels of fluoride, the EPA sets limits.
[00:51:29.480 --> 00:51:33.000] If it gets higher than that, we actually reduce the level of fluoride.
[00:51:33.000 --> 00:51:38.760] We only add it up to a very tiny amount that's well below anything that causes any issues.
[00:51:38.760 --> 00:51:39.240] Doses.
[00:51:39.560 --> 00:51:45.400] And by the way, again, this is a decision that's made at the local level.
[00:51:45.800 --> 00:51:52.760] There are states in this country that don't, or I guess I should say cities within states in this country that don't fluoridate the water.
[00:51:52.760 --> 00:51:55.160] 72.3% of the U.S.
[00:51:55.160 --> 00:51:59.960] population has access to fluoridated water, according to the CDC.
[00:51:59.960 --> 00:52:09.480] And the CDC calls fluoridation one of the 10 great public health achievements of the 20th century because fluoridated water helps with oral health.
[00:52:09.480 --> 00:52:20.280] It reduced cavities and tooth decay by 60% in the Grand Rapids experiment in 1945, where the first efforts to fluoridate water began.
[00:52:20.280 --> 00:52:30.040] It was such a resounding win that other jurisdictions decided to do this because the evidence was overwhelming.
[00:52:30.360 --> 00:52:31.800] Yeah, and it saves money.
[00:52:31.800 --> 00:52:35.640] And then they say, well, yeah, with fluoridated toothpaste, you don't really need it these days.
[00:52:35.640 --> 00:52:38.760] But when they take it away, tooth decay goes up.
[00:52:38.760 --> 00:52:39.160] Right.
[00:52:39.160 --> 00:52:40.520] So clearly we do.
[00:52:41.000 --> 00:52:50.720] Because of course, when we democratize something like fluoride in the drinking water, everybody has access to it.
[00:52:50.720 --> 00:52:56.400] As opposed to requiring it to be viewed, I mean, I hate to say this because it shouldn't be a luxury item.
[00:52:56.400 --> 00:52:58.160] It should be a necessity item.
[00:52:58.160 --> 00:53:00.560] But for some people, it is a luxury item.
[00:53:00.560 --> 00:53:04.720] And for some people, they just don't have access to the oral hygiene that they need.
[00:53:04.720 --> 00:53:08.960] Maybe they're using fluoridated toothpaste, but they're not brushing their teeth multiple times a day.
[00:53:08.960 --> 00:53:14.800] Maybe they don't have an opportunity to go to the dentist and use the fluoride that they give to children at the dentist.
[00:53:14.800 --> 00:53:16.320] Well, this is what we're headed for.
[00:53:16.320 --> 00:53:18.160] It's really, really scary.
[00:53:18.160 --> 00:53:25.200] And that's just one of, you know, the ACA, fluoridated water, and vaccines, that's just scratching the surface.
[00:53:25.200 --> 00:53:28.960] There is so much pseudoscience
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
.960 --> 00:52:09.480] And the CDC calls fluoridation one of the 10 great public health achievements of the 20th century because fluoridated water helps with oral health.
[00:52:09.480 --> 00:52:20.280] It reduced cavities and tooth decay by 60% in the Grand Rapids experiment in 1945, where the first efforts to fluoridate water began.
[00:52:20.280 --> 00:52:30.040] It was such a resounding win that other jurisdictions decided to do this because the evidence was overwhelming.
[00:52:30.360 --> 00:52:31.800] Yeah, and it saves money.
[00:52:31.800 --> 00:52:35.640] And then they say, well, yeah, with fluoridated toothpaste, you don't really need it these days.
[00:52:35.640 --> 00:52:38.760] But when they take it away, tooth decay goes up.
[00:52:38.760 --> 00:52:39.160] Right.
[00:52:39.160 --> 00:52:40.520] So clearly we do.
[00:52:41.000 --> 00:52:50.720] Because of course, when we democratize something like fluoride in the drinking water, everybody has access to it.
[00:52:50.720 --> 00:52:56.400] As opposed to requiring it to be viewed, I mean, I hate to say this because it shouldn't be a luxury item.
[00:52:56.400 --> 00:52:58.160] It should be a necessity item.
[00:52:58.160 --> 00:53:00.560] But for some people, it is a luxury item.
[00:53:00.560 --> 00:53:04.720] And for some people, they just don't have access to the oral hygiene that they need.
[00:53:04.720 --> 00:53:08.960] Maybe they're using fluoridated toothpaste, but they're not brushing their teeth multiple times a day.
[00:53:08.960 --> 00:53:14.800] Maybe they don't have an opportunity to go to the dentist and use the fluoride that they give to children at the dentist.
[00:53:14.800 --> 00:53:16.320] Well, this is what we're headed for.
[00:53:16.320 --> 00:53:18.160] It's really, really scary.
[00:53:18.160 --> 00:53:25.200] And that's just one of, you know, the ACA, fluoridated water, and vaccines, that's just scratching the surface.
[00:53:25.200 --> 00:53:28.960] There is so much pseudoscience that RFK has peddled.
[00:53:28.960 --> 00:53:39.680] And when you, again, look at the things that were covered by the panel, that's sort of like a good early taste of what we could be up against.
[00:53:39.680 --> 00:53:43.840] All of the wellness industry bullshit that they were peddling.
[00:53:43.840 --> 00:53:45.360] It really, really scares me.
[00:53:45.360 --> 00:53:48.080] Just so much pseudoscience and just outright lies.
[00:53:48.800 --> 00:53:49.360] Yeah.
[00:53:49.360 --> 00:53:49.680] All right.
[00:53:49.680 --> 00:53:50.720] Thanks, Kara.
[00:53:50.720 --> 00:53:56.240] All right, Bob, I understand the moon Miranda has water and maybe something else.
[00:53:56.480 --> 00:53:57.520] Perhaps.
[00:53:57.520 --> 00:54:04.960] We may, tentatively, perhaps, maybe have yet another icy moon in our solar system with a subsurface ocean.
[00:54:04.960 --> 00:54:06.160] And we know what that means, right?
[00:54:06.480 --> 00:54:10.240] My first thought is like, oh, maybe there's ketosynthetic life in there.
[00:54:10.560 --> 00:54:12.400] But that's definitely jumping the gun.
[00:54:12.400 --> 00:54:19.280] The moon, though, doesn't orbit Jupiter or Saturn, but the far more distant and less well-branded planet, Uranus.
[00:54:19.280 --> 00:54:26.320] How could such an interesting oceanic possibility be teased out of such a distant small object?
[00:54:26.560 --> 00:54:33.800] The study published in the Planetary Science Journal led by Tom Nordheim, planetary scientist at the Johns Hopkins Applied Physics Laboratory.
[00:54:34.040 --> 00:54:38.280] So this starts with Uranus, the second farthest planet at 19 AUs.
[00:54:38.280 --> 00:54:42.120] That's 93 million miles, the distance Earth to the Sun.
[00:54:42.120 --> 00:54:43.800] Jupiter's only five AUs.
[00:54:43.800 --> 00:54:46.200] So this is like a lot farther away.
[00:54:46.440 --> 00:54:48.760] This is the planet that rotates on its side.
[00:54:48.760 --> 00:54:55.640] You know that one giving it crazy seasons, like at the poles, it's 42 years of sunlight and then 42 years of darkness.
[00:54:55.640 --> 00:54:56.360] Wow.
[00:54:56.760 --> 00:54:58.040] Yeah, that's nasty.
[00:54:58.040 --> 00:55:02.520] The star of the study, so to speak, is Uranus's innermost moon, Miranda.
[00:55:02.520 --> 00:55:03.880] The moon is tiny.
[00:55:03.880 --> 00:55:06.520] It's got a surface area of Texas.
[00:55:06.520 --> 00:55:09.560] Texas is big, but it's small for a moon.
[00:55:09.560 --> 00:55:18.040] With a diameter of only 470 kilometers, it's one of the smallest observed objects in hydrostatic equilibrium in our solar system.
[00:55:18.200 --> 00:55:23.320] If it's in hydrostatic equilibrium, that means that it is round because of gravity.
[00:55:23.880 --> 00:55:26.440] Like the moons of Mars are not round.
[00:55:26.440 --> 00:55:27.320] Right, right.
[00:55:27.320 --> 00:55:29.800] Depending, of course, what it's made of, but they're rocky moons.
[00:55:29.800 --> 00:55:37.000] So how do we go about determining that there might be water under the ice on a moon that's 2.7 billion kilometers away?
[00:55:37.000 --> 00:55:38.120] Call the water company.
[00:55:38.440 --> 00:55:39.160] Yes.
[00:55:39.480 --> 00:55:45.720] Another option would be to, well, in this case, it's ironically about what's on the surface of the moon.
[00:55:45.720 --> 00:55:50.920] And we first got a look at Miranda from the pictures Voyager took way back in 86.
[00:55:50.920 --> 00:55:51.560] Wow.
[00:55:51.560 --> 00:55:54.920] It looks like a patchwork of different moons that got stitched together.
[00:55:54.920 --> 00:55:56.600] It's really kind of bizarre.
[00:55:56.600 --> 00:56:01.640] There's like these grooves or canyons that are 12 times deeper than the Grand Canyon.
[00:56:01.640 --> 00:56:03.320] And there's these huge cliffs.
[00:56:03.320 --> 00:56:08.760] And there's these weird trapezoidal shapes, geological shapes called Koronae.
[00:56:08.760 --> 00:56:15.200] They think it just might be dense, you know, metallic or rocky material from previous collisions with meteors.
[00:56:14.840 --> 00:56:18.480] Clearly, Miranda has a strange and complicated geological past.
[00:56:18.720 --> 00:56:23.280] So, to reconstruct that past, the researchers combined the old with the new.
[00:56:23.280 --> 00:56:33.520] They used the old Voyager pictures, and because that's really the best images that we have of Miranda, even though it was some 86, I mean, Voyager got pretty damn close.
[00:56:33.520 --> 00:56:38.000] And we incorporated those old pictures into modern modeling techniques.
[00:56:38.320 --> 00:56:44.160] Nordheim described it as squeezing the last bit of science we can from Voyager 2's images.
[00:56:44.160 --> 00:56:48.640] So, how does the surface of Miranda shed light on its interior?
[00:56:48.640 --> 00:56:58.000] They say in their paper, in this paper, we will attempt to constrain Miranda's interior structure from interpretation and modeling of surface stress patterns.
[00:56:58.000 --> 00:57:15.920] So, the researchers claim that by determining what caused those weird, deformed surface geological shapes and structures, they will be able to winnow the possibilities of what the interior of Miranda is like, based primarily on the surface.
[00:57:15.920 --> 00:57:27.120] Now, the model showed that, or concluded, that 100 to 500 million years ago, Miranda could have had a sub-ocean, a subsurface ocean more than 100 kilometers deep.
[00:57:27.120 --> 00:57:31.760] So, now, how do you actually create an ocean on a small moon so far from the sun?
[00:57:32.080 --> 00:57:37.040] The most likely culprit, they think, are orbital resonances with nearby moons.
[00:57:37.280 --> 00:57:38.720] These are really fascinating.
[00:57:38.720 --> 00:57:41.040] Resonances like this are like pushing a kid on a swing.
[00:57:41.040 --> 00:57:46.960] You know, the kid's swinging and you push at the right time, and the kid goes farther and farther and farther.
[00:57:46.960 --> 00:57:48.720] Orbital resonances are like that.
[00:57:48.720 --> 00:57:58.320] The tidal forces between the moons of Uranus can be amplified by these resonances to the point where the moons actually experience a change in their orbits.
[00:57:58.320 --> 00:58:01.880] And that's why Miranda's orbits inclined a bit, they think.
[00:57:59.840 --> 00:58:05.640] But it's not only the orbit that changes because of these resonances.
[00:58:06.280 --> 00:58:10.040] It can change the axis of rotation itself and the tilt of its axis.
[00:58:10.520 --> 00:58:16.760] So it just wreaks havoc with these moons, these resonances when they line up, when they're properly set up.
[00:58:16.760 --> 00:58:20.680] So this can also obviously wreak havoc on the moon's surface.
[00:58:20.680 --> 00:58:29.160] And they calculated that all of this resonance movement and stuff actually compressed one side of the moon and stretched the other side.
[00:58:29.160 --> 00:58:34.200] And that's probably one of the main reasons why we're seeing such a weird surface going on there.
[00:58:34.200 --> 00:58:40.280] Now, all of this also creates friction and heat in the interior, and that's what they think could have created the ocean there.
[00:58:40.280 --> 00:58:47.400] So it basically kind of almost boils down to tidal forces again, which is what we see in the moons of Jupiter as well.
[00:58:47.400 --> 00:58:53.480] We've got moons that some of the most volcanically active moons around Jupiter are because of the tidal forces.
[00:58:53.480 --> 00:58:56.520] They're just constantly kneading and compressing the interior.
[00:58:56.520 --> 00:59:02.200] So it's kind of related to what we're seeing here, although these resonances are a little bit different.
[00:59:02.200 --> 00:59:05.480] So how can that ocean still exist though after a half a billion years?
[00:59:05.480 --> 00:59:08.520] Because this happened, you know, 100 million to a half a billion years ago.
[00:59:08.760 --> 00:59:10.680] Why is this ocean still there?
[00:59:10.680 --> 00:59:18.840] So they say that one reason is that this tidal heating could persist because of this eccentric orbit that the moon is now in.
[00:59:18.840 --> 00:59:22.520] Now, they're not saying that the ocean is still 100 kilometers deep.
[00:59:22.600 --> 00:59:25.960] They say it's probably smaller, but it's probably still there.
[00:59:25.960 --> 00:59:37.800] And the other bit of evidence for that is that if it were frozen solid, if the moon had no subsurface ocean, they say that there would have been evidence of that on the surface, and they do not see that evidence.
[00:59:37.800 --> 00:59:39.880] So that's basically the gist of their argument.
[00:59:39.880 --> 00:59:45.000] In the future, they may be able to more definitively demonstrate that Uranus still has a subsurface ocean.
[00:59:45.280 --> 00:59:46.400] That would be very cool.
[00:59:46.400 --> 00:59:48.800] I mean, I don't think, I mean, this is so far away.
[00:59:48.800 --> 00:59:51.600] I mean, imagine three times farther away than Jupiter.
[00:59:51.600 --> 00:59:56.400] I mean, I don't think we're going to be getting there anytime in our lifetimes at all.
[00:59:57.040 --> 01:00:17.840] But maybe someday we will be able to say that this is one subsurface ocean that not only exists, and like I said at the beginning, perhaps if there's a subsurface ocean, there could potentially be some sort of life, some single-celled organisms, some microorganisms that are based on chemosynthetic rather than photosynthetic.
[01:00:17.840 --> 01:00:20.320] And that would be, of course, mind-boggling.
[01:00:20.320 --> 01:00:26.320] But actually, I'm hoping that we'll find that nearer by with Saturn and Jupiter and not have to go all the way to Uranus.
[01:00:26.320 --> 01:00:30.000] But one more thing to keep an eye on.
[01:00:30.000 --> 01:00:32.800] Yeah, but it's not going to be a probe there anytime soon.
[01:00:32.800 --> 01:00:39.360] No, but just the process they went through to model this and come to that conclusion was different and fascinating to me.
[01:00:39.360 --> 01:00:41.680] Yep, no probing of Uranus.
[01:00:41.920 --> 01:00:43.120] Got it.
[01:00:44.400 --> 01:00:46.960] All right, Evan, what's the Club 27 myth?
[01:00:46.960 --> 01:00:49.200] Have you heard of this before?
[01:00:49.520 --> 01:00:50.080] No.
[01:00:50.080 --> 01:00:59.120] No, I think maybe you have, but you don't know it as the Club 27 myth or what they call the 27 Club.
[01:00:59.120 --> 01:01:00.480] That's for short.
[01:01:00.480 --> 01:01:14.720] But this is a cultural phenomenon referring to a group of famous musicians initially and then artists and some other actors who got folded into this group who have all died at the age of 27.
[01:01:15.040 --> 01:01:15.920] Oh, okay.
[01:01:15.920 --> 01:01:16.320] Yeah.
[01:01:16.320 --> 01:01:16.640] Yeah.
[01:01:16.720 --> 01:01:17.920] The Kurt Coco Band.
[01:01:17.920 --> 01:01:19.200] Right, right, right.
[01:01:19.440 --> 01:01:21.280] But Kara, it didn't start there.
[01:01:21.280 --> 01:01:22.640] It's not a modern phenomenon.
[01:01:22.640 --> 01:01:22.960] Oh, right.
[01:01:22.960 --> 01:01:24.080] Janice Shoplin, too?
[01:01:24.480 --> 01:01:28.640] Yeah, this dates back to actually the first one was 1969.
[01:01:28.960 --> 01:01:30.600] That was when I was born.
[01:01:30.600 --> 01:01:35.400] And Rolling Stone's co-founder, Brian Jones, he dies at age 27.
[01:01:29.840 --> 01:01:36.040] He drowned.
[01:01:36.200 --> 01:01:39.640] It was tragic, and the rock world was stunned by that loss.
[01:01:39.640 --> 01:01:46.440] But then in 1970, Kara, Janice Joplin, also at age 27 died.
[01:01:46.440 --> 01:01:51.320] And Jimi Hendrix, age 27, dies in 1970.
[01:01:51.320 --> 01:01:54.280] So, all right, you got Brian Jones, 69.
[01:01:54.280 --> 01:01:58.040] Now, Janice and Jimmy are gone in 1970, all aged 27.
[01:01:58.040 --> 01:02:04.120] And then one year later, 1971, Jim Morrison, the lead singer of the Doors, he was 27.
[01:02:04.440 --> 01:02:06.120] Dies at age 27.
[01:02:06.200 --> 01:02:06.600] Nice.
[01:02:06.600 --> 01:02:07.160] In the club.
[01:02:07.320 --> 01:02:08.040] Holy crap.
[01:02:08.040 --> 01:02:11.800] What is going on with our music stars dying at age 27?
[01:02:11.800 --> 01:02:13.560] Is it some kind of curse?
[01:02:13.880 --> 01:02:17.480] But regardless, forever, there it was, cemented in our culture.
[01:02:17.480 --> 01:02:27.880] If you were a fan of any kind of music growing up in the 70s, even the early 80s, my guess is you have some kind of memory of discussions about the 27 club phenomenon.
[01:02:28.440 --> 01:02:33.240] And then what started happening is people started taking a peek back in time before the club was realized.
[01:02:33.240 --> 01:02:39.800] And you find out that legendary bluesman Robert Johnson, he was age 27 when he died.
[01:02:39.800 --> 01:02:42.600] So that, yeah, you add that big name to the club.
[01:02:42.600 --> 01:02:48.120] Going forward, like you said, Kara, Kurt Cobain from Nirvana in the 1990s.
[01:02:48.120 --> 01:02:54.360] Amy Weinhaus, also aged 27 in 2011, she passed away.
[01:02:54.360 --> 01:03:09.400] And so you wind up casting sort of this larger net because you wind, because what people will do is they'll start throwing actors and artists and other media people effectively into this more, and you get more relevant data points.
[01:03:09.400 --> 01:03:11.080] So, what do our brains do?
[01:03:11.080 --> 01:03:23.200] Well, we like to soak up these types of celebrity-related cultural phenomenon and accept them sort of as, I don't know, like a quasi-fact or fact-ish sort of thing that exists.
[01:03:23.200 --> 01:03:25.840] And maybe you think, wow, what are those odds?
[01:03:25.840 --> 01:03:31.040] All these amazing musicians and artists and celebrities dying at age 27.
[01:03:31.040 --> 01:03:32.480] What are those chances?
[01:03:32.480 --> 01:03:39.040] Are we really capable of knowing and understanding what those actual chances are with statistical significance?
[01:03:39.040 --> 01:03:40.160] Probably not.
[01:03:40.160 --> 01:03:42.000] But this has been studied before.
[01:03:42.000 --> 01:03:52.400] I went back to an article over at Vox from 2015, and there was research done that year by a professor of psychology and music at the University of Sydney.
[01:03:52.400 --> 01:03:55.520] Her name was Diana Theodora Kenney.
[01:03:55.520 --> 01:04:00.400] And the most common age of death musicians was not 27 in her study.
[01:04:00.400 --> 01:04:05.600] Do you want to guess what the age was most common death for these artists?
[01:04:05.600 --> 01:04:06.320] 68.
[01:04:06.320 --> 01:04:06.960] 69.
[01:04:07.280 --> 01:04:08.560] Bob's a little closer.
[01:04:08.880 --> 01:04:09.520] 67.
[01:04:10.480 --> 01:04:11.440] 56.
[01:04:11.760 --> 01:04:13.920] 56 was the age.
[01:04:13.920 --> 01:04:15.360] That's because musicians live hard.
[01:04:16.640 --> 01:04:22.880] Over 11,000 musicians were in that study, and she studied people who died between 1950 and 2010.
[01:04:22.880 --> 01:04:27.200] Only 1.3% of those musicians died at age 27.
[01:04:27.200 --> 01:04:31.360] 2.3% were the most at age 56.
[01:04:31.680 --> 01:04:39.040] So, yeah, and if she graphed it on a chart, it makes a very nice, you know, bell curve, statistically speaking.
[01:04:39.040 --> 01:04:46.800] But regardless of that and other studies, this continues to be a subject of revisiting and more studies, but in different ways.
[01:04:46.800 --> 01:04:49.520] And this is where we ran into the news item this week.
[01:04:49.520 --> 01:04:56.320] It's being covered by a lot of places, but Scientific American, her name is Rachel Newer, N-U-W-E-R.
[01:04:56.320 --> 01:05:02.840] She wrote an article about this titled The Myth that Musicians Die at 27 shows how superstitions are made.
[01:04:59.840 --> 01:05:03.000] Yeah.
[01:05:03.560 --> 01:05:12.840] And she's referring to a new study that appeared in the Proceedings of the National Academy of Sciences, P-N-A-S.
[01:05:14.280 --> 01:05:29.240] Titled, All right, bear with me here: Path Dependence, Stigmergy, and Mimetic Reification of the Formation of the 27 Club Myth with authors Zachary Dunovan and Patrick Kaminsky.
[01:05:29.240 --> 01:05:48.200] What they were trying to do here is basically get into how a legend or a myth that emerged out of random but strange series of events went on to have a real-world impact by shaping the legacies of other famous people who subsequently died at age 27.
[01:05:48.520 --> 01:05:56.920] In effect, they're saying, yeah, it's a myth, but there is maybe something going on here that is of some significance.
[01:05:56.920 --> 01:06:10.600] And what they did is they looked, they used Wikipedia and they looked at various languages, obviously, throughout the world, and they used an analysis of people who were born after 1900 and who died before 2015.
[01:06:10.600 --> 01:06:15.080] And they came up with over 344,000 Wikipedia pages.
[01:06:15.080 --> 01:06:20.440] But then they used page visits as their proxy for fame, right?
[01:06:20.440 --> 01:06:22.200] So this is based on that.
[01:06:22.200 --> 01:06:24.520] So they put that model together.
[01:06:24.520 --> 01:06:27.640] And do you know what happened when they looked at it?
[01:06:27.640 --> 01:06:32.160] And as far as looking at all the artists who died at age 27.
[01:06:32.720 --> 01:06:34.760] Was it a significantly different number?
[01:06:35.400 --> 01:06:36.200] Was it the same?
[01:06:36.200 --> 01:06:37.160] What do you guys think?
[01:06:37.160 --> 01:06:37.880] I don't know.
[01:06:38.200 --> 01:06:39.160] It was the same.
[01:06:39.160 --> 01:06:39.640] It was the same.
[01:06:40.280 --> 01:06:40.560] I know.
[01:06:40.560 --> 01:06:40.920] We didn't know.
[01:06:41.160 --> 01:06:41.880] It was the same.
[01:06:41.880 --> 01:06:43.400] That did not change.
[01:06:43.400 --> 01:07:03.120] But here's what they did find: they said among those in the 90th percentile of fame and higher, for those that did die at age 27, they experienced an extra boost of popularity in the form of more page visits to their Wikipedia page that could not be accounted for by other factors.
[01:07:03.120 --> 01:07:12.240] They say the effect was particularly pronounced for the most famous of the famous or individuals who roughly achieved the 99th percentile of fame.
[01:07:12.240 --> 01:07:23.840] And that bump, they say, indicates that people who die at age 27 are considerably more likely to be more famous than comparatively those who even die at just age 26 or 28.
[01:07:23.840 --> 01:07:25.200] So that's confirmation bias.
[01:07:25.200 --> 01:07:25.920] I suppose so.
[01:07:25.920 --> 01:07:26.640] That's what that is.
[01:07:26.640 --> 01:07:27.840] Yeah, 100%.
[01:07:27.840 --> 01:07:33.760] Yeah, so you have like people underestimate how many potential musicians there are, right?
[01:07:33.760 --> 01:07:35.280] Especially if you include celebrities.
[01:07:35.280 --> 01:07:37.280] It's thousands and thousands.
[01:07:37.280 --> 01:07:45.120] You could come up with the same kind of number of people who die at any age, and there's a bell curve, you know, that has nothing to do with the age 27.
[01:07:45.120 --> 01:07:53.360] That's just a, you know, people notice a pattern and then they include the data in subsequent formulations.
[01:07:53.360 --> 01:07:53.840] You know what I mean?
[01:07:54.160 --> 01:07:57.920] They carry that quirky observation forward.
[01:07:57.920 --> 01:08:05.120] And then, especially then you engage in confirmation bias, you're looking for data to fit the pattern and without looking at all the data.
[01:08:05.120 --> 01:08:06.000] It's just classic.
[01:08:06.000 --> 01:08:15.920] You know, this is why even if we see this same phenomenon happen, even like in the clinic, where you, by coincidence, see a couple of patients with some kind of a correlation.
[01:08:15.920 --> 01:08:17.440] And say, oh, maybe there's something here.
[01:08:17.440 --> 01:08:21.120] So you look to see if there's other cases and you find them.
[01:08:21.120 --> 01:08:23.360] And you include the original observations in the data.
[01:08:23.360 --> 01:08:25.840] And you have a case series that makes it seem like something's happening.
[01:08:25.840 --> 01:08:28.880] It's all just random quirkiness that has nothing to do with anything.
[01:08:28.880 --> 01:08:34.840] You need an independent, thorough evaluation with new data to see if this holds up.
[01:08:34.840 --> 01:08:35.880] And of course, it doesn't.
[01:08:29.760 --> 01:08:37.240] It's just random nonsense.
[01:08:37.560 --> 01:08:38.360] Correct.
[01:08:38.360 --> 01:08:38.760] All right.
[01:08:38.760 --> 01:08:39.480] Thanks, Seven.
[01:08:39.480 --> 01:08:41.560] Jay, who's that noisy time?
[01:08:41.560 --> 01:08:44.600] All right, guys, going back at least two episodes.
[01:08:44.600 --> 01:08:46.920] Here's the noisy that I played.
[01:09:05.240 --> 01:09:06.600] You guys have any guesses?
[01:09:06.600 --> 01:09:10.440] Yeah, it's Donald Duck maneuvering inside of a tank.
[01:09:10.760 --> 01:09:11.880] That's pretty good.
[01:09:11.880 --> 01:09:12.760] Okay.
[01:09:13.400 --> 01:09:20.200] I got so many good, good guesses, meaning ones that I thought, you know, just were worthy of that sound.
[01:09:20.200 --> 01:09:28.680] The first one was from Joe Vandenenden, and he says, Is that the walker bringing the spacecraft carrier, the Europa Clipper, back out of the launch pad?
[01:09:29.000 --> 01:09:29.960] Very cool guess.
[01:09:29.960 --> 01:09:31.000] I mean, yeah, I could see that.
[01:09:31.000 --> 01:09:33.480] I hear what you're saying there, and I think that's a great guess.
[01:09:33.480 --> 01:09:37.240] You're not correct, and I would like to actually hear that sound if you ever can get it.
[01:09:37.240 --> 01:09:47.560] Another listener named Matthew Morrison said, Hi, Jay, my daughter Niamh, and I think it is a ship moving through the water where there is a layer of ice on top that is breaking as the ship moves through it.
[01:09:47.560 --> 01:09:53.160] Another fantastic guess because there are water-like sounds going on in that clip.
[01:09:53.160 --> 01:09:55.160] So, you know, I think that was a pretty cool guess.
[01:09:55.160 --> 01:09:57.880] You are incorrect, and tell Niamh no big deal.
[01:09:57.880 --> 01:09:58.840] Everybody tries.
[01:09:58.840 --> 01:10:00.120] It's great to try.
[01:10:00.120 --> 01:10:01.880] Sometimes we actually win, right?
[01:10:01.880 --> 01:10:02.920] So keep trying.
[01:10:02.920 --> 01:10:07.560] Next one is Matt Soskins, and he said, Jay, great to meet you at Saikon.
[01:10:07.560 --> 01:10:14.960] Before I give you my guess, I want to tell you about my grandmother's brownie recipe and how it led to me, led me to know this noisy.
[01:10:14.960 --> 01:10:19.680] And then he's, of course, he's kidding because I asked people, you know, please don't write me these big stories.
[01:10:14.680 --> 01:10:20.880] It just cuts the chase.
[01:10:22.400 --> 01:10:26.240] And then he said, water-powered organ or a bird.
[01:10:26.240 --> 01:10:35.920] Now, whenever anybody says or a bird, I just ignore that because, of course, I shouldn't even be taking guesses from people with more than one guess, but it was a joke.
[01:10:35.920 --> 01:10:36.480] I get it.
[01:10:36.480 --> 01:10:37.840] Water-powered organ.
[01:10:37.840 --> 01:10:44.720] It is not a water-powered organ, but again, there is, you know, there is a water kind of noise in there, and I can see where he's coming from.
[01:10:44.720 --> 01:10:46.880] Next one is from Ben Simon.
[01:10:46.880 --> 01:10:54.640] He said, This week's noisy puts in mind movie scenes of nervous naval officers quietly glancing at each other during a tense submarine dive.
[01:10:54.640 --> 01:11:00.640] So I'm going to say this is a recording of a record-breaking deep dive by a research submersible.
[01:11:00.640 --> 01:11:02.000] Another awesome guess.
[01:11:02.000 --> 01:11:04.960] It's not correct, but damn, that's an awesome guess.
[01:11:04.960 --> 01:11:07.760] But there was no winner this week, and that's perfectly okay.
[01:11:07.760 --> 01:11:10.240] Like I said, you know, we try, sometimes we fail.
[01:11:10.240 --> 01:11:11.280] What did Yoda say?
[01:11:11.280 --> 01:11:12.720] Try or do not.
[01:11:12.720 --> 01:11:13.600] There is no fail, right?
[01:11:13.600 --> 01:11:14.720] Remember, do or do not.
[01:11:14.720 --> 01:11:15.680] There is no try.
[01:11:15.680 --> 01:11:16.640] Right, that's it.
[01:11:16.960 --> 01:11:20.080] Well, that does not apply to who's that noisy.
[01:11:20.960 --> 01:11:36.080] So I want to thank everybody for being honest because since nobody won, that means I'm 100% sure that everybody that listens to this show that sent in a guess respected my request to not write in if you knew the answer because I knew a lot of people knew the answer out there.
[01:11:36.080 --> 01:11:36.960] So, you know what?
[01:11:36.960 --> 01:11:38.080] Props to you guys.
[01:11:38.400 --> 01:11:43.200] More reason to go to Nauticon because we have high-quality people that listen to this show.
[01:11:43.200 --> 01:11:45.120] So, guys, what is that sound?
[01:11:45.120 --> 01:11:52.000] I was really excited to hear it after I knew what it was, and then I listened to it, and it ended up being as cool as I hoped it should be.
[01:11:52.000 --> 01:11:59.200] Okay, this is the sound of molten metals swirling in the Earth's core as its magnetic field flips.
[01:11:59.280 --> 01:12:00.000] Can't record that.
[01:12:00.360 --> 01:12:02.120] The tape recorder would melt.
[01:12:02.120 --> 01:12:03.560] Yeah, that's what I thought.
[01:12:03.560 --> 01:12:07.560] No, but that, guys, that is the internal swirling sound of the Earth's core.
[01:12:07.560 --> 01:12:11.800] And it made me think of something really interesting, and luckily, it's not the case.
[01:12:11.800 --> 01:12:14.520] Imagine if we constantly heard that noise.
[01:12:14.520 --> 01:12:17.640] But that noise is, you know, think about how loud that noise must be.
[01:12:17.640 --> 01:12:21.000] It just can't penetrate all the rock and regolith and everything.
[01:12:21.160 --> 01:12:23.400] I just thought that was a wicked cool noise.
[01:12:23.400 --> 01:12:25.240] Let's listen to it again.
[01:12:34.440 --> 01:12:35.160] It's loud.
[01:12:35.160 --> 01:12:37.000] I still hear Donald Duck in there a little bit.
[01:12:37.080 --> 01:12:38.920] There's a boat creek noise in there.
[01:12:39.240 --> 01:12:41.320] So I get all those guesses.
[01:12:41.320 --> 01:12:43.720] Jay, you mentioned as the magnetic field flips.
[01:12:44.360 --> 01:12:45.320] What does that mean?
[01:12:45.320 --> 01:12:48.760] I guess they were recording it in anticipation of the flip.
[01:12:48.760 --> 01:12:52.760] I don't think it has anything to do with the flip, but that's why they were recording.
[01:12:53.080 --> 01:12:58.760] And before I move on to the new noisy for this week, I have a response to something that we talked about.
[01:12:58.760 --> 01:13:07.800] You know, Kara was mentioning that, let me see, back in episode 1007, we heard the voice of both Helen Keller and her interpreter.
[01:13:07.800 --> 01:13:17.960] Kara commented on how strange it is to hear the accent that was used by the interpreter whenever recordings from that time period are played because nobody speaks that way anymore.
[01:13:17.960 --> 01:13:22.600] And then Kara speculated that that's probably just how people spoke back then, but it sounds affected.
[01:13:22.600 --> 01:13:25.160] So a couple of people wrote in about this.
[01:13:25.400 --> 01:13:28.760] This particular email is from someone named Stephen Hopkins.
[01:13:28.760 --> 01:13:34.680] And he says, it's app it absolutely was affected as it was manufactured.
[01:13:34.680 --> 01:13:37.720] It was a manufactured accent, which did not evolve organically.
[01:13:37.720 --> 01:13:42.040] It's called mid-Atlantic accent or the transatlantic accent.
[01:13:42.040 --> 01:13:57.040] And it was a basically, it was a put-on accent by Northeastern American upper class in the early 20th century, and it was adopted by many broadcasters and actors of that time period because it made them sound cultured and because they felt it helped their voice come through more clearly.
[01:13:57.040 --> 01:13:59.280] Okay, so the voice is BS, right?
[01:13:59.920 --> 01:14:00.960] That's an affectation.
[01:14:00.960 --> 01:14:02.160] Well, we've spoken about this.
[01:14:02.160 --> 01:14:04.800] We absolutely have the mid-Atlantic accent.
[01:14:05.120 --> 01:14:08.800] Yeah, what's interesting about it is that it's not a regional accent.
[01:14:08.800 --> 01:14:10.640] Like, it doesn't exist anywhere.
[01:14:10.640 --> 01:14:13.040] It is a learned accent.
[01:14:13.040 --> 01:14:15.760] It's taught like in finishing school or whatever.
[01:14:15.760 --> 01:14:28.080] And it is essentially like an Eastern American accent with some British affectations and mixed in, like British sounds mixed in.
[01:14:28.080 --> 01:14:36.560] And so, and it is supposed to be a very like you enunciate things very clearly, and you, you know, but it's also got this like high and mighty sound to it.
[01:14:36.560 --> 01:14:38.240] Yes, right, exactly.
[01:14:38.240 --> 01:14:45.680] But the thing is, so, and so that person might not have been speaking that way because she was being recorded.
[01:14:45.680 --> 01:14:51.040] She might have been speaking that way because she was educated at one of the schools who taught her to speak that way.
[01:14:51.040 --> 01:14:54.560] And maybe she did lay it on a little bit thick because she was being recorded.
[01:14:54.560 --> 01:15:00.960] We don't know because everyone has their own individual manifestation of like what their mid-Atlantic accent is.
[01:15:00.960 --> 01:15:18.880] But that doesn't mean there isn't also temporal accents because if you watch documentaries of like I watched a lot of World War II documentaries and you watch people in the 40s being interviewed on film and they're not actors and they're not, you know, they're not upper class affectations.
[01:15:18.880 --> 01:15:22.080] They speak in an accent that doesn't exist today.
[01:15:22.080 --> 01:15:23.800] You know, it's like the that's right.
[01:15:23.800 --> 01:15:24.120] Yeah.
[01:15:26.520 --> 01:15:27.520] It's not that exactly.
[01:15:27.520 --> 01:15:29.000] It's not like that's kind of a stage accent.
[01:15:29.440 --> 01:15:40.920] But that's what that's when you that's when you hear like what real people sound like at the time because they're not actors, they're just people in the war, you know, or whatever.
[01:15:40.920 --> 01:15:43.480] So there is absolutely temporal accents as well.
[01:15:43.480 --> 01:15:50.520] But I do agree that that person's accent was probably a learned, you know, transatlantic or middle agent.
[01:15:50.600 --> 01:15:51.640] I really don't like it.
[01:15:51.640 --> 01:15:53.800] It sounds so put on.
[01:15:53.800 --> 01:15:55.240] You know, it's snobby.
[01:15:55.800 --> 01:15:56.600] Yeah, absolutely.
[01:15:57.720 --> 01:15:58.600] It was snobby.
[01:15:58.600 --> 01:15:59.800] It was literally snobby.
[01:15:59.800 --> 01:16:01.240] I mean, that was kind of that.
[01:16:01.240 --> 01:16:03.400] And it just sort of faded away after World War II.
[01:16:03.400 --> 01:16:03.800] All right.
[01:16:03.800 --> 01:16:06.760] So good job, everyone, that sent in all those great guesses.
[01:16:07.240 --> 01:16:09.080] I have a new noisy for you this week.
[01:16:09.080 --> 01:16:10.360] Check this out.
[01:16:24.280 --> 01:16:25.000] There you go.
[01:16:25.000 --> 01:16:29.080] That's ready whip whipped cream being dispensed.
[01:16:29.960 --> 01:16:30.760] Yeah, we know that.
[01:16:31.000 --> 01:16:34.280] It's like 11 o'clock at night, and you get it in the refrigerator.
[01:16:35.240 --> 01:16:35.800] Yeah.
[01:16:36.120 --> 01:16:37.640] That's straight into the mouth.
[01:16:38.600 --> 01:16:41.240] That's a right of goddamn passage in the United States.
[01:16:41.240 --> 01:16:43.400] Oh, we should all take one tonight.
[01:16:43.400 --> 01:16:51.720] So, okay, guys, if you know this week's noisy, or if you heard something really cool, email me at wtn at the skepticsguy.org.
[01:16:51.720 --> 01:16:56.040] All right, so let's go on with that interview with Brian Cox and special guest Brian Wecht.
[01:16:56.040 --> 01:17:02.520] And for those premium patron members, you get to listen to the full uncut version of that interview.
[01:17:02.520 --> 01:17:03.960] That'll be up this weekend.
[01:17:15.680 --> 01:17:18.000] Well, joining us now is Brian Cox.
[01:17:18.000 --> 01:17:19.760] Brian, welcome to the Skeptical Sky to the Universe.
[01:17:19.760 --> 01:17:20.720] Ah, pleasure.
[01:17:20.720 --> 01:17:21.760] Pleasure to be here.
[01:17:21.760 --> 01:17:24.720] You are one of the people that we've been hoping to get an interview with for a very long time.
[01:17:24.720 --> 01:17:27.440] You're obviously one of the superstar science communicators in the world.
[01:17:27.440 --> 01:17:29.440] We really appreciate what you do.
[01:17:29.760 --> 01:17:36.480] So we were talking about, you know, making the transition from being basically an academic, a scientist, to being a science communicator.
[01:17:36.640 --> 01:17:37.760] How has that worked out for you?
[01:17:37.760 --> 01:17:39.600] How do you feel about that?
[01:17:39.600 --> 01:17:42.000] Well, I mean, the first thing to say was an accident.
[01:17:42.000 --> 01:17:43.840] So I didn't really plan it.
[01:17:44.240 --> 01:17:54.960] In fact, in my early career as a PhD student and then postdoc, all I tried to do was get a research fellowship so no one would bother me and then just do research.
[01:17:54.960 --> 01:17:56.480] And I didn't even want to teach.
[01:17:56.720 --> 01:18:01.440] I just wanted to avoid everything apart from doing particle physics.
[01:18:01.440 --> 01:18:12.560] And then I got involved in one of the sort of funding crises that happen every now and again in all sorts of countries, I think, in the UK.
[01:18:12.880 --> 01:18:17.440] So I began to get involved in arguing for more funding for research.
[01:18:17.440 --> 01:18:21.840] And that brought me into contact, I suppose, with the media and the press.
[01:18:21.840 --> 01:18:24.000] And so it was an accident, really.
[01:18:24.240 --> 01:18:31.520] And then the BBC in the UK interviewed me a few times and then said, why don't you make a little documentary on the radio about particle physics?
[01:18:31.520 --> 01:18:35.280] And then why don't you make a little TV show, a little low-budget thing about particle physics?
[01:18:35.280 --> 01:18:38.560] And so it happened by accident.
[01:18:38.560 --> 01:18:41.200] And now, I mean, I love to teach.
[01:18:41.200 --> 01:18:43.840] So now I choose to teach at the University of Manchester.
[01:18:43.840 --> 01:18:46.480] I teach first years, quantum mechanics and relativity, actually.
[01:18:46.960 --> 01:18:52.720] And I obviously, as you've said, I get involved in making television programmes and so on.
[01:18:52.720 --> 01:18:57.440] So it was something that I came to later in my career.
[01:18:57.440 --> 01:19:05.880] But I very strongly believe that it's an important part of an academic career if you choose to do it.
[01:19:06.200 --> 01:19:08.760] And in fact, I was at the University of Manchester last week.
[01:19:08.760 --> 01:19:12.920] We had a bigger sort of worldwide universities conference there.
[01:19:12.920 --> 01:19:34.600] And I spoke at that and said that I think it's extremely important that if academics want to engage in whatever capacity, it doesn't have to be making television programmes, but just in speaking about climate science, as we've spoken about today, for example, then that should be seen as not only positive, but it should be part of an academic career if the academic chooses for it.
[01:19:34.600 --> 01:19:36.840] So promotion case and so on.
[01:19:36.840 --> 01:19:41.800] And so I've come to believe that it's extremely important, of course, now to engage.
[01:19:41.800 --> 01:19:45.240] And we can talk about why, all the reasons why I think that's the case.
[01:19:45.240 --> 01:19:48.920] Yeah, obviously I completely agree with you, also being an academic myself.
[01:19:49.080 --> 01:19:54.520] I'm always curious, asking my fellow sort of academic science communicators, how's that going for you?
[01:19:54.920 --> 01:20:02.760] And how specifically, how does the university, do they agree with you that this should be part of an academic career and that you get credit for it for promotion?
[01:20:02.760 --> 01:20:04.200] And I know there's a little bit of a divide.
[01:20:04.200 --> 01:20:06.840] I think it's worse in the US than in the UK.
[01:20:07.560 --> 01:20:09.240] So what's your experience been?
[01:20:09.240 --> 01:20:23.320] Yeah, at the University of Manchester, actually, so we have the promotion, they call it the legs of the promotion case, and they are research, teaching, administration, and we call it social responsibility.
[01:20:24.680 --> 01:20:30.600] So it's a quarter, basically, of the case it can be, which is public engagement, as we might call it.
[01:20:30.840 --> 01:20:37.320] But I think it depends very strongly on your vice-chancellor, the head of the president of the university.
[01:20:37.320 --> 01:20:51.440] But also, and I spoke about this as well, this conference last week, it can often be that I do find that the people right at the top are very they understand, as I think we all understand, that it's vitally important to communicate science.
[01:20:51.760 --> 01:21:04.880] Of course, in a democracy, as Carl Sagan said, the idea that if you have a population that has no contact with the way that we acquire reliable knowledge about the world, then the decisions that democracy makes will be flawed.
[01:21:06.400 --> 01:21:08.720] And I think people at the top know that.
[01:21:08.960 --> 01:21:18.160] You can have problems in universities with the kind of middle management, the heads of department level, and things like that, you know, because of the funding streams and so on.
[01:21:18.160 --> 01:21:24.880] So, I think that's where in the UK, if there's going to be a problem, it will be with your kind of line manager.
[01:21:24.880 --> 01:21:28.400] It won't be with the people at the top who understand the wider picture.
[01:21:28.400 --> 01:21:35.920] Brian, I talk a lot about particle physics on the show, and I want to get your sense of how you feel about your competence in the future.
[01:21:35.920 --> 01:21:43.200] You know, you've got the LHC who's they've scaled up now to what, 14 tera electron volts, and the Higgs boson is still like the biggest thing that they've done.
[01:21:43.440 --> 01:21:51.040] They've made a lot of discoveries, but how is your optimism in the future for being able to reveal some new physics beyond the standard model?
[01:21:51.200 --> 01:22:03.920] Do you think we'll ever, in a reasonable amount of time, get to a regime where we can discover new physics, or is it probably forever beyond technology that we could build and finance to discover?
[01:22:04.000 --> 01:22:06.720] Is it just too far beyond us for a long time?
[01:22:06.720 --> 01:22:11.920] It's a very good question, and the answer is for the first time in the history of particle physics, we don't know.
[01:22:12.560 --> 01:22:13.360] It's a little scary, right?
[01:22:13.360 --> 01:22:14.240] It's a little scary.
[01:22:14.240 --> 01:22:19.440] I mean, so the LHC, I should say, it was, why was it built at that energy?
[01:22:19.440 --> 01:22:29.040] It was built at that energy because we knew that the standard model without a light Higgs broke down, mathematically speaking, at those energies.
[01:22:29.040 --> 01:22:36.280] So, what that means in reality is you either discover a Higgs boson of some kind or some other mechanism.
[01:22:37.000 --> 01:22:43.640] In fact, I worked on physics without a light Higgs at the LHC before we turned the LHC on.
[01:22:43.640 --> 01:22:47.160] So, signatures, the model breaks down.
[01:22:47.800 --> 01:22:52.760] One of the places it breaks down most obviously is in the scattering of W bosons, for example.
[01:22:52.760 --> 01:22:56.600] So, you can bang W bosons together, WW scattering, it's called.
[01:22:57.000 --> 01:23:08.280] And if you calculate that process without a Higgs boson in the theory, then it gives you nonsense, basically, energies of actually 1.4 TV, right?
[01:23:08.280 --> 01:23:11.560] So, well within the scope of the LHC.
[01:23:11.880 --> 01:23:19.080] So, that's why you could, with absolute confidence, build that machine, because you knew you were going to discover something.
[01:23:19.080 --> 01:23:28.760] It's also true to say many of my colleagues who've been in parcel physics longer than I have would say, but it's also true that most of the machines that we built discovered things they weren't built to discover.
[01:23:31.320 --> 01:23:38.200] But at least you knew there was an energy threshold which was within the scope of that machine, that you'd see something.
[01:23:38.200 --> 01:23:40.280] Right, and justify the billions of dollars spent, isn't it?
[01:23:40.440 --> 01:23:48.680] Yeah, now you're absolutely right now that at the moment with LHC, it's exciting that we're in a precision physics regime.
[01:23:48.680 --> 01:23:51.560] So, we're still looking for, obviously, new particles.
[01:23:51.800 --> 01:23:55.000] Most people, I think, would have put money on supersymmetry.
[01:23:55.000 --> 01:23:55.640] So, super symmetric.
[01:23:56.120 --> 01:23:58.360] That was my field when I was in the super string theory.
[01:23:58.360 --> 01:23:59.400] It comes out of string theory.
[01:24:00.760 --> 01:24:03.560] I'm sure that nature is supersymmetric at some point.
[01:24:04.280 --> 01:24:05.880] It seems very plausible, right?
[01:24:06.120 --> 01:24:10.960] And also plausible that the LHC might have seen some LHC.
[01:24:11.680 --> 01:24:12.280] Nothing.
[01:24:12.760 --> 01:24:15.600] So, you're right that what do you do?
[01:24:15.600 --> 01:24:24.480] I mean, it's true that particle physics goes in phases where you then go into a precision measurement phase, like the accelerator before it, LEP, in the same tunnel.
[01:24:24.880 --> 01:24:31.600] You know, we were making high-precision measurements on the W and Z bosons, and that was necessary information.
[01:24:31.600 --> 01:24:35.200] So you kind of go to a Higgs factory type model, for example.
[01:24:35.200 --> 01:24:45.200] If there's nothing else, you start making high-precision measurements on the Higgs, which is what we're doing with the LHC at the moment with the upgrades while still looking for new signatures.
[01:24:45.440 --> 01:24:47.600] So we know that it's not complete, right?
[01:24:47.680 --> 01:24:50.880] We know that we don't know where the energy scale is.
[01:24:50.880 --> 01:24:51.360] Yeah, right.
[01:24:51.680 --> 01:24:53.200] So we don't know how big to make the next one.
[01:24:53.200 --> 01:24:54.480] Is that basically the answer?
[01:24:54.480 --> 01:24:57.040] Yeah, but it's not just big, it's the type.
[01:24:57.040 --> 01:24:59.040] So you could do a muon collider or something like that.
[01:24:59.360 --> 01:25:02.160] If you're going to do a perionic collider, then that looks pretty good.
[01:25:02.320 --> 01:25:04.400] You'd have to just scale it up, but you can do lots of other stuff.
[01:25:06.080 --> 01:25:17.680] But then again, having said all that, I'm a strong supporter of the big machine, whatever it's currently called, that CERN wants to build the super LHC, which is a 100-kilometer tunnel, I think.
[01:25:18.800 --> 01:25:19.280] Amazing.
[01:25:20.160 --> 01:25:30.560] Because what you do find is that, and I saw it firsthand with LHC, is that there aren't many people who know how to build accelerators on that scale.
[01:25:30.560 --> 01:25:31.920] And they're really difficult.
[01:25:31.920 --> 01:25:36.400] And you can forget, you can lose the expertise, and it's hard.
[01:25:36.400 --> 01:25:43.040] And actually, a lot of the people who worked on the LHC were towards the end of their careers, they're highly experienced people.
[01:25:43.040 --> 01:25:48.000] And so I think there's a very strong argument that it's not a lot of money, actually.
[01:25:48.000 --> 01:25:51.120] When you look at these are decadal projects.
[01:25:51.920 --> 01:25:57.760] We're talking about the machine for 20, 30, 40, 50 years in the future.
[01:25:57.760 --> 01:26:11.480] And so, at the level of a billion dollars a year or something in total, which CERN is, it looks expensive, CERN, but actually, its budget is less than my university, the University of Manchester.
[01:26:11.480 --> 01:26:18.520] So, its yearly budget out of which it builds the machines is of a medium-sized university.
[01:26:18.520 --> 01:26:22.600] It's a lower budget than Harvard and Princeton, those universities, right?
[01:26:23.000 --> 01:26:38.920] So, I think at that level, the idea that the world has this capability to build these machines and builds one of them, and it takes decades to build them, and then you operate it and do good physics with it for 50 years is compelling to me.
[01:26:39.080 --> 01:26:43.640] And, you know, I put money on there being interesting stuff.
[01:26:43.960 --> 01:26:45.880] It's not you know, it's getting theoretical physics.
[01:26:47.240 --> 01:26:48.520] You get into problems.
[01:26:48.520 --> 01:26:51.240] If you don't see it, it's getting narrower and narrower and narrower.
[01:26:53.720 --> 01:27:02.040] But having said all that, as I said earlier, it is the case that we couldn't guarantee it, knowing what we know at the moment.
[01:27:02.360 --> 01:27:06.360] It does seem like, I mean, supersymmetry, the constraints are getting tighter and tighter for that, right?
[01:27:06.360 --> 01:27:07.640] So, who knows what's going to happen with that?
[01:27:07.640 --> 01:27:19.400] But then you have something which is possibly related, although a priori not necessarily, dark matter, where, right, that seems like a much more plausible discovery to me at some point in the nearer future.
[01:27:19.640 --> 01:27:21.880] Sure, and the standard model doesn't say anything about that.
[01:27:21.880 --> 01:27:23.320] So, we need to go beyond.
[01:27:23.320 --> 01:27:25.080] We need to go beyond the standard model.
[01:27:25.080 --> 01:27:25.480] That's right.
[01:27:25.480 --> 01:27:27.000] And is that supersymmetry or something else?
[01:27:27.320 --> 01:27:33.160] I saw this very, there's a very cool work that I heard of the other day from string theory.
[01:27:33.160 --> 01:27:42.080] So, maybe you know it, where you're looking at the parameter space of string theory and looking at this landscape of possibilities.
[01:27:42.080 --> 01:28:00.240] And then saying, I think it's true to say that if you go, if you take the cosmological constant, which is what is it, 10 to the minus 122 ridiculously tiny number, and you use that as the parameter that you don't know why that's the case, but you've used that.
[01:28:00.800 --> 01:28:25.360] I think there's some theories now that are suggesting that you could link that to dark matter in the sense that you get, it can be sort of slight, large-ish extra dimensions about the micron scale that are implied by that low value of the cosmological constant, which, and then the gravitons, that the tower of excited states of gravitons seem to have the right properties to be dark matter.
[01:28:25.360 --> 01:28:25.920] So it was.
[01:28:26.000 --> 01:28:27.680] Oh, the decline states from that.
[01:28:28.160 --> 01:28:28.720] Yeah, I heard it.
[01:28:28.720 --> 01:28:31.760] It's actually, it's on a collaborator of mine.
[01:28:31.760 --> 01:28:32.880] Right, so it's his work.
[01:28:33.280 --> 01:28:33.840] I don't know this work.
[01:28:34.560 --> 01:28:47.040] Yeah, so swampland is kind of the there's a while since I thought about this, so I might be getting this wrong, but there's a question of is everything that's possible out there described by some theoretical model or not?
[01:28:47.360 --> 01:28:50.960] So is it how complete or, you know, is your theoretical framework?
[01:28:50.960 --> 01:28:58.960] And he coined this term swampland to ask what can we actually, you know, what's allowed in terms of the parameter space versus the actual theory.
[01:28:59.280 --> 01:29:02.080] Yeah, so it was, it was, I think there's a review paper.
[01:29:02.080 --> 01:29:02.800] I haven't read it yet.
[01:29:02.800 --> 01:29:04.320] I was only made aware of it the other week.
[01:29:04.320 --> 01:29:06.720] So I'm on the plane back tonight.
[01:29:06.960 --> 01:29:07.120] Okay.
[01:29:08.400 --> 01:29:09.600] But it looks really fascinating.
[01:29:09.600 --> 01:29:13.920] So it's just an example of where there's theoretical progress.
[01:29:14.640 --> 01:29:22.120] And string theory is a good example because I get asked a lot, you know, that people tend to think, oh, it kind of went away, it kind of failed, doesn't it?
[01:29:22.080 --> 01:29:22.800] But it's not.
[01:29:22.800 --> 01:29:25.600] It's a tremendous amount of progress.
[01:29:25.840 --> 01:29:38.920] And with holography as well, which is coming in there and ADSCRT and links to the tiny bit of research that I do into black holes, then it does seem that we're on the verge of, I think, a really exciting transformation.
[01:29:39.560 --> 01:29:40.040] Fundamentally.
[01:29:40.200 --> 01:29:58.040] The way I have always explained it to people is I think the original, so when string theory was first, you know, kind of out there in the mid-80s, some very optimistic people said, you know, we're going to be, we're going to have the, you know, electron mass from first principles in 10 years, which was just completely not true.
[01:29:58.040 --> 01:30:03.160] Because what they wanted is write down the theory, we get to a four-dimensional universe with the standard model, and that's it.
[01:30:03.160 --> 01:30:03.880] That's not true.
[01:30:03.880 --> 01:30:04.520] That didn't happen.
[01:30:04.520 --> 01:30:05.880] I think it's never going to happen.
[01:30:05.880 --> 01:30:12.680] But what string theory does provide is this tremendous toolbox that you can use to understand hard problems, like with holography.
[01:30:12.680 --> 01:30:20.920] And people are using this all the time to study amazing things that we didn't have access to before from a variety of standpoints.
[01:30:20.920 --> 01:30:25.800] Is it a shut up and calculate kind of moment where it's like it works, don't worry about how it relates to reality?
[01:30:26.760 --> 01:30:28.680] I think that's a valid philosophy.
[01:30:29.480 --> 01:30:32.120] What can we calculate with this that we couldn't calculate before?
[01:30:32.200 --> 01:30:34.600] It's also valid to ask, and what does this mean for the real world?
[01:30:35.000 --> 01:30:38.200] Can you use these techniques to actually calculate anything useful?
[01:30:38.200 --> 01:30:39.800] That is an open question right now.
[01:30:39.800 --> 01:30:45.480] There are some people who are just really digging deep and trying to get the standard model out of string theory.
[01:30:45.480 --> 01:30:50.760] Even just getting the standard model with the right particles and masses and interactions, that's very, very hard to do.
[01:30:50.760 --> 01:30:54.760] Guys, if we could just magically have these answers appear in front of us, right?
[01:30:55.240 --> 01:30:57.080] What's the practicality behind it?
[01:30:57.080 --> 01:31:04.520] Is the goal here to just understand how the world works, or are there actual applications that people like me could relate to?
[01:31:04.840 --> 01:31:06.200] Look what flowed from quantum mechanics.
[01:31:06.200 --> 01:31:08.040] I mean, the whole industry is a very good idea.
[01:31:08.120 --> 01:31:09.000] I'm not talking to you.
[01:31:09.000 --> 01:31:09.800] I'm not talking to you.
[01:31:12.280 --> 01:31:28.160] So, one of the fascinating areas, which I've been involved in a little bit, so I work with, I have a co-supervisor PhD student at Manchester, who happens to be funded, by the way, by an information technology company that is working on black holes, quantum information.
[01:31:29.120 --> 01:31:44.000] There's a direct link between at least the techniques that have been developed to try to understand things like the black hole information paradox and the techniques you use to build error correction codes, quantum computers, you know, to try to protect the memory from errors and so on.
[01:31:44.560 --> 01:31:58.320] So, if you'd have said, and I say this to funding agencies when I speak to them, if I'd have said to you, fund research into collapsed stars because that will help you build quantum computers and understand them, then they would have just laughed at you.
[01:31:59.440 --> 01:32:02.720] But it turns out that the skill sets, at least, are the same.
[01:32:02.720 --> 01:32:26.960] And it could, I mean, basically, you know, it's not only that, it is that this field of emergent space-time, which is very popular at the moment, and where essentially what you have, the simple way to say it is you have space-time emerging from a quantum theory, from quantum entanglement of some objects, which are probably at the scale from the Bekenstein entropy, right?
[01:32:26.960 --> 01:32:31.600] The scale that you tile the event horizon to work out the entropy of a black hole.
[01:32:31.600 --> 01:32:34.400] It's probably, we probably know the distance scales, actually.
[01:32:35.040 --> 01:32:36.960] It's probably string, it's a string scale.
[01:32:38.320 --> 01:32:49.200] But so, the idea that space-time is emerging from entanglement, that's becoming an experimental science now.
[01:32:49.360 --> 01:32:50.400] I find it interesting.
[01:32:50.400 --> 01:33:02.760] I find it fun that the use, probably the best use from a physicist perspective of quantum computers, like the Google quantum computer and Microsoft, is not actually as a quantum computer, but as a load of qubits.
[01:32:59.760 --> 01:33:05.320] Because they're really good arrays of qubits.
[01:33:05.640 --> 01:33:08.600] That's not why they spent billions of dollars building the things.
[01:33:08.600 --> 01:33:10.440] But if you're a physicist, you go, this is brilliant.
[01:33:10.440 --> 01:33:11.480] I've got a load of qubits.
[01:33:12.200 --> 01:33:14.840] And there was a paper recently where there's this filament.
[01:33:14.840 --> 01:33:15.640] I know you saw it.
[01:33:17.480 --> 01:33:23.240] So it's in a particular configuration of the qubits, a particular entanglement structure of the network.
[01:33:23.480 --> 01:33:28.760] You get something that you could interpret as a filament of space emerging.
[01:33:29.080 --> 01:33:29.640] Oh, my God.
[01:33:29.800 --> 01:33:32.760] It's sometimes described as a one-dimensional wormhole.
[01:33:32.760 --> 01:33:33.160] Right.
[01:33:34.040 --> 01:33:35.800] But that's a remarkable paper.
[01:33:36.680 --> 01:33:37.880] It's a published paper.
[01:33:37.880 --> 01:33:38.680] You can look at it.
[01:33:38.680 --> 01:33:42.280] There's some controversy about if that's the right interpretation of it.
[01:33:42.280 --> 01:33:48.520] But it's fascinating that quantum gravity is becoming an experimental science potential.
[01:33:48.600 --> 01:33:50.200] We seem to be on the verge of that.
[01:33:50.200 --> 01:33:51.400] For years now, too.
[01:33:51.960 --> 01:33:56.520] I mean, another question that a lot of people say, oh, it's nonsense, extra dimensions, right?
[01:33:56.520 --> 01:34:09.080] There's a, I don't know if they're still doing it, but for a while there was a group, the Adelberger group in Seattle, which was, so the idea is that if you have access to extra dimensions at very small scales, you'll see deviations from the inverse square law of gravity, right?
[01:34:09.080 --> 01:34:11.960] Because the gravitational flux can spread into the extra dimensions.
[01:34:11.960 --> 01:34:19.080] So what they were doing is they were moving things together in very tiny distance scales and checking for deviations from one over r squared.
[01:34:19.160 --> 01:34:22.360] There's one extra dimension, then it's one over r cubed, et cetera, et cetera.
[01:34:24.360 --> 01:34:25.880] If they had, you would know about it.
[01:34:26.600 --> 01:34:27.880] I haven't talked about it.
[01:34:28.520 --> 01:34:30.600] I've rushed this paper from that I heard of that.
[01:34:30.600 --> 01:34:33.240] I said I haven't read in detail yet, so I'll read it tonight.
[01:34:33.480 --> 01:34:42.800] But it does suggest that there may be a large-ish dimension at the micron scale, which so it'd be interesting to see what the experimental because they can put downs on it.
[01:34:42.800 --> 01:34:45.200] It's like the number and size of the extra dimensions.
[01:34:45.680 --> 01:34:50.720] Does that relate to the as an explanation for the weakness of gravity compared to the other fundamental forces?
[01:34:50.720 --> 01:34:53.840] Is that like kind of leaking into these other potential dimensions?
[01:34:53.840 --> 01:34:55.520] Is that where you were going with that?
[01:34:56.000 --> 01:34:56.880] No, not quite.
[01:34:56.880 --> 01:34:59.440] I mean, it's not totally unrelated, but it wouldn't be the same thing.
[01:34:59.760 --> 01:35:01.840] Historically, that was a thought, wasn't it?
[01:35:01.840 --> 01:35:02.080] Yeah, yeah.
[01:35:02.960 --> 01:35:08.080] So I'm not sure you know more than me in the modern string theory, whether that's guys.
[01:35:08.400 --> 01:35:12.800] Why do physics change when you go smaller, but they don't change when you go bigger?
[01:35:13.120 --> 01:35:13.840] Why do you?
[01:35:13.840 --> 01:35:14.480] You know what I mean?
[01:35:14.480 --> 01:35:21.200] Like, you know, because we know that when you go, when you get into a certain quantum regime, right?
[01:35:21.200 --> 01:35:23.520] That you're asking why isn't there a new regime of validity?
[01:35:24.320 --> 01:35:29.520] If you just opened up, you made the scale tremendous, would concepts of physics change?
[01:35:29.760 --> 01:35:37.600] Well, don't some people think that's exactly what happens, and that's why dark matter is actually gravity behaving differently in the supervisor.
[01:35:37.840 --> 01:35:40.560] Okay, so we have theories about that.
[01:35:41.040 --> 01:35:42.160] That's theoretically possible.
[01:35:42.160 --> 01:35:43.360] I think that's a minority viewpoint.
[01:35:43.520 --> 01:35:44.640] It's a minority video.
[01:35:46.400 --> 01:35:49.280] You think dark matter is the answer to the observations?
[01:35:49.920 --> 01:35:50.480] Well, I don't.
[01:35:50.480 --> 01:35:52.800] No, I mean, we've operated under the assumption.
[01:35:53.040 --> 01:36:04.560] The thing is, the assumption that it's a weakly interacting particle of some description, that fits quite a lot of things, including in particular the cosmic microwave background.
[01:36:04.880 --> 01:36:14.640] It's an important component of the way that these sound waves move through the plasma in the early universe before 380,000 years after the Big Bang.
[01:36:14.960 --> 01:36:17.760] And that so we have very good data there.
[01:36:17.760 --> 01:36:23.520] And essentially, what you're seeing, if you look at those pictures of CMB, you're seeing sound waves going through the plasma.
[01:36:24.000 --> 01:36:35.800] And the presence of some kind of weakly interacting particle in there that does that not electromagnetically interacting is a component of those fits that fit very well.
[01:36:36.120 --> 01:36:39.400] And that fits also gravity rotation curves and all the things as well.
[01:36:39.800 --> 01:36:43.080] So it's a good model.
[01:36:43.080 --> 01:36:47.640] But it's not to say it's right because we finally discovered what it is.
[01:36:47.960 --> 01:36:58.040] But it does fit multiple different independent phenomena that we see, not only the gravitational phenomena, but also the CMB.
[01:36:58.040 --> 01:37:06.760] And it tends to be the case, you said when you modify general relativity, for example, it tends to be the case you can modify it and fit something, but you mess up a lot of other things.
[01:37:07.160 --> 01:37:08.360] It's quite difficult, isn't it?
[01:37:08.600 --> 01:37:09.480] Or near impossible.
[01:37:09.800 --> 01:37:10.200] That's right.
[01:37:10.200 --> 01:37:16.280] And I think if you polled most working physicists now, they would, you know, for a while there was this WIMP versus macho around dark matter.
[01:37:16.600 --> 01:37:21.080] I think it seems fairly consistent that most people would say WIMPs at this point.
[01:37:21.080 --> 01:37:23.640] But you could find some WIMPs and people who want to.
[01:37:24.040 --> 01:37:26.040] Well, I mean, until we know we know, right?
[01:37:26.040 --> 01:37:26.200] Right.
[01:37:27.080 --> 01:37:27.880] Anything is possible.
[01:37:28.200 --> 01:37:36.120] But I find it this idea, clearly dark energy is even more perplexing, to say the least.
[01:37:36.680 --> 01:37:37.000] Certainly.
[01:37:37.640 --> 01:37:54.600] This idea that I find it fascinating that there may be a link between that and, of course, inflation, which looks similar, and the fact that our universe is not is on the edge of stability, because which we know from measurements of the Higgs mass and the top quart mass and so on.
[01:37:54.600 --> 01:38:01.080] So I think there's quite a few of my colleagues you speak to think that maybe these things are related in some way.
[01:38:01.080 --> 01:38:09.240] I mean, inflation looks, it's a very much at the different energy scale, but then you've got the inflation, you've got the Higgs, which is not com contributing to anything, it seems.
[01:38:09.240 --> 01:38:16.560] This scalar field that doesn't blow the universe apart, and then you've got dark energy that's a fine sound.
[01:38:14.520 --> 01:38:19.520] Yeah, so that's the cussing edge at the moment.
[01:38:19.600 --> 01:38:22.640] I think it's one of the most interesting fields in theoretical physics.
[01:38:22.640 --> 01:38:24.640] Trying to understand if those things are the same.
[01:38:24.640 --> 01:38:26.000] Maybe they're not all different.
[01:38:26.880 --> 01:38:35.680] Does it feel to you that we're still missing something absolutely fundamental about the universe that is making it impossible for us to really understand what's going on?
[01:38:35.840 --> 01:38:41.040] I mean, the example, maybe you could talk as well, is holography, I think.
[01:38:41.040 --> 01:38:45.360] You know, ADSCFT, which is an example of a holographic theory.
[01:38:45.360 --> 01:38:49.280] That, I think, is really radical.
[01:38:49.840 --> 01:38:54.960] It's this idea that we can find dual descriptions of reality.
[01:38:55.360 --> 01:38:58.320] So that toolbox surely is going to be.
[01:38:58.800 --> 01:39:07.120] Yeah, the rough idea is that you can use a 10-dimensional string theory to describe essentially a four-dimensional field theory.
[01:39:07.120 --> 01:39:11.040] And we're in the regime where one problem gets hard, the other gets easy.
[01:39:11.040 --> 01:39:18.560] So you can do kind of geometrical calculations in the gravity regime to give you field theory data in the particle thing.
[01:39:18.560 --> 01:39:27.280] And in very special, supersymmetric cases, there's an exact dictionary between, okay, the mass of this baryon is the dimension of this operator.
[01:39:27.280 --> 01:39:35.200] And you can get these really non-trivial matchings, you know, these crazy, like, irrational numbers that, you know, pop up very nicely in both.
[01:39:35.200 --> 01:39:36.720] And it's a wild thing.
[01:39:38.000 --> 01:39:45.840] There's no proof of it per se, but there's so much data to indicate that it's correct that I think it has to be true.
[01:39:45.840 --> 01:39:50.560] Now, does that help us describe our universe is another question entirely.
[01:39:51.040 --> 01:39:52.160] One last question, guys.
[01:39:52.160 --> 01:39:52.640] Who gets it?
[01:39:52.800 --> 01:39:55.440] Will AI help this field in any way?
[01:39:55.560 --> 01:39:58.920] It's a really, it's a question that gets asked a lot, isn't it?
[01:39:58.640 --> 01:40:03.720] And it does, you know, in data analysis, then you'd have to say yes, right?
[01:40:04.200 --> 01:40:06.120] Large data sets and so on.
[01:40:06.680 --> 01:40:14.840] Whether or not creating new physical things, like asking Chat GPT to build a quantum theory of gravity, right?
[01:40:15.400 --> 01:40:16.680] That's a different thing.
[01:40:17.240 --> 01:40:19.320] I don't know what the answer is to that.
[01:40:19.560 --> 01:40:20.440] Have you tried plugging that in?
[01:40:21.560 --> 01:40:22.200] You can try.
[01:40:22.440 --> 01:40:23.160] It won't do it.
[01:40:24.200 --> 01:40:36.680] Big thing, so I left physics about 10 years ago, and one of the big things that I see different that a lot of my colleagues are doing is a lot of them are doing machine, applying machine learning to complicated systems to see what they can do.
[01:40:36.680 --> 01:40:41.080] So it's not AI in a solve the problem, it's going to accelerate the research.
[01:40:41.080 --> 01:40:41.400] That's right.
[01:40:41.800 --> 01:40:45.800] It's kind of like what I would say a good analogy is what AI did with protein pulp.
[01:40:45.800 --> 01:40:46.040] Yeah.
[01:40:46.040 --> 01:40:46.440] Right, okay.
[01:40:46.440 --> 01:40:47.640] So it could speed things up a lot.
[01:40:47.720 --> 01:40:48.840] It could speed things up in that way.
[01:40:49.480 --> 01:40:53.080] So Brian, tell us about your current project, the Horizons show that you're doing.
[01:40:53.080 --> 01:40:55.800] Yeah, I've been doing these, as you said, show.
[01:40:56.120 --> 01:41:02.920] My friend Robin Ins, comedian that I work with on the VBC, said, you know, you should call it a lecture.
[01:41:02.920 --> 01:41:04.840] And they say, not at those ticket prices, though.
[01:41:04.920 --> 01:41:05.880] You can't call it a lecture.
[01:41:07.000 --> 01:41:07.880] So it is true.
[01:41:08.200 --> 01:41:13.000] So I ended up developing this live show, which is big LED screens, basically.
[01:41:13.560 --> 01:41:16.680] And then many of the concepts we've just discussed, actually.
[01:41:16.920 --> 01:41:18.680] So it has become a show.
[01:41:19.000 --> 01:41:21.720] And it was built for arenas in the UK.
[01:41:21.720 --> 01:41:26.280] So we've done 14,000, 15,000 people in the Ota Arena and Wembley and things like that.
[01:41:27.160 --> 01:41:28.680] Stadium cosmology.
[01:41:30.280 --> 01:41:33.640] And so I've enjoyed doing it a lot.
[01:41:34.200 --> 01:41:35.200] We've done it.
[01:41:35.200 --> 01:41:39.160] Over 400,000 people have seen the show across the world in the last year.
[01:41:39.240 --> 01:41:40.680] That does make me feel good about humanity.
[01:41:40.680 --> 01:41:43.640] They won't get that many people to sit for a silent show.
[01:41:44.040 --> 01:41:47.440] 15,000 people listening to you describe a Penrose diagram.
[01:41:44.840 --> 01:41:48.400] It's kind of strange.
[01:41:50.080 --> 01:41:58.640] So just to finish it off, because we came to the US actually very early on in the development of it, and it was just after COVID, and we did some small places.
[01:41:58.640 --> 01:42:00.080] And I just wanted to finish it off.
[01:42:00.240 --> 01:42:09.360] So we're bringing it back at the end of April, start of May to a few cities: LA, San Francisco, New York, Chicago, Seattle, Portland.
[01:42:11.120 --> 01:42:17.600] And it's just me really saying I want to say goodbye to this, this particular show.
[01:42:17.600 --> 01:42:23.760] And I'd like to do it here because we started here, in a sense, a long time ago, and it's changed a lot.
[01:42:23.760 --> 01:42:25.600] Where can people find dates and get tickets?
[01:42:25.840 --> 01:42:29.120] There's a website called Briancoxlive.co.uk.
[01:42:29.120 --> 01:42:32.480] It's.co.uk because I think brian.coxlive.com.
[01:42:32.480 --> 01:42:33.760] I don't know, probably we couldn't get it.
[01:42:34.960 --> 01:42:35.840] That doesn't matter, does it?
[01:42:35.840 --> 01:42:38.160] It says briancoxlive.co.uk.
[01:42:38.160 --> 01:42:39.600] And the tickets are there.
[01:42:39.600 --> 01:42:52.960] And, you know, we might try and extend it a bit and come to, you know, I've never done a show in Vegas, for example, so I think we could just ask David Copperfield to move out for a night or something and put it in there.
[01:42:53.200 --> 01:42:53.840] But yeah.
[01:42:54.240 --> 01:42:55.360] Well, Brian, this has been awesome.
[01:42:55.360 --> 01:42:56.720] Thank you so much for sitting down with us.
[01:42:56.880 --> 01:42:57.200] Thank you.
[01:42:57.360 --> 01:42:57.920] Fantastic.
[01:42:57.920 --> 01:42:58.320] Thank you.
[01:42:58.480 --> 01:42:58.720] Thanks.
[01:42:58.880 --> 01:42:59.760] All right.
[01:43:02.640 --> 01:43:07.840] It's time for science or fiction.
[01:43:12.640 --> 01:43:26.560] Each week I come up with three science news items or facts, two real and one fake, and I challenge my panel of skeptics to tell me which one is the fake, because panel is the plural of skeptics or the collective.
[01:43:26.560 --> 01:43:27.440] All right, you guys ready?
[01:43:27.440 --> 01:43:28.640] Three regular news items.
[01:43:28.640 --> 01:43:29.040] Yep.
[01:43:29.040 --> 01:43:29.760] Here we go.
[01:43:29.960 --> 01:43:45.560] Item number one: new research finds that higher penetration of weather-dependent renewable energy sources, wind and solar, on the grid does not increase vulnerability to blackouts and reduces their severity when they occur.
[01:43:45.560 --> 01:43:56.040] Item number two: a recent study finds that coyotes are thriving in North America, and in fact, direct hunting by humans results in larger populations.
[01:43:56.040 --> 01:44:05.960] And item number three: a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:44:05.960 --> 01:44:07.400] Bob, go first.
[01:44:07.400 --> 01:44:13.400] Okay, so wind and solar do not increase the vulnerability of blackouts.
[01:44:13.720 --> 01:44:15.160] Yeah, that kind of makes sense.
[01:44:15.560 --> 01:44:19.400] I think I'm just going to buy that, although I haven't read anything specific about that.
[01:44:20.040 --> 01:44:27.320] Coyotes are thriving in North America, and direct hunting by humans results in larger populations.
[01:44:27.640 --> 01:44:29.080] I don't know about that last bit.
[01:44:29.640 --> 01:44:33.080] Let's see, I did not even absorb this third one at all.
[01:44:33.080 --> 01:44:41.640] Population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:44:41.640 --> 01:44:45.960] So basically, being preterm has no bad side effects.
[01:44:45.960 --> 01:44:47.400] Lasting into adulthood.
[01:44:47.400 --> 01:44:52.680] All right, tell me everything that you can say about these three for the first person going.
[01:44:53.000 --> 01:44:55.080] They're all pretty self-explanatory.
[01:44:55.080 --> 01:44:56.760] All right, I'm going to say the wolves.
[01:44:56.840 --> 01:44:57.800] The coyotes, you mean?
[01:44:58.040 --> 01:44:58.760] Fiction.
[01:44:58.760 --> 01:44:59.480] Wolves, coyotes.
[01:44:59.560 --> 01:45:00.520] Wait, you said coyotes?
[01:45:00.520 --> 01:45:01.680] It's coyotes and wolves.
[01:45:02.360 --> 01:45:03.000] Okay.
[01:45:03.320 --> 01:45:04.440] I'll say that's fiction.
[01:45:04.440 --> 01:45:05.160] All right, Jay.
[01:45:05.320 --> 01:45:06.280] Because I feel like it.
[01:45:06.280 --> 01:45:11.960] This first one about the higher penetration of weather-dependent renewable energy sources.
[01:45:11.960 --> 01:45:16.160] It doesn't increase the vulnerability to blackouts and reduces their severity when they occur.
[01:45:14.920 --> 01:45:20.320] I mean, yeah, I mean, I don't think it would increase vulnerability.
[01:45:20.640 --> 01:45:26.480] I think it, but these are, you know, these are things that definitely need electricity and have a lot of wiring and everything.
[01:45:26.480 --> 01:45:33.600] Yeah, so this is a really interesting thing here, Steve, because you're saying increase, they do not increase vulnerability to blackouts.
[01:45:33.600 --> 01:45:36.880] So there aren't more blackouts because of them.
[01:45:36.880 --> 01:45:37.360] Right.
[01:45:37.360 --> 01:45:40.480] And when blackouts do occur, they're less severe.
[01:45:40.480 --> 01:45:41.120] Yeah, all right.
[01:45:41.120 --> 01:45:42.080] I think that's science.
[01:45:42.080 --> 01:45:43.040] That just makes sense.
[01:45:43.040 --> 01:45:52.400] It took me a while to parse through it because at first, for some reason, I thought you meant during like an EMP or some solar flare or something, but that was just mistaken.
[01:45:52.400 --> 01:45:54.960] Okay, so that one, to me, 100% science.
[01:45:54.960 --> 01:46:02.480] The second one, a recent study, finds out that coyotes are thriving in North America, and in fact, direct hunting by humans results in larger populations.
[01:46:02.480 --> 01:46:06.880] So why would direct hunting by humans result in larger populations?
[01:46:06.880 --> 01:46:08.960] Maybe because they're killing off the weak ones?
[01:46:10.080 --> 01:46:11.680] Is that possibly be it?
[01:46:11.680 --> 01:46:12.880] That one's a maybe.
[01:46:12.880 --> 01:46:14.960] That doesn't seem to track with me.
[01:46:14.960 --> 01:46:23.440] The third one, a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:46:23.440 --> 01:46:24.720] Preterm infants.
[01:46:24.720 --> 01:46:28.080] Okay, so does it matter how early they are?
[01:46:28.480 --> 01:46:30.240] Yeah, there's a cutoff.
[01:46:30.240 --> 01:46:31.280] Okay, so preterm.
[01:46:31.600 --> 01:46:33.440] To qualify as preterm.
[01:46:33.440 --> 01:46:34.880] Can you tell me what the cutoff is?
[01:46:34.880 --> 01:46:38.800] So if you're born before 37 weeks, you are considered preterm.
[01:46:38.800 --> 01:46:40.960] Okay, so yeah, that's early.
[01:46:40.960 --> 01:46:42.640] I wish I knew that.
[01:46:42.640 --> 01:46:44.240] Would that have changed your answer?
[01:46:44.240 --> 01:46:45.840] Nah, just saying it.
[01:46:46.480 --> 01:46:48.000] All right, so let's think about this.
[01:46:48.000 --> 01:46:55.680] So, if a baby comes out early, I would imagine that there's reasons why that happens, but it doesn't necessarily mean that there's something wrong with the baby.
[01:46:55.680 --> 01:47:02.120] So, then it comes down to, you know, I mean, guys, what I'm about to say, I am not an expert, right?
[01:46:59.840 --> 01:47:07.000] I'm thinking off the cuff here, the baby's kind of nutrition changes.
[01:47:07.320 --> 01:47:11.080] I don't think nutrition is a maximal problem here.
[01:47:11.080 --> 01:47:16.840] I mean, there might be other things that the baby would be getting from the mother that could affect its development.
[01:47:16.840 --> 01:47:21.080] And, you know, babies can't breathe until they hit a certain age range.
[01:47:21.080 --> 01:47:23.320] So the baby would, that could be a factor.
[01:47:23.720 --> 01:47:25.080] This is a messy one.
[01:47:25.080 --> 01:47:33.560] I think that today, with proper care, babies are okay if they're born in Steve's preterm timeframe.
[01:47:33.560 --> 01:47:39.320] But this whole thing about like, you know, hunting the coyotes, I just, something rubbs me the wrong way.
[01:47:39.400 --> 01:47:40.680] That one's a fiction.
[01:47:40.680 --> 01:47:41.720] Okay, Evan.
[01:47:41.720 --> 01:47:47.080] Well, all right, the grid.
[01:47:47.080 --> 01:47:51.160] And it not increasing vulnerability to blackouts.
[01:47:51.160 --> 01:47:52.040] I believe that.
[01:47:52.040 --> 01:47:59.960] But this part about reducing their severity when they occur, I'm having a hard time understanding why that's the case.
[01:47:59.960 --> 01:48:03.560] Why would it reduce the severity of the blackout?
[01:48:03.560 --> 01:48:05.320] Yeah, hmm.
[01:48:06.280 --> 01:48:09.960] I'm going to be interested to hear that one if that one turns out to be science.
[01:48:09.960 --> 01:48:12.600] I have a feeling it's kind of fiction-y.
[01:48:12.600 --> 01:48:16.040] Coyotes, I have no idea, thriving in North America.
[01:48:16.040 --> 01:48:16.920] I don't know.
[01:48:16.920 --> 01:48:21.720] And in fact, direct hunting by humans results in larger populations.
[01:48:21.720 --> 01:48:23.000] Why would that be?
[01:48:23.000 --> 01:48:25.720] Because they cluster more?
[01:48:25.720 --> 01:48:28.520] And then is that how that will work?
[01:48:28.600 --> 01:48:29.400] Would work?
[01:48:29.400 --> 01:48:31.160] Because it creates pockets?
[01:48:31.160 --> 01:48:31.720] I don't know.
[01:48:31.720 --> 01:48:33.160] I don't know about that one either.
[01:48:33.320 --> 01:48:34.840] The last one, oh boy.
[01:48:34.840 --> 01:48:38.040] Preterm infants, no significant economic.
[01:48:38.520 --> 01:48:40.760] That one's like the one I know the least about.
[01:48:41.560 --> 01:48:42.760] They're all fiction.
[01:48:42.760 --> 01:48:43.480] Thank you.
[01:48:43.480 --> 01:48:43.880] Okay.
[01:48:44.200 --> 01
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 9: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 3 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
.080 --> 01:47:33.560] I think that today, with proper care, babies are okay if they're born in Steve's preterm timeframe.
[01:47:33.560 --> 01:47:39.320] But this whole thing about like, you know, hunting the coyotes, I just, something rubbs me the wrong way.
[01:47:39.400 --> 01:47:40.680] That one's a fiction.
[01:47:40.680 --> 01:47:41.720] Okay, Evan.
[01:47:41.720 --> 01:47:47.080] Well, all right, the grid.
[01:47:47.080 --> 01:47:51.160] And it not increasing vulnerability to blackouts.
[01:47:51.160 --> 01:47:52.040] I believe that.
[01:47:52.040 --> 01:47:59.960] But this part about reducing their severity when they occur, I'm having a hard time understanding why that's the case.
[01:47:59.960 --> 01:48:03.560] Why would it reduce the severity of the blackout?
[01:48:03.560 --> 01:48:05.320] Yeah, hmm.
[01:48:06.280 --> 01:48:09.960] I'm going to be interested to hear that one if that one turns out to be science.
[01:48:09.960 --> 01:48:12.600] I have a feeling it's kind of fiction-y.
[01:48:12.600 --> 01:48:16.040] Coyotes, I have no idea, thriving in North America.
[01:48:16.040 --> 01:48:16.920] I don't know.
[01:48:16.920 --> 01:48:21.720] And in fact, direct hunting by humans results in larger populations.
[01:48:21.720 --> 01:48:23.000] Why would that be?
[01:48:23.000 --> 01:48:25.720] Because they cluster more?
[01:48:25.720 --> 01:48:28.520] And then is that how that will work?
[01:48:28.600 --> 01:48:29.400] Would work?
[01:48:29.400 --> 01:48:31.160] Because it creates pockets?
[01:48:31.160 --> 01:48:31.720] I don't know.
[01:48:31.720 --> 01:48:33.160] I don't know about that one either.
[01:48:33.320 --> 01:48:34.840] The last one, oh boy.
[01:48:34.840 --> 01:48:38.040] Preterm infants, no significant economic.
[01:48:38.520 --> 01:48:40.760] That one's like the one I know the least about.
[01:48:41.560 --> 01:48:42.760] They're all fiction.
[01:48:42.760 --> 01:48:43.480] Thank you.
[01:48:43.480 --> 01:48:43.880] Okay.
[01:48:44.200 --> 01:48:45.600] And Kara says.
[01:48:45.920 --> 01:48:46.320] Darn it.
[01:48:46.560 --> 01:48:47.600] Automatic failure, but okay.
[01:48:47.600 --> 01:48:49.120] No, that's not an automatic failure.
[01:48:49.120 --> 01:48:50.240] That's an automatic win.
[01:48:50.240 --> 01:48:51.120] It's a forfeit.
[01:48:51.120 --> 01:48:54.400] I'm going to use my get out of jail free card on this episode.
[01:48:54.400 --> 01:48:55.920] Don't we all get one for the year?
[01:48:55.920 --> 01:48:56.320] No.
[01:48:57.280 --> 01:48:57.600] All right.
[01:48:57.600 --> 01:48:59.600] How about the preterm infants one?
[01:49:00.320 --> 01:49:07.360] I can't put my finger on it, but I think there's something, there's maybe an educational effect that lasts into adulthood that they've found.
[01:49:07.360 --> 01:49:09.040] So I'll say that one's the fiction.
[01:49:09.040 --> 01:49:10.080] Okay, Kara.
[01:49:10.080 --> 01:49:12.800] I think coyotes is science.
[01:49:12.800 --> 01:49:13.920] They are thriving.
[01:49:13.920 --> 01:49:16.800] They are everywhere, at least in LA.
[01:49:16.800 --> 01:49:27.520] I think that there are many examples of what's called conservation hunting, where culls are done intentionally to maintain or even grow populations.
[01:49:27.520 --> 01:49:34.240] I think it's about distribution more than anything because animals are not, you know, spread out evenly.
[01:49:34.240 --> 01:49:46.720] I think that the weather-dependent, so like wind and solar energy being in higher numbers, I don't think it would increase vulnerability to blackouts.
[01:49:46.720 --> 01:49:56.960] I know that the problem was with the severity, reducing the severity when they occur, but maybe they're just faster to get online, you know, or maybe they're cheaper and easier to fix, like who knows.
[01:49:56.960 --> 01:50:03.440] So yeah, the one that's really, really bothering me is this idea, because you said a study of preterm infants.
[01:50:03.440 --> 01:50:11.280] So just because 37 weeks is the cutoff doesn't mean that we're not also talking about babies that are born at 25, 26 weeks.
[01:50:11.280 --> 01:50:14.960] Babies that are born early do not have fully developed organs.
[01:50:14.960 --> 01:50:16.000] They're sick.
[01:50:16.000 --> 01:50:17.360] They need surgery.
[01:50:17.360 --> 01:50:19.280] They undergo a lot of treatments.
[01:50:19.280 --> 01:50:24.160] There is no way that that doesn't affect them economically well into adulthood.
[01:50:24.160 --> 01:50:26.720] This one just seems impossible to be science.
[01:50:26.720 --> 01:50:28.720] So I have to say that that's fiction.
[01:50:28.720 --> 01:50:29.120] All right.
[01:50:29.120 --> 01:50:30.840] So you all agree with the first one.
[01:50:29.920 --> 01:50:31.720] So we'll start there.
[01:50:31.880 --> 01:50:42.840] New research finds that higher penetration of weather-dependent renewable energy sources, wind and solar, on the grid does not increase vulnerability to blackouts and reduces their severity when they occur.
[01:50:42.840 --> 01:50:45.000] You all think this one is science.
[01:50:45.000 --> 01:50:48.120] And this one is science.
[01:50:48.120 --> 01:50:48.840] This is science.
[01:50:49.400 --> 01:50:50.120] That's great.
[01:50:50.120 --> 01:51:05.480] Now, this was part of the reason for this study was, I don't know if you guys remember the whole Texas blackout thing where they were blaming the renewable resources on the grid when it was in fact the coal-fired plants or gas plants that were going down.
[01:51:05.720 --> 01:51:17.800] So what they found was that, yeah, that having renewables, weather-dependent renewables on the grid does not make it more vulnerable, even to weather-based events, right?
[01:51:17.800 --> 01:51:26.520] So even if there's a storm or whatever, it doesn't make it more likely for these sources of energy to go down than traditional sources.
[01:51:26.760 --> 01:51:38.040] And when a blackout does occur, the amount of people who lose power is less because this is a more distributed power source and they do come back more quickly.
[01:51:38.040 --> 01:51:43.080] So yeah, it actually lends resiliency, if anything, to the grid.
[01:51:43.640 --> 01:51:44.440] Yeah, it's great.
[01:51:44.680 --> 01:51:46.280] I guess we'll just keep going in order.
[01:51:46.280 --> 01:51:52.040] A recent study finds that coyotes are thriving in North America, and in fact, the direct hunting by humans results in large populations.
[01:51:52.040 --> 01:51:54.760] Bob and Jay, you think this one is the fiction.
[01:51:54.760 --> 01:51:57.160] Evan and Carrie, you think this one is science.
[01:51:57.160 --> 01:52:04.840] And I guess the question comes down to how could directly hunting coyotes increase their population?
[01:52:05.080 --> 01:52:07.720] This wasn't a call to increase their population.
[01:52:07.720 --> 01:52:11.640] This is like when you try to reduce the population by hunting them.
[01:52:11.640 --> 01:52:13.400] And then backfired.
[01:52:13.400 --> 01:52:14.560] And it backfired, yeah.
[01:52:14.560 --> 01:52:15.840] So this is science.
[01:52:15.840 --> 01:52:16.640] This is science.
[01:52:14.280 --> 01:52:18.320] Yes.
[01:52:18.960 --> 01:52:22.080] This one is science because, yeah, this is all true.
[01:52:22.480 --> 01:52:34.400] What they think is happening is that even in locations where they have very liberal coyote hunting laws, meaning like it's open season, like there's no restricted season, there's no limit on how many you can kill.
[01:52:34.400 --> 01:52:46.640] And that when those populations actually, over time, increase, and the reason why they think this is happening is that the hunters are disproportionately killing older coyotes.
[01:52:46.640 --> 01:52:50.640] And then the younger coyotes have more resources and they have more pups.
[01:52:50.640 --> 01:52:52.560] And so they're just breeding more.
[01:52:52.560 --> 01:53:00.000] And so essentially their conclusion was that hunting is not an effective population control mechanism for coyotes.
[01:53:00.000 --> 01:53:02.560] They just bounce right back in even bigger numbers.
[01:53:02.560 --> 01:53:06.400] So we're disproportionately hunting older coyotes because they're slower and easier to kill.
[01:53:06.640 --> 01:53:08.320] I guess they're easier to kill, yeah.
[01:53:09.120 --> 01:53:10.240] Must be the case.
[01:53:10.240 --> 01:53:11.120] Whoops.
[01:53:11.600 --> 01:53:13.360] This study also looked at a lot of other things.
[01:53:13.360 --> 01:53:33.760] So coyotes also do not suffer when their region overlaps with wolves, except for certain, that's very, that's regionally dependent, but in many locations, basically depends on food supply, they do fine even when they're competing with wolves, but they don't do fine if they're competing with bears.
[01:53:34.160 --> 01:53:40.640] Yeah, it's because coyotes and bears, I think, they both hunt, but they also scavenge.
[01:53:40.640 --> 01:53:42.560] Yeah, so that's that's a good point.
[01:53:42.560 --> 01:54:01.160] So there's actually a mixed effect when coyotes overlap with other predators, there's a decrease in prey availability, but especially the larger predators tend to leave behind carcasses that the coyotes can then scavenge.
[01:53:59.840 --> 01:54:03.560] So they actually increase their food supply sometimes.
[01:54:03.640 --> 01:54:11.400] So depending on a lot of variables, sometimes it actually increases the coyote population because now they have more scavenging available.
[01:54:11.720 --> 01:54:19.240] And bears and coyotes also scavenge their garbage dump scavengers.
[01:54:19.640 --> 01:54:22.600] So they come into urban areas and take food.
[01:54:22.600 --> 01:54:25.640] Yeah, but coyotes are a mesopredator.
[01:54:25.640 --> 01:54:26.920] Have you ever heard that term?
[01:54:26.920 --> 01:54:27.640] They're in the middle.
[01:54:27.640 --> 01:54:28.200] They're not small.
[01:54:28.200 --> 01:54:28.760] They're not large.
[01:54:28.760 --> 01:54:29.480] They're meso.
[01:54:29.720 --> 01:54:30.920] And they're just thriving.
[01:54:30.920 --> 01:54:33.640] They're just adapting really well to human civilization.
[01:54:33.640 --> 01:54:35.000] Doing just fine.
[01:54:35.000 --> 01:54:35.400] All right.
[01:54:35.400 --> 01:54:35.720] All right.
[01:54:35.720 --> 01:54:48.040] That means that a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood is the fiction because they found significant economic and developmental effects.
[01:54:48.200 --> 01:54:49.320] And educational effects, yes.
[01:54:49.320 --> 01:54:55.160] So they have lower rates of enrolling in a university, graduating from a university with a degree.
[01:54:55.160 --> 01:54:59.560] They have lower income per year, 17% lower.
[01:54:59.880 --> 01:55:02.280] So yeah, there are lasting effects.
[01:55:02.280 --> 01:55:04.360] It's hard to tease out exactly what that might be.
[01:55:04.360 --> 01:55:09.560] Is this due to just they're not as healthy or maybe the neurological development is delayed?
[01:55:09.560 --> 01:55:17.560] And the reason why this is an important research question is because clearly they are starting at a loss, you know, farther back.
[01:55:18.120 --> 01:55:19.960] They have to make up ground.
[01:55:21.000 --> 01:55:27.240] The question is, do they eventually catch up or do their deficits persist?
[01:55:27.240 --> 01:55:30.920] So this was, they were looking at 18 to 28 year olds.
[01:55:30.920 --> 01:55:35.400] They say they had followed them for a long time, like two couple decades.
[01:55:35.720 --> 01:55:37.640] So this is a long, long follow-up.
[01:55:37.680 --> 01:55:38.240] How long?
[01:55:38.280 --> 01:55:39.560] How did you define preterm?
[01:55:39.560 --> 01:55:39.960] How many?
[01:55:40.120 --> 01:55:41.640] 37 before 37.
[01:55:41.640 --> 01:55:42.200] Or less.
[01:55:42.200 --> 01:55:42.600] Or less.
[01:55:42.600 --> 01:55:44.440] So some of them were a lot less, yeah.
[01:55:44.480 --> 01:55:45.280] 37?
[01:55:44.440 --> 01:55:46.000] 37 weeks.
[01:55:46.240 --> 01:55:46.640] Yeah.
[01:55:46.880 --> 01:55:48.400] And that what's what's typical?
[01:55:48.400 --> 01:55:49.440] 40 what?
[01:55:49.920 --> 01:55:50.560] I was.
[01:55:44.840 --> 01:55:52.160] So, I mean, I was four weeks.
[01:55:52.480 --> 01:55:54.000] I was a whole month premature.
[01:55:54.000 --> 01:55:54.400] Yeah.
[01:55:54.400 --> 01:55:56.960] But it obviously gets worse the more premature you are.
[01:55:56.960 --> 01:56:04.400] So they looked at you know, like 34 to 36 weeks, 32 to 33, 28 to 21, 24 to 27.
[01:56:04.720 --> 01:56:08.800] And the effects got worse, you know, the more preterm you were.
[01:56:08.800 --> 01:56:12.400] Yeah, so it wasn't much for like 20 years, 36.
[01:56:12.400 --> 01:56:12.880] You know, that.
[01:56:13.120 --> 01:56:19.600] So 40 is the average, but a full-term baby is 37 or 37 plus, up to 42.
[01:56:19.600 --> 01:56:20.080] Yeah.
[01:56:20.400 --> 01:56:30.640] So, but, you know, but it is still true that with modern medical management, they do fine, but they are starting, you know, at a deficit.
[01:56:30.640 --> 01:56:34.080] And they don't quite make that up, at least not in their 20s.
[01:56:34.080 --> 01:56:39.040] That sounds preterm sounds like way too broad of a term, at least for this specific science or fiction.
[01:56:39.680 --> 01:56:41.840] But it was true, no matter how you slice it.
[01:56:41.840 --> 01:56:45.200] I was biased because I'm preterm and I'm awesome.
[01:56:46.320 --> 01:56:49.040] But think of how much more awesome you would be about because it's a little bit more.
[01:56:49.200 --> 01:56:50.080] Oh my gosh.
[01:56:52.880 --> 01:56:54.160] That's so mean.
[01:56:54.160 --> 01:56:55.520] Right, fighting for resources.
[01:56:55.760 --> 01:56:57.040] You don't want to think about all that stuff.
[01:56:57.040 --> 01:56:57.680] You were a twin.
[01:56:57.680 --> 01:57:02.240] Twins also are at a deficit because your twin was using up some nutritional resources.
[01:57:02.800 --> 01:57:05.120] So you overcame multiple options.
[01:57:05.120 --> 01:57:08.080] And our mother smoked when she was pregnant with all of us.
[01:57:08.080 --> 01:57:08.800] Damn.
[01:57:09.040 --> 01:57:10.240] And that takes a hit as well.
[01:57:10.240 --> 01:57:11.360] And we were exposed when we were kids.
[01:57:11.920 --> 01:57:12.800] Literally takes a hit.
[01:57:12.800 --> 01:57:14.720] So we had a sewer.
[01:57:14.720 --> 01:57:15.200] Yeah.
[01:57:15.200 --> 01:57:16.160] She wasn't a drinker.
[01:57:16.880 --> 01:57:18.160] My parents weren't drinkers.
[01:57:18.560 --> 01:57:21.120] They were not drinkers, but they, yeah, they were smoking.
[01:57:21.200 --> 01:57:21.440] Terrible.
[01:57:21.520 --> 01:57:22.320] They did all that meth.
[01:57:22.560 --> 01:57:23.360] Hated it.
[01:57:23.840 --> 01:57:24.640] Hated it.
[01:57:24.640 --> 01:57:27.600] The worst thing about my childhood was having to deal with that.
[01:57:27.600 --> 01:57:28.160] Yeah.
[01:57:28.160 --> 01:57:29.040] The worst.
[01:57:29.040 --> 01:57:29.640] Yeah.
[01:57:29.200 --> 01:57:33.240] I was like this one annoying constant presence in my childhood.
[01:57:29.520 --> 01:57:35.480] Otherwise, we had, I think our childhood was great.
[01:57:36.280 --> 01:57:37.640] Barely was a blip on my radar.
[01:57:37.720 --> 01:57:38.360] Yeah, I hated it.
[01:57:38.360 --> 01:57:39.400] Totally hated it.
[01:57:39.400 --> 01:57:40.840] All right, Evan, give us a quote.
[01:57:40.840 --> 01:57:53.640] When you get in a tight place and everything goes against you till it seems as though you could not hold on a minute longer, never give up then, for that is just the time and the place that the tide will turn.
[01:57:53.640 --> 01:57:57.560] Very comforting words from Harriet Beecher Stowe.
[01:57:57.560 --> 01:57:57.800] Yeah.
[01:57:58.120 --> 01:58:06.680] American abolitionist and author from the 19th century, and so key and critical to helping end slavery in America.
[01:58:06.920 --> 01:58:11.240] Can't be understated how important she was, overstated how important she was.
[01:58:11.240 --> 01:58:13.800] Political change is a never-ending marathon.
[01:58:13.800 --> 01:58:14.280] Right.
[01:58:14.280 --> 01:58:17.640] So keep this in mind, folks, especially these days.
[01:58:17.720 --> 01:58:17.960] Right.
[01:58:17.960 --> 01:58:19.000] Thanks, Evan.
[01:58:19.000 --> 01:58:19.720] Yep.
[01:58:19.720 --> 01:58:21.720] Well, thank you all for joining me this week.
[01:58:21.720 --> 01:58:22.520] You're welcome, Steve.
[01:58:22.520 --> 01:58:23.080] Thanks, Steve.
[01:58:23.240 --> 01:58:24.040] Thanks, Steve.
[01:58:24.040 --> 01:58:27.880] And until next week, this is your Skeptic's Guide to the Universe.
[01:58:30.200 --> 01:58:36.920] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:58:36.920 --> 01:58:41.560] For more information, visit us at the skepticsguide.org.
[01:58:41.560 --> 01:58:45.480] Send your questions to info at the skepticsguide.org.
[01:58:45.480 --> 01:58:56.120] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:58:56.120 --> 01:58:59.800] Our listeners and supporters are what make SGU possible.
Prompt 10: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 11: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs, who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.440] Learn more at aarp.org/slash skills.
[00:00:33.280 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.360] Your escape to reality.
[00:00:40.000 --> 00:00:42.880] Hello, and welcome to the Skeptic's Guide to the Universe.
[00:00:42.880 --> 00:00:47.520] Today is Wednesday, November 6th, 2024, and this is your host, Stephen Novella.
[00:00:47.520 --> 00:00:49.200] Joining me this week are Bob Novella.
[00:00:49.200 --> 00:00:49.760] Hey, everybody.
[00:00:49.760 --> 00:00:50.960] Kara Santa Maria.
[00:00:50.960 --> 00:00:51.440] Howdy.
[00:00:51.440 --> 00:00:52.240] Jay Novella.
[00:00:52.240 --> 00:00:52.880] Hey, guys.
[00:00:52.880 --> 00:00:54.400] And Evan Bernstein.
[00:00:54.400 --> 00:00:55.440] Good evening, everyone.
[00:00:55.440 --> 00:00:58.320] Let's see what's going on in the world today.
[00:00:58.640 --> 00:01:01.360] Oh, I think it's a waxing crescent moon out.
[00:01:02.240 --> 00:01:02.880] It's warm.
[00:01:02.880 --> 00:01:03.120] Thanks.
[00:01:03.280 --> 00:01:03.920] Very warm.
[00:01:03.920 --> 00:01:05.200] It's unseasonably warm out there.
[00:01:05.600 --> 00:01:10.080] Oh, in Connecticut, it's been just beyond 20 degrees above normal temperature.
[00:01:10.080 --> 00:01:11.280] Oh, yeah, and Trump won the election.
[00:01:12.480 --> 00:01:12.800] Oh, yeah.
[00:01:12.960 --> 00:01:13.680] I forgot about that.
[00:01:13.680 --> 00:01:14.480] Oh, Halloween happened.
[00:01:15.280 --> 00:01:16.240] I didn't forget about that.
[00:01:16.240 --> 00:01:17.040] That was pretty awesome.
[00:01:18.000 --> 00:01:18.800] Thanksgiving coming up.
[00:01:19.920 --> 00:01:20.960] Yeah, some things happened.
[00:01:21.360 --> 00:01:24.080] Yeah, it was like a random assortment of miscellaneous stuff.
[00:01:24.080 --> 00:01:26.880] So, listen, you know, obviously, we're recording this the day after the election.
[00:01:26.880 --> 00:01:30.400] Trump, we just found out, you know, whatever, half a day ago that Trump won.
[00:01:30.400 --> 00:01:31.520] We're still processing it.
[00:01:31.520 --> 00:01:33.760] This is sort of a big thing to wrap your head around.
[00:01:33.760 --> 00:01:34.400] We're not happy.
[00:01:34.400 --> 00:01:36.000] We're not going to pretend we're happy.
[00:01:36.000 --> 00:01:38.240] There's a great deal to be concerned about.
[00:01:38.240 --> 00:01:39.920] He thinks global warming is a hoax.
[00:01:39.920 --> 00:01:43.280] He wants to roll back anything that we're doing about it.
[00:01:43.280 --> 00:01:46.320] He's going to probably put an anti-vaxxer in charge of our health care.
[00:01:46.320 --> 00:01:50.080] I mean, there's a lot to be concerned about, but we're not going to talk about the politics of this.
[00:01:50.080 --> 00:01:56.400] Obviously, there's a lot of scientific and skeptical considerations that we'll be exploring this episode and in the future.
[00:01:56.400 --> 00:01:57.760] But we're going to move on.
[00:01:57.760 --> 00:01:59.040] So, a lot actually happened.
[00:01:59.040 --> 00:02:06.360] It's been two weeks since we've recorded a show, and we went to SciCon last weekend.
[00:01:59.680 --> 00:02:07.160] That was a lot of fun.
[00:02:07.480 --> 00:02:07.960] That was great.
[00:02:08.440 --> 00:02:09.880] Yeah, we had a nice time.
[00:02:09.880 --> 00:02:13.560] Skeptical events are always recharging for me, like getting together.
[00:02:13.560 --> 00:02:15.800] We got to see some of our old friends.
[00:02:15.800 --> 00:02:25.880] I gave a talk on controversies within the skeptical group, things that skeptics disagree with each other about, because I think we should focus on that more at these conferences.
[00:02:25.880 --> 00:02:31.800] We should be working out with each other, you know, in a positive, constructive way, the things that we disagree about.
[00:02:31.800 --> 00:02:36.760] Because while you still learn some stuff, it gets kind of boring talking about the stuff we all agree on over and over again.
[00:02:36.760 --> 00:02:37.400] You know what I mean?
[00:02:37.640 --> 00:02:39.800] And then there's this like big elephant in the room.
[00:02:39.800 --> 00:02:40.280] Right, right.
[00:02:40.280 --> 00:02:42.600] And everyone's like, don't talk about the thing we can't talk about.
[00:02:42.600 --> 00:02:44.680] No, I was like, I just talked about it.
[00:02:44.680 --> 00:02:45.080] Oh, gosh.
[00:02:45.960 --> 00:02:47.240] That's when you should talk about it.
[00:02:47.400 --> 00:02:47.640] Exactly.
[00:02:48.040 --> 00:02:49.080] That was an amazing talk.
[00:02:49.240 --> 00:02:51.960] The best one I've ever heard, Steve, that you've done.
[00:02:51.960 --> 00:02:53.240] Standing ovation.
[00:02:53.240 --> 00:02:56.920] Well, certainly the few accolades and tears afterwards by people.
[00:02:56.920 --> 00:02:59.240] It was amazing to just experience.
[00:02:59.400 --> 00:03:02.360] Well, people were clearly hungry to address that issue, you know.
[00:03:02.360 --> 00:03:05.320] Yes, avoiding it is not the way to do it.
[00:03:05.800 --> 00:03:09.480] Part of the talk was on the biological sex controversy.
[00:03:09.480 --> 00:03:15.800] If you want to read about it, I actually blogged about it on Neurologica because there was a bit of disagreement, which, again, I think is healthy and good.
[00:03:15.800 --> 00:03:17.400] But you can read about that there.
[00:03:17.400 --> 00:03:19.400] We'll probably be putting out a video at some point.
[00:03:19.400 --> 00:03:25.160] SciCon will be putting out just recordings of the talks at some point.
[00:03:25.160 --> 00:03:29.000] So I do think we need to explore these issues.
[00:03:29.000 --> 00:03:39.880] And I think, you know, we can, if, hey, if there's a if there's a group of people who should be able to disagree politely and constructively and focus on logic and evidence, it's us, right?
[00:03:39.880 --> 00:03:43.160] I mean, if we can't do it, who how can we expect anybody to do it?
[00:03:43.160 --> 00:03:44.520] We have to practice what we preach.
[00:03:44.520 --> 00:03:44.920] You know what I'm saying?
[00:03:45.680 --> 00:03:46.240] I agree.
[00:03:46.240 --> 00:03:46.640] Holy.
[00:03:46.640 --> 00:03:47.600] Yeah, absolutely.
[00:03:47.600 --> 00:03:50.160] I mean, it's look, not gonna lie, it's hard.
[00:03:50.160 --> 00:03:51.200] You know, it's hard.
[00:03:51.200 --> 00:03:52.480] We've been doing this a long time.
[00:03:52.480 --> 00:03:58.880] We've been trying to teach people critical thinking, and sometimes the world doesn't bend the way that you'd like it to.
[00:03:58.880 --> 00:04:00.400] And it's quite all used.
[00:04:00.800 --> 00:04:04.080] And it's good to remind ourselves of a lot of these things as well.
[00:04:04.080 --> 00:04:05.440] Yeah, I mean, that was one of the themes as well.
[00:04:05.440 --> 00:04:06.960] It was an exercise in humility.
[00:04:06.960 --> 00:04:10.480] It's like, look, we disagree pretty much along political lines.
[00:04:10.480 --> 00:04:15.440] As much as we like to pretend we are trans partisan or whatever, we're not.
[00:04:15.440 --> 00:04:20.480] You know, we are just as susceptible to ideological bias as anybody else.
[00:04:20.480 --> 00:04:23.360] We just have to be more aware of it and confront it.
[00:04:23.360 --> 00:04:24.000] You know what I mean?
[00:04:24.000 --> 00:04:32.800] That's the whole metacognition angle, not ignore it or pretend like the other side is the only side that's ideologically biased and your side isn't.
[00:04:33.440 --> 00:04:35.360] But motivated reasoning is so powerful.
[00:04:35.360 --> 00:04:42.080] I think that's one of the hardest things, even for seasoned, knowledgeable skeptics to deal with.
[00:04:42.080 --> 00:04:43.120] Because we're good at it.
[00:04:43.120 --> 00:04:56.560] We're good at motivated reasoning, ironically, because people who are able to argue in a sophisticated way and put forth a nuanced argument and have a lot of knowledge at their fingertips are actually better at motivated reasoning.
[00:04:56.560 --> 00:04:58.800] And they're more confident in it as well.
[00:04:59.200 --> 00:05:01.760] So that we have to be very aware.
[00:05:01.760 --> 00:05:04.400] That's like our big Achilles heel in the movement.
[00:05:04.400 --> 00:05:05.600] We have to be very aware of it.
[00:05:05.600 --> 00:05:06.720] Yeah, so that was fun.
[00:05:06.720 --> 00:05:09.280] I think also kind of understanding.
[00:05:09.280 --> 00:05:13.360] Oftentimes we talk about causation and the directionality of causation.
[00:05:13.360 --> 00:05:16.160] It's such an important part of science literacy.
[00:05:16.160 --> 00:05:26.880] And I think we have to remember it when it comes to our own moral standings and our own decision-making around things like policy.
[00:05:27.360 --> 00:05:39.240] Because very often, we may think that something is a decision or an ideology that we came to via evidence when actually we came to it because, well, I agree with them about this.
[00:05:39.240 --> 00:05:41.480] Why wouldn't I agree with them about that?
[00:05:42.200 --> 00:05:51.640] You know, is my position based on my investigation, or am I open to somebody telling me what my position should be?
[00:05:51.640 --> 00:05:54.120] Because, yeah, we're on the same team.
[00:05:54.760 --> 00:06:11.880] That was one of the controversies I've referred to: the idea, the question of can all even ethical and moral questions be resolved with science, or does it require philosophy and morality and ethics, you know, separate from science?
[00:06:12.120 --> 00:06:17.080] I tend to align myself with the philosophers in the movement on this issue.
[00:06:17.080 --> 00:06:23.240] I've engaged with that discussion, and I think the philosophers absolutely crush it.
[00:06:23.240 --> 00:06:27.480] You know, when they come up against the other people, like, we don't need to do philosophy, philosophy is irrelevant to science.
[00:06:27.560 --> 00:06:29.480] Like, no, you're doing it, whether you know it or not.
[00:06:29.480 --> 00:06:30.200] You are doing it.
[00:06:30.200 --> 00:06:31.960] You're just doing it wrong.
[00:06:33.560 --> 00:06:36.600] And then you guys went home, and I went to Dubai.
[00:06:36.600 --> 00:06:39.560] It was the first time in an Arabic country.
[00:06:39.560 --> 00:06:42.360] It was very, you know, it was a good experience.
[00:06:42.360 --> 00:06:44.680] So, Dubai is a very interesting city.
[00:06:44.680 --> 00:06:46.440] It's very wealthy.
[00:06:46.440 --> 00:06:48.440] It's an international city.
[00:06:48.440 --> 00:06:50.120] There was English everywhere.
[00:06:50.120 --> 00:06:55.000] The only one touristy thing I did was entirely in English.
[00:06:55.240 --> 00:06:59.080] But there's this layer of Arabic Muslim culture there as well.
[00:06:59.080 --> 00:07:02.760] So it's kind of like, you're in an alternate universe in a sci-fi movie.
[00:07:02.760 --> 00:07:03.240] You know what I mean?
[00:07:03.240 --> 00:07:04.760] Where everything's the same but different?
[00:07:04.760 --> 00:07:05.080] Yeah.
[00:07:05.640 --> 00:07:11.800] So I gave what turned out to be a nine-hour seminar on scientific skepticism.
[00:07:11.800 --> 00:07:12.760] It's not a lot of time.
[00:07:12.760 --> 00:07:14.520] It went by super fast.
[00:07:14.520 --> 00:07:14.760] Really?
[00:07:15.040 --> 00:07:18.800] I had the perfect audience because they were very smart, very engaged.
[00:07:18.800 --> 00:07:23.680] They knew why they were there and they wanted something out of the conference, but this was all new to them.
[00:07:23.680 --> 00:07:24.800] Oh, yeah, that's exciting.
[00:07:24.800 --> 00:07:25.280] Yeah.
[00:07:25.280 --> 00:07:26.880] So they asked tons of questions.
[00:07:26.880 --> 00:07:29.280] Like every question was anticipating a future slides.
[00:07:29.360 --> 00:07:30.640] Like, that's a good question.
[00:07:30.640 --> 00:07:31.360] We're going to get to that.
[00:07:31.360 --> 00:07:32.240] Or let's talk about that.
[00:07:33.200 --> 00:07:43.680] And you realize when you're doing it, I didn't, I realized when I was writing it, but especially when I'm giving it, what a massive body of knowledge we have amassed under the umbrella of scientific skepticism.
[00:07:43.680 --> 00:07:45.360] There is so much to talk about.
[00:07:45.360 --> 00:07:49.520] Nine hours was nothing, it was scratching the surface.
[00:07:49.840 --> 00:07:54.400] I can't talk nine hours about anything, even if it was counting numbers.
[00:07:54.400 --> 00:07:55.680] I couldn't do it.
[00:07:56.320 --> 00:07:58.480] Well, Bob, I wasn't giving a nine-hour lecture.
[00:07:58.480 --> 00:07:59.360] You have to engage.
[00:07:59.360 --> 00:08:00.480] It was asking questions.
[00:08:00.480 --> 00:08:01.760] They were asking me questions.
[00:08:01.760 --> 00:08:02.880] We were doing demonstrations.
[00:08:02.880 --> 00:08:04.640] I did all the psychological stuff that we do.
[00:08:04.640 --> 00:08:05.280] You know what I mean?
[00:08:05.280 --> 00:08:06.880] It was dynamic.
[00:08:07.120 --> 00:08:07.920] The Socratic method.
[00:08:08.000 --> 00:08:09.280] It wasn't me blabbering for nine hours.
[00:08:09.600 --> 00:08:11.120] Assuming you had breaks, too.
[00:08:11.120 --> 00:08:12.320] And there were breaks in there, yeah.
[00:08:13.200 --> 00:08:16.640] There's three, three-hour sessions with a break in the middle of each session.
[00:08:16.640 --> 00:08:21.600] So I did what I usually do at these longer, longer classes that I give.
[00:08:21.920 --> 00:08:35.840] I gave a quiz that I've come up with, which is a quiz focusing on scientific literacy, critical thinking skills, media savvy, and just like belief in the paranormal, you know, kind of stuff.
[00:08:35.840 --> 00:08:37.360] 30 questions.
[00:08:37.520 --> 00:08:45.760] They were able to do the questions on their phones, and then I was able to get, you know, they did it all in Google Forms, so you get like a chart, pie chart of all the answers.
[00:08:46.000 --> 00:08:46.720] Yeah, everybody, yeah.
[00:08:46.720 --> 00:08:50.480] Yeah, so then we went over them at the end of the seminar, and it was a lot of fun.
[00:08:50.480 --> 00:08:55.120] But what was really interesting is that they basically were average.
[00:08:56.640 --> 00:09:02.280] Their results were aligned with a lot of published surveys about these questions.
[00:09:02.440 --> 00:09:07.320] Like one of the questions was: did humans and non-avian dinosaurs live at the same time?
[00:09:07.320 --> 00:09:09.640] 60% of them got that question wrong.
[00:09:10.440 --> 00:09:14.280] Oh, yeah, so we're all science nerds, right?
[00:09:14.280 --> 00:09:26.840] So, like, to us, these questions are blazingly obvious, but to even well-educated, these are all like CEOs, very power, you know, very high-end, high-achieving business people, but they're not science nerds.
[00:09:26.840 --> 00:09:27.720] So very few of them were.
[00:09:27.720 --> 00:09:29.400] There's a couple engineers thrown in there.
[00:09:29.720 --> 00:09:33.560] And you just realized how compartmentalized knowledge can be, you know?
[00:09:33.560 --> 00:09:34.040] Yeah.
[00:09:34.040 --> 00:09:37.080] Part of the whole theme of the talk was to be humbling.
[00:09:37.400 --> 00:09:43.480] It's like, yeah, even you, you know, the whole idea, like even though you might be very successful in one area, you don't know everything.
[00:09:43.480 --> 00:09:44.840] No one's an expert in everything.
[00:09:44.840 --> 00:09:48.520] And that was part of the demonstration of that fact.
[00:09:49.560 --> 00:09:50.920] You still have a brain.
[00:09:50.920 --> 00:09:51.960] You still have a human brain.
[00:09:51.960 --> 00:09:52.840] What's that, Bob?
[00:09:52.840 --> 00:09:53.720] What was the name of the talk?
[00:09:53.720 --> 00:09:55.080] You don't know Squat?
[00:09:55.080 --> 00:09:59.800] No, it was just, well, the whole theme of the conference was futurism, actually.
[00:09:59.800 --> 00:10:04.440] So it was like science, critical thinking, and the future, or something is what I called it.
[00:10:04.440 --> 00:10:07.160] But I did a little futurism at the end as well, like from our book.
[00:10:07.160 --> 00:10:07.720] Oh, damn.
[00:10:07.720 --> 00:10:08.040] Yeah.
[00:10:08.040 --> 00:10:08.680] Yeah, it was good.
[00:10:08.680 --> 00:10:09.640] It was very good.
[00:10:09.640 --> 00:10:12.680] Oh, so it wasn't specifically a skeptical conference.
[00:10:12.680 --> 00:10:13.640] No, it wasn't.
[00:10:13.960 --> 00:10:15.880] It was really, and it's month-long.
[00:10:15.880 --> 00:10:19.480] It was a month-long conference where people were coming and going, you know.
[00:10:19.480 --> 00:10:24.600] And it was just a series of lectures and seminars and classes on a bunch of topics.
[00:10:24.600 --> 00:10:31.800] The guy who is the dean of the Dubai Future Foundation, Muhammad Qassem, he's a big fan of the show.
[00:10:31.800 --> 00:10:33.080] He's been listening to us for years.
[00:10:33.080 --> 00:10:46.320] He's the one who invited me there, and he wants to inject skepticism into the United Arab Emirates and the whole region, not just the UAE, not just Dubai, but the whole into the whole culture.
[00:10:44.520 --> 00:10:49.600] We may all get invited there in the future.
[00:10:50.480 --> 00:10:53.440] He made it sound like this is the beginning of our relationship.
[00:10:54.000 --> 00:10:55.360] This is sort of a building kind of thing.
[00:10:55.360 --> 00:10:55.760] I love it.
[00:10:55.920 --> 00:10:56.240] Nice.
[00:10:56.240 --> 00:10:56.720] Nice.
[00:10:57.520 --> 00:11:03.120] That's why it's great to like, because we've obviously been operating within our own culture for 30 years.
[00:11:03.120 --> 00:11:10.000] It's nice to go somewhere else where there isn't an organized skeptical movement.
[00:11:10.160 --> 00:11:11.760] It's just virgin territory.
[00:11:11.760 --> 00:11:12.880] It's like, oh, my God.
[00:11:12.880 --> 00:11:14.080] I could be packed in 20 minutes.
[00:11:14.080 --> 00:11:15.280] It's just insane.
[00:11:15.920 --> 00:11:17.680] It's a 15-hour flight, just warning you.
[00:11:18.080 --> 00:11:20.000] Bob, you have to talk for nine hours about something.
[00:11:20.480 --> 00:11:22.880] But the food, the food was incredible.
[00:11:22.880 --> 00:11:25.680] I'm a fan of Middle Eastern food.
[00:11:25.840 --> 00:11:27.440] Not as if that's a monolithic thing.
[00:11:28.000 --> 00:11:31.040] It's like I had Iraqi food, never had Iraqi food before.
[00:11:31.040 --> 00:11:34.880] It was Middle Eastern food, same spice palate, same kind of vibe, but it was different.
[00:11:34.880 --> 00:11:36.320] It was everything was just cool.
[00:11:36.640 --> 00:11:38.800] Nothing I've ever had before was awesome.
[00:11:38.800 --> 00:11:40.000] It was incredible.
[00:11:40.000 --> 00:11:41.360] Anyway, that was a lot of fun.
[00:11:41.360 --> 00:11:46.000] Yeah, I hope I expect we will be going there sometime in the future.
[00:11:46.000 --> 00:11:47.920] Just please not October.
[00:11:48.240 --> 00:11:49.520] No promises.
[00:11:49.520 --> 00:11:54.080] But it was really interesting how similar it was.
[00:11:54.240 --> 00:11:59.120] I had conversations with people who were from the UAE.
[00:11:59.280 --> 00:12:03.680] I talked to somebody who was heavily involved in the power industry.
[00:12:04.400 --> 00:12:06.320] I could have been talking to somebody here.
[00:12:06.320 --> 00:12:07.760] It was really no different.
[00:12:07.760 --> 00:12:14.240] They had all the same issues, all the same information, as far as that was concerned, pretty much the same attitudes.
[00:12:14.240 --> 00:12:18.160] I was kind of surprised, but it's interesting how similar things were.
[00:12:18.160 --> 00:12:21.360] But I think that's because it's an international city.
[00:12:21.360 --> 00:12:33.160] And also, they are looking to, I think they're looking to elevate the UAE as a modern technological sort of state and Dubai as an international, modern city.
[00:12:33.160 --> 00:12:36.280] So they're very interested in that sort of thing.
[00:12:29.840 --> 00:12:37.160] Hey, George is here.
[00:12:37.320 --> 00:12:38.360] George, how are you doing, man?
[00:12:38.360 --> 00:12:38.840] George.
[00:12:38.840 --> 00:12:39.640] Hi, everybody.
[00:12:40.440 --> 00:12:41.080] What's going on?
[00:12:41.080 --> 00:12:41.800] What's going on?
[00:12:42.200 --> 00:12:45.240] Are you as excited as I am for December to get here?
[00:12:45.240 --> 00:12:46.600] Can we get the holidays happening?
[00:12:46.600 --> 00:12:49.800] Can we get the vibe of the holidays please happening?
[00:12:50.120 --> 00:12:54.280] I think we need to celebrate Pearl Harbor Day with a special event or two.
[00:12:55.720 --> 00:12:56.600] Perfect.
[00:12:56.600 --> 00:13:02.200] So we've got this monster day, December 7th, in Washington, D.C., of all places.
[00:13:02.200 --> 00:13:03.320] Are you ready for it?
[00:13:03.320 --> 00:13:04.360] I hope you're ready for it.
[00:13:04.760 --> 00:13:05.560] I hope you're ready.
[00:13:05.560 --> 00:13:09.480] Steve and I are working on supplementing the SGU swag stuff.
[00:13:09.480 --> 00:13:17.720] So if you're going to be there in particular, the best and pretty much the only time that you're going to get good SGU swag is if you come to a live event because we bring all the premium stuff.
[00:13:17.720 --> 00:13:20.040] George, the question is: are you excited?
[00:13:20.040 --> 00:13:25.000] We rarely do a big double show in one day like this.
[00:13:25.000 --> 00:13:39.240] And what's so great about doing both a private show earlier in the day and then doing the extravaganza at night is we get so wonked out, giddy, funny, silly, almost like skeptic drunk by the end of it.
[00:13:39.400 --> 00:13:40.840] It's a long day.
[00:13:40.840 --> 00:13:42.680] And those are the best shows.
[00:13:42.680 --> 00:13:49.720] So if you're going to go to one show this year, go to the extravaganza, go to the private show that's happening on the 7th because it's going to be this marathon day.
[00:13:49.720 --> 00:13:51.240] It is such a blast.
[00:13:51.240 --> 00:13:56.920] And what's super cool is this is one of our very special holiday-themed extravaganza shows.
[00:13:56.920 --> 00:13:58.840] We've only done this one other time.
[00:13:58.840 --> 00:14:00.200] I think it was in Arizona, right?
[00:14:00.440 --> 00:14:01.440] I think, yes, yes.
[00:14:01.440 --> 00:14:05.160] During that Arizona show, we had the famous Yukon Cornelius incident.
[00:14:05.160 --> 00:14:05.720] Do you remember that?
[00:14:05.880 --> 00:14:06.440] Of course, I do.
[00:14:07.080 --> 00:14:10.440] Oh my gosh, one of the biggest laughs of all time.
[00:14:10.440 --> 00:14:11.480] Ask us about that sometime.
[00:14:11.480 --> 00:14:12.600] If you see us, ask us.
[00:14:12.600 --> 00:14:14.200] We'll tell you in D.C.
[00:14:14.200 --> 00:14:18.080] about the Yukon Cornelius incident that happened in Arizona.
[00:14:18.080 --> 00:14:18.880] It was fantastic.
[00:14:14.920 --> 00:14:20.320] We'll try to recreate that on some level.
[00:14:20.960 --> 00:14:29.120] It's also important to say, guys, that the private show is a private show plus, which means there's an extra hour of content.
[00:14:29.120 --> 00:14:30.880] Basically, we do audience interaction.
[00:14:30.880 --> 00:14:32.160] It's different every time.
[00:14:32.160 --> 00:14:33.040] Yeah, we've done it.
[00:14:33.040 --> 00:14:34.480] We did a scavenger hunt once.
[00:14:34.480 --> 00:14:36.080] We did a bunch of trivia things once.
[00:14:36.080 --> 00:14:37.520] We did like some song stuff once.
[00:14:37.680 --> 00:14:39.600] Every time it's different, every time it's special.
[00:14:39.600 --> 00:14:43.600] So even if you've done this before, come to it again because it's guaranteed to be different.
[00:14:43.600 --> 00:14:45.200] Same thing with the extravaganzas.
[00:14:45.200 --> 00:14:50.320] Every extravaganza, the framework is the same, but there's so much improv and incidental stuff that happens.
[00:14:50.320 --> 00:14:51.840] Every show is different.
[00:14:51.840 --> 00:14:58.160] We literally had people going to two shows in a row when we did this back a couple months ago, and they fully enjoyed both shows.
[00:14:58.880 --> 00:15:04.800] I love doing those shows because we've had weekends where we've done like six shows in three days.
[00:15:04.800 --> 00:15:13.520] Like we've had some pretty serious tours that we've been on, but I really like it when all of us have to really work together to make it all happen.
[00:15:13.520 --> 00:15:17.200] And it's such a feel-good when we do the shows and we finish them.
[00:15:17.200 --> 00:15:19.840] It's like, oh my God, guys, it was so awesome, right?
[00:15:19.840 --> 00:15:21.520] And then we get to see everybody.
[00:15:21.520 --> 00:15:24.800] You know, for most of our lives, we're sitting here talking to microphones in our basements.
[00:15:24.800 --> 00:15:30.240] So it's kind of nice to see people's faces and hear everybody laugh and interact and enjoy themselves.
[00:15:30.240 --> 00:15:34.960] There's a lot of audience participation that happens at the extravaganza as well as the private show.
[00:15:34.960 --> 00:15:36.160] So come on out.
[00:15:36.880 --> 00:15:40.160] Talk about a great gift for someone who likes the show.
[00:15:40.160 --> 00:15:41.600] I mean, could you think of a better presentation?
[00:15:41.680 --> 00:15:45.920] And, George, everyone should know that you are one funny bastard.
[00:15:47.520 --> 00:15:48.240] It's true.
[00:15:48.560 --> 00:15:49.920] I just had my license renewed.
[00:15:50.320 --> 00:15:51.720] Yeah, funny bachelor license renewed.
[00:15:51.760 --> 00:15:52.000] That's great.
[00:15:52.160 --> 00:15:55.520] My funny bastard license renewed, freshened.
[00:15:55.520 --> 00:15:57.440] I got the new test taken for that.
[00:15:57.440 --> 00:15:58.480] So, yeah, all good.
[00:15:58.480 --> 00:15:59.280] No, we're excited.
[00:15:59.280 --> 00:15:59.680] I'm excited.
[00:15:59.680 --> 00:16:03.960] I hope you guys are as excited as I am because that means that you'll be equally as excited.
[00:15:59.840 --> 00:16:06.360] And then the equation's all even and good.
[00:16:06.680 --> 00:16:09.880] Go to the skepticsguide.org.
[00:16:09.880 --> 00:16:13.560] And Ian has put these special buttons on there that you can click.
[00:16:13.560 --> 00:16:17.400] And if you're interested in either show, you just click the button that you like.
[00:16:17.400 --> 00:16:21.800] And also, don't forget, I do think there's some VIP tickets left for the extravaganza.
[00:16:22.440 --> 00:16:27.960] And also, let's not, while we have George on, let's plug one more thing because it's actually a huge deal.
[00:16:27.960 --> 00:16:29.000] Not a con?
[00:16:29.000 --> 00:16:29.400] Yes.
[00:16:29.400 --> 00:16:31.000] We have Nauticon coming up.
[00:16:31.000 --> 00:16:33.880] This is, yes, the weekend of May 15th.
[00:16:33.880 --> 00:16:36.120] It's going to be in White Plains, New York.
[00:16:36.120 --> 00:16:39.800] And let me tell you, we had a goddamn awesome time.
[00:16:39.800 --> 00:16:40.440] It was epic.
[00:16:40.600 --> 00:16:41.080] Fabulous time.
[00:16:41.240 --> 00:16:45.320] It really was the best conference I've ever been to, and it was the best conference I was involved with.
[00:16:45.320 --> 00:16:48.280] And real quick, I must have said this story, but I'll just say it really quick.
[00:16:48.280 --> 00:16:50.840] There was this one guy that came up to us and said, you know, I came here.
[00:16:50.840 --> 00:16:51.800] I didn't have any friends.
[00:16:51.800 --> 00:16:52.600] I didn't know anyone.
[00:16:52.600 --> 00:16:54.040] I was almost not going to come.
[00:16:54.040 --> 00:16:54.920] And I said, what the hell?
[00:16:54.920 --> 00:16:56.760] I got to force myself to do it.
[00:16:56.760 --> 00:17:04.600] And we have an event-wide puzzle that you have to live through the entire event to be able to figure out the entirety of this puzzle.
[00:17:04.600 --> 00:17:11.960] And he like picks up the puzzle sheet and he said, the next thing you know, I'm sitting down with like six people I don't know and we're hanging out like we're friends.
[00:17:11.960 --> 00:17:12.680] And that was it.
[00:17:12.680 --> 00:17:13.880] He was bingo.
[00:17:13.880 --> 00:17:15.320] Yeah, he was hooked in.
[00:17:15.800 --> 00:17:16.920] The magic of Nauticon.
[00:17:17.080 --> 00:17:18.360] But isn't that what we saw, guys?
[00:17:18.360 --> 00:17:26.280] I mean, you know, we've known for a very long time that we've curated an amazing group of listeners and patrons.
[00:17:26.280 --> 00:17:35.720] But this group of people, man, I got to tell you, like, I feel like I have a lot of friendships with the people that come, and there's a lot of love and a lot of affection and a lot of awesome stuff.
[00:17:35.720 --> 00:17:38.040] So, if you're interested to come have a great time.
[00:17:38.040 --> 00:17:40.680] You know, this conference is about socializing.
[00:17:41.000 --> 00:17:45.600] If you're interested, you can go to naughticon.com.
[00:17:45.600 --> 00:17:48.000] Remember, George, that Ian made this weird URL?
[00:17:43.640 --> 00:17:49.200] NaticonCon.com.
[00:17:49.280 --> 00:17:50.320] NatakonCon.
[00:17:44.920 --> 00:17:51.280] Check your con, check your con.
[00:17:51.440 --> 00:17:54.160] Or you go to our homepage and just click on the natacon button.
[00:17:54.160 --> 00:17:55.360] Yeah, there's a natacon button there.
[00:17:55.840 --> 00:17:57.360] Natacon 2.
[00:17:57.360 --> 00:17:58.480] Skeptical.
[00:17:58.480 --> 00:17:59.040] Mystery.
[00:17:59.120 --> 00:18:01.280] And, George, what's happening at Natacon this year?
[00:18:01.280 --> 00:18:02.800] My God, what's happening at Specon?
[00:18:02.960 --> 00:18:03.920] I can't even keep track of it.
[00:18:04.000 --> 00:18:05.920] We have so many fantastic ideas happening.
[00:18:05.920 --> 00:18:08.080] We've got this kind of Beatles theme that's going through.
[00:18:08.080 --> 00:18:12.720] We might have like a special super game show, another game show, more audience participation.
[00:18:12.720 --> 00:18:18.320] We're going to have SG University, where you get to learn stuff in 15-minute chunks from all these experts on stage.
[00:18:18.560 --> 00:18:20.160] It's just overwhelmingly cool.
[00:18:20.160 --> 00:18:24.000] But the most important thing is, like you just said, Jay, you get to hang.
[00:18:24.880 --> 00:18:35.760] We've built it in so that there is this time to just meet and greet and hang and talk about stuff and have that very wonderful beer that Westchester's famous for.
[00:18:36.240 --> 00:18:37.040] It's just the best.
[00:18:37.040 --> 00:18:38.000] It's the best time.
[00:18:38.000 --> 00:18:40.080] Well, George, thank you for popping in.
[00:18:40.080 --> 00:18:40.560] Thanks.
[00:18:40.560 --> 00:18:41.760] I'll pop in anytime.
[00:18:41.760 --> 00:18:42.880] I'm going to go pop out now.
[00:18:42.880 --> 00:18:44.560] And we'll see you on the other side, friend.
[00:18:44.560 --> 00:18:44.960] All righty.
[00:18:44.960 --> 00:18:45.440] Bye, guys.
[00:18:45.440 --> 00:18:45.840] Hey, George.
[00:18:47.040 --> 00:18:51.120] While we were at SciCon, we actually did a number of interviews.
[00:18:51.120 --> 00:18:56.160] We have a really great interview coming up later in this episode with Brian Cox.
[00:18:56.160 --> 00:18:56.640] Oh, yeah.
[00:18:56.640 --> 00:18:57.520] The Brian Cox.
[00:18:57.520 --> 00:18:58.160] Brian Cox.
[00:18:58.160 --> 00:19:08.960] And Brian Wecht, who is our friend and collaborator on Natacon and other things, who's also the Brian Wecht has also been a physicist and a ninja and an actual ninja.
[00:19:09.120 --> 00:19:10.960] Don't even ask because we can't tell you.
[00:19:11.760 --> 00:19:14.880] He was there for the interview and made it extra special.
[00:19:14.880 --> 00:19:16.640] He was a special sauce.
[00:19:16.800 --> 00:19:18.880] So I'll be later in the show.
[00:19:18.880 --> 00:19:21.600] But now we're going to go on with our new items.
[00:19:21.600 --> 00:19:23.840] Bob, you're going to start us off with a quickie.
[00:19:23.840 --> 00:19:24.240] Sure.
[00:19:24.240 --> 00:19:25.200] Thank you, Steve.
[00:19:25.200 --> 00:19:27.280] This is your quickie with Bob.
[00:19:27.280 --> 00:19:39.240] So, the international team of scientists have actually determined the chemical properties of some super heavy elements, muscovium, element 115, and nihonium, element 113.
[00:19:39.800 --> 00:19:44.040] So this makes muscovium the heaviest element ever chemically studied.
[00:19:44.040 --> 00:19:51.400] And you might be thinking, how the hell do they test an element chemically if that element lasts for milliseconds before decaying?
[00:19:51.400 --> 00:19:52.360] Very good question.
[00:19:52.360 --> 00:19:53.400] So that's what I'm going to describe.
[00:19:53.400 --> 00:19:55.480] The lab setup was very slick.
[00:19:55.480 --> 00:20:06.280] Essentially, they create super heavy particles and using gas chromatography, they quickly kind of shuttle those elements through various detectors that are lined with quartz and gold.
[00:20:06.280 --> 00:20:19.240] Now, if the element, say muscovium, if it binds with the quartz or the gold, that is detected and recorded, as well as how long it binds with the gold or the quartz and where that super heavy element ends up.
[00:20:19.240 --> 00:20:25.480] So that binding information is invaluable data as to how these elements behave chemically.
[00:20:25.480 --> 00:20:34.120] Now they found generally, the results said that both super heavy elements, muscovium and nihonium, they had weak interactions with quartz.
[00:20:34.120 --> 00:20:36.840] And this actually lines up with their predictions.
[00:20:36.840 --> 00:20:38.440] So here's where it gets really interesting.
[00:20:38.440 --> 00:20:39.720] I wasn't aware of this.
[00:20:39.720 --> 00:20:51.640] They predicted that relativistic effects would make super heavy elements less reactive than similar elements on the periodic table with fewer protons, say like lead, for various reasons.
[00:20:51.640 --> 00:20:58.200] They think lead and muscovium should behave similarly in how it binds with quartz and gold, for example.
[00:20:58.200 --> 00:20:59.160] And it didn't.
[00:20:59.160 --> 00:21:03.640] The muscovium did not react, did not bind as well, and that was predicted.
[00:21:03.880 --> 00:21:10.040] So it's believed that relativistic effects come into play with super heavy elements because they have more protons.
[00:21:10.040 --> 00:21:11.240] All right, so imagine that.
[00:21:11.720 --> 00:21:15.200] You've got a super heavy element with 115 protons.
[00:21:14.280 --> 00:21:16.880] There's so many protons in the nucleus.
[00:21:17.040 --> 00:21:23.120] Now that increases the electromagnetic forces that are available, that are in play in the nucleus.
[00:21:23.120 --> 00:21:26.640] And that alters the electron, the electrons in the atom.
[00:21:26.640 --> 00:21:39.280] Now, classically, if you look at this classically, you could say that the electrons get closer to the protons because the electromagnetic force is stronger, and that makes them move at speeds closer to the speed of light.
[00:21:39.280 --> 00:21:41.520] And doing that, you're gaining mass.
[00:21:41.520 --> 00:21:43.600] Now, you can look at it quantum mechanically.
[00:21:43.600 --> 00:21:51.920] You can look at it as the electron cloud around the atom becoming distorted because of the stronger electromagnetic forces because of the protons, right?
[00:21:51.920 --> 00:21:52.640] You with me?
[00:21:52.640 --> 00:22:01.520] So either way you look at it, whether you look at it classically or quantum mechanically, the super heavy elements should change how they interact with other elements chemically.
[00:22:01.520 --> 00:22:04.000] And that's exactly what they showed with this experiment.
[00:22:04.400 --> 00:22:12.080] These relativistic effects come into play with these super heavy elements, causing them to behave differently chemically.
[00:22:12.240 --> 00:22:14.080] That's exactly what they showed.
[00:22:14.080 --> 00:22:25.600] So sure, this doesn't mean that we're going to add muscovium or nihonium to the iPhone 20, but understanding the chemical properties of super heavy elements could lead to breakthroughs in material science and other fields.
[00:22:25.600 --> 00:22:33.120] Who knows what kind of interesting breakthroughs could happen with a deeper understanding of relativistic effects influencing super heavy elements.
[00:22:33.120 --> 00:22:36.480] So this has been your Relativistic Super Heavy Quickie with Bob.
[00:22:36.480 --> 00:22:37.600] Back to you, Steve.
[00:22:37.600 --> 00:22:38.880] Thank you, Bob.
[00:22:38.880 --> 00:22:40.720] Jay, give us the first news item.
[00:22:40.720 --> 00:22:43.120] You're going to tell us about the Biogenome Project.
[00:22:43.120 --> 00:22:43.600] What is that?
[00:22:44.240 --> 00:22:45.600] All right, this is awesome, guys.
[00:22:45.600 --> 00:22:49.200] The Earth Biogenome Project, it's also called EBP.
[00:22:49.200 --> 00:22:51.280] This is a really amazing effort.
[00:22:51.280 --> 00:23:00.760] They started it in 2017, and their goal is to sequence the genomes of all 1.67 million named eukaryotic species.
[00:22:59.920 --> 00:23:05.720] There's 8.7 known species, but they're just talking about the ones that are currently named.
[00:23:06.040 --> 00:23:11.000] This encompasses, ready, plants, animals, fungi, and other microbes.
[00:23:11.000 --> 00:23:14.920] So the eukaryotic organisms include, let me give you the bigger list.
[00:23:14.920 --> 00:23:21.320] There's animals, plants, fungi, and protists encompassing all multicellular life and some unicellular species.
[00:23:21.640 --> 00:23:25.480] So basically, Jay, that's everything but bacteria and archaea?
[00:23:25.880 --> 00:23:26.200] Exactly.
[00:23:26.200 --> 00:23:26.680] That's exactly.
[00:23:26.760 --> 00:23:27.400] Is there anything else?
[00:23:27.800 --> 00:23:29.320] It's like algae single-cell?
[00:23:29.640 --> 00:23:30.120] Prokaryotic?
[00:23:30.680 --> 00:23:31.640] It depends on the type of algae.
[00:23:32.120 --> 00:23:33.240] Blue-green algae is not.
[00:23:33.240 --> 00:23:34.760] So there's some of each, okay.
[00:23:34.760 --> 00:23:44.600] Well, guys, they specifically are distinguishing eukaryotes from simpler, you know, the non-nucleated prokaryotic organisms like bacteria.
[00:23:44.600 --> 00:23:47.080] So I think we define that pretty good.
[00:23:47.080 --> 00:23:55.160] So the project scale literally, it's so massive, bigger than earlier efforts that have come before it, like the Human Genome Project.
[00:23:55.160 --> 00:23:57.320] And that was a cool thing that they did.
[00:23:57.320 --> 00:24:02.680] That project, it focused on mapping a single species genome, so the human species, right?
[00:24:02.680 --> 00:24:13.160] The Earth Biogenome Project's ultimate goal is to revolutionize this understanding of all the Earth's biodiversity, which will improve conservation strategies.
[00:24:13.160 --> 00:24:16.680] It'll drive advancement in agriculture and human health.
[00:24:16.680 --> 00:24:18.360] So there's three phases.
[00:24:18.360 --> 00:24:19.320] It's structured.
[00:24:19.320 --> 00:24:21.400] The first phase is in 2026.
[00:24:21.400 --> 00:24:23.720] They want to sequence 10,000 species.
[00:24:23.720 --> 00:24:29.560] This should be covering one representative per eukaryotic family.
[00:24:29.560 --> 00:24:32.920] Then phase two, 2030, is the end date.
[00:24:32.920 --> 00:24:37.800] They want to expand that to 150,000 genomes, which includes one for each genus.
[00:24:37.800 --> 00:24:46.160] And then by phase three, which is 2032, they want to complete the genomes for all 1.67 million named species.
[00:24:46.160 --> 00:24:48.080] It's huge, absolutely.
[00:24:48.080 --> 00:24:49.440] But they have a plan.
[00:24:49.440 --> 00:24:50.720] So let's go into that.
[00:24:44.920 --> 00:24:52.160] They don't have an idea or a concept.
[00:24:52.320 --> 00:24:53.920] They actually have a plan.
[00:24:54.880 --> 00:25:07.440] So far, the EBP, right, remember I told you that's what they call it, they've sequenced 3,000 genomes from 1,060 families, and this covers a lot of work, but that's far from its phase one target.
[00:25:07.440 --> 00:25:11.200] And the initial cost estimates for phase one were over $600 million.
[00:25:11.200 --> 00:25:15.680] But they were able to reduce that to $265 million because why, guys?
[00:25:15.920 --> 00:25:16.560] AI.
[00:25:16.560 --> 00:25:18.160] That's a great idea, but that wasn't it.
[00:25:18.560 --> 00:25:21.440] Because of rapid advancements in sequencing technology.
[00:25:21.440 --> 00:25:26.640] And again, keep in mind, like sequencing technology is constantly being improved on as the years go by.
[00:25:26.640 --> 00:25:32.400] It's been in a constant state of flux, which is awesome because look at that price difference, $600 million to $265 million.
[00:25:32.720 --> 00:25:39.600] So by comparison, the Human Genome Project's first draft of their genome project cost over $100 million.
[00:25:39.600 --> 00:25:43.840] But again, keep in mind, the Human Genome Project works on one species.
[00:25:43.840 --> 00:25:49.440] The EBP is going to work on 1.6 somewhat more, you know, 1.6 million, that's so much more.
[00:25:49.440 --> 00:25:50.480] It's ridiculous.
[00:25:50.480 --> 00:25:55.920] So despite the investments, the EBP's progress needs to happen faster than it currently is.
[00:25:55.920 --> 00:25:58.240] And they're trying to figure out how they're going to do it.
[00:25:58.240 --> 00:26:05.600] So what they want to do is they want the weekly genome production to scale from 20 to 721 by phase two.
[00:26:05.600 --> 00:26:10.080] And they've scoped it out that that'll meet their deadlines.
[00:26:10.080 --> 00:26:15.600] So unlike its predecessors, EBP is a decentralized global project.
[00:26:15.600 --> 00:26:16.160] This is great.
[00:26:16.160 --> 00:26:18.480] It involves affiliates from 28 countries.
[00:26:18.480 --> 00:26:23.600] Two major hubs are BGI in China and the Wellcome Sanger Institute in the UK.
[00:26:23.600 --> 00:26:30.000] So the project needs regions that have massive biodiversity to get funds to keep the project moving forward.
[00:26:29.960 --> 00:26:36.600] So the problem is that a lot of these places that have a huge biodiversity just don't have enough money to make it all happen.
[00:26:36.600 --> 00:26:43.720] Some regions have funding, they have these funding issues, but the people behind the EBP came up with a solution to help them out, and this is a great idea.
[00:26:43.720 --> 00:26:50.200] They developed a portable sequencing lab they call G-Boxes, and I could not find anything on why they call it G-Boxes.
[00:26:50.200 --> 00:26:53.720] So these labs, you know, they can be deployed wherever they need to.
[00:26:54.040 --> 00:27:00.440] Each unit costs approximately $5.5 million to install and operate over a three-year period.
[00:27:00.440 --> 00:27:04.520] And they can sequence between 1,000 and 3,000 genomes annually.
[00:27:04.520 --> 00:27:08.840] And they can just pump these, you know, these G-boxes out where they need to go.
[00:27:08.840 --> 00:27:12.920] And they're relatively, you know, inexpensive, all things considered.
[00:27:12.920 --> 00:27:17.240] So the funding has been ongoing, and it's been a huge challenge for them.
[00:27:17.240 --> 00:27:22.280] You know, right now, guys, as everybody knows, there's geopolitical tensions and they get in the way.
[00:27:22.280 --> 00:27:26.360] And it really matters when countries don't agree and don't have good relations with each other.
[00:27:26.360 --> 00:27:31.080] You know, science basically suffers a lot, and particularly between the U.S.
[00:27:31.080 --> 00:27:31.800] and China.
[00:27:31.800 --> 00:27:33.960] And you'd think, you know, why can't they just talk and fix it?
[00:27:33.960 --> 00:27:35.720] But they can't because we can't even agree.
[00:27:35.720 --> 00:27:38.520] And, you know, each country can't even agree on what's going on.
[00:27:38.520 --> 00:27:43.400] Even partial success here would still be awesome, you know, if they don't actually get all the way.
[00:27:43.400 --> 00:27:46.280] And they're actually describing it as transformative.
[00:27:46.280 --> 00:27:49.560] You know, a high-quality reference genomes could do lots of things.
[00:27:49.560 --> 00:27:52.840] It could accelerate conservation efforts for endangered species.
[00:27:52.840 --> 00:27:59.880] It could identify genes critical for crop resistance, as an example, or it can improve our understanding of ecosystem dynamics.
[00:27:59.880 --> 00:28:01.400] There's a lot to this.
[00:28:01.400 --> 00:28:05.640] It's not just getting the information, but there are applications there that they know of.
[00:28:05.640 --> 00:28:10.920] So, as it stands, EBP right now, it's the most ambitious genome initiative in history.
[00:28:10.920 --> 00:28:18.960] And if it's fully successful, it'll mark a new era in science, which is offering insights into the intricate tapestry of life on Earth.
[00:28:19.040 --> 00:28:31.200] You know, great, it sounds great, but really, they're saying we're going to basically have a backup of the genome of all these named species, and it will be the foundation for lots of future solutions and global challenges that are coming up, right?
[00:28:31.200 --> 00:28:35.200] Because if we want to save species, we can understand what their needs are better.
[00:28:35.200 --> 00:28:39.200] And if we lose the species, we might someday be able to bring it back, which would be great.
[00:28:39.200 --> 00:28:51.360] So, even in the project's early stages, the project does demonstrate, I would consider it to be extraordinary potential and a hugely collaborative workspace for science on a planetary scale.
[00:28:51.360 --> 00:28:58.320] You know, that's the exact environment that science I think is best when there's tons of people involved globally and it's something that's happening all over.
[00:28:58.320 --> 00:29:03.440] Yeah, the cool thing is, like, essentially, this will give us a map of all living things at the genetic level.
[00:29:03.440 --> 00:29:08.560] And we've got to start, we can start asking really interesting questions evolutionarily and otherwise.
[00:29:08.560 --> 00:29:10.720] You know, it's just a massive data set.
[00:29:10.720 --> 00:29:16.960] And imagine we sick AI after it to try to find patterns and stuff, yeah, in that vast data.
[00:29:16.960 --> 00:29:18.000] That's absolutely.
[00:29:18.000 --> 00:29:25.040] I mean, you would think this would be something that would have hopefully already been done, but it's still incredibly laborious.
[00:29:25.040 --> 00:29:28.080] It's very hard with today's technology.
[00:29:28.080 --> 00:29:31.680] But, you know, this type of science actually drives innovation as well.
[00:29:31.680 --> 00:29:44.320] So it's really good when, for example, I just read a cool statistic that like NASA, you know, in 2022, NASA brought tens of billions of dollars of industry by its inventions and everything that all the science that they're doing.
[00:29:44.320 --> 00:29:52.400] Like it's really valuable downstream when they literally hand over the technology that they're creating to companies in the United States.
[00:29:52.400 --> 00:29:55.680] So, bottom line is, I'm 100% behind this project.
[00:29:55.680 --> 00:29:56.480] I think it's wonderful.
[00:29:56.480 --> 00:30:03.960] It's the exact type of science that needs to be funded globally, and I can't wait to hear that bell ring when they finish.
[00:29:59.680 --> 00:30:05.240] Jay, let me ask you one more question.
[00:30:05.960 --> 00:30:07.800] Does the sun have a magnetic field?
[00:30:07.800 --> 00:30:08.360] It's gotta.
[00:30:08.520 --> 00:30:09.320] Of course, it does.
[00:30:09.320 --> 00:30:10.600] Yeah, it does.
[00:30:10.920 --> 00:30:11.880] How strong is it?
[00:30:12.280 --> 00:30:15.240] Steve, it's wicked strong, yeah.
[00:30:15.800 --> 00:30:19.960] It's like a 18/23 in Dungeons and Dragons.
[00:30:21.160 --> 00:30:24.840] It's the strongest force I would imagine in the galaxy, no?
[00:30:27.080 --> 00:30:28.680] It extends past Pluto.
[00:30:29.160 --> 00:30:30.200] It's big, it's big.
[00:30:30.200 --> 00:30:32.840] It's actually very variable.
[00:30:32.840 --> 00:30:38.840] So, the Earth's magnetic field at basically the Earth's surface is one Gauss?
[00:30:38.840 --> 00:30:41.160] It's 0.6 Gauss.
[00:30:41.160 --> 00:30:41.400] Oh.
[00:30:41.400 --> 00:30:41.720] Yeah.
[00:30:41.720 --> 00:30:42.280] Not even one.
[00:30:42.600 --> 00:30:43.000] Okay.
[00:30:44.120 --> 00:30:49.000] The average magnetic field of the Sun at its surface is one Gauss.
[00:30:49.000 --> 00:30:51.880] So just slightly stronger.
[00:30:51.880 --> 00:30:55.160] Average strength around one Gauss, but it's highly variable.
[00:30:55.160 --> 00:30:57.400] With sunspots being the strongest.
[00:30:57.400 --> 00:30:59.720] That's why that's what sunspots are.
[00:31:00.520 --> 00:31:05.800] Sunspots can reach magnetic field strengths of 2,000 to 3,000 Gauss.
[00:31:05.800 --> 00:31:06.760] That's attractive.
[00:31:06.760 --> 00:31:08.520] That's huge, right?
[00:31:08.520 --> 00:31:09.000] Yeah.
[00:31:09.560 --> 00:31:15.160] Yeah, you've got very powerful magnetic effects like reconnection happening that can unleash very powerful.
[00:31:15.240 --> 00:31:17.320] Now, it's a very big magnetic field.
[00:31:17.320 --> 00:31:20.520] It's bigger than Earth's magnetic field, but it still decreases with distance.
[00:31:20.520 --> 00:31:27.240] So at the Earth, how strong do you think the Sun's magnetic field is at the Earth's distance?
[00:31:27.240 --> 00:31:31.400] Probably pretty weak because it's 50 micro Gauss.
[00:31:31.400 --> 00:31:31.800] Yeah.
[00:31:31.800 --> 00:31:32.120] Ooh.
[00:31:32.840 --> 00:31:33.240] Wow.
[00:31:33.720 --> 00:31:35.640] One five hundred thousandth?
[00:31:35.720 --> 00:31:38.360] So yeah, a micro is a millionth.
[00:31:38.520 --> 00:31:39.920] So 50 millionths of a gauss.
[00:31:39.880 --> 00:31:40.480] 50 million.
[00:31:40.760 --> 00:31:42.360] Or five micro Tesla.
[00:31:42.360 --> 00:31:44.120] So I guess the Tesla is 10 Gauss.
[00:31:44.120 --> 00:31:45.520] Well, that's barely detectable, right?
[00:31:45.520 --> 00:31:45.920] Yep.
[00:31:45.920 --> 00:31:47.120] What about at Pluto?
[00:31:47.120 --> 00:31:47.760] Oh, it has to be.
[00:31:48.000 --> 00:31:48.960] Nano Gauss.
[00:31:48.960 --> 00:31:49.520] I mean, really?
[00:31:49.520 --> 00:31:51.920] Yeah, 0.2 nano Tesla.
[00:31:44.920 --> 00:31:52.240] Got it.
[00:31:52.480 --> 00:31:53.440] Oh, Tesla.
[00:31:55.120 --> 00:31:56.800] I got nano right.
[00:31:56.800 --> 00:31:57.040] Yeah.
[00:31:58.240 --> 00:31:59.760] Or so what would that be?
[00:32:00.240 --> 00:32:02.640] Or 2 nano Gauss.
[00:32:02.640 --> 00:32:03.280] Ah, still right.
[00:32:03.280 --> 00:32:03.520] Yeah.
[00:32:03.760 --> 00:32:04.400] Awesome.
[00:32:04.640 --> 00:32:06.320] What does that even do at that point?
[00:32:06.480 --> 00:32:07.520] It's negligible, right?
[00:32:07.520 --> 00:32:09.120] That's basically negligible.
[00:32:09.120 --> 00:32:12.160] I don't even know if we can measure calculating it.
[00:32:12.160 --> 00:32:16.000] But it still keeps out the interstellar medium, the intermission.
[00:32:16.160 --> 00:32:16.800] To some extent.
[00:32:17.520 --> 00:32:17.920] Oh, yeah.
[00:32:18.560 --> 00:32:21.200] It defines the boundary until you get to the helio.
[00:32:22.080 --> 00:32:22.400] Yeah.
[00:32:22.400 --> 00:32:23.040] Yeah.
[00:32:23.040 --> 00:32:26.880] And helio shock and all the helio sheath and all that other stuff.
[00:32:26.880 --> 00:32:40.560] Now the question is: has this always been the case, or has the, in the past, especially the distant past, especially the very early solar system, was the sun's magnetic field stronger?
[00:32:40.880 --> 00:32:42.160] I'd say much stronger.
[00:32:42.160 --> 00:32:46.560] Aren't there a lot of, don't some young stars are very kind of chaotic?
[00:32:46.560 --> 00:32:49.200] Like, were they like Wolf-Rayford stars?
[00:32:49.200 --> 00:32:49.680] Yeah.
[00:32:49.680 --> 00:32:51.680] How could we answer that question, do you think?
[00:32:51.680 --> 00:32:53.840] Well, we're doing a time machine would do.
[00:32:54.080 --> 00:32:54.960] Yeah, that'll do.
[00:32:54.960 --> 00:33:04.080] Is it the amount of elements in the current star versus what we estimate was in the early stars?
[00:33:04.080 --> 00:33:07.520] Or we could measure it from other stars.
[00:33:07.520 --> 00:33:07.920] No.
[00:33:08.560 --> 00:33:09.840] Through telescopes.
[00:33:10.400 --> 00:33:13.760] How do we know what the Earth's magnetic field was in the past?
[00:33:13.760 --> 00:33:15.520] Oh, you look at fossilized metal.
[00:33:15.760 --> 00:33:16.080] Rocks.
[00:33:16.320 --> 00:33:16.720] Yeah.
[00:33:16.720 --> 00:33:17.760] Oh, right.
[00:33:17.920 --> 00:33:18.800] Or the water samples?
[00:33:18.960 --> 00:33:19.360] No, water.
[00:33:19.600 --> 00:33:34.120] You're looking for rocks that have elements that will align themselves to a magnetic field, and you basically dealign them, and however much energy you have to put in to dealign them is how much energy was put in to align them, basically.
[00:33:29.840 --> 00:33:35.000] I think this is how it works.
[00:33:35.240 --> 00:33:39.320] So then we could say that, well, that's how strong the magnetic field was when this thing crystallized, right?
[00:33:39.320 --> 00:33:40.600] When this formed.
[00:33:40.600 --> 00:33:44.520] So how could we get rocks from 4 billion years ago?
[00:33:44.520 --> 00:33:45.320] Oh, asteroids.
[00:33:45.320 --> 00:33:46.520] Asteroids, exactly.
[00:33:46.600 --> 00:33:47.320] Meteors.
[00:33:47.320 --> 00:33:48.520] Asteroids and meteors.
[00:33:48.520 --> 00:33:56.840] So for meteors that hit the Earth, basically those are going to tell us about the magnetic field of the inner solar system.
[00:33:56.840 --> 00:34:01.720] Because most of the meteors that are going to land on the Earth probably formed in the inner solar system.
[00:34:01.720 --> 00:34:03.160] And we have done that.
[00:34:03.160 --> 00:34:18.760] But if we could get an asteroid from the outer solar system, then we could get a measure of the magnetic field, you know, when that asteroid formed, probably 4 billion years ago, whatever, 4.6 billion years ago, in the outer solar system.
[00:34:18.760 --> 00:34:20.600] So this is what was just done.
[00:34:20.920 --> 00:34:23.560] Do you guys remember Ryugu, the Japanese?
[00:34:23.960 --> 00:34:27.400] Oh, yes, it went and landed on an asteroid and took a sample.
[00:34:27.720 --> 00:34:29.800] So the asteroid was Ryugu, right?
[00:34:29.800 --> 00:34:34.120] We talked about this, I think, when they recovered it and they got the grains from.
[00:34:34.440 --> 00:34:34.600] Right.
[00:34:34.760 --> 00:34:37.960] Just like we talked about, Bennu was the NASA project.
[00:34:37.960 --> 00:34:39.800] This was the Japanese one.
[00:34:39.800 --> 00:34:50.680] So they recently, a team analyzed particles from Ryugu for signs of an ancient magnetic field in the outer solar system.
[00:34:51.000 --> 00:34:55.160] And what they found was nothing, right?
[00:34:55.720 --> 00:34:57.880] They found no evidence of a magnetic field.
[00:34:57.880 --> 00:35:12.280] But what they said, what this means is, given the techniques that they used, they can only say that there's an upper limit to how strong the magnetic field could have been, and that's 15 microtesla.
[00:35:12.280 --> 00:35:19.360] So if there was a magnetic field in the outer solar, so past Jupiter, when I say outer solar, I'm talking about past Jupiter.
[00:35:19.360 --> 00:35:30.640] If there was a magnetic field beyond Jupiter, it would have been at, you know, basically where that asteroid formed, it would have been less than 15 micro Tesla, which is not negligible.
[00:35:30.640 --> 00:35:32.880] It's small, but it's not negligible.
[00:35:32.880 --> 00:35:35.280] It's not in the nanotesla range.
[00:35:35.280 --> 00:35:38.000] And what do I mean by not negligible?
[00:35:38.000 --> 00:35:39.280] Something very specific here.
[00:35:39.280 --> 00:35:50.800] Because one of the questions was: what was the impact of the sun's magnetic field in the formation of planets and other objects circling the sun?
[00:35:50.800 --> 00:36:08.320] They analyzed other data, and the researchers concluded that there may have been a other data basically indicated that there was a five micro Tesla magnetic field in the outer solar system, which fits with the, it had to be less than 15 micro Tesla.
[00:36:08.320 --> 00:36:11.360] So this is not completely confirmed.
[00:36:11.360 --> 00:36:19.600] They are hoping to get the same analysis from the material brought back by asteroid Bennu and to see if they can confirm that.
[00:36:19.600 --> 00:36:27.840] But the data that we have so far is kind of lining up to there was probably a weak magnetic field, but not insignificant magnetic field in the outer solar system.
[00:36:27.840 --> 00:36:32.640] So what effect does the magnetic field have on the formation of the solar system?
[00:36:32.640 --> 00:36:45.920] You know, when the solar system condenses out of a cloud of gas, like it becomes basically a spinning disk, the sun forms in the middle, and you have a cloud of ionized gas, like a disk of ionized gas surrounding the sun.
[00:36:45.920 --> 00:36:49.680] It's ionized, so it responds to a magnetic field.
[00:36:49.920 --> 00:37:03.000] The thinking is that in the inner solar system, that magnetic field, the sun's magnetic field, would have caused clumping to occur, which probably formed into the inner rocky planets.
[00:37:03.000 --> 00:37:09.320] And so the question was: could the same process have been happening for the outer planets and even other stuff out there?
[00:37:09.320 --> 00:37:14.200] And this, so this is sort of moving in that direction and saying, yeah, it could have.
[00:37:14.200 --> 00:37:23.080] You know, if there was this five microtesla or whatever field out there, it was probably 50 to 200 microtesla in the inner solar system.
[00:37:23.080 --> 00:37:29.640] If it was like five or whatever microtesla in the outer solar system, that could have contributed to the formation of the gas giants.
[00:37:29.640 --> 00:37:36.200] And basically, everything condensed into like asteroids and comets and stuff past Jupiter.
[00:37:36.200 --> 00:37:45.320] So that's why they're so interested in this question, you know, because this affects their models of early planetary formation, right, in the early solar system.
[00:37:45.320 --> 00:37:45.720] Yeah.
[00:37:45.720 --> 00:38:02.520] So the next step, I think, is going to be doing the same analysis of particles from Bennu to see if that also lines up with the, maybe they can get a more precise number or confirm it that there is, well, there was like a five microtesla magnetic field in the outer solar system at that time.
[00:38:02.520 --> 00:38:03.880] So that's pretty cool.
[00:38:03.880 --> 00:38:04.520] Nice.
[00:38:04.760 --> 00:38:07.640] All right, Kara, we're going to go in a bit of a different direction here.
[00:38:07.640 --> 00:38:13.960] Tell us about the possibility of RFK being in charge of our health.
[00:38:13.960 --> 00:38:14.600] Oh, geez.
[00:38:14.600 --> 00:38:15.800] RFK Jr., I should say.
[00:38:16.520 --> 00:38:25.320] RFK Jr., who Trump has publicly verbalized, he plans to put into a, quote, position of power.
[00:38:25.320 --> 00:38:29.960] Actually, I guess the real quote was to go wild on health.
[00:38:31.160 --> 00:38:36.040] I think that's actually what he said, and possibly, quote, declare war on the FDA.
[00:38:36.040 --> 00:38:46.320] He's a proponent of this sort of newly dubbed health freedom movement, or what some are calling MAHA, Make America Ha Healthy Again.
[00:38:46.640 --> 00:38:57.520] When he suspended his campaign for the presidency in August of this year and backed Donald Trump, he promised to make him give him a big position.
[00:38:57.520 --> 00:39:03.440] He hasn't verbalized or he hasn't specified what that position might be, heading up health policy.
[00:39:03.440 --> 00:39:24.080] You know, some have speculated positions with the FDA or the CDC, although those require approval, like congressional approval, whereas, let's see, the Secretary of Health and Human Services or possibly a czar role, I think would be possibly easier to put him into.
[00:39:24.080 --> 00:39:25.600] Yeah, it wouldn't require approval.
[00:39:25.600 --> 00:39:25.760] Yeah.
[00:39:26.080 --> 00:39:26.720] You just appointed him.
[00:39:26.720 --> 00:39:28.080] You're the health czar and that's it.
[00:39:28.080 --> 00:39:32.000] You're the czar and now you're like his personal advisor in a way.
[00:39:32.000 --> 00:39:32.320] Right.
[00:39:33.680 --> 00:39:38.400] And so a lot of people that are writing about this are showing a tweet.
[00:39:38.400 --> 00:39:41.760] I mean, there's so much data to mine here.
[00:39:41.760 --> 00:39:46.720] There's so much evidence of what will happen that this is not crystal balling.
[00:39:46.720 --> 00:39:49.680] This is just saying, well, this is what the dude said he was going to do.
[00:39:49.680 --> 00:39:52.800] But here is a tweet from RFK Jr., Robert F.
[00:39:52.800 --> 00:40:12.480] Kennedy Jr., who quickly, like, I guess for those of you who don't know who he is, he is, he is an environmental lawyer and actually has a very long track record of sort of working to protect vulnerable people against environmental disaster.
[00:40:12.480 --> 00:40:16.080] But also, he's a hardcore anti-vaccine activist.
[00:40:16.080 --> 00:40:24.200] And I love that according to his Wikipedia page, he's an environmental lawyer, American politician, anti-vaccine activist, and conspiracy theorist.
[00:40:24.200 --> 00:40:25.840] Just right there in the first line.
[00:40:25.840 --> 00:40:31.400] So he founded the Children's Health Defense, which is an anti-vax group.
[00:40:29.840 --> 00:40:36.440] They're one of the biggest peddlers of COVID-19 vaccine misinformation.
[00:40:36.760 --> 00:40:40.840] Obviously, he tried, he attempted to run for president, and he is a Kennedy.
[00:40:40.840 --> 00:40:42.920] So he's the son of Robert F.
[00:40:42.920 --> 00:40:47.000] Kennedy, and he's the nephew of JFK and Ted Kennedy.
[00:40:47.000 --> 00:40:48.680] Back to the tweet here.
[00:40:48.680 --> 00:40:53.720] So this was October 25th, 2024, so only about a week ago, two weeks ago.
[00:40:53.720 --> 00:40:57.080] FDA's war on public health is about to end.
[00:40:57.080 --> 00:41:17.240] This includes its aggressive suppression of psychedelics, peptides, stem cells, raw milk, hyperbaric therapies, chelating compounds, ivermectin, hydroxychloroquine, vitamins, clean foods, sunshine, exercise, nutraceuticals, and anything else that advances human health and can't be patented by pharma.
[00:41:17.480 --> 00:41:19.000] If you laugh at last.
[00:41:23.160 --> 00:41:27.480] If you work for the FDA and are part of this corrupt system, I have two messages for you.
[00:41:27.480 --> 00:41:30.520] One, preserve your records, and two, pack your bags.
[00:41:30.520 --> 00:41:31.240] Oh, boy.
[00:41:31.240 --> 00:41:32.440] Oh, boy.
[00:41:32.440 --> 00:41:32.840] Okay, so.
[00:41:33.160 --> 00:41:38.600] David Gorski describes this as an extinction-level event for science-based federal health policy.
[00:41:39.320 --> 00:41:45.960] Yeah, that's in the headline of a very long piece that he wrote about how dangerous this would be.
[00:41:46.200 --> 00:41:50.440] And he, of course, wrote it the day before the election.
[00:41:50.760 --> 00:41:57.880] So let's kind of talk for a second about some of the things that RFK Jr.
[00:41:58.520 --> 00:42:00.760] wants to see changed.
[00:42:00.760 --> 00:42:03.640] A lot of people are focusing on fluoride.
[00:42:03.640 --> 00:42:07.560] I think we'll come back to fluoride because, Steve, you wrote a piece about fluoride.
[00:42:08.200 --> 00:42:09.320] Because of this, yeah.
[00:42:09.320 --> 00:42:10.680] Because of this, exactly.
[00:42:10.680 --> 00:42:15.000] But there are other things that have been highlighted across the board that are problematic.
[00:42:15.280 --> 00:42:20.640] Obviously, he is one of the biggest anti-vaccine voices out there.
[00:42:20.880 --> 00:42:30.640] The New York Times did a post where they sort of dug deep into whether or not can he actually affect real change here?
[00:42:30.640 --> 00:42:34.400] Like, could Trump ban vaccines, for example?
[00:42:34.720 --> 00:42:42.480] And what the New York Times, what this article at least is saying, is that, you know, the short answer is no, because public health in the U.S.
[00:42:42.480 --> 00:42:47.040] is mostly controlled by the states, not the federal government.
[00:42:47.040 --> 00:42:52.240] And then where there is federal kind of oversight, that's with the FDA.
[00:42:52.240 --> 00:43:04.800] They license the vaccines, and the president can't just remove a product that's lawful and licensed from the market, at least not without trying to do so through legal maneuvering.
[00:43:04.800 --> 00:43:08.640] But the president could put pressure on the FDA.
[00:43:08.640 --> 00:43:16.400] The president could make sure that the judges that are appointed limit the power of federal agencies.
[00:43:16.400 --> 00:43:25.760] We also know that he, before he left office last time, where he changed the classification of a lot of federal agencies.
[00:43:25.760 --> 00:43:27.120] Yeah, I forget the name of the order, but yeah.
[00:43:27.440 --> 00:43:33.440] Yeah, federal employees so that they could be let go and their jobs were not protected.
[00:43:33.440 --> 00:43:35.840] And so he passed this before he left office last time.
[00:43:35.840 --> 00:43:44.400] Like week one, Biden reversed that, but it's very likely that he's just going to dive right back in once he takes office.
[00:43:44.400 --> 00:43:47.920] So yes, there could be a lot of change here.
[00:43:47.920 --> 00:43:54.720] One person is quoted in this article saying, there's a lot of mischief that can be done, but a flat-out ban, no.
[00:43:55.040 --> 00:43:59.600] But they could pull funding if anybody doesn't follow their dictates.
[00:43:59.600 --> 00:44:01.320] That's how the federal government could put a lot of funding.
[00:44:01.720 --> 00:44:04.360] The state controls it, but the federal government could just pull funding.
[00:44:04.680 --> 00:44:05.720] Yes, exactly.
[00:43:59.840 --> 00:44:06.920] Affordable Care Act.
[00:44:07.880 --> 00:44:20.920] Recently, and I think that this was in a final bid to get elected, Trump kind of pulled back a little bit on his rhetoric around the Affordable Care Act and said that he never mentioned ending the program.
[00:44:20.920 --> 00:44:32.360] He never would have thought about such a thing, but there's a lot of, like, he's on tape, you know, saying that he did want to appeal Obamacare or the ACA.
[00:44:32.360 --> 00:44:44.360] As we know, that actually requires an act of Congress, but he could use executive power to undercut the law or to restrict access to the law.
[00:44:44.360 --> 00:44:49.960] And this article really details a lot of the ways that he would be able to do that.
[00:44:49.960 --> 00:45:01.480] So, like, for example, in January of 2021, Biden issued an executive order that strengthened Medicaid and the ACA, and Trump could just undo that as soon as he's in office.
[00:45:01.800 --> 00:45:06.280] And then, if Congress were to repeal the act, what would come next?
[00:45:06.280 --> 00:45:10.680] Well, going back to what you said, Jay, well, there's concepts of a plan.
[00:45:11.160 --> 00:45:12.120] They have no idea.
[00:45:13.080 --> 00:45:14.040] They have no idea.
[00:45:14.040 --> 00:45:15.160] They got nothing.
[00:45:15.160 --> 00:45:15.800] There is nothing.
[00:45:15.800 --> 00:45:17.640] There's only so many levers you could pull there.
[00:45:17.640 --> 00:45:19.880] You know, there's no magic here.
[00:45:20.520 --> 00:45:27.720] And so then we're going to backtrack a little bit to Fluoride, but I did want to point out a friend of the show, Dr.
[00:45:27.720 --> 00:45:38.760] Andrea Love, she has a blog called Immunologic, and she wrote a piece back in September about the Congressional American Health and Nutrition Roundtable.
[00:45:38.760 --> 00:45:46.640] The headline of her article is: The Congressional American Health and Nutrition Roundtable was an egregious display of anti-science disinformation.
[00:45:46.960 --> 00:46:21.280] So, basically, what happened is that on Monday, September 23rd, GOP senator for Wisconsin, Ron Johnson, hosted a public taxpayer dollar-funded event where he put together what he claimed to be a panel of quote experts who will provide a foundational and historical understanding of the changes that have occurred over the last century within public sanitation, agriculture, food processing, and healthcare industries, which impact the current state of national health, end quote.
[00:46:21.280 --> 00:46:27.120] But what really he put together was a panel including, I'm going to list some names: Robert F.
[00:46:27.120 --> 00:46:54.320] Kennedy Jr., Jordan Peterson, Michaela Fuller, Casey and Callie Means, Vanny Hari, Max Lugavir, Courtney Swan, Marty Mackery, Alex Clark, Jason Karp, Brigham Bueller, and even more, scrolling, scrolling, Gillian Michaels, Chris Palmer, and Grace Price, none of whom have any expertise relevant to what this panel is supposed to cover.
[00:46:54.320 --> 00:46:55.840] You're all a bunch of quacks and sharp.
[00:46:56.720 --> 00:46:59.120] That's food-based.
[00:46:59.120 --> 00:47:04.000] Yeah, all of whom are notorious anti-science quacks.
[00:47:04.320 --> 00:47:07.520] You know, they all peddle misinformation.
[00:47:07.520 --> 00:47:15.600] And not only do they peddle misinformation, the vast majority of them benefit financially from misinformation.
[00:47:15.600 --> 00:47:27.680] Many of these individuals have make their money having outlets, whether it's books or blog posts or podcasts or whatever, that are directly funded by supplement industries.
[00:47:27.680 --> 00:47:41.720] So, what they did during this panel is they demonized big pharma, they demonized big food, but then they talked a big game about, quote, big wellness and big organic food without calling them that, right?
[00:47:41.720 --> 00:47:46.840] And this is the sort of playbook that we've seen over and over and over.
[00:47:46.840 --> 00:48:02.680] Why would you trust the kind of mainstream individuals who have, you know, the FDA and the CDC have their boots on their necks, when instead you can trust the wellness industry because they have the real answers.
[00:48:03.080 --> 00:48:10.520] And the reason that they're able to tell you what's real is that the FDA isn't, you know, tying their hands behind their back.
[00:48:10.520 --> 00:48:15.240] And this is really the basis for the Make America Healthy Again movement.
[00:48:15.240 --> 00:48:16.920] Deregulation.
[00:48:17.240 --> 00:48:21.320] It's just about going in and deregulating, deregulating, deregulating.
[00:48:21.320 --> 00:48:36.280] The funny thing is, when we, we are already quite deregulated when it comes to alternative medicine, but when we start to deregulate legitimate medicine, that's when the pseudoscience is no longer compartmentalized.
[00:48:36.280 --> 00:48:40.520] It now bleeds its way into the legitimate medicine game.
[00:48:40.520 --> 00:48:40.840] Yeah.
[00:48:40.840 --> 00:48:53.880] And big pharma, the very quote, industry, and I'm saying that in, you know, air quotes, big pharma, the industry that they are so angry with is going to start peddling pseudoscience.
[00:48:54.120 --> 00:48:54.760] Totally.
[00:48:54.760 --> 00:48:56.440] They're happy to be deregulated.
[00:48:56.440 --> 00:48:57.880] It's like, oh, we could sell crap.
[00:48:57.880 --> 00:49:01.000] We don't have to research and charge up the wazoo for it.
[00:49:01.000 --> 00:49:01.800] Sure.
[00:49:02.120 --> 00:49:03.320] We're all in.
[00:49:03.320 --> 00:49:05.080] This is not an anti-big pharma bill.
[00:49:05.080 --> 00:49:07.880] It's a pro-quackery, pro-snake oil movement.
[00:49:08.120 --> 00:49:09.000] That's what it is.
[00:49:09.000 --> 00:49:09.400] Yep.
[00:49:09.400 --> 00:49:13.240] Deregulation will hurt so many people.
[00:49:13.880 --> 00:49:15.280] People will die.
[00:49:15.280 --> 00:49:15.760] They are dying.
[00:49:14.280 --> 00:49:18.480] I think it's they are dying and they will continue to die.
[00:49:19.280 --> 00:49:27.920] And very often the people that will die are women and children, individuals of color, LGBTQIA individuals, people who are vulnerable.
[00:49:27.920 --> 00:49:29.600] Yeah, and low socioeconomic status.
[00:49:29.600 --> 00:49:30.320] Yep.
[00:49:30.320 --> 00:49:31.360] Absolutely.
[00:49:31.680 --> 00:49:34.880] And so now I guess we should talk a little bit about fluoride.
[00:49:34.880 --> 00:49:38.960] I don't want it to take up the whole thing because we have covered it in the past.
[00:49:38.960 --> 00:49:42.080] But basically, RFK Jr.
[00:49:42.640 --> 00:49:46.480] is touting lines from Dr.
[00:49:46.480 --> 00:49:47.440] Strangelove.
[00:49:47.680 --> 00:49:54.000] Like, and I think, you know, it's the argument is that fluoride, I've got to find his quote because I got it right here.
[00:49:54.000 --> 00:49:54.560] You got it right here.
[00:49:55.760 --> 00:49:56.560] I have it too.
[00:49:56.720 --> 00:49:59.120] He described fluoride as an industrial waste.
[00:49:59.120 --> 00:50:08.320] Industrial waste associated with arthritis, bone fractures, bone cancer, IQ loss, neurodevelopmental disorders, and thyroid disease.
[00:50:08.320 --> 00:50:15.440] And then in an interview just this past Sunday, Trump said that the idea of doing away with fluoridation, quote, sounds okay to me.
[00:50:16.000 --> 00:50:21.600] And so, Steve, you did a really great job of going through each of the things in that list and debunking.
[00:50:21.600 --> 00:50:29.440] You know, we know that although some fluoride is produced through industrial processes, it is not industrial waste.
[00:50:29.440 --> 00:50:30.240] And it doesn't matter.
[00:50:30.240 --> 00:50:32.480] It's like one of those, it's a chemophobia thing.
[00:50:32.480 --> 00:50:37.280] Because at the end of the day, it completely dissociates into fluoride ions.
[00:50:37.280 --> 00:50:37.920] That's it.
[00:50:37.920 --> 00:50:38.960] It's a fluoride ion.
[00:50:38.960 --> 00:50:39.840] It's an element.
[00:50:39.840 --> 00:50:40.480] It doesn't matter.
[00:50:40.480 --> 00:50:43.040] It's like saying this hydrogen atom came from poison.
[00:50:43.040 --> 00:50:43.520] Who cares?
[00:50:43.520 --> 00:50:45.120] It's now a proton.
[00:50:45.120 --> 00:50:45.840] You know what I mean?
[00:50:45.840 --> 00:50:48.080] It doesn't matter where you sourced it from.
[00:50:48.320 --> 00:50:50.480] That's just fear-mongering chemophobia.
[00:50:50.480 --> 00:50:50.960] That's what that is.
[00:50:51.200 --> 00:50:53.680] Of course, but that's totally from the playbook, right?
[00:50:53.920 --> 00:50:55.840] That's from all of the rhetoric.
[00:50:55.840 --> 00:51:01.720] Arthritis, bone fractures, bone cancer, IQ loss, neurodevelopmental disorders, and thyroid disease.
[00:51:02.040 --> 00:51:15.160] You do a great job of going through and saying, okay, this was this one study, or this was when the levels that were looked at were X number of times greater than any acceptable level by the FDA.
[00:51:15.160 --> 00:51:18.120] And time and time again, what we have seen.
[00:51:18.360 --> 00:51:19.480] It's the EPA, actually.
[00:51:20.040 --> 00:51:20.440] Oh, thank you.
[00:51:20.600 --> 00:51:24.920] Yeah, because if fluoride is naturally occurring in water, and the U.S.
[00:51:24.920 --> 00:51:29.480] has patients with high levels of fluoride, the EPA sets limits.
[00:51:29.480 --> 00:51:33.000] If it gets higher than that, we actually reduce the level of fluoride.
[00:51:33.000 --> 00:51:38.760] We only add it up to a very tiny amount that's well below anything that causes any issues.
[00:51:38.760 --> 00:51:39.240] Doses.
[00:51:39.560 --> 00:51:45.400] And by the way, again, this is a decision that's made at the local level.
[00:51:45.800 --> 00:51:52.760] There are states in this country that don't, or I guess I should say cities within states in this country that don't fluoridate the water.
[00:51:52.760 --> 00:51:55.160] 72.3% of the U.S.
[00:51:55.160 --> 00:51:59.960] population has access to fluoridated water, according to the CDC.
[00:51:59.960 --> 00:52:09.480] And the CDC calls fluoridation one of the 10 great public health achievements of the 20th century because fluoridated water helps with oral health.
[00:52:09.480 --> 00:52:20.280] It reduced cavities and tooth decay by 60% in the Grand Rapids experiment in 1945, where the first efforts to fluoridate water began.
[00:52:20.280 --> 00:52:30.040] It was such a resounding win that other jurisdictions decided to do this because the evidence was overwhelming.
[00:52:30.360 --> 00:52:31.800] Yeah, and it saves money.
[00:52:31.800 --> 00:52:35.640] And then they say, well, yeah, with fluoridated toothpaste, you don't really need it these days.
[00:52:35.640 --> 00:52:38.760] But when they take it away, tooth decay goes up.
[00:52:38.760 --> 00:52:39.160] Right.
[00:52:39.160 --> 00:52:40.520] So clearly we do.
[00:52:41.000 --> 00:52:50.720] Because of course, when we democratize something like fluoride in the drinking water, everybody has access to it.
[00:52:50.720 --> 00:52:56.400] As opposed to requiring it to be viewed, I mean, I hate to say this because it shouldn't be a luxury item.
[00:52:56.400 --> 00:52:58.160] It should be a necessity item.
[00:52:58.160 --> 00:53:00.560] But for some people, it is a luxury item.
[00:53:00.560 --> 00:53:04.720] And for some people, they just don't have access to the oral hygiene that they need.
[00:53:04.720 --> 00:53:08.960] Maybe they're using fluoridated toothpaste, but they're not brushing their teeth multiple times a day.
[00:53:08.960 --> 00:53:14.800] Maybe they don't have an opportunity to go to the dentist and use the fluoride that they give to children at the dentist.
[00:53:14.800 --> 00:53:16.320] Well, this is what we're headed for.
[00:53:16.320 --> 00:53:18.160] It's really, really scary.
[00:53:18.160 --> 00:53:25.200] And that's just one of, you know, the ACA, fluoridated water, and vaccines, that's just scratching the surface.
[00:53:25.200 --> 00:53:28.960] There is so much pseudoscience that RFK has peddled.
[00:53:28.960 --> 00:53:39.680] And when you, again, look at the things that were covered by the panel, that's sort of like a good early taste of what we could be up against.
[00:53:39.680 --> 00:53:43.840] All of the wellness industry bullshit that they were peddling.
[00:53:43.840 --> 00:53:45.360] It really, really scares me.
[00:53:45.360 --> 00:53:48.080] Just so much pseudoscience and just outright lies.
[00:53:48.800 --> 00:53:49.360] Yeah.
[00:53:49.360 --> 00:53:49.680] All right.
[00:53:49.680 --> 00:53:50.720] Thanks, Kara.
[00:53:50.720 --> 00:53:56.240] All right, Bob, I understand the moon Miranda has water and maybe something else.
[00:53:56.480 --> 00:53:57.520] Perhaps.
[00:53:57.520 --> 00:54:04.960] We may, tentatively, perhaps, maybe have yet another icy moon in our solar system with a subsurface ocean.
[00:54:04.960 --> 00:54:06.160] And we know what that means, right?
[00:54:06.480 --> 00:54:10.240] My first thought is like, oh, maybe there's ketosynthetic life in there.
[00:54:10.560 --> 00:54:12.400] But that's definitely jumping the gun.
[00:54:12.400 --> 00:54:19.280] The moon, though, doesn't orbit Jupiter or Saturn, but the far more distant and less well-branded planet, Uranus.
[00:54:19.280 --> 00:54:26.320] How could such an interesting oceanic possibility be teased out of such a distant small object?
[00:54:26.560 --> 00:54:33.800] The study published in the Planetary Science Journal led by Tom Nordheim, planetary scientist at the Johns Hopkins Applied Physics Laboratory.
[00:54:34.040 --> 00:54:38.280] So this starts with Uranus, the second farthest planet at 19 AUs.
[00:54:38.280 --> 00:54:42.120] That's 93 million miles, the distance Earth to the Sun.
[00:54:42.120 --> 00:54:43.800] Jupiter's only five AUs.
[00:54:43.800 --> 00:54:46.200] So this is like a lot farther away.
[00:54:46.440 --> 00:54:48.760] This is the planet that rotates on its side.
[00:54:48.760 --> 00:54:55.640] You know that one giving it crazy seasons, like at the poles, it's 42 years of sunlight and then 42 years of darkness.
[00:54:55.640 --> 00:54:56.360] Wow.
[00:54:56.760 --> 00:54:58.040] Yeah, that's nasty.
[00:54:58.040 --> 00:55:02.520] The star of the study, so to speak, is Uranus's innermost moon, Miranda.
[00:55:02.520 --> 00:55:03.880] The moon is tiny.
[00:55:03.880 --> 00:55:06.520] It's got a surface area of Texas.
[00:55:06.520 --> 00:55:09.560] Texas is big, but it's small for a moon.
[00:55:09.560 --> 00:55:18.040] With a diameter of only 470 kilometers, it's one of the smallest observed objects in hydrostatic equilibrium in our solar system.
[00:55:18.200 --> 00:55:23.320] If it's in hydrostatic equilibrium, that means that it is round because of gravity.
[00:55:23.880 --> 00:55:26.440] Like the moons of Mars are not round.
[00:55:26.440 --> 00:55:27.320] Right, right.
[00:55:27.320 --> 00:55:29.800] Depending, of course, what it's made of, but they're rocky moons.
[00:55:29.800 --> 00:55:37.000] So how do we go about determining that there might be water under the ice on a moon that's 2.7 billion kilometers away?
[00:55:37.000 --> 00:55:38.120] Call the water company.
[00:55:38.440 --> 00:55:39.160] Yes.
[00:55:39.480 --> 00:55:45.720] Another option would be to, well, in this case, it's ironically about what's on the surface of the moon.
[00:55:45.720 --> 00:55:50.920] And we first got a look at Miranda from the pictures Voyager took way back in 86.
[00:55:50.920 --> 00:55:51.560] Wow.
[00:55:51.560 --> 00:55:54.920] It looks like a patchwork of different moons that got stitched together.
[00:55:54.920 --> 00:55:56.600] It's really kind of bizarre.
[00:55:56.600 --> 00:56:01.640] There's like these grooves or canyons that are 12 times deeper than the Grand Canyon.
[00:56:01.640 --> 00:56:03.320] And there's these huge cliffs.
[00:56:03.320 --> 00:56:08.760] And there's these weird trapezoidal shapes, geological shapes called Koronae.
[00:56:08.760 --> 00:56:15.200] They think it just might be dense, you know, metallic or rocky material from previous collisions with meteors.
[00:56:14.840 --> 00:56:18.480] Clearly, Miranda has a strange and complicated geological past.
[00:56:18.720 --> 00:56:23.280] So, to reconstruct that past, the researchers combined the old with the new.
[00:56:23.280 --> 00:56:33.520] They used the old Voyager pictures, and because that's really the best images that we have of Miranda, even though it was some 86, I mean, Voyager got pretty damn close.
[00:56:33.520 --> 00:56:38.000] And we incorporated those old pictures into modern modeling techniques.
[00:56:38.320 --> 00:56:44.160] Nordheim described it as squeezing the last bit of science we can from Voyager 2's images.
[00:56:44.160 --> 00:56:48.640] So, how does the surface of Miranda shed light on its interior?
[00:56:48.640 --> 00:56:58.000] They say in their paper, in this paper, we will attempt to constrain Miranda's interior structure from interpretation and modeling of surface stress patterns.
[00:56:58.000 --> 00:57:15.920] So, the researchers claim that by determining what caused those weird, deformed surface geological shapes and structures, they will be able to winnow the possibilities of what the interior of Miranda is like, based primarily on the surface.
[00:57:15.920 --> 00:57:27.120] Now, the model showed that, or concluded, that 100 to 500 million years ago, Miranda could have had a sub-ocean, a subsurface ocean more than 100 kilometers deep.
[00:57:27.120 --> 00:57:31.760] So, now, how do you actually create an ocean on a small moon so far from the sun?
[00:57:32.080 --> 00:57:37.040] The most likely culprit, they think, are orbital resonances with nearby moons.
[00:57:37.280 --> 00:57:38.720] These are really fascinating.
[00:57:38.720 --> 00:57:41.040] Resonances like this are like pushing a kid on a swing.
[00:57:41.040 --> 00:57:46.960] You know, the kid's swinging and you push at the right time, and the kid goes farther and farther and farther.
[00:57:46.960 --> 00:57:48.720] Orbital resonances are like that.
[00:57:48.720 --> 00:57:58.320] The tidal forces between the moons of Uranus can be amplified by these resonances to the point where the moons actually experience a change in their orbits.
[00:57:58.320 --> 00:58:01.880] And that's why Miranda's orbits inclined a bit, they think.
[00:57:59.840 --> 00:58:05.640] But it's not only the orbit that changes because of these resonances.
[00:58:06.280 --> 00:58:10.040] It can change the axis of rotation itself and the tilt of its axis.
[00:58:10.520 --> 00:58:16.760] So it just wreaks havoc with these moons, these resonances when they line up, when they're properly set up.
[00:58:16.760 --> 00:58:20.680] So this can also obviously wreak havoc on the moon's surface.
[00:58:20.680 --> 00:58:29.160] And they calculated that all of this resonance movement and stuff actually compressed one side of the moon and stretched the other side.
[00:58:29.160 --> 00:58:34.200] And that's probably one of the main reasons why we're seeing such a weird surface going on there.
[00:58:34.200 --> 00:58:40.280] Now, all of this also creates friction and heat in the interior, and that's what they think could have created the ocean there.
[00:58:40.280 --> 00:58:47.400] So it basically kind of almost boils down to tidal forces again, which is what we see in the moons of Jupiter as well.
[00:58:47.400 --> 00:58:53.480] We've got moons that some of the most volcanically active moons around Jupiter are because of the tidal forces.
[00:58:53.480 --> 00:58:56.520] They're just constantly kneading and compressing the interior.
[00:58:56.520 --> 00:59:02.200] So it's kind of related to what we're seeing here, although these resonances are a little bit different.
[00:59:02.200 --> 00:59:05.480] So how can that ocean still exist though after a half a billion years?
[00:59:05.480 --> 00:59:08.520] Because this happened, you know, 100 million to a half a billion years ago.
[00:59:08.760 --> 00:59:10.680] Why is this ocean still there?
[00:59:10.680 --> 00:59:18.840] So they say that one reason is that this tidal heating could persist because of this eccentric orbit that the moon is now in.
[00:59:18.840 --> 00:59:22.520] Now, they're not saying that the ocean is still 100 kilometers deep.
[00:59:22.600 --> 00:59:25.960] They say it's probably smaller, but it's probably still there.
[00:59:25.960 --> 00:59:37.800] And the other bit of evidence for that is that if it were frozen solid, if the moon had no subsurface ocean, they say that there would have been evidence of that on the surface, and they do not see that evidence.
[00:59:37.800 --> 00:59:39.880] So that's basically the gist of their argument.
[00:59:39.880 --> 00:59:45.000] In the future, they may be able to more definitively demonstrate that Uranus still has a subsurface ocean.
[00:59:45.280 --> 00:59:46.400] That would be very cool.
[00:59:46.400 --> 00:59:48.800] I mean, I don't think, I mean, this is so far away.
[00:59:48.800 --> 00:59:51.600] I mean, imagine three times farther away than Jupiter.
[00:59:51.600 --> 00:59:56.400] I mean, I don't think we're going to be getting there anytime in our lifetimes at all.
[00:59:57.040 --> 01:00:17.840] But maybe someday we will be able to say that this is one subsurface ocean that not only exists, and like I said at the beginning, perhaps if there's a subsurface ocean, there could potentially be some sort of life, some single-celled organisms, some microorganisms that are based on chemosynthetic rather than photosynthetic.
[01:00:17.840 --> 01:00:20.320] And that would be, of course, mind-boggling.
[01:00:20.320 --> 01:00:26.320] But actually, I'm hoping that we'll find that nearer by with Saturn and Jupiter and not have to go all the way to Uranus.
[01:00:26.320 --> 01:00:30.000] But one more thing to keep an eye on.
[01:00:30.000 --> 01:00:32.800] Yeah, but it's not going to be a probe there anytime soon.
[01:00:32.800 --> 01:00:39.360] No, but just the process they went through to model this and come to that conclusion was different and fascinating to me.
[01:00:39.360 --> 01:00:41.680] Yep, no probing of Uranus.
[01:00:41.920 --> 01:00:43.120] Got it.
[01:00:44.400 --> 01:00:46.960] All right, Evan, what's the Club 27 myth?
[01:00:46.960 --> 01:00:49.200] Have you heard of this before?
[01:00:49.520 --> 01:00:50.080] No.
[01:00:50.080 --> 01:00:59.120] No, I think maybe you have, but you don't know it as the Club 27 myth or what they call the 27 Club.
[01:00:59.120 --> 01:01:00.480] That's for short.
[01:01:00.480 --> 01:01:14.720] But this is a cultural phenomenon referring to a group of famous musicians initially and then artists and some other actors who got folded into this group who have all died at the age of 27.
[01:01:15.040 --> 01:01:15.920] Oh, okay.
[01:01:15.920 --> 01:01:16.320] Yeah.
[01:01:16.320 --> 01:01:16.640] Yeah.
[01:01:16.720 --> 01:01:17.920] The Kurt Coco Band.
[01:01:17.920 --> 01:01:19.200] Right, right, right.
[01:01:19.440 --> 01:01:21.280] But Kara, it didn't start there.
[01:01:21.280 --> 01:01:22.640] It's not a modern phenomenon.
[01:01:22.640 --> 01:01:22.960] Oh, right.
[01:01:22.960 --> 01:01:24.080] Janice Shoplin, too?
[01:01:24.480 --> 01:01:28.640] Yeah, this dates back to actually the first one was 1969.
[01:01:28.960 --> 01:01:30.600] That was when I was born.
[01:01:30.600 --> 01:01:35.400] And Rolling Stone's co-founder, Brian Jones, he dies at age 27.
[01:01:29.840 --> 01:01:36.040] He drowned.
[01:01:36.200 --> 01:01:39.640] It was tragic, and the rock world was stunned by that loss.
[01:01:39.640 --> 01:01:46.440] But then in 1970, Kara, Janice Joplin, also at age 27 died.
[01:01:46.440 --> 01:01:51.320] And Jimi Hendrix, age 27, dies in 1970.
[01:01:51.320 --> 01:01:54.280] So, all right, you got Brian Jones, 69.
[01:01:54.280 --> 01:01:58.040] Now, Janice and Jimmy are gone in 1970, all aged 27.
[01:01:58.040 --> 01:02:04.120] And then one year later, 1971, Jim Morrison, the lead singer of the Doors, he was 27.
[01:02:04.440 --> 01:02:06.120] Dies at age 27.
[01:02:06.200 --> 01:02:06.600] Nice.
[01:02:06.600 --> 01:02:07.160] In the club.
[01:02:07.320 --> 01:02:08.040] Holy crap.
[01:02:08.040 --> 01:02:11.800] What is going on with our music stars dying at age 27?
[01:02:11.800 --> 01:02:13.560] Is it some kind of curse?
[01:02:13.880 --> 01:02:17.480] But regardless, forever, there it was, cemented in our culture.
[01:02:17.480 --> 01:02:27.880] If you were a fan of any kind of music growing up in the 70s, even the early 80s, my guess is you have some kind of memory of discussions about the 27 club phenomenon.
[01:02:28.440 --> 01:02:33.240] And then what started happening is people started taking a peek back in time before the club was realized.
[01:02:33.240 --> 01:02:39.800] And you find out that legendary bluesman Robert Johnson, he was age 27 when he died.
[01:02:39.800 --> 01:02:42.600] So that, yeah, you add that big name to the club.
[01:02:42.600 --> 01:02:48.120] Going forward, like you said, Kara, Kurt Cobain from Nirvana in the 1990s.
[01:02:48.120 --> 01:02:54.360] Amy Weinhaus, also aged 27 in 2011, she passed away.
[01:02:54.360 --> 01:03:09.400] And so you wind up casting sort of this larger net because you wind, because what people will do is they'll start throwing actors and artists and other media people effectively into this more, and you get more relevant data points.
[01:03:09.400 --> 01:03:11.080] So, what do our brains do?
[01:03:11.080 --> 01:03:23.200] Well, we like to soak up these types of celebrity-related cultural phenomenon and accept them sort of as, I don't know, like a quasi-fact or fact-ish sort of thing that exists.
[01:03:23.200 --> 01:03:25.840] And maybe you think, wow, what are those odds?
[01:03:25.840 --> 01:03:31.040] All these amazing musicians and artists and celebrities dying at age 27.
[01:03:31.040 --> 01:03:32.480] What are those chances?
[01:03:32.480 --> 01:03:39.040] Are we really capable of knowing and understanding what those actual chances are with statistical significance?
[01:03:39.040 --> 01:03:40.160] Probably not.
[01:03:40.160 --> 01:03:42.000] But this has been studied before.
[01:03:42.000 --> 01:03:52.400] I went back to an article over at Vox from 2015, and there was research done that year by a professor of psychology and music at the University of Sydney.
[01:03:52.400 --> 01:03:55.520] Her name was Diana Theodora Kenney.
[01:03:55.520 --> 01:04:00.400] And the most common age of death musicians was not 27 in her study.
[01:04:00.400 --> 01:04:05.600] Do you want to guess what the age was most common death for these artists?
[01:04:05.600 --> 01:04:06.320] 68.
[01:04:06.320 --> 01:04:06.960] 69.
[01:04:07.280 --> 01:04:08.560] Bob's a little closer.
[01:04:08.880 --> 01:04:09.520] 67.
[01:04:10.480 --> 01:04:11.440] 56.
[01:04:11.760 --> 01:04:13.920] 56 was the age.
[01:04:13.920 --> 01:04:15.360] That's because musicians live hard.
[01:04:16.640 --> 01:04:22.880] Over 11,000 musicians were in that study, and she studied people who died between 1950 and 2010.
[01:04:22.880 --> 01:04:27.200] Only 1.3% of those musicians died at age 27.
[01:04:27.200 --> 01:04:31.360] 2.3% were the most at age 56.
[01:04:31.680 --> 01:04:39.040] So, yeah, and if she graphed it on a chart, it makes a very nice, you know, bell curve, statistically speaking.
[01:04:39.040 --> 01:04:46.800] But regardless of that and other studies, this continues to be a subject of revisiting and more studies, but in different ways.
[01:04:46.800 --> 01:04:49.520] And this is where we ran into the news item this week.
[01:04:49.520 --> 01:04:56.320] It's being covered by a lot of places, but Scientific American, her name is Rachel Newer, N-U-W-E-R.
[01:04:56.320 --> 01:05:02.840] She wrote an article about this titled The Myth that Musicians Die at 27 shows how superstitions are made.
[01:04:59.840 --> 01:05:03.000] Yeah.
[01:05:03.560 --> 01:05:12.840] And she's referring to a new study that appeared in the Proceedings of the National Academy of Sciences, P-N-A-S.
[01:05:14.280 --> 01:05:29.240] Titled, All right, bear with me here: Path Dependence, Stigmergy, and Mimetic Reification of the Formation of the 27 Club Myth with authors Zachary Dunovan and Patrick Kaminsky.
[01:05:29.240 --> 01:05:48.200] What they were trying to do here is basically get into how a legend or a myth that emerged out of random but strange series of events went on to have a real-world impact by shaping the legacies of other famous people who subsequently died at age 27.
[01:05:48.520 --> 01:05:56.920] In effect, they're saying, yeah, it's a myth, but there is maybe something going on here that is of some significance.
[01:05:56.920 --> 01:06:10.600] And what they did is they looked, they used Wikipedia and they looked at various languages, obviously, throughout the world, and they used an analysis of people who were born after 1900 and who died before 2015.
[01:06:10.600 --> 01:06:15.080] And they came up with over 344,000 Wikipedia pages.
[01:06:15.080 --> 01:06:20.440] But then they used page visits as their proxy for fame, right?
[01:06:20.440 --> 01:06:22.200] So this is based on that.
[01:06:22.200 --> 01:06:24.520] So they put that model together.
[01:06:24.520 --> 01:06:27.640] And do you know what happened when they looked at it?
[01:06:27.640 --> 01:06:32.160] And as far as looking at all the artists who died at age 27.
[01:06:32.720 --> 01:06:34.760] Was it a significantly different number?
[01:06:35.400 --> 01:06:36.200] Was it the same?
[01:06:36.200 --> 01:06:37.160] What do you guys think?
[01:06:37.160 --> 01:06:37.880] I don't know.
[01:06:38.200 --> 01:06:39.160] It was the same.
[01:06:39.160 --> 01:06:39.640] It was the same.
[01:06:40.280 --> 01:06:40.560] I know.
[01:06:40.560 --> 01:06:40.920] We didn't know.
[01:06:41.160 --> 01:06:41.880] It was the same.
[01:06:41.880 --> 01:06:43.400] That did not change.
[01:06:43.400 --> 01:07:03.120] But here's what they did find: they said among those in the 90th percentile of fame and higher, for those that did die at age 27, they experienced an extra boost of popularity in the form of more page visits to their Wikipedia page that could not be accounted for by other factors.
[01:07:03.120 --> 01:07:12.240] They say the effect was particularly pronounced for the most famous of the famous or individuals who roughly achieved the 99th percentile of fame.
[01:07:12.240 --> 01:07:23.840] And that bump, they say, indicates that people who die at age 27 are considerably more likely to be more famous than comparatively those who even die at just age 26 or 28.
[01:07:23.840 --> 01:07:25.200] So that's confirmation bias.
[01:07:25.200 --> 01:07:25.920] I suppose so.
[01:07:25.920 --> 01:07:26.640] That's what that is.
[01:07:26.640 --> 01:07:27.840] Yeah, 100%.
[01:07:27.840 --> 01:07:33.760] Yeah, so you have like people underestimate how many potential musicians there are, right?
[01:07:33.760 --> 01:07:35.280] Especially if you include celebrities.
[01:07:35.280 --> 01:07:37.280] It's thousands and thousands.
[01:07:37.280 --> 01:07:45.120] You could come up with the same kind of number of people who die at any age, and there's a bell curve, you know, that has nothing to do with the age 27.
[01:07:45.120 --> 01:07:53.360] That's just a, you know, people notice a pattern and then they include the data in subsequent formulations.
[01:07:53.360 --> 01:07:53.840] You know what I mean?
[01:07:54.160 --> 01:07:57.920] They carry that quirky observation forward.
[01:07:57.920 --> 01:08:05.120] And then, especially then you engage in confirmation bias, you're looking for data to fit the pattern and without looking at all the data.
[01:08:05.120 --> 01:08:06.000] It's just classic.
[01:08:06.000 --> 01:08:15.920] You know, this is why even if we see this same phenomenon happen, even like in the clinic, where you, by coincidence, see a couple of patients with some kind of a correlation.
[01:08:15.920 --> 01:08:17.440] And say, oh, maybe there's something here.
[01:08:17.440 --> 01:08:21.120] So you look to see if there's other cases and you find them.
[01:08:21.120 --> 01:08:23.360] And you include the original observations in the data.
[01:08:23.360 --> 01:08:25.840] And you have a case series that makes it seem like something's happening.
[01:08:25.840 --> 01:08:28.880] It's all just random quirkiness that has nothing to do with anything.
[01:08:28.880 --> 01:08:34.840] You need an independent, thorough evaluation with new data to see if this holds up.
[01:08:34.840 --> 01:08:35.880] And of course, it doesn't.
[01:08:29.760 --> 01:08:37.240] It's just random nonsense.
[01:08:37.560 --> 01:08:38.360] Correct.
[01:08:38.360 --> 01:08:38.760] All right.
[01:08:38.760 --> 01:08:39.480] Thanks, Seven.
[01:08:39.480 --> 01:08:41.560] Jay, who's that noisy time?
[01:08:41.560 --> 01:08:44.600] All right, guys, going back at least two episodes.
[01:08:44.600 --> 01:08:46.920] Here's the noisy that I played.
[01:09:05.240 --> 01:09:06.600] You guys have any guesses?
[01:09:06.600 --> 01:09:10.440] Yeah, it's Donald Duck maneuvering inside of a tank.
[01:09:10.760 --> 01:09:11.880] That's pretty good.
[01:09:11.880 --> 01:09:12.760] Okay.
[01:09:13.400 --> 01:09:20.200] I got so many good, good guesses, meaning ones that I thought, you know, just were worthy of that sound.
[01:09:20.200 --> 01:09:28.680] The first one was from Joe Vandenenden, and he says, Is that the walker bringing the spacecraft carrier, the Europa Clipper, back out of the launch pad?
[01:09:29.000 --> 01:09:29.960] Very cool guess.
[01:09:29.960 --> 01:09:31.000] I mean, yeah, I could see that.
[01:09:31.000 --> 01:09:33.480] I hear what you're saying there, and I think that's a great guess.
[01:09:33.480 --> 01:09:37.240] You're not correct, and I would like to actually hear that sound if you ever can get it.
[01:09:37.240 --> 01:09:47.560] Another listener named Matthew Morrison said, Hi, Jay, my daughter Niamh, and I think it is a ship moving through the water where there is a layer of ice on top that is breaking as the ship moves through it.
[01:09:47.560 --> 01:09:53.160] Another fantastic guess because there are water-like sounds going on in that clip.
[01:09:53.160 --> 01:09:55.160] So, you know, I think that was a pretty cool guess.
[01:09:55.160 --> 01:09:57.880] You are incorrect, and tell Niamh no big deal.
[01:09:57.880 --> 01:09:58.840] Everybody tries.
[01:09:58.840 --> 01:10:00.120] It's great to try.
[01:10:00.120 --> 01:10:01.880] Sometimes we actually win, right?
[01:10:01.880 --> 01:10:02.920] So keep trying.
[01:10:02.920 --> 01:10:07.560] Next one is Matt Soskins, and he said, Jay, great to meet you at Saikon.
[01:10:07.560 --> 01:10:14.960] Before I give you my guess, I want to tell you about my grandmother's brownie recipe and how it led to me, led me to know this noisy.
[01:10:14.960 --> 01:10:19.680] And then he's, of course, he's kidding because I asked people, you know, please don't write me these big stories.
[01:10:14.680 --> 01:10:20.880] It just cuts the chase.
[01:10:22.400 --> 01:10:26.240] And then he said, water-powered organ or a bird.
[01:10:26.240 --> 01:10:35.920] Now, whenever anybody says or a bird, I just ignore that because, of course, I shouldn't even be taking guesses from people with more than one guess, but it was a joke.
[01:10:35.920 --> 01:10:36.480] I get it.
[01:10:36.480 --> 01:10:37.840] Water-powered organ.
[01:10:37.840 --> 01:10:44.720] It is not a water-powered organ, but again, there is, you know, there is a water kind of noise in there, and I can see where he's coming from.
[01:10:44.720 --> 01:10:46.880] Next one is from Ben Simon.
[01:10:46.880 --> 01:10:54.640] He said, This week's noisy puts in mind movie scenes of nervous naval officers quietly glancing at each other during a tense submarine dive.
[01:10:54.640 --> 01:11:00.640] So I'm going to say this is a recording of a record-breaking deep dive by a research submersible.
[01:11:00.640 --> 01:11:02.000] Another awesome guess.
[01:11:02.000 --> 01:11:04.960] It's not correct, but damn, that's an awesome guess.
[01:11:04.960 --> 01:11:07.760] But there was no winner this week, and that's perfectly okay.
[01:11:07.760 --> 01:11:10.240] Like I said, you know, we try, sometimes we fail.
[01:11:10.240 --> 01:11:11.280] What did Yoda say?
[01:11:11.280 --> 01:11:12.720] Try or do not.
[01:11:12.720 --> 01:11:13.600] There is no fail, right?
[01:11:13.600 --> 01:11:14.720] Remember, do or do not.
[01:11:14.720 --> 01:11:15.680] There is no try.
[01:11:15.680 --> 01:11:16.640] Right, that's it.
[01:11:16.960 --> 01:11:20.080] Well, that does not apply to who's that noisy.
[01:11:20.960 --> 01:11:36.080] So I want to thank everybody for being honest because since nobody won, that means I'm 100% sure that everybody that listens to this show that sent in a guess respected my request to not write in if you knew the answer because I knew a lot of people knew the answer out there.
[01:11:36.080 --> 01:11:36.960] So, you know what?
[01:11:36.960 --> 01:11:38.080] Props to you guys.
[01:11:38.400 --> 01:11:43.200] More reason to go to Nauticon because we have high-quality people that listen to this show.
[01:11:43.200 --> 01:11:45.120] So, guys, what is that sound?
[01:11:45.120 --> 01:11:52.000] I was really excited to hear it after I knew what it was, and then I listened to it, and it ended up being as cool as I hoped it should be.
[01:11:52.000 --> 01:11:59.200] Okay, this is the sound of molten metals swirling in the Earth's core as its magnetic field flips.
[01:11:59.280 --> 01:12:00.000] Can't record that.
[01:12:00.360 --> 01:12:02.120] The tape recorder would melt.
[01:12:02.120 --> 01:12:03.560] Yeah, that's what I thought.
[01:12:03.560 --> 01:12:07.560] No, but that, guys, that is the internal swirling sound of the Earth's core.
[01:12:07.560 --> 01:12:11.800] And it made me think of something really interesting, and luckily, it's not the case.
[01:12:11.800 --> 01:12:14.520] Imagine if we constantly heard that noise.
[01:12:14.520 --> 01:12:17.640] But that noise is, you know, think about how loud that noise must be.
[01:12:17.640 --> 01:12:21.000] It just can't penetrate all the rock and regolith and everything.
[01:12:21.160 --> 01:12:23.400] I just thought that was a wicked cool noise.
[01:12:23.400 --> 01:12:25.240] Let's listen to it again.
[01:12:34.440 --> 01:12:35.160] It's loud.
[01:12:35.160 --> 01:12:37.000] I still hear Donald Duck in there a little bit.
[01:12:37.080 --> 01:12:38.920] There's a boat creek noise in there.
[01:12:39.240 --> 01:12:41.320] So I get all those guesses.
[01:12:41.320 --> 01:12:43.720] Jay, you mentioned as the magnetic field flips.
[01:12:44.360 --> 01:12:45.320] What does that mean?
[01:12:45.320 --> 01:12:48.760] I guess they were recording it in anticipation of the flip.
[01:12:48.760 --> 01:12:52.760] I don't think it has anything to do with the flip, but that's why they were recording.
[01:12:53.080 --> 01:12:58.760] And before I move on to the new noisy for this week, I have a response to something that we talked about.
[01:12:58.760 --> 01:13:07.800] You know, Kara was mentioning that, let me see, back in episode 1007, we heard the voice of both Helen Keller and her interpreter.
[01:13:07.800 --> 01:13:17.960] Kara commented on how strange it is to hear the accent that was used by the interpreter whenever recordings from that time period are played because nobody speaks that way anymore.
[01:13:17.960 --> 01:13:22.600] And then Kara speculated that that's probably just how people spoke back then, but it sounds affected.
[01:13:22.600 --> 01:13:25.160] So a couple of people wrote in about this.
[01:13:25.400 --> 01:13:28.760] This particular email is from someone named Stephen Hopkins.
[01:13:28.760 --> 01:13:34.680] And he says, it's app it absolutely was affected as it was manufactured.
[01:13:34.680 --> 01:13:37.720] It was a manufactured accent, which did not evolve organically.
[01:13:37.720 --> 01:13:42.040] It's called mid-Atlantic accent or the transatlantic accent.
[01:13:42.040 --> 01:13:57.040] And it was a basically, it was a put-on accent by Northeastern American upper class in the early 20th century, and it was adopted by many broadcasters and actors of that time period because it made them sound cultured and because they felt it helped their voice come through more clearly.
[01:13:57.040 --> 01:13:59.280] Okay, so the voice is BS, right?
[01:13:59.920 --> 01:14:00.960] That's an affectation.
[01:14:00.960 --> 01:14:02.160] Well, we've spoken about this.
[01:14:02.160 --> 01:14:04.800] We absolutely have the mid-Atlantic accent.
[01:14:05.120 --> 01:14:08.800] Yeah, what's interesting about it is that it's not a regional accent.
[01:14:08.800 --> 01:14:10.640] Like, it doesn't exist anywhere.
[01:14:10.640 --> 01:14:13.040] It is a learned accent.
[01:14:13.040 --> 01:14:15.760] It's taught like in finishing school or whatever.
[01:14:15.760 --> 01:14:28.080] And it is essentially like an Eastern American accent with some British affectations and mixed in, like British sounds mixed in.
[01:14:28.080 --> 01:14:36.560] And so, and it is supposed to be a very like you enunciate things very clearly, and you, you know, but it's also got this like high and mighty sound to it.
[01:14:36.560 --> 01:14:38.240] Yes, right, exactly.
[01:14:38.240 --> 01:14:45.680] But the thing is, so, and so that person might not have been speaking that way because she was being recorded.
[01:14:45.680 --> 01:14:51.040] She might have been speaking that way because she was educated at one of the schools who taught her to speak that way.
[01:14:51.040 --> 01:14:54.560] And maybe she did lay it on a little bit thick because she was being recorded.
[01:14:54.560 --> 01:15:00.960] We don't know because everyone has their own individual manifestation of like what their mid-Atlantic accent is.
[01:15:00.960 --> 01:15:18.880] But that doesn't mean there isn't also temporal accents because if you watch documentaries of like I watched a lot of World War II documentaries and you watch people in the 40s being interviewed on film and they're not actors and they're not, you know, they're not upper class affectations.
[01:15:18.880 --> 01:15:22.080] They speak in an accent that doesn't exist today.
[01:15:22.080 --> 01:15:23.800] You know, it's like the that's right.
[01:15:23.800 --> 01:15:24.120] Yeah.
[01:15:26.520 --> 01:15:27.520] It's not that exactly.
[01:15:27.520 --> 01:15:29.000] It's not like that's kind of a stage accent.
[01:15:29.440 --> 01:15:40.920] But that's what that's when you that's when you hear like what real people sound like at the time because they're not actors, they're just people in the war, you know, or whatever.
[01:15:40.920 --> 01:15:43.480] So there is absolutely temporal accents as well.
[01:15:43.480 --> 01:15:50.520] But I do agree that that person's accent was probably a learned, you know, transatlantic or middle agent.
[01:15:50.600 --> 01:15:51.640] I really don't like it.
[01:15:51.640 --> 01:15:53.800] It sounds so put on.
[01:15:53.800 --> 01:15:55.240] You know, it's snobby.
[01:15:55.800 --> 01:15:56.600] Yeah, absolutely.
[01:15:57.720 --> 01:15:58.600] It was snobby.
[01:15:58.600 --> 01:15:59.800] It was literally snobby.
[01:15:59.800 --> 01:16:01.240] I mean, that was kind of that.
[01:16:01.240 --> 01:16:03.400] And it just sort of faded away after World War II.
[01:16:03.400 --> 01:16:03.800] All right.
[01:16:03.800 --> 01:16:06.760] So good job, everyone, that sent in all those great guesses.
[01:16:07.240 --> 01:16:09.080] I have a new noisy for you this week.
[01:16:09.080 --> 01:16:10.360] Check this out.
[01:16:24.280 --> 01:16:25.000] There you go.
[01:16:25.000 --> 01:16:29.080] That's ready whip whipped cream being dispensed.
[01:16:29.960 --> 01:16:30.760] Yeah, we know that.
[01:16:31.000 --> 01:16:34.280] It's like 11 o'clock at night, and you get it in the refrigerator.
[01:16:35.240 --> 01:16:35.800] Yeah.
[01:16:36.120 --> 01:16:37.640] That's straight into the mouth.
[01:16:38.600 --> 01:16:41.240] That's a right of goddamn passage in the United States.
[01:16:41.240 --> 01:16:43.400] Oh, we should all take one tonight.
[01:16:43.400 --> 01:16:51.720] So, okay, guys, if you know this week's noisy, or if you heard something really cool, email me at wtn at the skepticsguy.org.
[01:16:51.720 --> 01:16:56.040] All right, so let's go on with that interview with Brian Cox and special guest Brian Wecht.
[01:16:56.040 --> 01:17:02.520] And for those premium patron members, you get to listen to the full uncut version of that interview.
[01:17:02.520 --> 01:17:03.960] That'll be up this weekend.
[01:17:15.680 --> 01:17:18.000] Well, joining us now is Brian Cox.
[01:17:18.000 --> 01:17:19.760] Brian, welcome to the Skeptical Sky to the Universe.
[01:17:19.760 --> 01:17:20.720] Ah, pleasure.
[01:17:20.720 --> 01:17:21.760] Pleasure to be here.
[01:17:21.760 --> 01:17:24.720] You are one of the people that we've been hoping to get an interview with for a very long time.
[01:17:24.720 --> 01:17:27.440] You're obviously one of the superstar science communicators in the world.
[01:17:27.440 --> 01:17:29.440] We really appreciate what you do.
[01:17:29.760 --> 01:17:36.480] So we were talking about, you know, making the transition from being basically an academic, a scientist, to being a science communicator.
[01:17:36.640 --> 01:17:37.760] How has that worked out for you?
[01:17:37.760 --> 01:17:39.600] How do you feel about that?
[01:17:39.600 --> 01:17:42.000] Well, I mean, the first thing to say was an accident.
[01:17:42.000 --> 01:17:43.840] So I didn't really plan it.
[01:17:44.240 --> 01:17:54.960] In fact, in my early career as a PhD student and then postdoc, all I tried to do was get a research fellowship so no one would bother me and then just do research.
[01:17:54.960 --> 01:17:56.480] And I didn't even want to teach.
[01:17:56.720 --> 01:18:01.440] I just wanted to avoid everything apart from doing particle physics.
[01:18:01.440 --> 01:18:12.560] And then I got involved in one of the sort of funding crises that happen every now and again in all sorts of countries, I think, in the UK.
[01:18:12.880 --> 01:18:17.440] So I began to get involved in arguing for more funding for research.
[01:18:17.440 --> 01:18:21.840] And that brought me into contact, I suppose, with the media and the press.
[01:18:21.840 --> 01:18:24.000] And so it was an accident, really.
[01:18:24.240 --> 01:18:31.520] And then the BBC in the UK interviewed me a few times and then said, why don't you make a little documentary on the radio about particle physics?
[01:18:31.520 --> 01:18:35.280] And then why don't you make a little TV show, a little low-budget thing about particle physics?
[01:18:35.280 --> 01:18:38.560] And so it happened by accident.
[01:18:38.560 --> 01:18:41.200] And now, I mean, I love to teach.
[01:18:41.200 --> 01:18:43.840] So now I choose to teach at the University of Manchester.
[01:18:43.840 --> 01:18:46.480] I teach first years, quantum mechanics and relativity, actually.
[01:18:46.960 --> 01:18:52.720] And I obviously, as you've said, I get involved in making television programmes and so on.
[01:18:52.720 --> 01:18:57.440] So it was something that I came to later in my career.
[01:18:57.440 --> 01:19:05.880] But I very strongly believe that it's an important part of an academic career if you choose to do it.
[01:19:06.200 --> 01:19:08.760] And in fact, I was at the University of Manchester last week.
[01:19:08.760 --> 01:19:12.920] We had a bigger sort of worldwide universities conference there.
[01:19:12.920 --> 01:19:34.600] And I spoke at that and said that I think it's extremely important that if academics want to engage in whatever capacity, it doesn't have to be making television programmes, but just in speaking about climate science, as we've spoken about today, for example, then that should be seen as not only positive, but it should be part of an academic career if the academic chooses for it.
[01:19:34.600 --> 01:19:36.840] So promotion case and so on.
[01:19:36.840 --> 01:19:41.800] And so I've come to believe that it's extremely important, of course, now to engage.
[01:19:41.800 --> 01:19:45.240] And we can talk about why, all the reasons why I think that's the case.
[01:19:45.240 --> 01:19:48.920] Yeah, obviously I completely agree with you, also being an academic myself.
[01:19:49.080 --> 01:19:54.520] I'm always curious, asking my fellow sort of academic science communicators, how's that going for you?
[01:19:54.920 --> 01:20:02.760] And how specifically, how does the university, do they agree with you that this should be part of an academic career and that you get credit for it for promotion?
[01:20:02.760 --> 01:20:04.200] And I know there's a little bit of a divide.
[01:20:04.200 --> 01:20:06.840] I think it's worse in the US than in the UK.
[01:20:07.560 --> 01:20:09.240] So what's your experience been?
[01:20:09.240 --> 01:20:23.320] Yeah, at the University of Manchester, actually, so we have the promotion, they call it the legs of the promotion case, and they are research, teaching, administration, and we call it social responsibility.
[01:20:24.680 --> 01:20:30.600] So it's a quarter, basically, of the case it can be, which is public engagement, as we might call it.
[01:20:30.840 --> 01:20:37.320] But I think it depends very strongly on your vice-chancellor, the head of the president of the university.
[01:20:37.320 --> 01:20:51.440] But also, and I spoke about this as well, this conference last week, it can often be that I do find that the people right at the top are very they understand, as I think we all understand, that it's vitally important to communicate science.
[01:20:51.760 --> 01:21:04.880] Of course, in a democracy, as Carl Sagan said, the idea that if you have a population that has no contact with the way that we acquire reliable knowledge about the world, then the decisions that democracy makes will be flawed.
[01:21:06.400 --> 01:21:08.720] And I think people at the top know that.
[01:21:08.960 --> 01:21:18.160] You can have problems in universities with the kind of middle management, the heads of department level, and things like that, you know, because of the funding streams and so on.
[01:21:18.160 --> 01:21:24.880] So, I think that's where in the UK, if there's going to be a problem, it will be with your kind of line manager.
[01:21:24.880 --> 01:21:28.400] It won't be with the people at the top who understand the wider picture.
[01:21:28.400 --> 01:21:35.920] Brian, I talk a lot about particle physics on the show, and I want to get your sense of how you feel about your competence in the future.
[01:21:35.920 --> 01:21:43.200] You know, you've got the LHC who's they've scaled up now to what, 14 tera electron volts, and the Higgs boson is still like the biggest thing that they've done.
[01:21:43.440 --> 01:21:51.040] They've made a lot of discoveries, but how is your optimism in the future for being able to reveal some new physics beyond the standard model?
[01:21:51.200 --> 01:22:03.920] Do you think we'll ever, in a reasonable amount of time, get to a regime where we can discover new physics, or is it probably forever beyond technology that we could build and finance to discover?
[01:22:04.000 --> 01:22:06.720] Is it just too far beyond us for a long time?
[01:22:06.720 --> 01:22:11.920] It's a very good question, and the answer is for the first time in the history of particle physics, we don't know.
[01:22:12.560 --> 01:22:13.360] It's a little scary, right?
[01:22:13.360 --> 01:22:14.240] It's a little scary.
[01:22:14.240 --> 01:22:19.440] I mean, so the LHC, I should say, it was, why was it built at that energy?
[01:22:19.440 --> 01:22:29.040] It was built at that energy because we knew that the standard model without a light Higgs broke down, mathematically speaking, at those energies.
[01:22:29.040 --> 01:22:36.280] So, what that means in reality is you either discover a Higgs boson of some kind or some other mechanism.
[01:22:37.000 --> 01:22:43.640] In fact, I worked on physics without a light Higgs at the LHC before we turned the LHC on.
[01:22:43.640 --> 01:22:47.160] So, signatures, the model breaks down.
[01:22:47.800 --> 01:22:52.760] One of the places it breaks down most obviously is in the scattering of W bosons, for example.
[01:22:52.760 --> 01:22:56.600] So, you can bang W bosons together, WW scattering, it's called.
[01:22:57.000 --> 01:23:08.280] And if you calculate that process without a Higgs boson in the theory, then it gives you nonsense, basically, energies of actually 1.4 TV, right?
[01:23:08.280 --> 01:23:11.560] So, well within the scope of the LHC.
[01:23:11.880 --> 01:23:19.080] So, that's why you could, with absolute confidence, build that machine, because you knew you were going to discover something.
[01:23:19.080 --> 01:23:28.760] It's also true to say many of my colleagues who've been in parcel physics longer than I have would say, but it's also true that most of the machines that we built discovered things they weren't built to discover.
[01:23:31.320 --> 01:23:38.200] But at least you knew there was an energy threshold which was within the scope of that machine, that you'd see something.
[01:23:38.200 --> 01:23:40.280] Right, and justify the billions of dollars spent, isn't it?
[01:23:40.440 --> 01:23:48.680] Yeah, now you're absolutely right now that at the moment with LHC, it's exciting that we're in a precision physics regime.
[01:23:48.680 --> 01:23:51.560] So, we're still looking for, obviously, new particles.
[01:23:51.800 --> 01:23:55.000] Most people, I think, would have put money on supersymmetry.
[01:23:55.000 --> 01:23:55.640] So, super symmetric.
[01:23:56.120 --> 01:23:58.360] That was my field when I was in the super string theory.
[01:23:58.360 --> 01:23:59.400] It comes out of string theory.
[01:24:00.760 --> 01:24:03.560] I'm sure that nature is supersymmetric at some point.
[01:24:04.280 --> 01:24:05.880] It seems very plausible, right?
[01:24:06.120 --> 01:24:10.960] And also plausible that the LHC might have seen some LHC.
[01:24:11.680 --> 01:24:12.280] Nothing.
[01:24:12.760 --> 01:24:15.600] So, you're right that what do you do?
[01:24:15.600 --> 01:24:24.480] I mean, it's true that particle physics goes in phases where you then go into a precision measurement phase, like the accelerator before it, LEP, in the same tunnel.
[01:24:24.880 --> 01:24:31.600] You know, we were making high-precision measurements on the W and Z bosons, and that was necessary information.
[01:24:31.600 --> 01:24:35.200] So you kind of go to a Higgs factory type model, for example.
[01:24:35.200 --> 01:24:45.200] If there's nothing else, you start making high-precision measurements on the Higgs, which is what we're doing with the LHC at the moment with the upgrades while still looking for new signatures.
[01:24:45.440 --> 01:24:47.600] So we know that it's not complete, right?
[01:24:47.680 --> 01:24:50.880] We know that we don't know where the energy scale is.
[01:24:50.880 --> 01:24:51.360] Yeah, right.
[01:24:51.680 --> 01:24:53.200] So we don't know how big to make the next one.
[01:24:53.200 --> 01:24:54.480] Is that basically the answer?
[01:24:54.480 --> 01:24:57.040] Yeah, but it's not just big, it's the type.
[01:24:57.040 --> 01:24:59.040] So you could do a muon collider or something like that.
[01:24:59.360 --> 01:25:02.160] If you're going to do a perionic collider, then that looks pretty good.
[01:25:02.320 --> 01:25:04.400] You'd have to just scale it up, but you can do lots of other stuff.
[01:25:06.080 --> 01:25:17.680] But then again, having said all that, I'm a strong supporter of the big machine, whatever it's currently called, that CERN wants to build the super LHC, which is a 100-kilometer tunnel, I think.
[01:25:18.800 --> 01:25:19.280] Amazing.
[01:25:20.160 --> 01:25:30.560] Because what you do find is that, and I saw it firsthand with LHC, is that there aren't many people who know how to build accelerators on that scale.
[01:25:30.560 --> 01:25:31.920] And they're really difficult.
[01:25:31.920 --> 01:25:36.400] And you can forget, you can lose the expertise, and it's hard.
[01:25:36.400 --> 01:25:43.040] And actually, a lot of the people who worked on the LHC were towards the end of their careers, they're highly experienced people.
[01:25:43.040 --> 01:25:48.000] And so I think there's a very strong argument that it's not a lot of money, actually.
[01:25:48.000 --> 01:25:51.120] When you look at these are decadal projects.
[01:25:51.920 --> 01:25:57.760] We're talking about the machine for 20, 30, 40, 50 years in the future.
[01:25:57.760 --> 01:26:11.480] And so, at the level of a billion dollars a year or something in total, which CERN is, it looks expensive, CERN, but actually, its budget is less than my university, the University of Manchester.
[01:26:11.480 --> 01:26:18.520] So, its yearly budget out of which it builds the machines is of a medium-sized university.
[01:26:18.520 --> 01:26:22.600] It's a lower budget than Harvard and Princeton, those universities, right?
[01:26:23.000 --> 01:26:38.920] So, I think at that level, the idea that the world has this capability to build these machines and builds one of them, and it takes decades to build them, and then you operate it and do good physics with it for 50 years is compelling to me.
[01:26:39.080 --> 01:26:43.640] And, you know, I put money on there being interesting stuff.
[01:26:43.960 --> 01:26:45.880] It's not you know, it's getting theoretical physics.
[01:26:47.240 --> 01:26:48.520] You get into problems.
[01:26:48.520 --> 01:26:51.240] If you don't see it, it's getting narrower and narrower and narrower.
[01:26:53.720 --> 01:27:02.040] But having said all that, as I said earlier, it is the case that we couldn't guarantee it, knowing what we know at the moment.
[01:27:02.360 --> 01:27:06.360] It does seem like, I mean, supersymmetry, the constraints are getting tighter and tighter for that, right?
[01:27:06.360 --> 01:27:07.640] So, who knows what's going to happen with that?
[01:27:07.640 --> 01:27:19.400] But then you have something which is possibly related, although a priori not necessarily, dark matter, where, right, that seems like a much more plausible discovery to me at some point in the nearer future.
[01:27:19.640 --> 01:27:21.880] Sure, and the standard model doesn't say anything about that.
[01:27:21.880 --> 01:27:23.320] So, we need to go beyond.
[01:27:23.320 --> 01:27:25.080] We need to go beyond the standard model.
[01:27:25.080 --> 01:27:25.480] That's right.
[01:27:25.480 --> 01:27:27.000] And is that supersymmetry or something else?
[01:27:27.320 --> 01:27:33.160] I saw this very, there's a very cool work that I heard of the other day from string theory.
[01:27:33.160 --> 01:27:42.080] So, maybe you know it, where you're looking at the parameter space of string theory and looking at this landscape of possibilities.
[01:27:42.080 --> 01:28:00.240] And then saying, I think it's true to say that if you go, if you take the cosmological constant, which is what is it, 10 to the minus 122 ridiculously tiny number, and you use that as the parameter that you don't know why that's the case, but you've used that.
[01:28:00.800 --> 01:28:25.360] I think there's some theories now that are suggesting that you could link that to dark matter in the sense that you get, it can be sort of slight, large-ish extra dimensions about the micron scale that are implied by that low value of the cosmological constant, which, and then the gravitons, that the tower of excited states of gravitons seem to have the right properties to be dark matter.
[01:28:25.360 --> 01:28:25.920] So it was.
[01:28:26.000 --> 01:28:27.680] Oh, the decline states from that.
[01:28:28.160 --> 01:28:28.720] Yeah, I heard it.
[01:28:28.720 --> 01:28:31.760] It's actually, it's on a collaborator of mine.
[01:28:31.760 --> 01:28:32.880] Right, so it's his work.
[01:28:33.280 --> 01:28:33.840] I don't know this work.
[01:28:34.560 --> 01:28:47.040] Yeah, so swampland is kind of the there's a while since I thought about this, so I might be getting this wrong, but there's a question of is everything that's possible out there described by some theoretical model or not?
[01:28:47.360 --> 01:28:50.960] So is it how complete or, you know, is your theoretical framework?
[01:28:50.960 --> 01:28:58.960] And he coined this term swampland to ask what can we actually, you know, what's allowed in terms of the parameter space versus the actual theory.
[01:28:59.280 --> 01:29:02.080] Yeah, so it was, it was, I think there's a review paper.
[01:29:02.080 --> 01:29:02.800] I haven't read it yet.
[01:29:02.800 --> 01:29:04.320] I was only made aware of it the other week.
[01:29:04.320 --> 01:29:06.720] So I'm on the plane back tonight.
[01:29:06.960 --> 01:29:07.120] Okay.
[01:29:08.400 --> 01:29:09.600] But it looks really fascinating.
[01:29:09.600 --> 01:29:13.920] So it's just an example of where there's theoretical progress.
[01:29:14.640 --> 01:29:22.120] And string theory is a good example because I get asked a lot, you know, that people tend to think, oh, it kind of went away, it kind of failed, doesn't it?
[01:29:22.080 --> 01:29:22.800] But it's not.
[01:29:22.800 --> 01:29:25.600] It's a tremendous amount of progress.
[01:29:25.840 --> 01:29:38.920] And with holography as well, which is coming in there and ADSCRT and links to the tiny bit of research that I do into black holes, then it does seem that we're on the verge of, I think, a really exciting transformation.
[01:29:39.560 --> 01:29:40.040] Fundamentally.
[01:29:40.200 --> 01:29:58.040] The way I have always explained it to people is I think the original, so when string theory was first, you know, kind of out there in the mid-80s, some very optimistic people said, you know, we're going to be, we're going to have the, you know, electron mass from first principles in 10 years, which was just completely not true.
[01:29:58.040 --> 01:30:03.160] Because what they wanted is write down the theory, we get to a four-dimensional universe with the standard model, and that's it.
[01:30:03.160 --> 01:30:03.880] That's not true.
[01:30:03.880 --> 01:30:04.520] That didn't happen.
[01:30:04.520 --> 01:30:05.880] I think it's never going to happen.
[01:30:05.880 --> 01:30:12.680] But what string theory does provide is this tremendous toolbox that you can use to understand hard problems, like with holography.
[01:30:12.680 --> 01:30:20.920] And people are using this all the time to study amazing things that we didn't have access to before from a variety of standpoints.
[01:30:20.920 --> 01:30:25.800] Is it a shut up and calculate kind of moment where it's like it works, don't worry about how it relates to reality?
[01:30:26.760 --> 01:30:28.680] I think that's a valid philosophy.
[01:30:29.480 --> 01:30:32.120] What can we calculate with this that we couldn't calculate before?
[01:30:32.200 --> 01:30:34.600] It's also valid to ask, and what does this mean for the real world?
[01:30:35.000 --> 01:30:38.200] Can you use these techniques to actually calculate anything useful?
[01:30:38.200 --> 01:30:39.800] That is an open question right now.
[01:30:39.800 --> 01:30:45.480] There are some people who are just really digging deep and trying to get the standard model out of string theory.
[01:30:45.480 --> 01:30:50.760] Even just getting the standard model with the right particles and masses and interactions, that's very, very hard to do.
[01:30:50.760 --> 01:30:54.760] Guys, if we could just magically have these answers appear in front of us, right?
[01:30:55.240 --> 01:30:57.080] What's the practicality behind it?
[01:30:57.080 --> 01:31:04.520] Is the goal here to just understand how the world works, or are there actual applications that people like me could relate to?
[01:31:04.840 --> 01:31:06.200] Look what flowed from quantum mechanics.
[01:31:06.200 --> 01:31:08.040] I mean, the whole industry is a very good idea.
[01:31:08.120 --> 01:31:09.000] I'm not talking to you.
[01:31:09.000 --> 01:31:09.800] I'm not talking to you.
[01:31:12.280 --> 01:31:28.160] So, one of the fascinating areas, which I've been involved in a little bit, so I work with, I have a co-supervisor PhD student at Manchester, who happens to be funded, by the way, by an information technology company that is working on black holes, quantum information.
[01:31:29.120 --> 01:31:44.000] There's a direct link between at least the techniques that have been developed to try to understand things like the black hole information paradox and the techniques you use to build error correction codes, quantum computers, you know, to try to protect the memory from errors and so on.
[01:31:44.560 --> 01:31:58.320] So, if you'd have said, and I say this to funding agencies when I speak to them, if I'd have said to you, fund research into collapsed stars because that will help you build quantum computers and understand them, then they would have just laughed at you.
[01:31:59.440 --> 01:32:02.720] But it turns out that the skill sets, at least, are the same.
[01:32:02.720 --> 01:32:26.960] And it could, I mean, basically, you know, it's not only that, it is that this field of emergent space-time, which is very popular at the moment, and where essentially what you have, the simple way to say it is you have space-time emerging from a quantum theory, from quantum entanglement of some objects, which are probably at the scale from the Bekenstein entropy, right?
[01:32:26.960 --> 01:32:31.600] The scale that you tile the event horizon to work out the entropy of a black hole.
[01:32:31.600 --> 01:32:34.400] It's probably, we probably know the distance scales, actually.
[01:32:35.040 --> 01:32:36.960] It's probably string, it's a string scale.
[01:32:38.320 --> 01:32:49.200] But so, the idea that space-time is emerging from entanglement, that's becoming an experimental science now.
[01:32:49.360 --> 01:32:50.400] I find it interesting.
[01:32:50.400 --> 01:33:02.760] I find it fun that the use, probably the best use from a physicist perspective of quantum computers, like the Google quantum computer and Microsoft, is not actually as a quantum computer, but as a load of qubits.
[01:32:59.760 --> 01:33:05.320] Because they're really good arrays of qubits.
[01:33:05.640 --> 01:33:08.600] That's not why they spent billions of dollars building the things.
[01:33:08.600 --> 01:33:10.440] But if you're a physicist, you go, this is brilliant.
[01:33:10.440 --> 01:33:11.480] I've got a load of qubits.
[01:33:12.200 --> 01:33:14.840] And there was a paper recently where there's this filament.
[01:33:14.840 --> 01:33:15.640] I know you saw it.
[01:33:17.480 --> 01:33:23.240] So it's in a particular configuration of the qubits, a particular entanglement structure of the network.
[01:33:23.480 --> 01:33:28.760] You get something that you could interpret as a filament of space emerging.
[01:33:29.080 --> 01:33:29.640] Oh, my God.
[01:33:29.800 --> 01:33:32.760] It's sometimes described as a one-dimensional wormhole.
[01:33:32.760 --> 01:33:33.160] Right.
[01:33:34.040 --> 01:33:35.800] But that's a remarkable paper.
[01:33:36.680 --> 01:33:37.880] It's a published paper.
[01:33:37.880 --> 01:33:38.680] You can look at it.
[01:33:38.680 --> 01:33:42.280] There's some controversy about if that's the right interpretation of it.
[01:33:42.280 --> 01:33:48.520] But it's fascinating that quantum gravity is becoming an experimental science potential.
[01:33:48.600 --> 01:33:50.200] We seem to be on the verge of that.
[01:33:50.200 --> 01:33:51.400] For years now, too.
[01:33:51.960 --> 01:33:56.520] I mean, another question that a lot of people say, oh, it's nonsense, extra dimensions, right?
[01:33:56.520 --> 01:34:09.080] There's a, I don't know if they're still doing it, but for a while there was a group, the Adelberger group in Seattle, which was, so the idea is that if you have access to extra dimensions at very small scales, you'll see deviations from the inverse square law of gravity, right?
[01:34:09.080 --> 01:34:11.960] Because the gravitational flux can spread into the extra dimensions.
[01:34:11.960 --> 01:34:19.080] So what they were doing is they were moving things together in very tiny distance scales and checking for deviations from one over r squared.
[01:34:19.160 --> 01:34:22.360] There's one extra dimension, then it's one over r cubed, et cetera, et cetera.
[01:34:24.360 --> 01:34:25.880] If they had, you would know about it.
[01:34:26.600 --> 01:34:27.880] I haven't talked about it.
[01:34:28.520 --> 01:34:30.600] I've rushed this paper from that I heard of that.
[01:34:30.600 --> 01:34:33.240] I said I haven't read in detail yet, so I'll read it tonight.
[01:34:33.480 --> 01:34:42.800] But it does suggest that there may be a large-ish dimension at the micron scale, which so it'd be interesting to see what the experimental because they can put downs on it.
[01:34:42.800 --> 01:34:45.200] It's like the number and size of the extra dimensions.
[01:34:45.680 --> 01:34:50.720] Does that relate to the as an explanation for the weakness of gravity compared to the other fundamental forces?
[01:34:50.720 --> 01:34:53.840] Is that like kind of leaking into these other potential dimensions?
[01:34:53.840 --> 01:34:55.520] Is that where you were going with that?
[01:34:56.000 --> 01:34:56.880] No, not quite.
[01:34:56.880 --> 01:34:59.440] I mean, it's not totally unrelated, but it wouldn't be the same thing.
[01:34:59.760 --> 01:35:01.840] Historically, that was a thought, wasn't it?
[01:35:01.840 --> 01:35:02.080] Yeah, yeah.
[01:35:02.960 --> 01:35:08.080] So I'm not sure you know more than me in the modern string theory, whether that's guys.
[01:35:08.400 --> 01:35:12.800] Why do physics change when you go smaller, but they don't change when you go bigger?
[01:35:13.120 --> 01:35:13.840] Why do you?
[01:35:13.840 --> 01:35:14.480] You know what I mean?
[01:35:14.480 --> 01:35:21.200] Like, you know, because we know that when you go, when you get into a certain quantum regime, right?
[01:35:21.200 --> 01:35:23.520] That you're asking why isn't there a new regime of validity?
[01:35:24.320 --> 01:35:29.520] If you just opened up, you made the scale tremendous, would concepts of physics change?
[01:35:29.760 --> 01:35:37.600] Well, don't some people think that's exactly what happens, and that's why dark matter is actually gravity behaving differently in the supervisor.
[01:35:37.840 --> 01:35:40.560] Okay, so we have theories about that.
[01:35:41.040 --> 01:35:42.160] That's theoretically possible.
[01:35:42.160 --> 01:35:43.360] I think that's a minority viewpoint.
[01:35:43.520 --> 01:35:44.640] It's a minority video.
[01:35:46.400 --> 01:35:49.280] You think dark matter is the answer to the observations?
[01:35:49.920 --> 01:35:50.480] Well, I don't.
[01:35:50.480 --> 01:35:52.800] No, I mean, we've operated under the assumption.
[01:35:53.040 --> 01:36:04.560] The thing is, the assumption that it's a weakly interacting particle of some description, that fits quite a lot of things, including in particular the cosmic microwave background.
[01:36:04.880 --> 01:36:14.640] It's an important component of the way that these sound waves move through the plasma in the early universe before 380,000 years after the Big Bang.
[01:36:14.960 --> 01:36:17.760] And that so we have very good data there.
[01:36:17.760 --> 01:36:23.520] And essentially, what you're seeing, if you look at those pictures of CMB, you're seeing sound waves going through the plasma.
[01:36:24.000 --> 01:36:35.800] And the presence of some kind of weakly interacting particle in there that does that not electromagnetically interacting is a component of those fits that fit very well.
[01:36:36.120 --> 01:36:39.400] And that fits also gravity rotation curves and all the things as well.
[01:36:39.800 --> 01:36:43.080] So it's a good model.
[01:36:43.080 --> 01:36:47.640] But it's not to say it's right because we finally discovered what it is.
[01:36:47.960 --> 01:36:58.040] But it does fit multiple different independent phenomena that we see, not only the gravitational phenomena, but also the CMB.
[01:36:58.040 --> 01:37:06.760] And it tends to be the case, you said when you modify general relativity, for example, it tends to be the case you can modify it and fit something, but you mess up a lot of other things.
[01:37:07.160 --> 01:37:08.360] It's quite difficult, isn't it?
[01:37:08.600 --> 01:37:09.480] Or near impossible.
[01:37:09.800 --> 01:37:10.200] That's right.
[01:37:10.200 --> 01:37:16.280] And I think if you polled most working physicists now, they would, you know, for a while there was this WIMP versus macho around dark matter.
[01:37:16.600 --> 01:37:21.080] I think it seems fairly consistent that most people would say WIMPs at this point.
[01:37:21.080 --> 01:37:23.640] But you could find some WIMPs and people who want to.
[01:37:24.040 --> 01:37:26.040] Well, I mean, until we know we know, right?
[01:37:26.040 --> 01:37:26.200] Right.
[01:37:27.080 --> 01:37:27.880] Anything is possible.
[01:37:28.200 --> 01:37:36.120] But I find it this idea, clearly dark energy is even more perplexing, to say the least.
[01:37:36.680 --> 01:37:37.000] Certainly.
[01:37:37.640 --> 01:37:54.600] This idea that I find it fascinating that there may be a link between that and, of course, inflation, which looks similar, and the fact that our universe is not is on the edge of stability, because which we know from measurements of the Higgs mass and the top quart mass and so on.
[01:37:54.600 --> 01:38:01.080] So I think there's quite a few of my colleagues you speak to think that maybe these things are related in some way.
[01:38:01.080 --> 01:38:09.240] I mean, inflation looks, it's a very much at the different energy scale, but then you've got the inflation, you've got the Higgs, which is not com contributing to anything, it seems.
[01:38:09.240 --> 01:38:16.560] This scalar field that doesn't blow the universe apart, and then you've got dark energy that's a fine sound.
[01:38:14.520 --> 01:38:19.520] Yeah, so that's the cussing edge at the moment.
[01:38:19.600 --> 01:38:22.640] I think it's one of the most interesting fields in theoretical physics.
[01:38:22.640 --> 01:38:24.640] Trying to understand if those things are the same.
[01:38:24.640 --> 01:38:26.000] Maybe they're not all different.
[01:38:26.880 --> 01:38:35.680] Does it feel to you that we're still missing something absolutely fundamental about the universe that is making it impossible for us to really understand what's going on?
[01:38:35.840 --> 01:38:41.040] I mean, the example, maybe you could talk as well, is holography, I think.
[01:38:41.040 --> 01:38:45.360] You know, ADSCFT, which is an example of a holographic theory.
[01:38:45.360 --> 01:38:49.280] That, I think, is really radical.
[01:38:49.840 --> 01:38:54.960] It's this idea that we can find dual descriptions of reality.
[01:38:55.360 --> 01:38:58.320] So that toolbox surely is going to be.
[01:38:58.800 --> 01:39:07.120] Yeah, the rough idea is that you can use a 10-dimensional string theory to describe essentially a four-dimensional field theory.
[01:39:07.120 --> 01:39:11.040] And we're in the regime where one problem gets hard, the other gets easy.
[01:39:11.040 --> 01:39:18.560] So you can do kind of geometrical calculations in the gravity regime to give you field theory data in the particle thing.
[01:39:18.560 --> 01:39:27.280] And in very special, supersymmetric cases, there's an exact dictionary between, okay, the mass of this baryon is the dimension of this operator.
[01:39:27.280 --> 01:39:35.200] And you can get these really non-trivial matchings, you know, these crazy, like, irrational numbers that, you know, pop up very nicely in both.
[01:39:35.200 --> 01:39:36.720] And it's a wild thing.
[01:39:38.000 --> 01:39:45.840] There's no proof of it per se, but there's so much data to indicate that it's correct that I think it has to be true.
[01:39:45.840 --> 01:39:50.560] Now, does that help us describe our universe is another question entirely.
[01:39:51.040 --> 01:39:52.160] One last question, guys.
[01:39:52.160 --> 01:39:52.640] Who gets it?
[01:39:52.800 --> 01:39:55.440] Will AI help this field in any way?
[01:39:55.560 --> 01:39:58.920] It's a really, it's a question that gets asked a lot, isn't it?
[01:39:58.640 --> 01:40:03.720] And it does, you know, in data analysis, then you'd have to say yes, right?
[01:40:04.200 --> 01:40:06.120] Large data sets and so on.
[01:40:06.680 --> 01:40:14.840] Whether or not creating new physical things, like asking Chat GPT to build a quantum theory of gravity, right?
[01:40:15.400 --> 01:40:16.680] That's a different thing.
[01:40:17.240 --> 01:40:19.320] I don't know what the answer is to that.
[01:40:19.560 --> 01:40:20.440] Have you tried plugging that in?
[01:40:21.560 --> 01:40:22.200] You can try.
[01:40:22.440 --> 01:40:23.160] It won't do it.
[01:40:24.200 --> 01:40:36.680] Big thing, so I left physics about 10 years ago, and one of the big things that I see different that a lot of my colleagues are doing is a lot of them are doing machine, applying machine learning to complicated systems to see what they can do.
[01:40:36.680 --> 01:40:41.080] So it's not AI in a solve the problem, it's going to accelerate the research.
[01:40:41.080 --> 01:40:41.400] That's right.
[01:40:41.800 --> 01:40:45.800] It's kind of like what I would say a good analogy is what AI did with protein pulp.
[01:40:45.800 --> 01:40:46.040] Yeah.
[01:40:46.040 --> 01:40:46.440] Right, okay.
[01:40:46.440 --> 01:40:47.640] So it could speed things up a lot.
[01:40:47.720 --> 01:40:48.840] It could speed things up in that way.
[01:40:49.480 --> 01:40:53.080] So Brian, tell us about your current project, the Horizons show that you're doing.
[01:40:53.080 --> 01:40:55.800] Yeah, I've been doing these, as you said, show.
[01:40:56.120 --> 01:41:02.920] My friend Robin Ins, comedian that I work with on the VBC, said, you know, you should call it a lecture.
[01:41:02.920 --> 01:41:04.840] And they say, not at those ticket prices, though.
[01:41:04.920 --> 01:41:05.880] You can't call it a lecture.
[01:41:07.000 --> 01:41:07.880] So it is true.
[01:41:08.200 --> 01:41:13.000] So I ended up developing this live show, which is big LED screens, basically.
[01:41:13.560 --> 01:41:16.680] And then many of the concepts we've just discussed, actually.
[01:41:16.920 --> 01:41:18.680] So it has become a show.
[01:41:19.000 --> 01:41:21.720] And it was built for arenas in the UK.
[01:41:21.720 --> 01:41:26.280] So we've done 14,000, 15,000 people in the Ota Arena and Wembley and things like that.
[01:41:27.160 --> 01:41:28.680] Stadium cosmology.
[01:41:30.280 --> 01:41:33.640] And so I've enjoyed doing it a lot.
[01:41:34.200 --> 01:41:35.200] We've done it.
[01:41:35.200 --> 01:41:39.160] Over 400,000 people have seen the show across the world in the last year.
[01:41:39.240 --> 01:41:40.680] That does make me feel good about humanity.
[01:41:40.680 --> 01:41:43.640] They won't get that many people to sit for a silent show.
[01:41:44.040 --> 01:41:47.440] 15,000 people listening to you describe a Penrose diagram.
[01:41:44.840 --> 01:41:48.400] It's kind of strange.
[01:41:50.080 --> 01:41:58.640] So just to finish it off, because we came to the US actually very early on in the development of it, and it was just after COVID, and we did some small places.
[01:41:58.640 --> 01:42:00.080] And I just wanted to finish it off.
[01:42:00.240 --> 01:42:09.360] So we're bringing it back at the end of April, start of May to a few cities: LA, San Francisco, New York, Chicago, Seattle, Portland.
[01:42:11.120 --> 01:42:17.600] And it's just me really saying I want to say goodbye to this, this particular show.
[01:42:17.600 --> 01:42:23.760] And I'd like to do it here because we started here, in a sense, a long time ago, and it's changed a lot.
[01:42:23.760 --> 01:42:25.600] Where can people find dates and get tickets?
[01:42:25.840 --> 01:42:29.120] There's a website called Briancoxlive.co.uk.
[01:42:29.120 --> 01:42:32.480] It's.co.uk because I think brian.coxlive.com.
[01:42:32.480 --> 01:42:33.760] I don't know, probably we couldn't get it.
[01:42:34.960 --> 01:42:35.840] That doesn't matter, does it?
[01:42:35.840 --> 01:42:38.160] It says briancoxlive.co.uk.
[01:42:38.160 --> 01:42:39.600] And the tickets are there.
[01:42:39.600 --> 01:42:52.960] And, you know, we might try and extend it a bit and come to, you know, I've never done a show in Vegas, for example, so I think we could just ask David Copperfield to move out for a night or something and put it in there.
[01:42:53.200 --> 01:42:53.840] But yeah.
[01:42:54.240 --> 01:42:55.360] Well, Brian, this has been awesome.
[01:42:55.360 --> 01:42:56.720] Thank you so much for sitting down with us.
[01:42:56.880 --> 01:42:57.200] Thank you.
[01:42:57.360 --> 01:42:57.920] Fantastic.
[01:42:57.920 --> 01:42:58.320] Thank you.
[01:42:58.480 --> 01:42:58.720] Thanks.
[01:42:58.880 --> 01:42:59.760] All right.
[01:43:02.640 --> 01:43:07.840] It's time for science or fiction.
[01:43:12.640 --> 01:43:26.560] Each week I come up with three science news items or facts, two real and one fake, and I challenge my panel of skeptics to tell me which one is the fake, because panel is the plural of skeptics or the collective.
[01:43:26.560 --> 01:43:27.440] All right, you guys ready?
[01:43:27.440 --> 01:43:28.640] Three regular news items.
[01:43:28.640 --> 01:43:29.040] Yep.
[01:43:29.040 --> 01:43:29.760] Here we go.
[01:43:29.960 --> 01:43:45.560] Item number one: new research finds that higher penetration of weather-dependent renewable energy sources, wind and solar, on the grid does not increase vulnerability to blackouts and reduces their severity when they occur.
[01:43:45.560 --> 01:43:56.040] Item number two: a recent study finds that coyotes are thriving in North America, and in fact, direct hunting by humans results in larger populations.
[01:43:56.040 --> 01:44:05.960] And item number three: a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:44:05.960 --> 01:44:07.400] Bob, go first.
[01:44:07.400 --> 01:44:13.400] Okay, so wind and solar do not increase the vulnerability of blackouts.
[01:44:13.720 --> 01:44:15.160] Yeah, that kind of makes sense.
[01:44:15.560 --> 01:44:19.400] I think I'm just going to buy that, although I haven't read anything specific about that.
[01:44:20.040 --> 01:44:27.320] Coyotes are thriving in North America, and direct hunting by humans results in larger populations.
[01:44:27.640 --> 01:44:29.080] I don't know about that last bit.
[01:44:29.640 --> 01:44:33.080] Let's see, I did not even absorb this third one at all.
[01:44:33.080 --> 01:44:41.640] Population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:44:41.640 --> 01:44:45.960] So basically, being preterm has no bad side effects.
[01:44:45.960 --> 01:44:47.400] Lasting into adulthood.
[01:44:47.400 --> 01:44:52.680] All right, tell me everything that you can say about these three for the first person going.
[01:44:53.000 --> 01:44:55.080] They're all pretty self-explanatory.
[01:44:55.080 --> 01:44:56.760] All right, I'm going to say the wolves.
[01:44:56.840 --> 01:44:57.800] The coyotes, you mean?
[01:44:58.040 --> 01:44:58.760] Fiction.
[01:44:58.760 --> 01:44:59.480] Wolves, coyotes.
[01:44:59.560 --> 01:45:00.520] Wait, you said coyotes?
[01:45:00.520 --> 01:45:01.680] It's coyotes and wolves.
[01:45:02.360 --> 01:45:03.000] Okay.
[01:45:03.320 --> 01:45:04.440] I'll say that's fiction.
[01:45:04.440 --> 01:45:05.160] All right, Jay.
[01:45:05.320 --> 01:45:06.280] Because I feel like it.
[01:45:06.280 --> 01:45:11.960] This first one about the higher penetration of weather-dependent renewable energy sources.
[01:45:11.960 --> 01:45:16.160] It doesn't increase the vulnerability to blackouts and reduces their severity when they occur.
[01:45:14.920 --> 01:45:20.320] I mean, yeah, I mean, I don't think it would increase vulnerability.
[01:45:20.640 --> 01:45:26.480] I think it, but these are, you know, these are things that definitely need electricity and have a lot of wiring and everything.
[01:45:26.480 --> 01:45:33.600] Yeah, so this is a really interesting thing here, Steve, because you're saying increase, they do not increase vulnerability to blackouts.
[01:45:33.600 --> 01:45:36.880] So there aren't more blackouts because of them.
[01:45:36.880 --> 01:45:37.360] Right.
[01:45:37.360 --> 01:45:40.480] And when blackouts do occur, they're less severe.
[01:45:40.480 --> 01:45:41.120] Yeah, all right.
[01:45:41.120 --> 01:45:42.080] I think that's science.
[01:45:42.080 --> 01:45:43.040] That just makes sense.
[01:45:43.040 --> 01:45:52.400] It took me a while to parse through it because at first, for some reason, I thought you meant during like an EMP or some solar flare or something, but that was just mistaken.
[01:45:52.400 --> 01:45:54.960] Okay, so that one, to me, 100% science.
[01:45:54.960 --> 01:46:02.480] The second one, a recent study, finds out that coyotes are thriving in North America, and in fact, direct hunting by humans results in larger populations.
[01:46:02.480 --> 01:46:06.880] So why would direct hunting by humans result in larger populations?
[01:46:06.880 --> 01:46:08.960] Maybe because they're killing off the weak ones?
[01:46:10.080 --> 01:46:11.680] Is that possibly be it?
[01:46:11.680 --> 01:46:12.880] That one's a maybe.
[01:46:12.880 --> 01:46:14.960] That doesn't seem to track with me.
[01:46:14.960 --> 01:46:23.440] The third one, a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood.
[01:46:23.440 --> 01:46:24.720] Preterm infants.
[01:46:24.720 --> 01:46:28.080] Okay, so does it matter how early they are?
[01:46:28.480 --> 01:46:30.240] Yeah, there's a cutoff.
[01:46:30.240 --> 01:46:31.280] Okay, so preterm.
[01:46:31.600 --> 01:46:33.440] To qualify as preterm.
[01:46:33.440 --> 01:46:34.880] Can you tell me what the cutoff is?
[01:46:34.880 --> 01:46:38.800] So if you're born before 37 weeks, you are considered preterm.
[01:46:38.800 --> 01:46:40.960] Okay, so yeah, that's early.
[01:46:40.960 --> 01:46:42.640] I wish I knew that.
[01:46:42.640 --> 01:46:44.240] Would that have changed your answer?
[01:46:44.240 --> 01:46:45.840] Nah, just saying it.
[01:46:46.480 --> 01:46:48.000] All right, so let's think about this.
[01:46:48.000 --> 01:46:55.680] So, if a baby comes out early, I would imagine that there's reasons why that happens, but it doesn't necessarily mean that there's something wrong with the baby.
[01:46:55.680 --> 01:47:02.120] So, then it comes down to, you know, I mean, guys, what I'm about to say, I am not an expert, right?
[01:46:59.840 --> 01:47:07.000] I'm thinking off the cuff here, the baby's kind of nutrition changes.
[01:47:07.320 --> 01:47:11.080] I don't think nutrition is a maximal problem here.
[01:47:11.080 --> 01:47:16.840] I mean, there might be other things that the baby would be getting from the mother that could affect its development.
[01:47:16.840 --> 01:47:21.080] And, you know, babies can't breathe until they hit a certain age range.
[01:47:21.080 --> 01:47:23.320] So the baby would, that could be a factor.
[01:47:23.720 --> 01:47:25.080] This is a messy one.
[01:47:25.080 --> 01:47:33.560] I think that today, with proper care, babies are okay if they're born in Steve's preterm timeframe.
[01:47:33.560 --> 01:47:39.320] But this whole thing about like, you know, hunting the coyotes, I just, something rubbs me the wrong way.
[01:47:39.400 --> 01:47:40.680] That one's a fiction.
[01:47:40.680 --> 01:47:41.720] Okay, Evan.
[01:47:41.720 --> 01:47:47.080] Well, all right, the grid.
[01:47:47.080 --> 01:47:51.160] And it not increasing vulnerability to blackouts.
[01:47:51.160 --> 01:47:52.040] I believe that.
[01:47:52.040 --> 01:47:59.960] But this part about reducing their severity when they occur, I'm having a hard time understanding why that's the case.
[01:47:59.960 --> 01:48:03.560] Why would it reduce the severity of the blackout?
[01:48:03.560 --> 01:48:05.320] Yeah, hmm.
[01:48:06.280 --> 01:48:09.960] I'm going to be interested to hear that one if that one turns out to be science.
[01:48:09.960 --> 01:48:12.600] I have a feeling it's kind of fiction-y.
[01:48:12.600 --> 01:48:16.040] Coyotes, I have no idea, thriving in North America.
[01:48:16.040 --> 01:48:16.920] I don't know.
[01:48:16.920 --> 01:48:21.720] And in fact, direct hunting by humans results in larger populations.
[01:48:21.720 --> 01:48:23.000] Why would that be?
[01:48:23.000 --> 01:48:25.720] Because they cluster more?
[01:48:25.720 --> 01:48:28.520] And then is that how that will work?
[01:48:28.600 --> 01:48:29.400] Would work?
[01:48:29.400 --> 01:48:31.160] Because it creates pockets?
[01:48:31.160 --> 01:48:31.720] I don't know.
[01:48:31.720 --> 01:48:33.160] I don't know about that one either.
[01:48:33.320 --> 01:48:34.840] The last one, oh boy.
[01:48:34.840 --> 01:48:38.040] Preterm infants, no significant economic.
[01:48:38.520 --> 01:48:40.760] That one's like the one I know the least about.
[01:48:41.560 --> 01:48:42.760] They're all fiction.
[01:48:42.760 --> 01:48:43.480] Thank you.
[01:48:43.480 --> 01:48:43.880] Okay.
[01:48:44.200 --> 01:48:45.600] And Kara says.
[01:48:45.920 --> 01:48:46.320] Darn it.
[01:48:46.560 --> 01:48:47.600] Automatic failure, but okay.
[01:48:47.600 --> 01:48:49.120] No, that's not an automatic failure.
[01:48:49.120 --> 01:48:50.240] That's an automatic win.
[01:48:50.240 --> 01:48:51.120] It's a forfeit.
[01:48:51.120 --> 01:48:54.400] I'm going to use my get out of jail free card on this episode.
[01:48:54.400 --> 01:48:55.920] Don't we all get one for the year?
[01:48:55.920 --> 01:48:56.320] No.
[01:48:57.280 --> 01:48:57.600] All right.
[01:48:57.600 --> 01:48:59.600] How about the preterm infants one?
[01:49:00.320 --> 01:49:07.360] I can't put my finger on it, but I think there's something, there's maybe an educational effect that lasts into adulthood that they've found.
[01:49:07.360 --> 01:49:09.040] So I'll say that one's the fiction.
[01:49:09.040 --> 01:49:10.080] Okay, Kara.
[01:49:10.080 --> 01:49:12.800] I think coyotes is science.
[01:49:12.800 --> 01:49:13.920] They are thriving.
[01:49:13.920 --> 01:49:16.800] They are everywhere, at least in LA.
[01:49:16.800 --> 01:49:27.520] I think that there are many examples of what's called conservation hunting, where culls are done intentionally to maintain or even grow populations.
[01:49:27.520 --> 01:49:34.240] I think it's about distribution more than anything because animals are not, you know, spread out evenly.
[01:49:34.240 --> 01:49:46.720] I think that the weather-dependent, so like wind and solar energy being in higher numbers, I don't think it would increase vulnerability to blackouts.
[01:49:46.720 --> 01:49:56.960] I know that the problem was with the severity, reducing the severity when they occur, but maybe they're just faster to get online, you know, or maybe they're cheaper and easier to fix, like who knows.
[01:49:56.960 --> 01:50:03.440] So yeah, the one that's really, really bothering me is this idea, because you said a study of preterm infants.
[01:50:03.440 --> 01:50:11.280] So just because 37 weeks is the cutoff doesn't mean that we're not also talking about babies that are born at 25, 26 weeks.
[01:50:11.280 --> 01:50:14.960] Babies that are born early do not have fully developed organs.
[01:50:14.960 --> 01:50:16.000] They're sick.
[01:50:16.000 --> 01:50:17.360] They need surgery.
[01:50:17.360 --> 01:50:19.280] They undergo a lot of treatments.
[01:50:19.280 --> 01:50:24.160] There is no way that that doesn't affect them economically well into adulthood.
[01:50:24.160 --> 01:50:26.720] This one just seems impossible to be science.
[01:50:26.720 --> 01:50:28.720] So I have to say that that's fiction.
[01:50:28.720 --> 01:50:29.120] All right.
[01:50:29.120 --> 01:50:30.840] So you all agree with the first one.
[01:50:29.920 --> 01:50:31.720] So we'll start there.
[01:50:31.880 --> 01:50:42.840] New research finds that higher penetration of weather-dependent renewable energy sources, wind and solar, on the grid does not increase vulnerability to blackouts and reduces their severity when they occur.
[01:50:42.840 --> 01:50:45.000] You all think this one is science.
[01:50:45.000 --> 01:50:48.120] And this one is science.
[01:50:48.120 --> 01:50:48.840] This is science.
[01:50:49.400 --> 01:50:50.120] That's great.
[01:50:50.120 --> 01:51:05.480] Now, this was part of the reason for this study was, I don't know if you guys remember the whole Texas blackout thing where they were blaming the renewable resources on the grid when it was in fact the coal-fired plants or gas plants that were going down.
[01:51:05.720 --> 01:51:17.800] So what they found was that, yeah, that having renewables, weather-dependent renewables on the grid does not make it more vulnerable, even to weather-based events, right?
[01:51:17.800 --> 01:51:26.520] So even if there's a storm or whatever, it doesn't make it more likely for these sources of energy to go down than traditional sources.
[01:51:26.760 --> 01:51:38.040] And when a blackout does occur, the amount of people who lose power is less because this is a more distributed power source and they do come back more quickly.
[01:51:38.040 --> 01:51:43.080] So yeah, it actually lends resiliency, if anything, to the grid.
[01:51:43.640 --> 01:51:44.440] Yeah, it's great.
[01:51:44.680 --> 01:51:46.280] I guess we'll just keep going in order.
[01:51:46.280 --> 01:51:52.040] A recent study finds that coyotes are thriving in North America, and in fact, the direct hunting by humans results in large populations.
[01:51:52.040 --> 01:51:54.760] Bob and Jay, you think this one is the fiction.
[01:51:54.760 --> 01:51:57.160] Evan and Carrie, you think this one is science.
[01:51:57.160 --> 01:52:04.840] And I guess the question comes down to how could directly hunting coyotes increase their population?
[01:52:05.080 --> 01:52:07.720] This wasn't a call to increase their population.
[01:52:07.720 --> 01:52:11.640] This is like when you try to reduce the population by hunting them.
[01:52:11.640 --> 01:52:13.400] And then backfired.
[01:52:13.400 --> 01:52:14.560] And it backfired, yeah.
[01:52:14.560 --> 01:52:15.840] So this is science.
[01:52:15.840 --> 01:52:16.640] This is science.
[01:52:14.280 --> 01:52:18.320] Yes.
[01:52:18.960 --> 01:52:22.080] This one is science because, yeah, this is all true.
[01:52:22.480 --> 01:52:34.400] What they think is happening is that even in locations where they have very liberal coyote hunting laws, meaning like it's open season, like there's no restricted season, there's no limit on how many you can kill.
[01:52:34.400 --> 01:52:46.640] And that when those populations actually, over time, increase, and the reason why they think this is happening is that the hunters are disproportionately killing older coyotes.
[01:52:46.640 --> 01:52:50.640] And then the younger coyotes have more resources and they have more pups.
[01:52:50.640 --> 01:52:52.560] And so they're just breeding more.
[01:52:52.560 --> 01:53:00.000] And so essentially their conclusion was that hunting is not an effective population control mechanism for coyotes.
[01:53:00.000 --> 01:53:02.560] They just bounce right back in even bigger numbers.
[01:53:02.560 --> 01:53:06.400] So we're disproportionately hunting older coyotes because they're slower and easier to kill.
[01:53:06.640 --> 01:53:08.320] I guess they're easier to kill, yeah.
[01:53:09.120 --> 01:53:10.240] Must be the case.
[01:53:10.240 --> 01:53:11.120] Whoops.
[01:53:11.600 --> 01:53:13.360] This study also looked at a lot of other things.
[01:53:13.360 --> 01:53:33.760] So coyotes also do not suffer when their region overlaps with wolves, except for certain, that's very, that's regionally dependent, but in many locations, basically depends on food supply, they do fine even when they're competing with wolves, but they don't do fine if they're competing with bears.
[01:53:34.160 --> 01:53:40.640] Yeah, it's because coyotes and bears, I think, they both hunt, but they also scavenge.
[01:53:40.640 --> 01:53:42.560] Yeah, so that's that's a good point.
[01:53:42.560 --> 01:54:01.160] So there's actually a mixed effect when coyotes overlap with other predators, there's a decrease in prey availability, but especially the larger predators tend to leave behind carcasses that the coyotes can then scavenge.
[01:53:59.840 --> 01:54:03.560] So they actually increase their food supply sometimes.
[01:54:03.640 --> 01:54:11.400] So depending on a lot of variables, sometimes it actually increases the coyote population because now they have more scavenging available.
[01:54:11.720 --> 01:54:19.240] And bears and coyotes also scavenge their garbage dump scavengers.
[01:54:19.640 --> 01:54:22.600] So they come into urban areas and take food.
[01:54:22.600 --> 01:54:25.640] Yeah, but coyotes are a mesopredator.
[01:54:25.640 --> 01:54:26.920] Have you ever heard that term?
[01:54:26.920 --> 01:54:27.640] They're in the middle.
[01:54:27.640 --> 01:54:28.200] They're not small.
[01:54:28.200 --> 01:54:28.760] They're not large.
[01:54:28.760 --> 01:54:29.480] They're meso.
[01:54:29.720 --> 01:54:30.920] And they're just thriving.
[01:54:30.920 --> 01:54:33.640] They're just adapting really well to human civilization.
[01:54:33.640 --> 01:54:35.000] Doing just fine.
[01:54:35.000 --> 01:54:35.400] All right.
[01:54:35.400 --> 01:54:35.720] All right.
[01:54:35.720 --> 01:54:48.040] That means that a population-based cohort study of preterm infants finds no significant economic or educational effects lasting into adulthood is the fiction because they found significant economic and developmental effects.
[01:54:48.200 --> 01:54:49.320] And educational effects, yes.
[01:54:49.320 --> 01:54:55.160] So they have lower rates of enrolling in a university, graduating from a university with a degree.
[01:54:55.160 --> 01:54:59.560] They have lower income per year, 17% lower.
[01:54:59.880 --> 01:55:02.280] So yeah, there are lasting effects.
[01:55:02.280 --> 01:55:04.360] It's hard to tease out exactly what that might be.
[01:55:04.360 --> 01:55:09.560] Is this due to just they're not as healthy or maybe the neurological development is delayed?
[01:55:09.560 --> 01:55:17.560] And the reason why this is an important research question is because clearly they are starting at a loss, you know, farther back.
[01:55:18.120 --> 01:55:19.960] They have to make up ground.
[01:55:21.000 --> 01:55:27.240] The question is, do they eventually catch up or do their deficits persist?
[01:55:27.240 --> 01:55:30.920] So this was, they were looking at 18 to 28 year olds.
[01:55:30.920 --> 01:55:35.400] They say they had followed them for a long time, like two couple decades.
[01:55:35.720 --> 01:55:37.640] So this is a long, long follow-up.
[01:55:37.680 --> 01:55:38.240] How long?
[01:55:38.280 --> 01:55:39.560] How did you define preterm?
[01:55:39.560 --> 01:55:39.960] How many?
[01:55:40.120 --> 01:55:41.640] 37 before 37.
[01:55:41.640 --> 01:55:42.200] Or less.
[01:55:42.200 --> 01:55:42.600] Or less.
[01:55:42.600 --> 01:55:44.440] So some of them were a lot less, yeah.
[01:55:44.480 --> 01:55:45.280] 37?
[01:55:44.440 --> 01:55:46.000] 37 weeks.
[01:55:46.240 --> 01:55:46.640] Yeah.
[01:55:46.880 --> 01:55:48.400] And that what's what's typical?
[01:55:48.400 --> 01:55:49.440] 40 what?
[01:55:49.920 --> 01:55:50.560] I was.
[01:55:44.840 --> 01:55:52.160] So, I mean, I was four weeks.
[01:55:52.480 --> 01:55:54.000] I was a whole month premature.
[01:55:54.000 --> 01:55:54.400] Yeah.
[01:55:54.400 --> 01:55:56.960] But it obviously gets worse the more premature you are.
[01:55:56.960 --> 01:56:04.400] So they looked at you know, like 34 to 36 weeks, 32 to 33, 28 to 21, 24 to 27.
[01:56:04.720 --> 01:56:08.800] And the effects got worse, you know, the more preterm you were.
[01:56:08.800 --> 01:56:12.400] Yeah, so it wasn't much for like 20 years, 36.
[01:56:12.400 --> 01:56:12.880] You know, that.
[01:56:13.120 --> 01:56:19.600] So 40 is the average, but a full-term baby is 37 or 37 plus, up to 42.
[01:56:19.600 --> 01:56:20.080] Yeah.
[01:56:20.400 --> 01:56:30.640] So, but, you know, but it is still true that with modern medical management, they do fine, but they are starting, you know, at a deficit.
[01:56:30.640 --> 01:56:34.080] And they don't quite make that up, at least not in their 20s.
[01:56:34.080 --> 01:56:39.040] That sounds preterm sounds like way too broad of a term, at least for this specific science or fiction.
[01:56:39.680 --> 01:56:41.840] But it was true, no matter how you slice it.
[01:56:41.840 --> 01:56:45.200] I was biased because I'm preterm and I'm awesome.
[01:56:46.320 --> 01:56:49.040] But think of how much more awesome you would be about because it's a little bit more.
[01:56:49.200 --> 01:56:50.080] Oh my gosh.
[01:56:52.880 --> 01:56:54.160] That's so mean.
[01:56:54.160 --> 01:56:55.520] Right, fighting for resources.
[01:56:55.760 --> 01:56:57.040] You don't want to think about all that stuff.
[01:56:57.040 --> 01:56:57.680] You were a twin.
[01:56:57.680 --> 01:57:02.240] Twins also are at a deficit because your twin was using up some nutritional resources.
[01:57:02.800 --> 01:57:05.120] So you overcame multiple options.
[01:57:05.120 --> 01:57:08.080] And our mother smoked when she was pregnant with all of us.
[01:57:08.080 --> 01:57:08.800] Damn.
[01:57:09.040 --> 01:57:10.240] And that takes a hit as well.
[01:57:10.240 --> 01:57:11.360] And we were exposed when we were kids.
[01:57:11.920 --> 01:57:12.800] Literally takes a hit.
[01:57:12.800 --> 01:57:14.720] So we had a sewer.
[01:57:14.720 --> 01:57:15.200] Yeah.
[01:57:15.200 --> 01:57:16.160] She wasn't a drinker.
[01:57:16.880 --> 01:57:18.160] My parents weren't drinkers.
[01:57:18.560 --> 01:57:21.120] They were not drinkers, but they, yeah, they were smoking.
[01:57:21.200 --> 01:57:21.440] Terrible.
[01:57:21.520 --> 01:57:22.320] They did all that meth.
[01:57:22.560 --> 01:57:23.360] Hated it.
[01:57:23.840 --> 01:57:24.640] Hated it.
[01:57:24.640 --> 01:57:27.600] The worst thing about my childhood was having to deal with that.
[01:57:27.600 --> 01:57:28.160] Yeah.
[01:57:28.160 --> 01:57:29.040] The worst.
[01:57:29.040 --> 01:57:29.640] Yeah.
[01:57:29.200 --> 01:57:33.240] I was like this one annoying constant presence in my childhood.
[01:57:29.520 --> 01:57:35.480] Otherwise, we had, I think our childhood was great.
[01:57:36.280 --> 01:57:37.640] Barely was a blip on my radar.
[01:57:37.720 --> 01:57:38.360] Yeah, I hated it.
[01:57:38.360 --> 01:57:39.400] Totally hated it.
[01:57:39.400 --> 01:57:40.840] All right, Evan, give us a quote.
[01:57:40.840 --> 01:57:53.640] When you get in a tight place and everything goes against you till it seems as though you could not hold on a minute longer, never give up then, for that is just the time and the place that the tide will turn.
[01:57:53.640 --> 01:57:57.560] Very comforting words from Harriet Beecher Stowe.
[01:57:57.560 --> 01:57:57.800] Yeah.
[01:57:58.120 --> 01:58:06.680] American abolitionist and author from the 19th century, and so key and critical to helping end slavery in America.
[01:58:06.920 --> 01:58:11.240] Can't be understated how important she was, overstated how important she was.
[01:58:11.240 --> 01:58:13.800] Political change is a never-ending marathon.
[01:58:13.800 --> 01:58:14.280] Right.
[01:58:14.280 --> 01:58:17.640] So keep this in mind, folks, especially these days.
[01:58:17.720 --> 01:58:17.960] Right.
[01:58:17.960 --> 01:58:19.000] Thanks, Evan.
[01:58:19.000 --> 01:58:19.720] Yep.
[01:58:19.720 --> 01:58:21.720] Well, thank you all for joining me this week.
[01:58:21.720 --> 01:58:22.520] You're welcome, Steve.
[01:58:22.520 --> 01:58:23.080] Thanks, Steve.
[01:58:23.240 --> 01:58:24.040] Thanks, Steve.
[01:58:24.040 --> 01:58:27.880] And until next week, this is your Skeptic's Guide to the Universe.
[01:58:30.200 --> 01:58:36.920] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:58:36.920 --> 01:58:41.560] For more information, visit us at the skepticsguide.org.
[01:58:41.560 --> 01:58:45.480] Send your questions to info at the skepticsguide.org.
[01:58:45.480 --> 01:58:56.120] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:58:56.120 --> 01:58:59.800] Our listeners and supporters are what make SGU possible.