Debug Information
Processing Details
- VTT File: skepticast2024-11-23.vtt
- Processing Time: September 09, 2025 at 08:40 AM
- Total Chunks: 3
- Transcript Length: 167,250 characters
- Caption Count: 1,742 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.320 --> 00:00:04.640] The Skeptic's Guide to the Universe is brought to you by Progressive Insurance.
[00:00:04.640 --> 00:00:08.960] Do you ever think about switching insurance companies to see if you could save some cash?
[00:00:08.960 --> 00:00:14.480] Progressive makes it easy to see if you could save when you bundle your home and auto policies.
[00:00:14.480 --> 00:00:17.120] Try it at progressive.com.
[00:00:17.120 --> 00:00:19.920] Progressive casualty insurance company and affiliates.
[00:00:19.920 --> 00:00:21.440] Potential savings will vary.
[00:00:21.440 --> 00:00:23.600] Not available in all states.
[00:00:27.120 --> 00:00:30.320] You're listening to the skeptic's guide to the universe.
[00:00:30.320 --> 00:00:33.200] Your escape to reality.
[00:00:33.840 --> 00:00:36.480] Hello and welcome to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:41.440] Today is Tuesday, November 19th, 2024, and this is your host, Stephen Novella.
[00:00:41.440 --> 00:00:43.120] Joining me this week are Bob Novella.
[00:00:43.120 --> 00:00:43.760] Hey, everybody.
[00:00:43.760 --> 00:00:45.120] Kara Santa Maria.
[00:00:45.120 --> 00:00:45.600] Howdy.
[00:00:45.600 --> 00:00:46.560] Jay Novella.
[00:00:46.560 --> 00:00:47.360] Hey, guys.
[00:00:47.360 --> 00:00:49.200] And Evan Bernstein.
[00:00:49.200 --> 00:00:50.720] Good evening, everyone.
[00:00:50.720 --> 00:00:51.680] How's everyone doing?
[00:00:51.680 --> 00:00:55.920] We're recording a little bit early because we're getting ready for the Thanksgiving break.
[00:00:56.240 --> 00:00:58.080] I'll be going away this weekend.
[00:00:58.080 --> 00:01:00.160] Yeah, I'm going to see my family in Denver.
[00:01:00.160 --> 00:01:03.520] Yeah, this is the holiday where the novella guys are all splitting up.
[00:01:03.520 --> 00:01:04.480] I'll go into our in-look.
[00:01:04.640 --> 00:01:04.960] Don't worry.
[00:01:04.960 --> 00:01:06.240] I'll hold down the state for you.
[00:01:06.240 --> 00:01:06.960] I'll be remaining.
[00:01:07.200 --> 00:01:07.680] Thank you.
[00:01:07.680 --> 00:01:08.160] Thank you.
[00:01:08.160 --> 00:01:08.960] I'll take care of things.
[00:01:08.960 --> 00:01:09.600] Just leave it to you.
[00:01:11.040 --> 00:01:12.160] I'm going to Oregon.
[00:01:12.160 --> 00:01:12.640] Oregon.
[00:01:12.640 --> 00:01:12.800] Oh.
[00:01:13.040 --> 00:01:13.760] Oregon.
[00:01:14.640 --> 00:01:17.280] I always mispronounce Oregon.
[00:01:17.280 --> 00:01:17.840] It's Oregon.
[00:01:18.720 --> 00:01:20.320] They don't want you to say Oregon.
[00:01:20.560 --> 00:01:21.360] No, it's Oregon.
[00:01:21.840 --> 00:01:22.240] Oregon.
[00:01:23.760 --> 00:01:26.720] And that's the people of the state who get to decide that, right?
[00:01:27.760 --> 00:01:28.880] I suppose that's fair.
[00:01:28.880 --> 00:01:35.280] So I know you guys have heard about the fact that the Onion has acquired InfoWars.
[00:01:35.760 --> 00:01:37.200] I mean, how wonderful is that?
[00:01:37.200 --> 00:01:37.920] Well, there's a wrinkle.
[00:01:37.920 --> 00:01:39.040] Did you guys hear the wrinkle?
[00:01:39.040 --> 00:01:39.680] No, what happened?
[00:01:39.920 --> 00:01:40.400] No.
[00:01:40.400 --> 00:01:42.080] It's being contested.
[00:01:42.240 --> 00:01:42.800] Of course it is.
[00:01:42.960 --> 00:01:43.600] Okay.
[00:01:44.160 --> 00:01:52.160] So, just as a quick review, InfoWars is the media outlet of Alex Jones, who got sued for lots of money.
[00:01:52.560 --> 00:02:02.920] It has to award like $1.5 billion to the families of the Sandy Hook massacre because of his hateful conspiracy theories, right?
[00:01:59.680 --> 00:02:05.240] One little bit of justice in the world.
[00:02:05.800 --> 00:02:11.400] And to pay off the money he owes, they're selling off his assets, including InfoWars.
[00:02:11.400 --> 00:02:21.720] And so that just was up for auction, and it was purchased by The Onion, or the parent company of The Onion, Global Tetrahedron is the name of the parent company.
[00:02:21.720 --> 00:02:27.240] And they plan on launching it as a basically satire of itself.
[00:02:27.240 --> 00:02:28.040] That's amazing.
[00:02:28.040 --> 00:02:29.000] Sometime next year.
[00:02:29.000 --> 00:02:31.240] It would be amazing, totally amazing.
[00:02:31.240 --> 00:02:42.760] But a company affiliated with Alex Jones is claiming that the bidding was not fair, that they did not have a fair opportunity to make a counterbid.
[00:02:42.760 --> 00:02:44.280] So now it has to go before the judge.
[00:02:44.280 --> 00:02:47.160] And hopefully this is just all a waste of time and it'll still happen.
[00:02:47.160 --> 00:02:47.960] A stall tactic.
[00:02:48.680 --> 00:02:53.160] The judge is not going to allow a company affiliated with Alex Jones to buy the company.
[00:02:53.160 --> 00:02:59.480] Well, their argument is that the goal of the bid is to get as much money for the families as possible.
[00:02:59.480 --> 00:03:02.680] And so if they have a higher bid, the judge should honor that.
[00:03:03.160 --> 00:03:05.560] But then are they just going to turn around and give it back to Alex Jones?
[00:03:05.560 --> 00:03:06.520] Well, they could.
[00:03:07.720 --> 00:03:08.520] I don't know.
[00:03:08.520 --> 00:03:16.280] Hopefully this won't sink the deal because it would be awesome if The Onion gets InfoWars.
[00:03:16.600 --> 00:03:26.040] Also, the people who sued Alex Jones, who were getting some of the money, essentially gave The Onion some of their money to get the bid up enough to win.
[00:03:26.040 --> 00:03:26.840] Oh, interesting.
[00:03:27.160 --> 00:03:28.840] So they're investors, in a sense.
[00:03:29.320 --> 00:03:31.880] I guess so, yeah, because they wanted this to happen, right?
[00:03:31.880 --> 00:03:36.600] And they didn't wanted to keep it away from Alex John, you know, a company affiliated with Alex Jones.
[00:03:36.600 --> 00:03:37.080] Okay.
[00:03:37.080 --> 00:03:37.400] All right.
[00:03:37.400 --> 00:03:40.360] Well, yeah, it sounds like a delay tactic more than anything to me.
[00:03:40.360 --> 00:03:42.280] Yeah, I mean, I don't know.
[00:03:42.280 --> 00:03:45.520] I think they legitimately want he wants to get it if he's got the money.
[00:03:45.680 --> 00:03:52.800] I don't know how much they paid for it, but I mean, it's, you know, look, people just don't, people will throw lawsuits out there like crazy today, you know?
[00:03:52.800 --> 00:03:54.640] Like, it's all BS.
[00:03:54.640 --> 00:03:56.000] Like, he didn't do it.
[00:03:56.000 --> 00:03:56.880] He lost it.
[00:03:56.880 --> 00:03:57.840] He wasn't on his game.
[00:03:57.840 --> 00:03:58.880] Whatever the problem was.
[00:03:58.880 --> 00:04:00.320] The time came and the time went.
[00:04:00.320 --> 00:04:02.320] It's all forecasted.
[00:04:02.320 --> 00:04:04.640] It's not like they, you know, secretly put it out.
[00:04:04.800 --> 00:04:06.560] Like, it was in the public eye.
[00:04:06.560 --> 00:04:07.120] That's it.
[00:04:07.120 --> 00:04:09.120] Yeah, we'll update you on if there's any.
[00:04:09.280 --> 00:04:12.160] I think it's going to be go before the judge next week.
[00:04:12.160 --> 00:04:12.400] Yeah.
[00:04:12.400 --> 00:04:12.880] Well, we'll see.
[00:04:13.120 --> 00:04:13.440] Okay.
[00:04:13.440 --> 00:04:14.160] Quick resolution.
[00:04:14.160 --> 00:04:14.640] Let's go.
[00:04:14.880 --> 00:04:16.720] Hopefully there'll be a good resolution.
[00:04:16.720 --> 00:04:16.880] All right.
[00:04:16.880 --> 00:04:22.480] Well, let's get into our news items because we have a great interview coming up later in the show with Kevin Fulta.
[00:04:22.480 --> 00:04:24.000] If you want to leave time for that.
[00:04:24.160 --> 00:04:31.440] I'm going to start us off by talking about giving robots a sense of self.
[00:04:31.440 --> 00:04:32.880] Oh, don't do that.
[00:04:33.520 --> 00:04:36.240] Well, brain circuit in there.
[00:04:36.720 --> 00:04:40.320] I've seen too many movies where that didn't work out as intended.
[00:04:40.320 --> 00:04:46.080] But don't they need that for like, I don't know, that embodied kind of thing, right?
[00:04:46.240 --> 00:04:51.440] So there was a recent paper published by three experts.
[00:04:51.440 --> 00:04:55.840] One is a robotic, a cognitive roboticist, right?
[00:04:55.840 --> 00:04:59.760] So they deal with AI that controls robots.
[00:04:59.760 --> 00:05:06.160] Another one is a cognitive psychologist who specializes in human-robot interactions.
[00:05:06.160 --> 00:05:08.160] And the third was a psychiatrist.
[00:05:08.160 --> 00:05:20.160] So they wrote a paper talking about the fact that we should be studying the elements of a human sense of self, the components that make that up in robots.
[00:05:20.160 --> 00:05:26.480] And then we could then do research on those elements in the robotic model.
[00:05:26.480 --> 00:05:34.040] To break this down, what's interesting about this, what I found interesting about this, obviously I'm a neuroscientist, I love the cognitive neurosciences, and I also love robots and AI.
[00:05:29.840 --> 00:05:35.080] So it all comes together.
[00:05:35.320 --> 00:05:40.280] What I've been saying for years is that AI is going to be a fantastic way to study neuroscience, right?
[00:05:40.280 --> 00:05:49.480] Because it essentially gives us an actual model that we could mess with, like what happens when we take this component out, or what happens when we dial this all the way up, or whatever.
[00:05:49.480 --> 00:05:52.360] And we could see how the pieces interact with each other.
[00:05:52.360 --> 00:05:55.000] But first, we have to know what all the pieces are, right?
[00:05:55.000 --> 00:06:00.520] So this is, you know, they're talking about what are the pieces of a sense of self.
[00:06:00.520 --> 00:06:03.960] Do you guys have, I mean, I've spoken about this before, but do you guys have any guesses?
[00:06:03.960 --> 00:06:16.280] Like, what do you think would be something very specific, like some circuit in the brain that doesn't create necessarily by itself a sense of self, but that would contribute to our sense of self?
[00:06:16.280 --> 00:06:23.240] Well, I think there's the obvious things like our proprioception, like our understanding of our own body map and where we are in space.
[00:06:23.720 --> 00:06:24.200] I don't know.
[00:06:24.200 --> 00:06:29.240] I did a whole podcast with this really interesting cognitive, I think he was a cognitive neuroscientist.
[00:06:29.240 --> 00:06:31.000] Ooh, he might have been a psychologist.
[00:06:31.000 --> 00:06:35.000] About how the concept of self only develops in relation to other.
[00:06:35.000 --> 00:06:38.280] And so you can't really have a sense of self if you're in a vacuum.
[00:06:38.280 --> 00:06:38.680] Right.
[00:06:38.680 --> 00:06:39.720] So that's correct.
[00:06:39.720 --> 00:06:45.400] And the different components, one that you're getting to, we would call is embodiment, right?
[00:06:45.400 --> 00:06:50.440] So there's a sense that you are in your body, right?
[00:06:50.440 --> 00:06:54.760] There's also a sense that you are separate from the rest of the universe.
[00:06:55.320 --> 00:06:59.560] At some point, you end and the rest of the universe begins.
[00:06:59.560 --> 00:07:02.280] What's interesting is that infants don't have that.
[00:07:02.840 --> 00:07:06.440] That develops after birth, like over time.
[00:07:06.680 --> 00:07:12.360] Then that module clicks in, and then they can see, like, this is my hand, and everything beyond that is not me.
[00:07:12.400 --> 00:07:13.960] You know, it's imagine that day in the world.
[00:07:14.200 --> 00:07:15.120] Oh, shit.
[00:07:16.800 --> 00:07:17.680] I'm not a god.
[00:07:17.680 --> 00:07:18.800] What the hell?
[00:07:14.600 --> 00:07:21.680] Well, you can see it when they recognize their own parts and stuff.
[00:07:22.000 --> 00:07:24.240] They start to see themselves in the mirror and all those good things.
[00:07:24.720 --> 00:07:26.000] They pass the mirror test.
[00:07:26.000 --> 00:07:31.680] So then another component is the sense that you own your body parts.
[00:07:31.680 --> 00:07:32.320] Oh, yeah.
[00:07:32.640 --> 00:07:34.400] And then agency, right?
[00:07:34.960 --> 00:07:35.840] Sometimes I rent out that.
[00:07:36.160 --> 00:07:38.240] Agency is a separate thing, Evan.
[00:07:38.240 --> 00:07:42.480] So then there's the sense that you control your body parts.
[00:07:42.480 --> 00:07:44.000] That's agency.
[00:07:44.000 --> 00:07:48.000] And then there's the sense that you can have an influence.
[00:07:48.000 --> 00:07:51.360] You can do stuff that influences the rest of the world, right?
[00:07:51.360 --> 00:07:54.640] You could do something that would affect something else.
[00:07:54.960 --> 00:08:00.880] And there's also the sense that other people have the same kind of sense of self that you have.
[00:08:00.880 --> 00:08:02.000] Yeah, that comes much later.
[00:08:02.000 --> 00:08:02.640] That comes later.
[00:08:02.640 --> 00:08:03.040] Yeah.
[00:08:03.520 --> 00:08:05.680] What does that have to do with your own sense of self?
[00:08:06.160 --> 00:08:08.720] Because you have to have a theory of mind, right?
[00:08:08.960 --> 00:08:15.520] And the theory of mind is that you have to know what an agent is in order to feel like you're an agent.
[00:08:15.520 --> 00:08:22.000] And if you are an agent, that has to mean that other people are also their own agents.
[00:08:22.000 --> 00:08:23.200] And they're not you.
[00:08:23.200 --> 00:08:24.960] They are separate from you.
[00:08:24.960 --> 00:08:39.920] The thing is, and I think a lot of people get tripped up who haven't, you know, who are not neuroscientists or who haven't thought about this, is that these things seem so fundamental, you might wonder, it's like, do we really need a circuit in the brain to make us feel that way?
[00:08:40.000 --> 00:08:41.680] These things are true.
[00:08:42.000 --> 00:08:43.120] But that's naive.
[00:08:43.120 --> 00:08:46.560] That's not how our consciousness works.
[00:08:46.560 --> 00:08:50.480] We don't feel that way simply because it's true.
[00:08:51.040 --> 00:08:57.920] We have to have it, it's a constructed, subjective experience that the brain has to actively make.
[00:08:57.920 --> 00:08:59.120] And it can be knocked out.
[00:08:59.120 --> 00:09:00.360] And it can get knocked out.
[00:08:59.440 --> 00:09:04.280] Mostly we know about these things when they get knocked out.
[00:09:04.600 --> 00:09:05.560] Brain injuries.
[00:08:59.840 --> 00:09:06.600] Injuries and drugs.
[00:09:06.760 --> 00:09:07.800] Those are two big ones, right?
[00:09:08.040 --> 00:09:11.080] And now we could do it with like transcranial magnetic stimulation.
[00:09:11.080 --> 00:09:13.880] We could turn off this circuit or turn off that circuit.
[00:09:13.880 --> 00:09:18.040] And so there are drugs, for example, that will make you have an out-of-body experience.
[00:09:18.040 --> 00:09:19.400] What's an out-of-body experience?
[00:09:19.400 --> 00:09:22.040] You lose your sense that you are in your body.
[00:09:23.720 --> 00:09:36.920] Or you feel like, and I remember when we interviewed a neuroscientist who did LSD, she said that when she took it, her body expanded to the size of the universe.
[00:09:37.400 --> 00:09:38.600] Shawn of Jennifer Willette.
[00:09:38.600 --> 00:09:41.000] Yeah, she and Sean did LSD for one of her books.
[00:09:41.000 --> 00:09:42.760] Although I don't think that's who that quote is from.
[00:09:42.760 --> 00:09:44.680] It's Susan Blackmore.
[00:09:44.680 --> 00:09:45.160] Oh, okay.
[00:09:45.640 --> 00:09:51.800] And but in any case, so what is that other than you not feeling you are separate from the universe?
[00:09:51.800 --> 00:09:53.720] You feel one with the universe, right?
[00:09:53.720 --> 00:10:00.680] Which feels like a spiritual experience, but it's just a breakdown of your sense that you are not one with the universe, right?
[00:10:00.680 --> 00:10:04.600] Which is, again, is an actively constructed sense that you have.
[00:10:04.600 --> 00:10:10.200] Because how can you have a sense of self unless you're distinguished from not self, right?
[00:10:10.520 --> 00:10:13.480] So you're in your body, self versus not-self.
[00:10:13.480 --> 00:10:15.720] And then we talk about alien hand syndrome.
[00:10:15.720 --> 00:10:17.000] What's alien hand syndrome?
[00:10:17.000 --> 00:10:22.840] The circuit that makes you feel as if you control your body is not working at some point.
[00:10:23.160 --> 00:10:24.760] How does that circuit work?
[00:10:24.760 --> 00:10:27.560] Well, first you decide you want to make a move.
[00:10:27.560 --> 00:10:31.160] You make a move, and then you feel and see the move.
[00:10:31.160 --> 00:10:34.440] And if it matches, your brain says you control your hand.
[00:10:34.440 --> 00:10:39.800] If it doesn't match, or that circuit is broken, you don't feel as if you are controlling your hand.
[00:10:39.800 --> 00:10:44.200] You feel like it's acting on its own agency, not your agency.
[00:10:44.200 --> 00:10:49.520] Even though subconsciously it may be your agency, you don't feel that it is.
[00:10:49.840 --> 00:10:59.040] So people will, with alien hand syndrome, will like be walking down the street and then their hand empties their pockets onto the sidewalk.
[00:10:59.840 --> 00:11:17.840] And there's the famous example in Oliver Sachs' book where a patient kept waking up on the floor and they discovered that he was, he thought that there was a cadaver leg in his bed and he would throw it out of the bed out of fear and then he would go with it because it was his own leg.
[00:11:18.160 --> 00:11:19.760] That's a different phenomenon.
[00:11:19.760 --> 00:11:20.720] That's neglect.
[00:11:20.720 --> 00:11:21.440] That's neglect.
[00:11:21.440 --> 00:11:36.400] Yeah, so that's the sense that a part of your body doesn't belong to you because it's not part of, because if you feel like a stroke in your right hemisphere, you won't know that the left side of your body belongs to you.
[00:11:36.720 --> 00:11:41.760] So you're talking about the difference between it feeling like it's possessed by somebody else or feeling like it's not possessed at all?
[00:11:41.760 --> 00:11:46.160] Well, so this is the difference between feeling like you own it versus feeling like you control it.
[00:11:46.160 --> 00:11:49.280] Alien hand syndrome is the lack of feeling that you control it.
[00:11:49.760 --> 00:11:50.960] And then neglect, it's a lack of feeling.
[00:11:51.200 --> 00:11:54.720] Neglect can result in a lack of sense that you own it.
[00:11:54.720 --> 00:11:58.720] So, yeah, and what patients will say, there's another patient in the bed with me, because that's not my leg.
[00:11:58.720 --> 00:12:00.240] It must be another patient's leg.
[00:12:00.240 --> 00:12:10.320] And the quickie bedside test we do is we take their paralyzed hand, you know, the one that they're neglecting, we hold it in front of their face, and we say, whose hand is this?
[00:12:10.320 --> 00:12:15.120] And they invariably say, that's your hand, even though you're showing them their own hand.
[00:12:15.440 --> 00:12:19.360] Because they don't feel like it's part of them, and therefore it isn't, right?
[00:12:19.760 --> 00:12:24.240] But a good example of both versions, or of like the two, how those things are actually separate.
[00:12:24.240 --> 00:12:24.560] They are.
[00:12:24.560 --> 00:12:29.880] They are distinct phenomena, although obviously these are all related, but these are the components that are distinct.
[00:12:29.880 --> 00:12:34.760] Steve, if you put on an alien costume, would that enhance your alien hand syndrome?
[00:12:34.760 --> 00:12:36.120] No, it wouldn't.
[00:12:29.440 --> 00:12:36.760] It's not necessary.
[00:12:37.240 --> 00:12:38.840] Now, one chuckle from anybody.
[00:12:38.840 --> 00:12:39.000] No.
[00:12:39.320 --> 00:12:40.280] It would look cool.
[00:12:41.720 --> 00:12:43.400] I was laughing on the inside.
[00:12:46.760 --> 00:12:48.520] A couple other interesting wrinkles here.
[00:12:48.520 --> 00:12:54.440] You can have a sense of ownership over a part of the body you don't, that doesn't exist, right?
[00:12:54.440 --> 00:12:58.920] So you can have what's called a supernumerary phantom limb.
[00:12:59.240 --> 00:13:00.280] So this happened.
[00:13:00.280 --> 00:13:07.640] These are now like you have a stroke, the ownership module gets disconnected from the paralyzed limb, but it's still working.
[00:13:07.640 --> 00:13:10.120] It's just not getting any feedback from your arm.
[00:13:10.120 --> 00:13:11.720] So it makes up an arm.
[00:13:12.840 --> 00:13:18.440] You have a phantom limb, and it's separate from your real physical limb.
[00:13:18.440 --> 00:13:20.360] And you feel like you own it.
[00:13:20.360 --> 00:13:21.240] It's part of you.
[00:13:21.240 --> 00:13:22.600] You control it.
[00:13:22.600 --> 00:13:27.080] You just can't obviously manipulate external reality with it because it exists only as a figment in your mind.
[00:13:27.320 --> 00:13:29.880] What about your homunculus in that scenario, Steve?
[00:13:29.880 --> 00:13:33.240] Yeah, I mean, the homunculus is somewhat plastic, right?
[00:13:33.720 --> 00:13:35.320] You can rearrange itself.
[00:13:35.320 --> 00:13:41.000] So it would have, by definition, would that have adapted to that phantom supernumerary limb?
[00:13:41.000 --> 00:13:42.840] They tend to go away over time.
[00:13:42.840 --> 00:13:45.960] So they tend not to persist indefinitely.
[00:13:47.560 --> 00:13:51.960] The most extreme case was somebody who had four phantom limbs.
[00:13:52.280 --> 00:13:54.760] Four supernumerary phantom limbs.
[00:13:54.760 --> 00:13:58.280] So he was literally Doc Ock because he had eight limbs.
[00:13:58.280 --> 00:14:00.040] But four of them weren't real.
[00:14:00.040 --> 00:14:09.480] So, you know, you know, and there's also phantom limb when you you like have a paralyzed arm, or you have like an amputated arm rather, and you still feel like it's there.
[00:14:09.480 --> 00:14:14.880] Because the arm is physically gone, but your ownership module still owns the limb.
[00:14:14.600 --> 00:14:18.240] It's that circuit is still there, even though there's no physical arm in place.
[00:14:18.560 --> 00:14:19.920] And Kare, have you ever heard of this?
[00:14:19.920 --> 00:14:28.160] I forget what the technical name of it is, but there's a very rare syndrome where people feel like they don't own parts of their actual body.
[00:14:28.160 --> 00:14:34.560] And they often present to doctors saying, I want you to amputate this thing attached to me.
[00:14:35.040 --> 00:14:35.840] Oh, interesting.
[00:14:35.840 --> 00:14:36.800] Because it's not me.
[00:14:36.800 --> 00:14:38.880] And it makes them feel very uncomfortable.
[00:14:38.880 --> 00:14:43.840] And there's a huge ethical discussion going on as to what is the appropriate thing to do about it.
[00:14:44.880 --> 00:14:45.840] Like their liver?
[00:14:46.160 --> 00:14:48.800] No, like they're not.
[00:14:48.880 --> 00:14:51.840] Like, let's say they don't want their, yeah, they're pinky.
[00:14:53.120 --> 00:14:55.600] But imagine feeling like your left arm is not you.
[00:14:55.600 --> 00:14:57.440] It's attached to you.
[00:14:57.760 --> 00:15:02.720] And there's something neurological going on, but they're probably often misdiagnosed as having some form of psychosis.
[00:15:02.720 --> 00:15:02.960] Right.
[00:15:02.960 --> 00:15:03.360] Yeah, where do you think that's a good idea?
[00:15:04.000 --> 00:15:05.200] Medicines help with that.
[00:15:06.160 --> 00:15:07.440] Not if that's not what's wrong.
[00:15:07.440 --> 00:15:09.760] He's saying there's a circuit in their brain that's faulty.
[00:15:09.760 --> 00:15:10.000] Right.
[00:15:10.400 --> 00:15:11.120] And it's so rare.
[00:15:11.120 --> 00:15:14.720] We don't really have a lot of research into it, so I don't know that we have it all fleshed out.
[00:15:14.720 --> 00:15:17.680] And certainly not a lot of treatment trials or anything with it.
[00:15:17.680 --> 00:15:18.240] Yeah.
[00:15:18.240 --> 00:15:23.040] So all of these components can be disconnected from each other is the cool part.
[00:15:23.040 --> 00:15:44.800] So now getting back to the robot thing with the paper, what they want to do is say, yeah, now that we kind of know all of these components, mainly from people with strokes or people on drugs or whatever, wherever these circuits go off for one reason or another, or anoxia or whatever, you know, this is, again, part of, often part of a near-death experience, for example, the out-of-body experience.
[00:15:44.800 --> 00:15:54.000] Let's see if we can replicate these components in a robot, make a robot feel as if it's inside its robot body.
[00:15:54.000 --> 00:15:56.080] And so they could replicate the circuits.
[00:15:56.080 --> 00:16:07.640] For example, for the agency module, you could say, like, the robot knows what it wants to do, meaning that at least there's the circuit that says you're going to raise your right arm.
[00:16:07.640 --> 00:16:11.560] So that information is in the AI controlling the robot.
[00:16:11.560 --> 00:16:20.920] It then will raise the arm, and then you have sensors that feed back to say how is the limb moving and does it match what you intended to do.
[00:16:20.920 --> 00:16:26.040] And if it does match, you have to give some kind of positive feedback to the algorithm.
[00:16:26.040 --> 00:16:33.160] And so that would essentially mimic that loop that exists in a human brain that makes you feel as if you own your body part, right?
[00:16:33.320 --> 00:16:35.080] Aren't they kind of doing that now, though?
[00:16:35.080 --> 00:16:46.600] To some extent, but they want to explicitly try to replicate these components of a sense of self that humans have in a robot and then see how that influences the robot's behavior.
[00:16:46.600 --> 00:16:51.240] Maybe it will improve the robot's ability to control its movements, for example.
[00:16:51.240 --> 00:16:53.960] Maybe it will have some consequences.
[00:16:54.520 --> 00:16:57.480] Probably we evolved these things for a reason.
[00:16:57.800 --> 00:17:00.920] Maybe give it a psychosis, some robotic psychosis.
[00:17:00.920 --> 00:17:01.480] Yeah.
[00:17:01.800 --> 00:17:09.400] Now, the question is: now, if we want to keep going forward with this, if you add all these things together, what does that add up to?
[00:17:09.720 --> 00:17:16.360] Does it add up to an actual sense of self that the robot has?
[00:17:16.360 --> 00:17:27.320] Now, I don't think that if we had all these circuits disconnected from each other and not connected to anything, that was also trying to replicate consciousness.
[00:17:27.320 --> 00:17:28.200] Consciousness, yeah.
[00:17:28.200 --> 00:17:28.520] Yeah.
[00:17:28.520 --> 00:17:29.960] So, and what is consciousness?
[00:17:29.960 --> 00:17:33.720] Again, that we could try to replicate that in an AI/slash robot.
[00:17:33.720 --> 00:17:39.320] That is, you know, essentially, if I had to strip it down, what we know now is wakeful consciousness.
[00:17:39.320 --> 00:17:42.760] If we're talking about clinically now, what is wakeful consciousness?
[00:17:42.760 --> 00:17:48.560] It is a constant communication that the brain is having with itself, right?
[00:17:48.800 --> 00:17:55.680] There's just this constant loop of neurological activity, and it's being activated by the brainstem.
[00:17:55.680 --> 00:18:03.920] The brainstem is constantly giving your cortex a kick in the ass, and then every time something happens, it leads to something else, which leads to something else, which leads to something else.
[00:18:03.920 --> 00:18:10.400] And if that just keeps happening, that chain of neurological events is your stream of consciousness, right?
[00:18:10.400 --> 00:18:21.760] And it's taking in sensory information, it's taking information from your own body, it's taking in information from other parts of your brain that are communicating with each other, and it produces the stream of consciousness, right?
[00:18:21.760 --> 00:18:23.680] That's sort of self-propelling.
[00:18:23.680 --> 00:18:27.440] And it's not just excitatory, it's like there's a lot of inhibitory action going on.
[00:18:27.440 --> 00:18:30.640] Well, yeah, then at the high levels, there's a lot of stuff's happening.
[00:18:30.640 --> 00:18:34.000] You have to sort of inhibit the stuff to control your behavior.
[00:18:34.000 --> 00:18:35.360] So it's not just chaos.
[00:18:35.600 --> 00:18:37.040] Or a massive seizure.
[00:18:37.040 --> 00:18:41.040] Yeah, well, that's the seizures are inhibited at a really basic level.
[00:18:41.040 --> 00:18:52.560] Like if you, every time you, like one neural circuit, bunch of clump of neurons sends a signal to another clump of neurons, they inhibit all of the adjacent neurons.
[00:18:52.560 --> 00:18:59.920] So that's to keep it from spreading outside of the circuit to prevent seizures and prevent what we call ephaptic transmission.
[00:18:59.920 --> 00:19:01.440] Ever hear that term, Kara?
[00:19:01.440 --> 00:19:02.080] Ephactic.
[00:19:02.240 --> 00:19:03.040] Ephaptic.
[00:19:03.040 --> 00:19:03.680] EPH.
[00:19:03.760 --> 00:19:04.160] Ephactic.
[00:19:04.240 --> 00:19:04.880] Ephaptic.
[00:19:04.880 --> 00:19:11.600] Yeah, basically meaning not through a circuit, but just spreading it to adjacent neurons.
[00:19:12.320 --> 00:19:21.040] Yeah, seizures are when the bunch of cells or neurons are firing, not along pathways, but just because they're all next to each other and they're just all firing.
[00:19:21.360 --> 00:19:21.600] Yep.
[00:19:21.600 --> 00:19:22.400] It's bad.
[00:19:22.400 --> 00:19:22.880] Right.
[00:19:22.880 --> 00:19:24.560] They're not good for the brain for that to happen.
[00:19:24.560 --> 00:19:24.960] Yeah.
[00:19:24.960 --> 00:19:27.200] And here's the final question I want to leave you guys with.
[00:19:27.520 --> 00:19:36.920] Is there a difference between a general sentient AI that exists only virtually and one that's in a robot?
[00:19:36.920 --> 00:19:42.680] And is it what would an AI that exists only virtually be like?
[00:19:42.680 --> 00:20:00.600] Now, the gray zone in between these two states is what I call the max headroom thing, which is that you could have a virtual body, an AI that's not in a robot, that's just on a computer, could have a virtual body and could have all of these sense of self modules running with the virtual body.
[00:20:00.600 --> 00:20:01.880] But what if you didn't do that?
[00:20:01.880 --> 00:20:05.560] What if you had none of these sense of self circuits running?
[00:20:05.560 --> 00:20:07.080] You just had the AI.
[00:20:07.080 --> 00:20:09.160] What would it experience?
[00:20:09.160 --> 00:20:11.160] And would that be sustainable?
[00:20:11.160 --> 00:20:11.960] Would that be functional?
[00:20:11.960 --> 00:20:17.000] Or do we really need to embody it in order for it to be a functioning self-aware AI?
[00:20:17.320 --> 00:20:28.440] Well, and I guess to make it even more complicated, if we're talking now about software, not hardware, software can control external hardware.
[00:20:28.440 --> 00:20:39.800] So even if it's not a robot body, if that AI has access to the grid or it has access to a server, could it then embody other machinery?
[00:20:39.960 --> 00:20:41.880] Yeah, maybe your house is its body.
[00:20:41.880 --> 00:20:42.280] Yeah.
[00:20:42.280 --> 00:20:43.800] Or a spaceship is its body.
[00:20:44.120 --> 00:20:45.720] That's Futurama right there.
[00:20:45.720 --> 00:20:46.920] Yeah, that's the best thing.
[00:20:47.080 --> 00:20:49.880] The spaceship is the, is the machine, is the machine.
[00:20:50.040 --> 00:20:51.720] But it's embodied in something.
[00:20:51.720 --> 00:20:52.840] Yeah, it's embodied in something.
[00:20:53.000 --> 00:20:55.880] It's embodied in something, but in a much more, I think, useful way.
[00:20:55.880 --> 00:20:59.880] Like that, to me, you could do a lot more with that than like a robot in a humanoid body.
[00:20:59.880 --> 00:21:01.560] Well, was HAL 9000 that?
[00:21:01.560 --> 00:21:05.080] I think so, because it had total control over the ship.
[00:21:05.560 --> 00:21:05.880] Yeah.
[00:21:06.280 --> 00:21:08.840] Yeah, except it was indistinguishable.
[00:21:08.800 --> 00:21:10.120] It seemed to be indistinguishable.
[00:21:10.280 --> 00:21:11.160] Anyway, it's fascinating.
[00:21:11.400 --> 00:21:16.320] The thing that's interesting is that we will be able to investigate all of these questions once we do it, right?
[00:21:16.320 --> 00:21:31.280] We could speculate now, but once we do it, we'll get a much better sense of how this sense of self and embodiment affects artificial intelligence, whether it's virtual or just in the void or if it's embodied in a robot.
[00:21:31.280 --> 00:21:32.400] We will see.
[00:21:32.400 --> 00:21:33.120] Cool.
[00:21:33.120 --> 00:21:39.840] All right, Kara, tell us about the new energy secretary, or at least the proposed new energy secretary.
[00:21:40.000 --> 00:21:42.320] You sound so happy when you say that.
[00:21:43.920 --> 00:21:49.520] I feel like this is going to be a new series because as I was prepping this, I found out that Dr.
[00:21:49.520 --> 00:21:53.920] Oz has been selected for Medicare and Medicaid.
[00:21:53.920 --> 00:21:54.560] Oh, what?
[00:21:57.040 --> 00:22:00.720] But I did not do a deep dive for that one.
[00:22:00.720 --> 00:22:01.600] So just set that up.
[00:22:01.840 --> 00:22:04.320] I thought a board of trustees dealt with that.
[00:22:04.320 --> 00:22:05.440] I guess there's somebody at the top.
[00:22:05.840 --> 00:22:07.200] There's always somebody at the top.
[00:22:07.520 --> 00:22:10.080] There's always somebody appointed who's in charge.
[00:22:10.080 --> 00:22:12.000] So instead, we're not going to talk about Dr.
[00:22:12.000 --> 00:22:13.760] Oz, at least not this week.
[00:22:14.080 --> 00:22:18.240] We will be talking about Chris Wright.
[00:22:18.640 --> 00:22:21.840] His full name is Christopher Allen Wright.
[00:22:21.840 --> 00:22:25.040] He's the CEO of Liberty Energy.
[00:22:25.040 --> 00:22:37.760] It's the second largest fracking company in the U.S., and he is the presumptive nominee for United States Secretary of Energy under this next Trump presidency.
[00:22:37.760 --> 00:22:42.400] He's obviously got a lot of experience in the energy sector.
[00:22:42.400 --> 00:22:51.520] He is a board member of a nuclear energy company, also a board member of a royalty payment company for mineral rights and mining rights.
[00:22:51.520 --> 00:22:56.480] But there's a little bit of a wrinkle in that he does not believe in climate change.
[00:22:56.480 --> 00:22:58.560] Didn't he also work for a solar company, too?
[00:22:58.560 --> 00:22:59.040] I heard?
[00:22:59.040 --> 00:23:03.320] He's worked for, yeah, he's been on boards and worked for like companies across the board.
[00:23:03.480 --> 00:23:18.600] And that's what Trump really pushed when he did his post on, I think, Truth Social, where he said, I'm thrilled to announce that Chris Wright will be joining my administration as both United States Secretary of Energy and member of the newly formed Council of National Energy.
[00:23:18.600 --> 00:23:21.560] He's been a leading technologist and entrepreneur in energy.
[00:23:21.560 --> 00:23:24.840] He's worked in nuclear, solar, geothermal, and oil and gas.
[00:23:25.080 --> 00:23:27.880] He is an oil and gas executive.
[00:23:27.880 --> 00:23:31.880] He is a firm believer.
[00:23:31.880 --> 00:23:39.880] Well, I actually, it's hard to know what somebody actually believes in their mind, but he's a firm proponent, not proponent, that's not the right word either.
[00:23:40.200 --> 00:23:48.280] He claims that there are no negative impacts from fossil fuel energy on the climate.
[00:23:48.920 --> 00:23:49.480] Wow.
[00:23:49.880 --> 00:23:51.320] That's a remarkable statement.
[00:23:51.640 --> 00:23:52.360] Pablo.
[00:23:52.360 --> 00:23:52.920] Thank you.
[00:23:52.920 --> 00:23:53.400] Despite that.
[00:23:53.640 --> 00:24:11.000] He claims in a video that he posted on his LinkedIn, and this is what he labeled the video: five commonly used words around energy and climate that are both deceptive and destructive: climate crisis, energy transition, carbon pollution, clean energy, and dirty energy.
[00:24:11.000 --> 00:24:13.240] Hashtag energy sopriety.
[00:24:13.240 --> 00:24:23.240] So he claims that, quote, we have seen no increase in the frequency or intensity of hurricanes, tornadoes, droughts, or floods, despite endless fear-mongering.
[00:24:23.240 --> 00:24:26.200] He says that there is no climate crisis.
[00:24:26.200 --> 00:24:41.640] And he goes on in this 12 and a half minute video that he posted to his LinkedIn about a year ago to basically argue that carbon dioxide cannot be a pollutant and carbon dioxide cannot have all of these downstream negative consequences because it's natural.
[00:24:42.600 --> 00:24:47.680] Because it's a natural phenomenon that occurs via photosynthesis.
[00:24:47.680 --> 00:24:49.600] Right, which is a nonsensical argument.
[00:24:49.600 --> 00:24:50.880] Yeah, and respiration.
[00:24:44.680 --> 00:24:51.120] Yeah.
[00:24:51.440 --> 00:24:53.200] So it's pretty scary.
[00:24:53.200 --> 00:25:02.400] He says that there is no climate crisis and the negative impacts from climate change, because of course he can't fully argue that climate change doesn't exist.
[00:25:02.400 --> 00:25:04.400] Like there are very few people who do that now.
[00:25:04.400 --> 00:25:06.400] Instead, they've sort of moved the goalposts.
[00:25:06.400 --> 00:25:11.760] And he says that the negative impacts from climate change are less than the benefits of using fossil fuels.
[00:25:11.760 --> 00:25:22.480] So he is a firm believer that we need to continue to drill, we need to continue to burn, that these approaches to energy are going to allow us energy independence.
[00:25:22.480 --> 00:25:26.640] And according to Donald Trump's Truth Social Post, energy, U.S.
[00:25:26.720 --> 00:25:29.280] energy dominance, he put it in all caps.
[00:25:30.640 --> 00:25:31.200] Dominant.
[00:25:31.520 --> 00:25:35.520] Yeah, which is a large goal of the administration.
[00:25:35.840 --> 00:25:41.120] You know, to be fair, Chris Wright has worked in alternative energy.
[00:25:41.120 --> 00:25:45.440] He works in energy, which means he's worked in renewables and non-renewables.
[00:25:45.600 --> 00:25:52.400] It does appear, I cannot speak for him, but it does appear that the motivation here is money.
[00:25:52.720 --> 00:25:54.480] It's not clean energy.
[00:25:54.480 --> 00:25:56.880] It's just energy, right?
[00:25:57.200 --> 00:26:04.640] And however it's going to be the most lucrative and the easiest to produce that energy is going to be the path.
[00:26:04.640 --> 00:26:10.320] And that's what's so scary because we know the cost now of natural gas.
[00:26:10.320 --> 00:26:13.360] We know the cost of crude oil.
[00:26:13.360 --> 00:26:16.000] We know the cost of fracking.
[00:26:16.000 --> 00:26:19.520] And these just aren't arguments that he's making.
[00:26:19.520 --> 00:26:25.520] And he's going to have inordinate power if he's confirmed by the Senate to lead the Department of Energy.
[00:26:25.760 --> 00:26:27.040] He will 100% be confirmed.
[00:26:27.040 --> 00:26:28.720] There's no way that they're not going to confirm him.
[00:26:28.720 --> 00:26:29.760] Oh, God, don't say that.
[00:26:30.200 --> 00:26:31.400] But let me tell you this, Kara.
[00:26:31.400 --> 00:26:34.520] I'm going to make an argument for why this is not as bad as it seems.
[00:26:34.840 --> 00:26:35.400] I know where you're going.
[00:26:35.640 --> 00:26:36.120] Good luck.
[00:26:37.800 --> 00:26:41.000] And I'm not just comparing him to the other secretary, whatever other people.
[00:26:41.240 --> 00:26:42.120] No, I know where you're going with this.
[00:26:42.440 --> 00:26:43.000] It's an argument.
[00:26:43.320 --> 00:26:43.720] It's an argument.
[00:26:44.120 --> 00:26:44.760] Here's my argument.
[00:26:44.920 --> 00:26:45.640] Let me put it out.
[00:26:46.120 --> 00:26:51.640] Obviously, it's bad to have somebody in that position who just straight up denies the science, right?
[00:26:51.640 --> 00:26:52.680] That's not a good thing.
[00:26:52.680 --> 00:27:02.520] And this will absolutely be a setback, and it would be worse, obviously, than if we had somebody who was fully on board with transitioning away from fossil fuels, which he isn't.
[00:27:03.320 --> 00:27:15.240] But at this point in time, essentially, we have two strategies for transitioning to renewable energy, green energy, low-carbon energy, and away from fossil fuels.
[00:27:15.240 --> 00:27:23.960] And these are not mutually exclusive, but there's some combination of reducing supply and reducing demand for fossil fuels, right?
[00:27:24.280 --> 00:27:29.640] So far, we are not taking the reduce the supply approach.
[00:27:29.640 --> 00:27:41.480] Under the Biden administration, the United States is producing more fossil fuels than any other country at any time ever in human history, including during Trump's administration.
[00:27:42.360 --> 00:27:43.400] So we're right on plan then.
[00:27:43.720 --> 00:27:46.440] Is that a function of just there being more people in more need?
[00:27:46.680 --> 00:27:48.760] It's a function of, you know what it is?
[00:27:48.760 --> 00:27:51.160] It's a function of Russia invading Ukraine.
[00:27:51.160 --> 00:27:59.960] So we, of course, we jacked up our oil and gas production to essentially displace Russia's sales to Europe.
[00:28:00.440 --> 00:28:09.560] We're trying to replace Russia's sales to Europe of natural gas and oil, and that put us over the top to more production than we've ever done before.
[00:28:09.560 --> 00:28:14.880] So, the idea that it's always been silly for Trump to say, we're going to bring oil back and we're going to be dominant.
[00:28:14.880 --> 00:28:15.760] We're already there, dude.
[00:28:14.600 --> 00:28:20.560] He's already producing more oil than we ever have before, or anyone has ever had.
[00:28:20.880 --> 00:28:22.800] But Biden didn't do that willy-nilly.
[00:28:23.120 --> 00:28:24.080] No, he did it deliberately.
[00:28:24.320 --> 00:28:26.240] He was deliberate with success.
[00:28:26.400 --> 00:28:28.320] But the point is, we're already there.
[00:28:28.320 --> 00:28:30.640] We're already producing all this oil.
[00:28:30.640 --> 00:28:33.280] And we're doing that to keep prices down.
[00:28:33.280 --> 00:28:38.720] Now, keeping prices down is actually a good thing because it lowers the value of pulling that oil out of the ground.
[00:28:38.720 --> 00:28:45.760] It's also good because it takes money away from authoritarians who are basically funded by the sale of fossil fuels.
[00:28:45.760 --> 00:28:46.240] Right.
[00:28:46.480 --> 00:28:50.720] So, ideally, ideally, we will reduce demand first.
[00:28:50.720 --> 00:28:52.240] How do we reduce demand first?
[00:28:52.240 --> 00:28:58.560] Because then that again reduces the cost, reduces the value of fossil fuel and the incentive to go after it.
[00:28:58.560 --> 00:29:02.160] You do that by transitioning to green energy, right?
[00:29:02.160 --> 00:29:09.920] So, fewer cars that run on gas, fewer cars that are running gas, and then fewer coal and gas-powered energy production.
[00:29:09.920 --> 00:29:19.520] So, for right now, it's more important that we build non-fossil fuel resources than that we restrict fossil fuel.
[00:29:19.520 --> 00:29:22.320] Eventually, we have to dial down the fossil fuel.
[00:29:22.320 --> 00:29:31.280] But for now, if we're just investing in expanding our non-fossil fuel infrastructure, that's fine.
[00:29:31.280 --> 00:29:42.560] And so, my hope is: again, this is just a hope, but there has been a lot of good news recently on the nuclear power front, which I just summarized in my blog post.
[00:29:42.560 --> 00:29:53.600] The Biden administration and also a consortium of countries around the world have pledged to triple nuclear power capacity by 2050.
[00:29:53.600 --> 00:29:54.320] Good.
[00:29:54.320 --> 00:29:55.040] Wow.
[00:29:55.040 --> 00:29:55.600] Triple.
[00:29:55.600 --> 00:29:56.560] That's huge.
[00:29:56.560 --> 00:29:57.600] That is huge.
[00:29:57.600 --> 00:29:58.880] It's a big piece of the puzzle.
[00:29:59.320 --> 00:30:09.000] Yeah, now, if by 2050 we have a 50% increase in our energy demand, that means doubling the nuclear percentage of production.
[00:30:09.000 --> 00:30:11.320] So right now we're about 19, 20%.
[00:30:11.320 --> 00:30:15.960] So we're talking about going to about 40% nuclear worldwide and in the U.S.
[00:30:16.200 --> 00:30:18.760] And that's probably where we should be.
[00:30:18.760 --> 00:30:26.840] So I don't know of any reason why this guy or why the Trump administration is going to undo nuclear.
[00:30:27.000 --> 00:30:28.840] No, I think he's pro-nuclear.
[00:30:28.920 --> 00:30:29.640] They're pro-nuclear.
[00:30:29.800 --> 00:30:31.240] This has broad bipartisan support.
[00:30:31.320 --> 00:30:34.200] So yeah, this has broad bipartisan support.
[00:30:34.200 --> 00:30:40.280] So as long as this keeps happening, that could keep us on pace to where we need to get by 2050.
[00:30:41.000 --> 00:30:43.240] It may not be good for the solar or the wind industry.
[00:30:43.560 --> 00:30:44.120] I get that.
[00:30:44.120 --> 00:30:46.120] That's where I'm more concerned.
[00:30:46.120 --> 00:30:57.000] But here's the thing: some people have argued that because wind and solar are currently the cheapest form of new energy to add to the grid, that it doesn't need a lot of subsidies at this point in time.
[00:30:57.000 --> 00:30:59.560] Companies are doing it because it's the cheapest.
[00:30:59.560 --> 00:31:10.520] And so hopefully that will have inertia unless they actively try to inhibit it, which they may, which Trump may just decide to mess with the wind industry just to do it because he doesn't like wind.
[00:31:10.920 --> 00:31:12.520] It kills birds, you know, and stuff.
[00:31:12.520 --> 00:31:15.720] I don't think this guy would do that because, as you say, he's kind of neutral.
[00:31:16.040 --> 00:31:22.040] Yeah, I think he's neutral about the source, like from a moralistic perspective, but that's actually a bad thing.
[00:31:22.360 --> 00:31:23.080] Yeah, I agree.
[00:31:23.080 --> 00:31:36.840] But I mean, but for now, doing the all of the above so that at least the renewables and the nuclear and the geothermal and the hydroelectric can still grow and expand, it may not be that much of a disaster, is what I'm saying.
[00:31:36.840 --> 00:31:38.040] It may not be bad.
[00:31:38.600 --> 00:31:39.160] I don't know.
[00:31:39.160 --> 00:31:42.760] I think that he wants to fully deregulate oil and gas.
[00:31:42.840 --> 00:31:47.440] They're going to deregulate, but they're also probably going to deregulate nuclear and deregulate solar and wind, too.
[00:31:44.840 --> 00:31:49.360] Yeah, but that's not going to.
[00:31:49.520 --> 00:31:56.160] So all of those things, yes, are going to make for more competition for alternative sources in the marketplace.
[00:31:56.160 --> 00:31:59.680] But what they don't do is they do not mitigate the pollution.
[00:32:00.000 --> 00:32:00.480] Of course.
[00:32:00.480 --> 00:32:00.880] Absolutely.
[00:32:01.680 --> 00:32:04.400] That is what is actually causing the climate crisis.
[00:32:04.400 --> 00:32:05.040] I agree.
[00:32:05.040 --> 00:32:14.320] But I think though my point is it's really complicated to try to figure out over the next four years what the net effect of this is going to be.
[00:32:14.320 --> 00:32:22.080] And if they continue to expand the non-fossil fuel infrastructure, it may not be that dramatic a difference.
[00:32:22.080 --> 00:32:32.240] And if we are in a much better place in terms of more nuclear, more wind, more solar in four years, that might be a better time to start really thinking of ways to dial back fossil fuels.
[00:32:32.240 --> 00:32:34.160] It wasn't going to happen in the short term anyway.
[00:32:34.160 --> 00:32:35.600] It wasn't happening under Biden.
[00:32:35.600 --> 00:32:38.000] It's definitely not going to happen under Trump.
[00:32:38.000 --> 00:32:42.880] So how quickly does, I mean, don't these nuclear plants take quite a while to get online.
[00:32:43.200 --> 00:32:48.960] But part of what Biden is already doing, he also put together, I mean, there's so many things going on.
[00:32:48.960 --> 00:32:53.520] So he announced $900 million to support startup Gen 3 nuclear reactors.
[00:32:53.520 --> 00:33:00.960] He was part of 25 signatories pledging tripling nuclear capacity by 2050.
[00:33:00.960 --> 00:33:13.360] And also there's the Advance Act, which was just passed with bipartisan support, which streamlines regulations and also provides sweeping support for the nuclear industry.
[00:33:13.360 --> 00:33:24.320] So they're trying to figure out ways, specifically, there's a commission, it's like figure out ways to make us be able to build nuclear reactors cheaper and faster and to streamline all the regulations.
[00:33:24.320 --> 00:33:25.680] That's already happening.
[00:33:25.680 --> 00:33:30.000] Again, I don't see that being undone under this guy or Trump.
[00:33:30.200 --> 00:33:45.400] You know, and this is my spidey senses picking up, but like, while I agree that the regulatory burden is high right now, and we've talked about this on the show before, and there does need to be some streamlining, I think that it is still very important.
[00:33:45.720 --> 00:33:46.200] Absolutely.
[00:33:46.200 --> 00:33:46.920] You can go too far.
[00:33:47.400 --> 00:33:48.200] Safely.
[00:33:48.200 --> 00:33:48.600] Absolutely.
[00:33:48.760 --> 00:33:54.120] And I'm very worried that too much streamlining could lead to disaster.
[00:33:54.120 --> 00:33:55.560] And we should be worried about that.
[00:33:55.560 --> 00:33:56.200] We should be worried about that.
[00:33:56.280 --> 00:33:59.480] Because then that will set us back, like the last disaster set us back.
[00:33:59.720 --> 00:34:01.720] So the devil's always going to be in the details.
[00:34:01.720 --> 00:34:03.560] And there is like a nuclear industry, too.
[00:34:03.560 --> 00:34:06.520] You know, they don't necessarily want to build unsafe reactors.
[00:34:06.520 --> 00:34:07.080] That's what I'm saying.
[00:34:07.080 --> 00:34:10.760] The net effect, it's hard to really calculate the net effect of all of this.
[00:34:10.760 --> 00:34:15.720] So, yes, they probably, because Trump deregulates with a machete, not with a scalpel, right?
[00:34:15.720 --> 00:34:16.680] We've seen that before.
[00:34:16.680 --> 00:34:18.920] That's clearly what they're going to do now.
[00:34:18.920 --> 00:34:22.040] And so, yeah, so that is a legitimate concern.
[00:34:22.040 --> 00:34:34.920] But, you know, the investors and the industry probably will welcome the deregulation, but hopefully, there's already international standards in place for the nuclear industry, and hopefully, they won't be eroded too much.
[00:34:35.720 --> 00:34:40.920] You know, it's like that is really where it bumps up, and that's the part where I don't have the same kind of hope that you have.
[00:34:41.400 --> 00:34:43.000] I'm trying to be positive over here, Karen.
[00:34:43.160 --> 00:34:46.760] But the other thing is, in four years, they could all snap back, you know what I mean?
[00:34:46.760 --> 00:34:52.120] Or at least some of them, or if they went too far, we could then, you know, we have time to claw that back.
[00:34:52.120 --> 00:34:54.680] It's not like whatever happens now is forever.
[00:34:54.680 --> 00:35:02.120] It's really just we're looking at four years with regards to regulation is forever if a plant is built under those regulations.
[00:35:02.280 --> 00:35:03.400] Yeah, probably not in four years.
[00:35:03.400 --> 00:35:04.280] That's a little fast.
[00:35:04.440 --> 00:35:05.720] Yeah, and that's the hope, right?
[00:35:05.720 --> 00:35:17.040] Because ultimately, while I agree with you that people who work in this sector do not want unsafe plants, there are many people who care more about profits than they do about it.
[00:35:17.040 --> 00:35:17.840] Absolutely.
[00:35:14.840 --> 00:35:19.760] So this is a complicated issue.
[00:35:20.000 --> 00:35:23.040] This is one that I am definitely going to be keeping my eye on.
[00:35:23.040 --> 00:35:26.240] I think it's not all doom and gloom.
[00:35:26.240 --> 00:35:35.920] This guy, you know, because of the nuclear thing, because that's all gaining momentum, I'm hoping that over the next four years, that's where they'll focus their efforts.
[00:35:35.920 --> 00:35:37.040] I hope so, too.
[00:35:37.040 --> 00:35:57.280] I mean, I do, I agree with you that it's not all doom and gloom, but I do think that the goalposts have been moved so far at this point that the reason it's not all doom and gloom is because, exactly like you're saying, maybe we'll have nuclear, but like the deregulation of fossil fuels is scaring the living shit out of me right now.
[00:35:57.360 --> 00:35:58.400] I got to be honest.
[00:35:58.400 --> 00:35:59.200] Yeah, I agree.
[00:35:59.200 --> 00:36:00.000] I agree.
[00:36:00.000 --> 00:36:02.000] I'm not sure how much more damage they can do.
[00:36:02.000 --> 00:36:04.480] We're already producing more than we've ever produced.
[00:36:04.480 --> 00:36:04.960] You know what I mean?
[00:36:05.040 --> 00:36:06.000] There's like only so much they can produce.
[00:36:06.160 --> 00:36:10.160] We're already producing more than we've ever produced with strong regulations in place.
[00:36:10.160 --> 00:36:13.200] So imagine when the lid comes off.
[00:36:13.840 --> 00:36:14.960] So we'll see.
[00:36:15.360 --> 00:36:16.720] We'll keep everyone updated.
[00:36:17.280 --> 00:36:22.720] We'll see where it is in the spectrum of worst case versus best case scenario.
[00:36:22.960 --> 00:36:26.720] I mean, of all of Trump's appointments, this is not the one that keeps me up at night.
[00:36:26.720 --> 00:36:27.040] Yeah.
[00:36:27.760 --> 00:36:28.880] Yes, this is not ideal.
[00:36:28.880 --> 00:36:32.400] This is basically what you would, this is exactly what I would have expected.
[00:36:32.400 --> 00:36:36.640] I guess it could have been worse, but I didn't expect somebody who fully denies climate change.
[00:36:36.800 --> 00:36:37.440] Oh, yeah, I did.
[00:36:37.440 --> 00:36:38.080] Oh, 100%.
[00:36:38.720 --> 00:36:39.680] I just thought we were past this.
[00:36:40.000 --> 00:36:41.120] No, because Trump denies it.
[00:36:41.120 --> 00:36:42.800] Trump is 100% denying climate change.
[00:36:43.120 --> 00:36:44.640] He's still saying it's a hoax.
[00:36:44.640 --> 00:36:49.920] Clearly, though, there's two people that keep me up at night in terms of the appointments right now.
[00:36:49.920 --> 00:36:53.600] One is Tulsi Gabbard, you know, because she's don't need to get into that.
[00:36:53.920 --> 00:36:57.600] As a head of intelligence, that's like actually dangerous for the country.
[00:36:57.600 --> 00:37:00.840] And the other one's RFK Jr., because he could destroy the federal health care.
[00:37:01.640 --> 00:37:02.520] Now I'm going to add Dr.
[00:37:02.520 --> 00:37:03.400] Oz to that.
[00:37:03.560 --> 00:37:03.960] Dr.
[00:36:59.840 --> 00:37:06.520] Oz is not nearly as bad as RFK Jr.
[00:37:06.840 --> 00:37:10.280] You know, I don't.
[00:37:10.280 --> 00:37:15.080] I don't know how much shift he could make at Medicaid and Medicaid.
[00:37:15.080 --> 00:37:20.760] Whereas RFK is like actively wants to cause mischief, wants to oppose vaccines.
[00:37:21.560 --> 00:37:23.800] He is the wrecking ball.
[00:37:24.120 --> 00:37:29.400] And, you know, Gabbard just doesn't know reality from fantasy, and that's dangerous as the head of intelligence.
[00:37:29.400 --> 00:37:31.240] That's very, very dangerous.
[00:37:31.240 --> 00:37:36.360] But I guess my thing is, you can say don't get vaccinated all you want.
[00:37:36.360 --> 00:37:43.720] I don't know if he can actually block a vaccine from being available to the public, but you can say don't get vaccinated all you want.
[00:37:43.720 --> 00:37:49.960] But if you defund or deeply change the structure of Medicaid and Medicare, people will die.
[00:37:49.960 --> 00:37:52.200] Lots and lots of people will die.
[00:37:52.200 --> 00:37:54.920] Well, there's lots of ways that RFK Jr.
[00:37:55.000 --> 00:37:59.160] can undermine our vaccine infrastructure in this country.
[00:37:59.800 --> 00:38:00.760] This is a separate talk.
[00:38:00.760 --> 00:38:07.480] Maybe we'll give this, you know, probably what we should do is bring David Gorsky on because he's been really writing a lot about RFK Jr.
[00:38:07.800 --> 00:38:10.680] And like we'll have a little primer last or two weeks ago.
[00:38:10.840 --> 00:38:14.600] We'll do a good deep dive on like what could he actually do?
[00:38:14.600 --> 00:38:16.600] Because that is a very interesting question.
[00:38:17.960 --> 00:38:22.200] But that's, I think, the most, you know, those two are the most scary appointees so far.
[00:38:22.360 --> 00:38:25.000] It's a big question: will he have a larger U.S.
[00:38:25.000 --> 00:38:27.000] body count than he had during COVID?
[00:38:27.000 --> 00:38:27.640] That's the big question.
[00:38:27.880 --> 00:38:28.280] Possibly.
[00:38:28.680 --> 00:38:29.800] It is possible.
[00:38:29.800 --> 00:38:31.720] Yeah, the next pandemic scares me.
[00:38:31.720 --> 00:38:34.920] All right, Bob, tell us about finding Planet 9.
[00:38:35.800 --> 00:38:36.760] My turn, is it?
[00:38:36.760 --> 00:38:37.160] Okay.
[00:38:37.480 --> 00:38:40.120] Planet X, or is it Planet 9?
[00:38:40.120 --> 00:38:41.320] Was in the news recently.
[00:38:41.320 --> 00:38:55.520] Scientists have published a proposal to use an array of 200 small telescopes that they say can prove if a massive planet indeed exists in the farthest reaches of our solar system in the region where the so-called trans-Neptunian objects dwell.
[00:38:55.520 --> 00:39:04.400] Daniel Gomez and Gary Bernstein from the Department of Physics and Astronomy, University of Pennsylvania, wrote on the online archive their paper.
[00:39:04.400 --> 00:39:11.520] It is named an automated occultation network for gravitational mapping of the trans-Neptunian solar system.
[00:39:11.520 --> 00:39:17.360] Okay, so to better appreciate this, let's explore the few bits of terminology typically found in these discussions.
[00:39:17.360 --> 00:39:20.640] First off, is it Planet X or is it Planet 9?
[00:39:20.640 --> 00:39:22.400] Planet X is more general.
[00:39:22.400 --> 00:39:25.600] That's a general term that's been used for many, many years.
[00:39:25.600 --> 00:39:27.120] Many years ago, a hundred years.
[00:39:28.560 --> 00:39:31.360] Right, used for the potential planet beyond Neptune.
[00:39:31.360 --> 00:39:32.240] That's Planet X.
[00:39:32.240 --> 00:39:49.200] Planet 9, on the other hand, is often used interchangeably with Planet X, of course, but it seems Planet 9 is used most often when referring to the idea that the ninth planet of our solar system could potentially be found by observing its impact on the orbits of trans-Neptunian objects.
[00:39:49.440 --> 00:39:55.120] So that's where you're going to mostly find the term Planet 9, and that makes sense, and that's fine.
[00:39:55.360 --> 00:39:58.480] All right, so this brings us to trans-Neptunian objects.
[00:39:58.480 --> 00:40:00.960] And it's not hard to predict what that term refers to.
[00:40:00.960 --> 00:40:04.240] It refers to objects beyond the orbit of Neptune.
[00:40:04.240 --> 00:40:07.760] But those objects have two primary categories.
[00:40:07.760 --> 00:40:15.200] The most distant trans-Neptunian objects exist in a region that I wasn't aware of called the scattered disk.
[00:40:15.520 --> 00:40:20.240] Now, these are really, really far away, up to 100 AUs, astronomical units.
[00:40:20.240 --> 00:40:24.800] Each AU is the distance from the Earth to the Sun, 93 million miles.
[00:40:25.040 --> 00:40:27.440] Sorry, I don't have the kilometers memorized.
[00:40:27.440 --> 00:40:34.840] The scattered disk contains small icy bodies, and they're in very eccentric orbits, really high off of the plane.
[00:40:34.840 --> 00:40:40.200] The other major area where trans-Neptunian objects exist, this is the place you really want to be.
[00:40:40.440 --> 00:40:43.480] If you ever hang out beyond Neptune, it's going to be the Kuiper Belt.
[00:40:43.480 --> 00:40:44.280] That's where you got to go.
[00:40:45.720 --> 00:40:47.000] Sure, sure, you know.
[00:40:47.800 --> 00:40:55.800] The Kuiper Belt starts at Neptune's orbit right beyond its orbit at 30 AUs and stretches out to 50 AUs or 55 AUs, I've heard as well.
[00:40:55.800 --> 00:40:57.480] So it's huge.
[00:40:57.480 --> 00:41:06.520] It's about 20 times as wide as the asteroid belt that we know between Mars, we know very well between Mars and Jupiter.
[00:41:06.520 --> 00:41:11.720] 20 times as wide and potentially 200 times as massive.
[00:41:11.720 --> 00:41:14.120] So the Kuiper Belt is gargantuan.
[00:41:14.120 --> 00:41:17.720] Kuiper Belt objects, though, they're not technically asteroids.
[00:41:17.720 --> 00:41:19.160] I wasn't quite aware of this.
[00:41:19.160 --> 00:41:29.000] As far as I can tell, it's because the word asteroid is mainly reserved for a location, not really what you're made of, but really where you exist.
[00:41:29.000 --> 00:41:34.920] So the large rocky objects between or near the orbits of Mars and Jupiter, those are asteroids.
[00:41:34.920 --> 00:41:38.440] So if you're there, if you come from there, you're an asteroid.
[00:41:38.440 --> 00:41:43.960] But Kuiper Belt objects are not referred to as asteroids.
[00:41:44.200 --> 00:41:46.520] They're just basically Kuiper Belt objects.
[00:41:46.520 --> 00:41:49.320] And they're actually, they're different also as well.
[00:41:49.320 --> 00:41:57.000] They're made up of frozen volatiles, various ices composed of methane, ammonia, and water, too, as well.
[00:41:57.000 --> 00:42:00.280] Trans-Neptunian objects are anything beyond Neptune.
[00:42:00.280 --> 00:42:07.800] And within that area, there's a huge Kuiper belt area, and there's also the more distant scattered disk object area.
[00:42:07.800 --> 00:42:13.160] So now, these objects are thought to be remnants from the solar system's formation.
[00:42:13.160 --> 00:42:14.480] So they are ancient.
[00:42:14.120 --> 00:42:18.000] And since they're so far away, they're basically unchanged.
[00:42:18.080 --> 00:42:24.160] So they would be amazing repositories of information of the early solar system because they have not been melted.
[00:42:24.160 --> 00:42:26.640] They have not been changed really in any way out there.
[00:42:26.640 --> 00:42:29.600] Now, their distant orbits, though, this was interesting.
[00:42:29.760 --> 00:43:51.080] They're in such distant orbits beyond Neptune, we think because Jupiter and Saturn got basically they got together and they imposed their gravitational will on these remnants and they forced them from out from in the early in the you know in the inner solar system closer maybe to Jupiter Saturn area perhaps but they've they've basically forced them out into the orbits that they are in now beyond Neptune the question then becomes can there be a true planet sized object out there a planet 9 or even multiple sub such objects hiding in the Kuiper belt many people think so now the evidence most often cited for this you know it's subtle it's nothing really that's overt but it's it is there and a lot of people are looking into it very closely it's this subtle clustering of orbits of some of these Kuiper belt objects to the scientists to the astronomers the orbits just don't seem to be as randomly distributed as you would expect them to be and one one next explanation they contend could be a very distant unseen planet some say it could have as much as as many as five earth masses a super earth out there waiting to be found that's five, I mean, that's would be i don't i mean, I don't believe that, but i think there could be something out there, and i hope i hope there is.
[00:43:51.080 --> 00:43:52.200] That would be amazing.
[00:43:52.200 --> 00:43:56.120] Now, of course they have they have searched and searched for Planet 9.
[00:43:56.360 --> 00:43:58.040] Nothing has has been found.
[00:43:58.040 --> 00:44:00.920] And this is where the paper comes in, this latest paper.
[00:44:00.920 --> 00:44:24.760] So the authors contend that using 200 small telescopes, like something like 20, 30 centimeters, I mean, pretty small, separated by five kilometers and lined up in an array that stretches end to end a thousand kilometers wide, that such an array of 200 telescopes could tell us these critical details about Planet 9, if it even exists.
[00:44:24.760 --> 00:44:31.080] So the key technique that they describe in detail in their paper is called occultation.
[00:44:31.080 --> 00:44:35.320] Occultation appears more capable and fascinating than I would have thought.
[00:44:35.320 --> 00:44:36.920] So here's how this works.
[00:44:36.920 --> 00:44:41.160] So imagine you're observing an asteroid or a trans-Neptunian object.
[00:44:41.160 --> 00:44:49.640] So you're observing it, and you precisely time to the nanosecond or so when it blocks a distant star, right?
[00:44:49.640 --> 00:44:59.560] It's moving in its orbit and it moves in front of a distant star that's in our galaxy somewhere, say, whatever, 10, 20, 30, 40 light years away, whatever it is.
[00:44:59.560 --> 00:45:06.280] So you time to the nanosecond when it's blocked, and then also to the nanosecond when the star reappears.
[00:45:06.280 --> 00:45:08.680] And we could do that very, very precisely.
[00:45:08.680 --> 00:45:17.000] Now, if you do that, not only with one telescope, but 200 of these telescopes, each one having their own slightly different angle, right?
[00:45:17.000 --> 00:45:22.600] Each one has its own specific angle onto that occultation event.
[00:45:22.600 --> 00:45:26.280] And so they'll have their own view, their own timings.
[00:45:26.280 --> 00:45:40.040] So if you take all these 200 timings and put them together, you combine all that data, what you get is you get an extremely precise understanding of the asteroid's orbit, where it is, and when, very, very precisely.
[00:45:40.040 --> 00:45:41.240] It gets even better than that.
[00:45:41.240 --> 00:45:50.480] The more of these occultations that you observe, the more accurate your timings and your positional data become, more so than any other method that's used alone.
[00:45:50.720 --> 00:45:59.920] So, then ultimately, then the idea here is that once you have these hyper-accurate orbits mapped out, we can then detect very subtle gravitational anomalies, right?
[00:45:59.920 --> 00:46:14.960] If we know down to down to the third, fourth, fifth, sixth decimal point when this star should be blocked by an asteroid or a trans-Neptunian object, if we know where that, and it doesn't happen, then you've then there's some, you had you have an anomaly.
[00:46:14.960 --> 00:46:20.320] You have a gravitational orbital anomaly, and that's something that can be investigated.
[00:46:20.320 --> 00:46:34.320] So, we may discover, for example, through these anomalies that various asteroids are moving, or trans-Neptunian objects, are moving in a way that points to an unknown large gravitational source in a very specific orbital location.
[00:46:34.320 --> 00:46:35.920] In other words, planet 9.
[00:46:35.920 --> 00:46:42.880] So, that's the hope that this hyper-accurate information can actually say there's got to be something over here.
[00:46:42.880 --> 00:46:50.160] Multiple asteroids, multiple trans-Neptunian objects are telling us that there's some mass in this specific area.
[00:46:50.640 --> 00:46:56.240] It seems like it's got, say, two Earth masses, and it's in this orbit this distance from the sun.
[00:46:56.240 --> 00:46:58.640] It could potentially be that specific.
[00:46:58.640 --> 00:47:08.320] So, all we would then have to do is just zoom in on that specific area, and we'd have a relatively, you know, very small parcel of space to investigate, and we could potentially find it.
[00:47:08.320 --> 00:47:09.600] Best case scenario.
[00:47:09.600 --> 00:47:12.320] That's how Obi-Wan Kenobi discovered Carino.
[00:47:12.320 --> 00:47:12.960] Yeah, exactly.
[00:47:12.960 --> 00:47:13.360] Red Law.
[00:47:13.520 --> 00:47:16.000] Oh, that's right, because something was missing.
[00:47:16.000 --> 00:47:18.160] Yeah, there was a gravitational source that was missing.
[00:47:18.160 --> 00:47:21.520] But there was no, right, but there was no body assigned to it.
[00:47:21.520 --> 00:47:24.640] It was a dead spot in space, but there had to have been something there.
[00:47:24.640 --> 00:47:25.040] Yeah.
[00:47:25.040 --> 00:47:25.760] All right.
[00:47:26.080 --> 00:47:27.200] So even if.
[00:47:27.440 --> 00:47:28.960] Bob, we're talking science here.
[00:47:29.280 --> 00:47:29.800] Yes.
[00:47:29.520 --> 00:47:30.680] Science.
[00:47:31.000 --> 00:47:38.040] So even if, even if Planet 9, though, is a bust, a server like this could be incredibly informative about our outer solar system, right?
[00:47:38.040 --> 00:47:41.080] There's still so much to learn even without Planet 9.
[00:47:41.320 --> 00:47:54.600] They believe, the researchers believe that a 10-year survey could find 1,800 new trans-Neptunian objects and reveal details about their properties, their orbital dynamics, their surfaces, so many different things.
[00:47:55.160 --> 00:47:59.880] It could refine also our understanding of the boundary of our solar system and how it evolved.
[00:47:59.880 --> 00:48:04.440] And Bob, a lot of the objects that it discovers could be dwarf planets, even if they're not full planets.
[00:48:04.440 --> 00:48:04.760] Oh, yeah.
[00:48:04.760 --> 00:48:15.240] Well, yeah, well, I didn't say, but yeah, if it's not clear, Kuiper Belt, dwarf planets basically are all in the Kuiper Belt, but the one exception, I think, is Cirrus in the main asteroid belt.
[00:48:15.720 --> 00:48:17.400] Because pretty much all the other ones are in.
[00:48:17.480 --> 00:48:20.120] So, yeah, that's where we find dwarf planets.
[00:48:20.360 --> 00:48:34.520] Favorite long shot, though, is the outside chance that it could reveal information about primordial black holes, black holes that existed from the early, early small ones that existed early in the universe, soon after the Big Bang.
[00:48:34.520 --> 00:48:42.760] So if a primordial back hole was to pass in front of a star, we could detect that by the microlensing event that would happen to the stars, right?
[00:48:42.760 --> 00:48:46.840] It would just bend the light, and we could say, oh, crap, there's a super dense mass right there.
[00:48:46.840 --> 00:48:48.920] It could be a primordial black hole.
[00:48:48.920 --> 00:48:51.400] Long shot, I know, but that would be cool.
[00:48:51.400 --> 00:48:56.760] But it would be truly amazing finding a true planet in the Kuiper Belt.
[00:48:57.000 --> 00:48:57.960] How epic would that be?
[00:48:57.960 --> 00:49:01.800] That would be the astronomical discovery of the millennia, really.
[00:49:01.800 --> 00:49:06.440] I mean, another planet, potentially, you know, multiple Earth masses.
[00:49:06.680 --> 00:49:10.600] You know, the number would go, you know, back from eight back up to nine.
[00:49:10.600 --> 00:49:19.680] I think it would make Pluto feel a lot better, you know, since one of its Kuiper Belt buddies was recognized as a true planet, even if even if Pluto could never reattain that.
[00:49:20.000 --> 00:49:26.000] And at this project is estimated to cost only 15 million USD.
[00:49:26.960 --> 00:49:30.400] That really, that's smaller than I would have anticipated.
[00:49:30.880 --> 00:49:32.000] That really is, I mean, sure.
[00:49:32.000 --> 00:49:32.880] I mean, I'd like to have that much.
[00:49:32.960 --> 00:49:33.680] It's a grounding error.
[00:49:33.680 --> 00:49:34.080] It's pretty small.
[00:49:34.240 --> 00:49:34.640] Yeah, that's just a trendy trend.
[00:49:34.800 --> 00:49:35.760] Yeah, it really is.
[00:49:35.760 --> 00:49:36.720] It's so tiny.
[00:49:36.720 --> 00:49:38.400] It sounds like an amazing deal.
[00:49:38.640 --> 00:49:40.560] It seems like a no-brainer deal to me.
[00:49:40.560 --> 00:49:48.640] Of course, once this is truly vetted by other scientists and astronomers and make sure their numbers look good, I mean, this sounds like a great idea.
[00:49:48.640 --> 00:49:49.520] I hope they do it.
[00:49:49.520 --> 00:49:49.840] All right.
[00:49:49.840 --> 00:49:50.720] Thanks, Bob.
[00:49:50.720 --> 00:49:51.360] Sure.
[00:49:51.360 --> 00:49:54.720] Evan, tell us about stress and paranormal belief.
[00:49:54.960 --> 00:49:56.640] It stresses me out.
[00:49:57.920 --> 00:49:58.960] Does it, though?
[00:49:59.280 --> 00:50:00.640] Mel, it depends.
[00:50:00.640 --> 00:50:07.440] Did you know there's something called the Revised Paranormal Belief Scale, the RPBS?
[00:50:07.760 --> 00:50:11.920] This is a tool I had not heard of before.
[00:50:12.560 --> 00:50:13.680] And shame on me.
[00:50:13.680 --> 00:50:16.880] I probably should have read about this before at some point.
[00:50:17.200 --> 00:50:23.440] It is a 26-item survey that measures a person's belief in paranormal phenomena.
[00:50:23.440 --> 00:50:35.360] It's a widely used tool in parapsychological research and was developed by Jerome Tabasic and published in the International Journal of Transpersonal Studies in 2004.
[00:50:35.360 --> 00:50:40.480] So this has been around a while, and I believe there are even references to this prior to that.
[00:50:40.480 --> 00:50:43.760] So yeah, for many decades, this tool has been there.
[00:50:43.760 --> 00:50:50.320] Basically, you have the participants respond on a Likert scale.
[00:50:50.320 --> 00:50:54.080] One, strongly disagree, to seven, strongly agree.
[00:50:54.080 --> 00:50:58.720] And they ask you questions about: well, how do you feel about witchcraft?
[00:50:58.720 --> 00:51:04.040] How do you feel about superstition or spiritualism or extraordinary life forms?
[00:51:04.200 --> 00:51:06.840] And down the list it goes.
[00:51:06.840 --> 00:51:11.640] So much prior research has relied on this, using this tool.
[00:51:11.640 --> 00:51:19.080] And the results have suggested that paranormal belief in general is not linked to poorer psychological well-being.
[00:51:19.080 --> 00:51:26.680] However, certain facets of paranormal belief, such as superstition, could be linked with stress vulnerability.
[00:51:27.000 --> 00:51:28.840] You're saying that's the claim of these authors.
[00:51:29.800 --> 00:51:32.440] Well, no, this is, again, this is the prior research.
[00:51:32.440 --> 00:51:34.120] I haven't even gotten to the current study.
[00:51:35.160 --> 00:51:35.960] That's what I've read as well.
[00:51:35.960 --> 00:51:43.640] It's like superstition is the one that gets triggered by feeling lack of control, feeling under stress, et cetera, depression.
[00:51:43.960 --> 00:51:53.720] Until along came a revised version of this tool called the Rash Purified Revised Paranormal Belief Scale, R-P-P-B-S for short.
[00:51:53.720 --> 00:52:17.960] Yes, another tool to measure people's beliefs in paranormal phenomena, but it's considered a better statistical method that they use with improved validity and reliability compared to the original one, enhanced ability to compare results across different populations, and a more robust measurement of paranormal belief as a unidimensional construct.
[00:52:17.960 --> 00:52:28.120] I also wanted to look up, I said, you know, because I'm unfamiliar with these tools, I did a little more research into it to figure out, is this legitimate?
[00:52:28.120 --> 00:52:30.120] Is this considered scientific?
[00:52:30.120 --> 00:52:37.000] Or is this kind of just, you know, something that experimental researchers are kind of throwing together on their own?
[00:52:37.000 --> 00:52:38.200] But no, they say it is.
[00:52:38.200 --> 00:52:41.080] They say this is legitimate.
[00:52:41.080 --> 00:52:43.560] It's part of psychological research.
[00:52:43.560 --> 00:52:48.480] It could also be used for a parapsychological bogus research, but it is part of just psychological research.
[00:52:44.920 --> 00:52:51.360] It's considered accepted as a reliable tool.
[00:52:51.600 --> 00:52:56.640] Yeah, like psychologists studying conspiracy theories doesn't mean that they believe in the conspiracy theories, they're studying them.
[00:52:56.640 --> 00:52:57.280] It's the same thing.
[00:52:57.280 --> 00:52:58.960] They're studying paranormal beliefs.
[00:52:58.960 --> 00:53:04.160] And that means, yeah, so that gets us now to the news item this week, which we can now better understand.
[00:53:04.160 --> 00:53:16.160] There was a new study that was published in PLUS1, P-L-O-S One, titled Re-evaluation of the Relationship Between Paranormal Belief and Perceived Stress Using Statistical Modeling.
[00:53:16.160 --> 00:53:21.600] The authors are Kenneth Drinkwater, Andrew Dinovian, and Neil Dagnall.
[00:53:21.600 --> 00:53:43.360] Drinkwater and his colleagues had 3,084 people complete the RASH model survey, which is the more refined survey, alongside a questionnaire evaluating different facets of perceived stress called the perceived stress scale, of course, to help deepen their understanding of potential links between paranormal belief and stress.
[00:53:43.360 --> 00:54:00.640] Here are some quotes of what the researchers found: finding support that the notion that traditional paranormal belief is associated with external control, specifically the notion that unknown supernatural forces and powers influence existence.
[00:54:00.640 --> 00:54:08.480] Feelings of distress and reduced ability to cope with stress were associated with these traditional paranormal beliefs.
[00:54:08.480 --> 00:54:14.480] So superstition is one of those considered traditional paranormal beliefs, right?
[00:54:14.480 --> 00:54:22.720] Sort of belief in witchcraft, ghosts, these sorts of things, external forces that you don't have control over.
[00:54:22.720 --> 00:54:32.920] And on the other side of this coin are the New Age philosophy, sort of paranormal beliefs, dealing with psy, spiritualism, precognition.
[00:54:33.240 --> 00:54:41.800] They could not find it to be statistically linked to any tendencies regarding distress or coping for these New Age philosophy.
[00:54:41.800 --> 00:54:44.280] And these, this is what was expected.
[00:54:44.440 --> 00:54:49.560] in their findings, in line with the idea that traditional paranormal belief reflects anxiety.
[00:54:49.560 --> 00:54:53.880] And again, it's about that lack of control over those external forces.
[00:54:53.880 --> 00:55:00.600] They do admit the study was exploratory and does not support any cause-effect relationship.
[00:55:00.920 --> 00:55:01.880] Why is it important?
[00:55:01.880 --> 00:55:06.040] So why did the authors, you know, why are they bothering with this?
[00:55:06.040 --> 00:55:12.120] And I thought they summed it up nicely in the abstract of the paper, which I will read to you now, this part of it.
[00:55:12.120 --> 00:55:22.600] Research into paranormal belief is important because supernatural credence persists within contemporary soc
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
eived stress scale, of course, to help deepen their understanding of potential links between paranormal belief and stress.
[00:53:43.360 --> 00:54:00.640] Here are some quotes of what the researchers found: finding support that the notion that traditional paranormal belief is associated with external control, specifically the notion that unknown supernatural forces and powers influence existence.
[00:54:00.640 --> 00:54:08.480] Feelings of distress and reduced ability to cope with stress were associated with these traditional paranormal beliefs.
[00:54:08.480 --> 00:54:14.480] So superstition is one of those considered traditional paranormal beliefs, right?
[00:54:14.480 --> 00:54:22.720] Sort of belief in witchcraft, ghosts, these sorts of things, external forces that you don't have control over.
[00:54:22.720 --> 00:54:32.920] And on the other side of this coin are the New Age philosophy, sort of paranormal beliefs, dealing with psy, spiritualism, precognition.
[00:54:33.240 --> 00:54:41.800] They could not find it to be statistically linked to any tendencies regarding distress or coping for these New Age philosophy.
[00:54:41.800 --> 00:54:44.280] And these, this is what was expected.
[00:54:44.440 --> 00:54:49.560] in their findings, in line with the idea that traditional paranormal belief reflects anxiety.
[00:54:49.560 --> 00:54:53.880] And again, it's about that lack of control over those external forces.
[00:54:53.880 --> 00:55:00.600] They do admit the study was exploratory and does not support any cause-effect relationship.
[00:55:00.920 --> 00:55:01.880] Why is it important?
[00:55:01.880 --> 00:55:06.040] So why did the authors, you know, why are they bothering with this?
[00:55:06.040 --> 00:55:12.120] And I thought they summed it up nicely in the abstract of the paper, which I will read to you now, this part of it.
[00:55:12.120 --> 00:55:22.600] Research into paranormal belief is important because supernatural credence persists within contemporary society and potentially influences everyday attitudes and behavior.
[00:55:22.600 --> 00:55:31.960] For instance, investigators report that paranormal belief is associated with lower levels of trust in science and higher anti-science attitudes.
[00:55:31.960 --> 00:55:40.520] These are notions not based upon reasoned or reliable evidence which conflict with prevailing conceptions of the world.
[00:55:40.520 --> 00:55:49.320] Specific examples allied to belief in the paranormal are endorsement of alternative medicine, anti-vaccination, and conspiracies.
[00:55:49.320 --> 00:56:01.080] Evidence suggests that paranormal belief is a form of non-clinical delusion arising from an over-reliance on emotional content and the failure to rigorously evaluate the validity of information.
[00:56:01.080 --> 00:56:12.520] And what is it we talk about on this show for the past 20 years and the 10 years prior to that that we've been a skeptical organization and all the shoulders of the giants that we've stood upon that came even before that.
[00:56:12.520 --> 00:56:14.600] It all boils down to this.
[00:56:14.960 --> 00:56:15.280] Right?
[00:56:15.280 --> 00:56:22.720] But I do think there's multiple moving parts here, you know, having followed this literature somewhat over the years, you know, as a skeptic.
[00:56:22.720 --> 00:56:31.040] There's also other studies which show that there's a correlation with intuitive thinking style versus an analytical thinking style, which makes perfect sense.
[00:56:31.440 --> 00:56:47.280] And there's also this question of, which I don't see as much in the literature, the fantasy-prone personality type, which I think is just an extreme version of this tendency to believe in the paranormal or maybe intuitive thinking.
[00:56:47.520 --> 00:56:49.840] But also, this is very context-dependent.
[00:56:49.840 --> 00:56:50.400] You know what I mean?
[00:56:50.400 --> 00:56:56.560] It's like what culture did you grow up in, and what is or how culturally acceptable are the beliefs that you're talking about?
[00:56:56.880 --> 00:57:12.080] And that may be where the real divide here is between traditional paranormal beliefs and New Age paranormal beliefs, in that the New Age one seems to be more of a, like you get into that subculture and that worldview.
[00:57:12.080 --> 00:57:24.960] And I think it tends to attract people who have, again, the sort of a fantasy-prone or intuitive thinking style, whereas like the more traditional ones may come about because of stress or whatever.
[00:57:24.960 --> 00:57:26.480] It makes sense that they would not be.
[00:57:26.480 --> 00:57:27.760] It's not all one phenomenon.
[00:57:27.760 --> 00:57:28.640] It's not monolithic.
[00:57:28.640 --> 00:57:40.080] I think there's also that conspiracy beliefs are its own phenomenon, you know, like the tendency to believe in conspiracies, while there's a ton of overlap, that is an entity unto itself.
[00:57:40.080 --> 00:57:40.880] Yeah, it's interesting.
[00:57:41.760 --> 00:57:50.040] I would imagine there would be, and of course more research is needed, correlations of stress and conspiracy theory.
[00:57:50.920 --> 00:57:52.080] Conspiracy thinking.
[00:57:53.040 --> 00:57:59.360] Yeah, maybe, but again, conspiracy thinking comes in two flavors: opportunistic conspiracy thinking and dedicated conspiracy thinking.
[00:57:59.360 --> 00:58:06.440] People who are conspiracy theorists, they believe in all conspiracies, and people who only believe in ones that support their worldview.
[00:58:06.920 --> 00:58:10.840] Opportunistically, those comforting ones won't cause the stress.
[00:58:11.080 --> 00:58:11.640] Right, probably.
[00:58:12.200 --> 00:58:14.920] Would not, probably not cause stress for them.
[00:58:15.240 --> 00:58:16.840] All right, it's an interesting area.
[00:58:16.840 --> 00:58:21.560] You know, again, I tend to follow this because it's pretty much in our sweet spot of what we do.
[00:58:21.560 --> 00:58:25.480] And it's fascinating to look at this as a psychological research project.
[00:58:25.800 --> 00:58:30.520] Well, everyone, we're going to take a quick break from our show to talk about our sponsor this week, Aura Frames.
[00:58:30.520 --> 00:58:31.560] Yeah, Aura Frames.
[00:58:31.560 --> 00:58:39.080] These are digital picture frames that allow users to display photos and videos directly from their smartphones or other devices.
[00:58:39.080 --> 00:58:45.240] These frames are Wi-Fi enabled, facilitating seamless photo sharing and display.
[00:58:45.240 --> 00:58:52.840] There's no memory cards or USBs required, and there's a reason Wirecutter named it the number one best digital photo frame.
[00:58:53.160 --> 00:58:58.360] I got my father-in-law, I got my mother-in-law, I got a couple of other relatives that we bought these frames for.
[00:58:58.360 --> 00:59:01.560] They're super easy to use, and they report back to me.
[00:59:01.560 --> 00:59:04.440] I have one too, but I'm just telling you what the people in my life thought.
[00:59:04.440 --> 00:59:05.720] They absolutely love it.
[00:59:05.720 --> 00:59:07.240] It's a fantastic present.
[00:59:07.240 --> 00:59:15.720] For a limited time, visit auraframes.com and get $45 off Aura's best-selling Carver Matte frames by using promo code skeptics at checkout.
[00:59:15.720 --> 00:59:20.040] That's A-U-R-A-Frames.com, promo code Skeptics.
[00:59:20.040 --> 00:59:24.920] This exclusive Black Friday Cyber Monday deal is their best of the year, so don't miss out.
[00:59:24.920 --> 00:59:26.360] Terms and conditions apply.
[00:59:26.360 --> 00:59:28.440] All right, guys, let's get back to the show.
[00:59:28.440 --> 00:59:30.760] All right, Jay, it's who's that noisy time?
[00:59:30.760 --> 00:59:33.720] All right, guys, last week I played this noisy.
[00:59:52.560 --> 00:59:54.000] In its entirety.
[00:59:54.000 --> 00:59:55.760] So, guys, have any guesses?
[00:59:55.760 --> 00:59:56.400] I have a guess.
[00:59:56.400 --> 00:59:59.760] That was Jay's first attempt at playing the Digi Dude.
[01:00:00.400 --> 01:00:02.800] It sounds like a super old recording.
[01:00:03.200 --> 01:00:04.160] That's a nice guess.
[01:00:04.160 --> 01:00:04.800] Anybody else?
[01:00:04.800 --> 01:00:06.880] It just sounded like an insect to me.
[01:00:06.880 --> 01:00:08.400] It has an insect-like quality.
[01:00:08.400 --> 01:00:08.720] Absolutely.
[01:00:08.880 --> 01:00:10.320] I thought I heard help me in there.
[01:00:10.800 --> 01:00:15.680] There was a listener, many that wrote in, one of them named Benjamin Davolt.
[01:00:15.920 --> 01:00:18.480] Ben here, the Frenchie from Japan.
[01:00:18.720 --> 01:00:20.000] I think this is a drone.
[01:00:20.000 --> 01:00:27.200] Quadcopter equipped with ultra-low noise propellers, probably the asymmetrical type with a counterweight on the side opposed to the blade.
[01:00:27.440 --> 01:00:30.400] That is, oh, and he says my name is pronounced Davu.
[01:00:30.400 --> 01:00:32.240] So Benjamin Davu.
[01:00:32.240 --> 01:00:35.600] So that's an interesting and very specific guess.
[01:00:35.600 --> 01:00:43.040] And, you know, quadcopters do make that kind of, you know, that, what would you call it, a buzzing sound, like almost like a droning noise?
[01:00:43.200 --> 01:00:44.240] A droning noise.
[01:00:44.560 --> 01:00:45.120] Yeah.
[01:00:45.440 --> 01:00:47.680] That is not correct, though, and we will continue on.
[01:00:47.680 --> 01:00:51.280] So a listener named Shane Hillier wrote in and said it's a murder hornet.
[01:00:51.280 --> 01:00:55.200] So yeah, Kara, somebody else agreed with you about the insect-like noise.
[01:00:55.600 --> 01:00:56.560] Murder hornets.
[01:00:56.640 --> 01:00:58.480] Listener named Stephen Walker wrote in.
[01:00:58.480 --> 01:01:05.360] He said, hi, our guess is a bee, a honeybee, doing its waggle dance to tell its buddies which direction to find the good stuff.
[01:01:05.360 --> 01:01:08.320] So yeah, there was a murder hornet and a bee.
[01:01:08.320 --> 01:01:10.800] So definitely people are hearing that kind of sound.
[01:01:10.800 --> 01:01:18.160] And I always find it fascinating that bees, you know, they talk with pheromones, which is basically, you know, smells.
[01:01:18.160 --> 01:01:20.800] It's pretty freaking cool that they can communicate with that.
[01:01:20.800 --> 01:01:23.040] So now I'm going to click right over to the winner.
[01:01:23.040 --> 01:01:26.720] We had multiple winners, but I'll play, I'll read one of them here.
[01:01:26.720 --> 01:01:34.520] So Frederick Neant was the first person to guess right, and he guessed this as the first known recording of a human voice.
[01:01:34.520 --> 01:01:34.920] Oh, wow.
[01:01:35.160 --> 01:01:38.440] I think at some point I've played this previously.
[01:01:38.760 --> 01:01:45.480] I had a different recording, and now this is like an update because something pretty remarkable happened where they were able to.
[01:01:47.080 --> 01:01:50.520] So let me give you another listener's answer.
[01:01:50.840 --> 01:01:52.760] This is Joshua Banta's answer.
[01:01:52.840 --> 01:01:55.320] Very, very nice description here.
[01:01:55.320 --> 01:02:03.480] He said it's an actual recording from 1860 of Eduard Leone Scott de Martinville singing a Claire de Lune.
[01:02:03.720 --> 01:02:08.360] This is something, it's played on something called a phonodogram.
[01:02:08.360 --> 01:02:09.880] But let me give you some more specifics here.
[01:02:09.880 --> 01:02:12.200] So James, he said James Buchanan was the U.S.
[01:02:12.200 --> 01:02:15.160] president at the time, pre-Civil War, pre-Abraham Lincoln.
[01:02:15.160 --> 01:02:30.040] And he said the Martinville invented a device called the phonodograph that collects sound by using a horn connected to a diaphragm, which caused a rigid bristle to vibrate and inscribe a visual representation on a hand-craked cylinder.
[01:02:30.040 --> 01:02:32.760] But this was never intended for playback, by the way.
[01:02:33.080 --> 01:02:36.920] It only produced visual images to show you what sound looked like.
[01:02:36.920 --> 01:02:38.760] It's just squiggles on paper.
[01:02:38.760 --> 01:02:42.920] But there was absolutely no capacity for there to be any playback.
[01:02:42.920 --> 01:02:52.520] Now we click forward to 2008 and the recording was transformed into a playable digital audio file by scientists at the Lawrence Berkeley National Laboratory in Berkeley, California.
[01:02:52.520 --> 01:02:53.640] And it worked really well.
[01:02:53.640 --> 01:03:00.120] So it was inferred to be played back at a 10-second speed, which caused the voice to sound like a woman or even a child.
[01:03:00.120 --> 01:03:03.000] But later, the scientists realized that this was the wrong speed.
[01:03:03.000 --> 01:03:08.440] And when they played it back at slower speeds, they found the one that they thought sounded the most correct, and it is of a man.
[01:03:08.440 --> 01:03:13.160] And they think it is Martinville himself singing Claire DeLune.
[01:03:13.160 --> 01:03:14.040] So I'll play it again.
[01:03:14.040 --> 01:03:18.720] Now, keep in mind, you know, this is the lowest fidelity recording you'll probably ever hear.
[01:03:33.840 --> 01:03:35.760] So that's pretty cool, guys.
[01:03:35.760 --> 01:03:36.160] Weird.
[01:03:37.200 --> 01:03:38.720] Unexpected past.
[01:03:38.720 --> 01:03:40.560] So thank you for sending that in.
[01:03:40.560 --> 01:03:46.640] I have a new noisy for you guys this week, and this was sent in by a listener named John Karabayk.
[01:03:46.640 --> 01:03:50.160] Thank you for sending in the phonetic pronunciation of your name.
[01:03:50.160 --> 01:03:52.720] And I'm going to play the sound now.
[01:04:01.280 --> 01:04:09.280] If you guys think you know what this week's noisy is, or you heard something cool, email us at wtn at the skepticsguide.org.
[01:04:09.280 --> 01:04:10.880] Steve, it's not too late.
[01:04:11.200 --> 01:04:14.720] It's actually, it is, it can be too late, but it isn't now.
[01:04:14.720 --> 01:04:26.160] If you're hearing this, and it's and it's basically like what, the 20th or the 21st or soon thereafter in November, you could buy tickets to, we have two shows going on in Washington, D.C.
[01:04:26.240 --> 01:04:40.320] We have a private show that is a live recording of the SGU, limited audience size, and we record the show and then for an hour, all of us, including George Hobb, will do something fun and unique that has never been done before in a live audience.
[01:04:40.320 --> 01:04:43.760] So if you want to have some fun, you can join us at our private show.
[01:04:43.760 --> 01:04:46.480] Or you can also go to the extravaganza.
[01:04:46.480 --> 01:04:50.160] This is the skeptical extravaganza stage show that we have.
[01:04:50.160 --> 01:04:51.520] It's going to be in D.C.
[01:04:51.520 --> 01:04:53.600] as well, and it's going to be that night, right?
[01:04:53.600 --> 01:04:56.000] The private show will be starting at 11:30 a.m.
[01:04:56.160 --> 01:04:59.520] and the extravaganza starts at 8 p.m.
[01:04:59.600 --> 01:05:01.400] Please get there around 7.
[01:04:59.760 --> 01:05:05.560] You can go to www.theskepticsguy.org.
[01:05:05.800 --> 01:05:10.920] We have buttons on there that link to tickets, which means you can buy them and you can come see us, and we'd love to have you.
[01:05:10.920 --> 01:05:16.280] And, Jay, we should tell people all of the social media stuff that we are on as well.
[01:05:16.280 --> 01:05:21.880] So, first of all, we do a live stream most Wednesdays starting at 1 p.m.
[01:05:22.200 --> 01:05:25.160] and most Fridays starting at 5 p.m.
[01:05:25.160 --> 01:05:26.360] Eastern.
[01:05:26.360 --> 01:05:28.520] We have a Facebook page.
[01:05:28.520 --> 01:05:33.400] Two blogs are affiliated with us: Neurological and Science-Based Medicine.
[01:05:33.720 --> 01:05:45.960] And we are still on X, but we are also now on Blue Sky, and we are on Instagram, and we post TikTok videos, two or three TikTok videos every week.
[01:05:46.280 --> 01:05:49.560] My most popular TikTok videos, what do you guys think?
[01:05:49.560 --> 01:05:50.360] What's it up to?
[01:05:50.680 --> 01:05:52.360] 4.5 million.
[01:05:52.920 --> 01:05:54.280] 5.7 million.
[01:05:54.520 --> 01:05:55.400] Oh, no.
[01:05:55.400 --> 01:05:56.600] 5.7 million?
[01:05:56.920 --> 01:05:57.800] Climbing.
[01:05:57.800 --> 01:05:58.840] Damn, man.
[01:05:59.160 --> 01:06:01.160] How is it really slowing down?
[01:06:01.400 --> 01:06:02.440] No, it's still ticking along.
[01:06:02.760 --> 01:06:03.320] It's still going strong.
[01:06:03.640 --> 01:06:04.040] Still going on.
[01:06:04.120 --> 01:06:04.840] Still tickety time.
[01:06:05.240 --> 01:06:08.440] The interesting thing is, like, there is just no rhyme or reason.
[01:06:09.240 --> 01:06:10.440] It just went viral.
[01:06:10.760 --> 01:06:18.920] So, sure, we could have slipped into TikTok's algorithm or whatever, but like, we make a ton of these videos, and it's all, you know, all revolving around the same theme.
[01:06:18.920 --> 01:06:23.080] Steve watches something on TikTok, and then he'll explain why that person is wrong or whatever.
[01:06:23.080 --> 01:06:31.720] Like, you know, we just kind of go into some of the more wilder things that people are talking about and bend the skeptical eye at it.
[01:06:31.720 --> 01:06:35.720] But this one, we, you know, me, Steve, and Ian were like, what happened?
[01:06:36.440 --> 01:06:37.400] I wish we knew the formula.
[01:06:37.400 --> 01:06:38.120] It just happened.
[01:06:38.120 --> 01:06:40.440] We're happy about it, but it is what it is.
[01:06:40.440 --> 01:06:41.720] Oh, and Steve, I can't forget.
[01:06:41.720 --> 01:06:42.200] Hold on.
[01:06:42.680 --> 01:06:46.000] These events in DC are fantastic, and I do hope that you can make it.
[01:06:46.000 --> 01:06:48.080] But my God, you got to go to Natacon.
[01:06:44.760 --> 01:06:50.800] This is our socializing conference.
[01:06:50.960 --> 01:06:53.680] This is a conference where people make connections with each other.
[01:06:53.680 --> 01:06:54.960] Tons of socializing.
[01:06:54.960 --> 01:06:56.640] We have a ton of entertainment that we do.
[01:06:56.640 --> 01:07:01.600] George Robb, Brian Wecht, and Andrea Jones Roy will join all of us here at the SGU.
[01:07:01.600 --> 01:07:06.640] It's 2.2 days of a lot of fun, and we really hope that you can join us.
[01:07:06.640 --> 01:07:09.840] You can go to nataconcon.com, right?
[01:07:09.840 --> 01:07:14.080] That's natacon con, c-o-n-c-on-dot c-o-m.com.
[01:07:14.080 --> 01:07:15.760] Bob, are you understanding what I'm saying?
[01:07:16.320 --> 01:07:16.960] Pretty sad.
[01:07:17.200 --> 01:07:20.080] It's nataconcon.com.
[01:07:20.080 --> 01:07:21.360] Ian, I swear to God.
[01:07:21.360 --> 01:07:22.720] All right, so anyway, guys.
[01:07:23.040 --> 01:07:25.360] Did you say 2.2 days?
[01:07:25.360 --> 01:07:26.160] Yes.
[01:07:28.080 --> 01:07:32.160] Roughly, yeah, because it's like you don't need to explain that, Jay.
[01:07:32.800 --> 01:07:34.640] Anyway, 210 days.
[01:07:34.640 --> 01:07:35.760] You'll see, because you'll be there.
[01:07:35.760 --> 01:07:37.600] So, anyway, please try to join us for that.
[01:07:37.600 --> 01:07:38.960] It's going to be a wonderful thing.
[01:07:38.960 --> 01:07:42.240] You know, we need some happiness in the world, and their happiness will be there.
[01:07:42.240 --> 01:07:43.440] So please join us.
[01:07:43.440 --> 01:07:43.760] All right.
[01:07:43.760 --> 01:07:44.720] Thanks, Jay.
[01:07:44.720 --> 01:07:46.240] We're going to do one quick email.
[01:07:46.240 --> 01:07:53.680] This comes from Mike Hampton, who writes, On Friday's live stream, you began talking about phrases like way anchor and such.
[01:07:53.680 --> 01:07:58.160] It reminded me of a phrase that I think wins the award for the dumbest phrase.
[01:07:58.720 --> 01:08:02.080] That is, you've got your work cut out for you.
[01:08:02.080 --> 01:08:09.520] I lived most of my life thinking this phrase meant you have an easy road ahead, which is what it should mean.
[01:08:09.520 --> 01:08:20.560] Any project you do that involves cutting, whether that be carpentry, papercrafts, sewing, et cetera, at least a quarter to a third of the project is cutting the material to the sizes and patterns you need.
[01:08:20.560 --> 01:08:30.600] So, if someone has prepared the material by cutting it out for you, the project is suddenly that much easier and going to make and going to take less time.
[01:08:29.840 --> 01:08:32.760] I was shocked to learn, shocked, shocked.
[01:08:33.720 --> 01:08:40.760] It actually means the opposite, which shot that phrase to the dumbest phrase in the English language as far as I'm concerned.
[01:08:41.080 --> 01:08:46.280] So, I looked into it because I love the etymology, especially of these kind of phrases.
[01:08:47.000 --> 01:08:47.640] What's your guess?
[01:08:47.640 --> 01:08:49.560] Where does that phrase come from?
[01:08:49.560 --> 01:08:50.920] What's the origin of you?
[01:08:51.080 --> 01:08:52.760] You've got your work cut out for you.
[01:08:52.760 --> 01:08:54.840] Oh, gosh, I'd only be guessing.
[01:08:54.840 --> 01:08:56.200] I mean, yeah, who knows?
[01:08:56.200 --> 01:08:57.640] Farming, something with farming?
[01:08:57.640 --> 01:08:59.000] Yeah, that tailoring.
[01:08:59.000 --> 01:09:02.920] It comes from tailoring, which was on his list, sewing.
[01:09:02.920 --> 01:09:09.560] And usually, the way a professional tailor would work is they would have an assistant.
[01:09:09.560 --> 01:09:20.600] The assistant would cut out all the patterns, and they would, for a dress or whatever, anything that they were going to make, and they would do all of that ahead of time.
[01:09:20.600 --> 01:09:24.840] And the primary reason for that was to make sure that they had everything, right?
[01:09:24.840 --> 01:09:36.600] So, you cut out all of the pieces, you make sure that every piece is there, and then the tailor would sew them all together into the final piece, the final dress or whatever.
[01:09:36.600 --> 01:09:47.480] Now, the sewing was the that's where all you know, most of the skill, the artistry, the artistry, and the, and that was very exacting and complicated work.
[01:09:47.480 --> 01:09:50.600] Whereas the cutting was you, you know, you're cutting it out of a pattern.
[01:09:50.600 --> 01:09:56.280] Not that that, you know, wasn't a lot of work, but that wasn't the tailor's work.
[01:09:56.280 --> 01:10:06.280] So, as a tailor, if you have your work cut out for you, that means you have a lot of work ahead of you, all that intricate sewing ahead of you.
[01:10:06.280 --> 01:10:07.080] I see.
[01:10:07.080 --> 01:10:07.560] Somebody else.
[01:10:07.720 --> 01:10:12.840] But I get what the dude is saying, because if you don't have your work cut out for you, there's even more work to write.
[01:10:13.080 --> 01:10:14.040] I hear what he's saying.
[01:10:14.800 --> 01:10:22.960] But because there was built into the origin of this phrase, this division of labor, not for the tailor, you have nothing to do because your assistant hasn't cut out the patterns yet.
[01:10:23.120 --> 01:10:27.440] Right, so you get to rest right now, but once your work is cut out for you, then your work begins.
[01:10:27.440 --> 01:10:32.240] So basically, this is when your work begins, when the work is quote unquote cut out for you.
[01:10:32.560 --> 01:10:34.400] But that's not really how we use the phrase.
[01:10:34.720 --> 01:10:36.640] No, we've twisted it a bit.
[01:10:37.040 --> 01:10:39.520] Well, it now means you have a long road ahead of you.
[01:10:39.520 --> 01:10:41.920] Yeah, it's going to get rough now from here on out.
[01:10:41.920 --> 01:10:42.160] Yeah.
[01:10:42.400 --> 01:10:48.160] Yeah, but it basically means you have a lot of work or difficult work or like, yeah, like you have a job to do.
[01:10:48.160 --> 01:10:50.400] Like, this is your job and you've got to do it.
[01:10:50.640 --> 01:10:57.440] Yeah, the uses evolve over time, but it does make sense in the context of its origin.
[01:10:57.440 --> 01:11:02.480] But yeah, some phrases do end up meaning the opposite of what they originally meant.
[01:11:02.480 --> 01:11:05.680] Like, for example, blood is thicker than water.
[01:11:05.680 --> 01:11:07.680] We've talked about this on the show before.
[01:11:07.760 --> 01:11:08.400] It means the opposite.
[01:11:08.640 --> 01:11:10.960] That's because it doesn't lob off the last part of the story.
[01:11:11.040 --> 01:11:14.080] Yeah, it's the blood of the, I forget now.
[01:11:14.080 --> 01:11:18.640] It was like the blood of the Christ is thicker than the water of the womb or something.
[01:11:18.640 --> 01:11:22.160] And it means the opposite of what people use it to mean now.
[01:11:22.800 --> 01:11:31.440] It means that your dedication to your religion is stronger than your familial ties, where people now use it to mean that your familial ties are the strongest.
[01:11:31.440 --> 01:11:32.800] The blood is thicker than water.
[01:11:33.600 --> 01:11:35.200] It was flipped in its meaning.
[01:11:35.200 --> 01:11:36.400] Yeah, interesting.
[01:11:36.800 --> 01:11:37.760] But it was always fascinating.
[01:11:38.080 --> 01:11:45.440] We do that a lot where we shorten the phrase, and the phrase, I've seen a few other examples of this where there's a long phrase with a moral at the end.
[01:11:45.440 --> 01:11:49.840] But when we shorten it, we only focus on the first sentence, which is actually the opposite of the point.
[01:11:49.840 --> 01:11:50.720] Like the proof is in the pudding.
[01:11:50.800 --> 01:11:51.520] The proof is in the pudding.
[01:11:51.840 --> 01:11:54.320] I mean, the proof of the pudding is in the taste.
[01:11:54.320 --> 01:11:54.800] It's the tasting.
[01:11:55.280 --> 01:11:57.360] Or this is one of my big P's.
[01:11:57.360 --> 01:11:59.120] When people say, I could care less.
[01:11:59.440 --> 01:11:59.680] Yes.
[01:11:59.680 --> 01:11:59.960] Oh, well.
[01:12:01.160 --> 01:12:01.640] You can?
[01:12:01.640 --> 01:12:01.880] Yeah.
[01:12:02.360 --> 01:12:03.080] You care more.
[01:12:03.080 --> 01:12:04.440] So you cared more.
[01:11:59.840 --> 01:12:06.600] I couldn't care less.
[01:12:07.000 --> 01:12:09.960] I care so little, I couldn't possibly care even less.
[01:12:09.960 --> 01:12:14.440] But people, they shorten it because we tend to shorten things, but that it re-flips the meaning when you shorten it.
[01:12:14.600 --> 01:12:15.080] That's annoying.
[01:12:15.080 --> 01:12:15.800] That's annoying.
[01:12:16.760 --> 01:12:18.920] Everyone out there, just stop that one.
[01:12:18.920 --> 01:12:20.200] Say, I couldn't.
[01:12:20.200 --> 01:12:21.240] Add the int.
[01:12:21.480 --> 01:12:23.960] I couldn't care less, please.
[01:12:27.400 --> 01:12:32.840] Guys, we have a great interview coming up with Kevin Folta, so let's go on with that interview now.
[01:12:44.920 --> 01:12:47.240] Well, we are joined now by Kevin Fulta.
[01:12:47.240 --> 01:12:48.760] Kevin, welcome back to the SGU.
[01:12:48.760 --> 01:12:49.320] Yeah, thank you.
[01:12:49.320 --> 01:12:50.680] It's really nice to be here again.
[01:12:50.680 --> 01:12:57.560] Yeah, it's been a while, and it's always good to interview people in person, you know, so we could look face-to-face and have a discussion rather than over the interwebbies.
[01:12:57.640 --> 01:13:07.720] So I've been dying to talk to you about a topic, about a new technique that we talked about just very, very briefly, about that plant biologists are using to make new cultivars.
[01:13:07.720 --> 01:13:08.680] You know what I'm talking about.
[01:13:08.680 --> 01:13:10.280] Well, yeah, this was the work that was done.
[01:13:10.600 --> 01:13:12.840] We mentioned this briefly with Judd Ward and the folks.
[01:13:12.840 --> 01:13:14.760] I can't remember the name of the company now.
[01:13:15.080 --> 01:13:22.120] But the newest, it's a new technique that is involving doubling the genetic material inside of a cell.
[01:13:22.120 --> 01:13:29.640] So basically creating a not just the old polyploids, but actually creating hybrids from hybrids.
[01:13:29.640 --> 01:13:32.600] So allowing complete genetic sets being to passed down.
[01:13:32.600 --> 01:13:35.240] So you're not getting genetic mixing in each generation.
[01:13:35.240 --> 01:13:40.280] So, yeah, so if you make a hybrid, then you can have those hybrid traits breed through to subsequent generations.
[01:13:40.280 --> 01:13:40.600] That's right.
[01:13:40.600 --> 01:13:42.520] You have to breed through going forward.
[01:13:42.520 --> 01:13:45.120] Yeah, because right now you can't do that with a hybrid.
[01:13:45.120 --> 01:13:45.680] That's right.
[01:13:44.840 --> 01:13:49.200] This is something that I know we bring up when we're talking about GMOs.
[01:13:49.280 --> 01:13:52.880] And we should probably remind our audience: you are a plant biologist.
[01:13:52.880 --> 01:13:53.840] And tell us, tell Kevin.
[01:13:54.080 --> 01:13:56.080] Well, I'm a molecular biologist by training.
[01:13:56.080 --> 01:14:07.280] I ended up working in plants, and we do a lot of work in genomics, mostly around flavors and aromas, but other major plant traits that are involved in traits that are important for farmers.
[01:14:07.280 --> 01:14:08.640] You're still a strawberry guy, though?
[01:14:08.880 --> 01:14:09.440] Not so much.
[01:14:09.440 --> 01:14:10.480] I'm out of strawberries now.
[01:14:10.880 --> 01:14:11.440] You don't have strawberries now?
[01:14:11.520 --> 01:14:11.840] No, no, no.
[01:14:11.920 --> 01:14:14.400] I wanted to get new strawberries from you from the corner.
[01:14:15.200 --> 01:14:17.280] I tried those strawberries on camera once.
[01:14:17.360 --> 01:14:17.760] That was amazing.
[01:14:18.240 --> 01:14:18.720] That's right.
[01:14:18.720 --> 01:14:19.120] That's right.
[01:14:19.440 --> 01:14:19.840] We did.
[01:14:20.160 --> 01:14:22.960] And all those strawberries went in the autoclave, unfortunately.
[01:14:22.960 --> 01:14:28.000] We created a fungus-resistant strawberry, but the industry wouldn't use it, so it's gone.
[01:14:28.480 --> 01:14:32.480] So, Kevin, you selectively bred these strawberries, right?
[01:14:32.880 --> 01:14:34.320] Just correct me when I'm wrong.
[01:14:34.320 --> 01:14:35.280] Well, you're wrong.
[01:14:36.240 --> 01:14:37.120] That's why I asked you.
[01:14:37.200 --> 01:14:37.680] Go ahead.
[01:14:37.680 --> 01:14:43.840] Yeah, these were a variety that already existed that we added a gene that would prime its immune system.
[01:14:43.840 --> 01:14:47.840] So even before a pathogen came along, it was ready to confront the pathogen.
[01:14:48.400 --> 01:14:50.800] And it was awesome because they didn't get as sick.
[01:14:51.120 --> 01:14:52.560] They would recover from disease.
[01:14:52.560 --> 01:14:53.360] It was great.
[01:14:53.360 --> 01:14:57.600] And as we know, fungicides or strawberries are fungicide-dependent crops.
[01:14:57.920 --> 01:15:06.480] And so it allowed us to potentially make something that would help the industry farm with fewer fungicides, which is great for the environment, great for farmers.
[01:15:06.480 --> 01:15:10.000] But there was a lukewarm feeling in the industry about it.
[01:15:10.480 --> 01:15:11.520] Because it was GM?
[01:15:11.520 --> 01:15:12.720] Because it was genetically homicide.
[01:15:15.200 --> 01:15:16.560] Was it a synthetic insertion?
[01:15:16.560 --> 01:15:18.880] Was it something from another organism?
[01:15:18.880 --> 01:15:20.160] It was a plant gene in a plant.
[01:15:20.160 --> 01:15:22.240] So it was a plant, it was from a rabbidopsis.
[01:15:22.560 --> 01:15:22.880] Cotton?
[01:15:23.120 --> 01:15:23.680] No, that's not cotton.
[01:15:23.760 --> 01:15:28.400] No, no, no rabidopsis is the little white lab mouse, a plants.
[01:15:26.440 --> 01:15:29.400] Yes, the little lab of plants, right?
[01:15:29.720 --> 01:15:31.880] And so we put that in, and it seemed to prime.
[01:15:31.880 --> 01:15:32.520] It did great.
[01:15:32.520 --> 01:15:34.120] The strawberry did wonderfully.
[01:15:29.280 --> 01:15:35.720] But it did have a yield hit.
[01:15:35.960 --> 01:15:40.360] So, in other words, you were resistant to disease, but you had a slight dip in yield.
[01:15:40.360 --> 01:15:43.960] And between that and the genetic engineering trait, they weren't so excited about that.
[01:15:44.200 --> 01:15:49.640] That's so interesting because a slight dip in yield, yes, I could see on paper that would be frightening to a farmer.
[01:15:49.640 --> 01:15:55.080] But when all of your crop gets taken by fungus, that's a big dip in yield.
[01:15:55.080 --> 01:15:56.120] So I guess they're willing to roll it up.
[01:15:56.440 --> 01:15:57.960] Long term, it might be positive.
[01:15:58.840 --> 01:16:00.120] But wait, way more importantly.
[01:16:00.440 --> 01:16:02.040] They were delicious.
[01:16:02.040 --> 01:16:07.960] And people need to, you know, people, I think, would be very responsive to it if they knew how good they were.
[01:16:07.960 --> 01:16:10.440] Yeah, but the base, that comes from the base hybrid.
[01:16:10.440 --> 01:16:14.600] The basic strawberry was so good that, you know, you had one more gene, you couldn't taste it.
[01:16:14.920 --> 01:16:16.600] It just purely was in the management.
[01:16:16.600 --> 01:16:20.840] It was allowing farmers to use less chemistry, which is really expensive to apply.
[01:16:20.840 --> 01:16:27.960] Right, so why you might not know this answer, but why would we still get these supermarket strawberries that don't have a lot of flavor?
[01:16:27.960 --> 01:16:31.800] Like, why aren't they just starting to grow these, even without the gene editing?
[01:16:31.800 --> 01:16:32.680] But that's changing.
[01:16:32.680 --> 01:16:40.200] And it's because over the last decade, my lab spent a lot of time identifying the genes that control the traits that consumers really like.
[01:16:40.200 --> 01:16:47.240] So we identified, we interviewed consumers, having them taste strawberries, and they tasted hundreds of different kinds of strawberries.
[01:16:47.240 --> 01:16:49.720] And then we went through and did principal component analysis.
[01:16:49.720 --> 01:16:56.200] So we took a strawberry, exploded it into its chemistry, and then said which ones always line up with consumer liking.
[01:16:56.200 --> 01:16:59.640] And there was always a list of 12 that consumers really liked.
[01:16:59.640 --> 01:17:04.200] You know, when the consumers liked it, one of those 12 or multiples of those 12 was available.
[01:17:04.200 --> 01:17:09.080] So then we found the genes that underlie those 12 volatile aroma traits.
[01:17:09.080 --> 01:17:19.440] And then once we had those genes nailed down, identified markers associated with them, so DNA signatures, that would then, as they were inherited, would allow us to get all of them in one place.
[01:17:20.160 --> 01:17:22.960] So that work is still ongoing by the strawberry breeding program.
[01:17:23.200 --> 01:17:24.080] And that's conventional breeding.
[01:17:24.400 --> 01:17:25.360] That's conventional breeding.
[01:17:25.360 --> 01:17:27.280] Yeah, you're not actually turning those genes on.
[01:17:27.280 --> 01:17:30.320] You're just finding cultivars that already have them and breeding them together.
[01:17:30.320 --> 01:17:30.640] That's right.
[01:17:30.960 --> 01:17:33.680] And the cool part is, this is all done in a seedling.
[01:17:33.680 --> 01:17:40.720] In the old days, we used to have to put out 10 acres of strawberries and taste them and run them through gas chromatography to find the ones that had it.
[01:17:40.720 --> 01:17:48.960] Now you take 384 seedlings in a petri dish, basically, do the assay, and then throw away the ones that don't have the markers.
[01:17:48.960 --> 01:17:51.520] So this work is still ongoing at the University of Florida.
[01:17:51.600 --> 01:17:53.840] And I'm a little bit separate from it these days.
[01:17:54.640 --> 01:17:55.600] Can I ask a quick question?
[01:17:55.600 --> 01:18:01.600] Like with this example, now that we have that kind of in our minds of how that works, could you just like CRISPR those things on?
[01:18:01.920 --> 01:18:03.600] Now that's more possible, yes.
[01:18:03.600 --> 01:18:05.040] And would that then be considered?
[01:18:05.040 --> 01:18:10.240] Obviously, it is genetic engineering, but would it be considered a genetically modified organism?
[01:18:10.240 --> 01:18:14.880] That would be only if we left around the hardware that did the edit.
[01:18:14.880 --> 01:18:23.040] So if you had Cas9, if you had the enzyme that does the little scissors trick, if that enzyme was engineered in to do the work, then it would be one thing.
[01:18:23.040 --> 01:18:31.760] But there's a lot of plants where you can engineer a single cell with CRISPR and do it in a single cell with the protein itself rather than have to install the genes.
[01:18:31.760 --> 01:18:35.920] So in other words, you just put in the hardware rather than have the cell make the hardware.
[01:18:35.920 --> 01:18:36.720] Oh my goodness.
[01:18:36.720 --> 01:18:38.800] This is so annoying that you have to worry about.
[01:18:38.800 --> 01:18:39.760] Oh yeah, totally.
[01:18:39.760 --> 01:18:46.160] But then you have that one cell and sometimes you can get that into two cells, five cells, ten cells, whatever, and then eventually having a whole plant from that one cell.
[01:18:46.160 --> 01:18:50.640] That one's foreign DNA-free, yet it contains the edit you're looking for.
[01:18:50.960 --> 01:18:53.120] That one is not a regulated article by the U.S.
[01:18:53.680 --> 01:18:57.280] genetically engineered, but not genetically modified in the U.S., is that correct?
[01:18:57.280 --> 01:19:00.920] Well, it's not a, as the USDA says, it's not a regulated article.
[01:18:59.920 --> 01:19:02.360] Yeah, it's such a bottom line.
[01:19:02.600 --> 01:19:11.560] Yeah, it just shows, it just goes to show that the fundamental choices that are made in legislation around this are so divorced from the science.
[01:19:11.560 --> 01:19:12.040] Yeah.
[01:19:12.280 --> 01:19:20.920] I mean, people, you know, have this perception that if they eat a genetically modified organism, that's going to do something to them, to their DNA, to their DNA and everything.
[01:19:20.920 --> 01:19:22.280] And again, it's misinformation.
[01:19:22.280 --> 01:19:23.240] It's disinformation.
[01:19:23.240 --> 01:19:24.200] It's big industry.
[01:19:24.680 --> 01:19:26.520] There's motivations behind all this stuff.
[01:19:26.680 --> 01:19:28.200] I want to get back to the hybrids.
[01:19:28.680 --> 01:19:39.240] So just as we were saying, that when you hybrid two plants together, the daughter plants have a certain mix of genes that, whatever, is a good crop.
[01:19:39.240 --> 01:19:45.320] But if you take those seeds and breed them together, you end up with a mix of genes that might not be what you want.
[01:19:45.320 --> 01:19:50.600] So, I know, before GMOs came out, like 95, 98% of crop seeds were hybrids.
[01:19:50.920 --> 01:19:51.640] Oh, cellar.
[01:19:51.960 --> 01:19:52.520] Cellar.
[01:19:52.520 --> 01:19:53.240] Yeah, it's still on.
[01:19:54.360 --> 01:19:58.360] You can't save your seeds and replant them, which is like an annoying thing about the whole anti-GMO thing.
[01:19:58.440 --> 01:19:59.240] You can't save seeds.
[01:19:59.400 --> 01:19:59.960] You never could.
[01:19:59.960 --> 01:20:01.640] You could never do that with the hybrids.
[01:20:01.640 --> 01:20:08.600] But now what you're saying is you can make a cultivar where the hybrid traits do go through to subsequent generations because...
[01:20:08.600 --> 01:20:12.600] So is that like in the seeds now, or is that you still have to do it every generation?
[01:20:12.920 --> 01:20:18.280] It is a, once you have the first plant, there's a process that you basically break meiosis, right?
[01:20:18.440 --> 01:20:25.240] So the segregation of alleles during, or of genetic complements during segregation of gametes.
[01:20:25.400 --> 01:20:26.760] I know I got to straighten that out.
[01:20:27.000 --> 01:20:27.880] He's talking about sex, yeah.
[01:20:28.200 --> 01:20:34.520] Basically, when you're making the pollen and the egg cells, you make sure that that doesn't reduce its gametes.
[01:20:34.520 --> 01:20:36.680] It doesn't go down with half the genetic complements.
[01:20:36.840 --> 01:20:37.240] Oh, interesting.
[01:20:37.480 --> 01:20:39.720] It passes the whole thing in one of the two.
[01:20:39.720 --> 01:20:41.720] So you're turning meiosis into mitosis.
[01:20:41.720 --> 01:20:43.880] You're basically turning meiosis into mitosis.
[01:20:43.880 --> 01:20:44.360] Interesting.
[01:20:44.360 --> 01:20:46.960] And that's passing down its genetic material.
[01:20:46.960 --> 01:20:47.200] Right.
[01:20:44.760 --> 01:20:50.720] So instead of reduction division, you're just dividing the cells.
[01:20:44.840 --> 01:20:51.120] Yeah.
[01:20:51.680 --> 01:20:55.760] And one of them gets the entire complement of the hybrid, the other one gets nothing.
[01:20:56.080 --> 01:20:58.160] And so, so, quick question, just to interject.
[01:20:58.400 --> 01:21:06.720] When a hybrid, in the before times, like you were talking about, which is still the now times, when a hybrid would breed with a hybrid, you might, it's like Punnett Squares, right?
[01:21:06.720 --> 01:21:09.680] Then you might get traits that were undesirable.
[01:21:09.680 --> 01:21:12.000] But are they always even viable?
[01:21:12.000 --> 01:21:15.120] Like, because I'm thinking about animal biology, which is where I come from.
[01:21:15.120 --> 01:21:17.600] And sometimes hybrids can't breed.
[01:21:17.600 --> 01:21:18.240] Right, that's true.
[01:21:18.240 --> 01:21:19.440] Does that happen in plants as well?
[01:21:19.440 --> 01:21:22.960] Yeah, there are examples where plants produce infertile offspring.
[01:21:22.960 --> 01:21:23.440] Interesting.
[01:21:23.440 --> 01:21:25.360] But that's usually a function of polyploidy.
[01:21:25.440 --> 01:21:37.680] So if you can, so like a seedless watermelon is a combination of one that has four times the genetic material bred against one that has the normal complement, and it comes out with 3x, and so that's where you get seedlessness.
[01:21:37.680 --> 01:21:39.200] Yeah, that's deliberate to make it seedless.
[01:21:39.200 --> 01:21:39.920] That's deliberate, yeah.
[01:21:39.920 --> 01:21:43.120] So you have to have the right female flowers in the field and the right male flowers.
[01:21:43.120 --> 01:21:43.600] Yeah, yeah.
[01:21:43.600 --> 01:21:44.480] Yeah, cool stuff.
[01:21:44.480 --> 01:21:48.320] So Kevin, there's a question that I've been wanting to ask you for a long time.
[01:21:48.320 --> 01:21:55.840] And I've been saying on the show a lot, you know, like everything we eat, all the food, all the meats, all this stuff, it's all been selectively bred, right?
[01:21:55.840 --> 01:21:58.000] And some of it has been genetically engineered.
[01:21:58.000 --> 01:22:03.040] So we have this limited spectrum of different types of fruits and vegetables and meats and everything.
[01:22:03.040 --> 01:22:07.840] But I'm curious to know, like, how broad could this flavor profile be?
[01:22:07.840 --> 01:22:14.480] Like, could there be a million more brand new fruits that don't taste like anything we've ever tasted before that just haven't been created yet?
[01:22:14.480 --> 01:22:15.200] Absolutely.
[01:22:15.200 --> 01:22:18.000] Yeah, there's so much volatile diversity out there.
[01:22:18.000 --> 01:22:21.360] And what you're looking at is a combination of two different things.
[01:22:21.360 --> 01:22:34.520] There's the volatiles that are out there, which are the major things that shape flavor along with acids and sugars and other aspects, where the tongue and the olfactory system collide in the brain, right, with all the different things it's sensing.
[01:22:34.840 --> 01:22:39.320] But there's so many unusual volatiles that are out there, and we find things all the time.
[01:22:39.320 --> 01:22:42.200] But also things that disrupt your ability to perceive them.
[01:22:42.200 --> 01:22:43.880] So things like miracle fruit.
[01:22:43.880 --> 01:22:48.360] You can eat a miracle fruit and then not be able to, and then when you taste acid, it tastes sweet.
[01:22:48.360 --> 01:22:52.040] So it's a weird sensory combination that's very interesting.
[01:22:52.040 --> 01:23:02.440] So to visualize it, this might be a hard thing to answer, but like we have like these little, you know, along the horizon, we have like a pineapple and watermelon, blah, blah, blah.
[01:23:02.440 --> 01:23:07.080] But literally, there could be like millions of fruit flavors that we've never tasted before.
[01:23:07.560 --> 01:23:09.880] Well, maybe you've tasted them, but you don't know you did.
[01:23:09.880 --> 01:23:17.000] Because what's happening in a strawberry, there are very small hints of the major flavor of peach and the major flavor of grape.
[01:23:17.000 --> 01:23:19.080] And you have to think of it like an orchestra.
[01:23:19.080 --> 01:23:24.760] All these things contribute just a tiny little bit that's barely perceptible unless you really know what you're looking for.
[01:23:24.760 --> 01:23:30.760] But we can use gas chromatography and other analytical methods, analytical chemistry methods, to be able to detect them.
[01:23:30.760 --> 01:23:34.840] And consumers notice when they're not there, even if they don't know why.
[01:23:35.640 --> 01:23:37.640] So maybe we can think about this in two ways.
[01:23:37.640 --> 01:23:47.320] One is, if you have all the fruity flavors that we could taste, right, then a fruit is some combination of those flavors, some subset, and some are more powerful than others.
[01:23:47.320 --> 01:23:50.840] So yes, strawberry has peach in it, but a peach has a lot of peach in it.
[01:23:50.840 --> 01:23:51.400] That's right.
[01:23:51.880 --> 01:24:05.720] But is there also potentially, like, you talk about the chemical space, you know, like, is there a space of flavors that nature hasn't necessarily fully explored all of those flavors, and maybe there is more room for even new flavors that we've never tasted before?
[01:24:05.880 --> 01:24:06.520] Can we even know?
[01:24:07.000 --> 01:24:07.720] Yeah, can we even know about that?
[01:24:07.880 --> 01:24:08.440] That's interesting.
[01:24:08.440 --> 01:24:09.480] I think there probably are.
[01:24:09.480 --> 01:24:29.840] I think if you start going into especially floral aromas where you're attracting pollinators and unusual pollinators, where you attract the enemies of herbivores, so you have plants that are being fed or are being fed upon, and then the plant will exude volatiles to attract the predator of the thing that's feeding on it.
[01:24:30.080 --> 01:24:31.680] It kind of calls in the troops.
[01:24:31.680 --> 01:24:38.560] But all these things may have some sort of flavor response in us, but we just haven't integrated them in the fruits and vegetables.
[01:24:38.560 --> 01:24:41.840] You know, there's the volatile ohm is expensive.
[01:24:42.480 --> 01:24:43.280] Yeah, yeah, right, exactly.
[01:24:43.440 --> 01:24:44.560] So remind me.
[01:24:44.880 --> 01:24:51.360] There may be flavors in plants that are delicious, but the plants are poisonous to humans.
[01:24:51.360 --> 01:24:51.840] Could be.
[01:24:51.840 --> 01:24:57.600] So we could make a non-poisonous version of them or get their genes into stuff we can't eat.
[01:24:57.600 --> 01:24:58.080] Absolutely.
[01:24:58.240 --> 01:25:02.320] And all kinds of other interesting stuff, not just flavors and aromas.
[01:25:02.320 --> 01:25:05.680] Look at like Sichuan peppercorns that have that numbing effect.
[01:25:05.920 --> 01:25:09.360] There's so many interesting sensory aspects of plant biology.
[01:25:09.360 --> 01:25:11.120] And I wish I still played in that field.
[01:25:11.280 --> 01:25:15.680] We really kind of moved out of it a bunch just because now it's in the hands of the breeders.
[01:25:15.680 --> 01:25:17.040] Now they've got to put it all together.
[01:25:17.040 --> 01:25:25.440] It reminds me, I was reading an article about medieval food and what it tasted like and the flavor profiles that they enjoyed.
[01:25:25.440 --> 01:25:28.320] If modern people had that, they would be like, what did I just eat?
[01:25:28.320 --> 01:25:29.280] That was horrible.
[01:25:29.280 --> 01:25:45.680] But it is making me think in the future that people of our future, our descendants might think, might feel bad for us because then, like, can you imagine they had so few things to eat, so few fruits, and they're enjoying, they have this cornucopia of tastes and flavors and volatiles that we just can't even imagine right now.
[01:25:45.680 --> 01:25:48.120] And you're just like, they would think of us.
[01:25:48.200 --> 01:25:50.560] Poor people that had horrible food.
[01:25:50.880 --> 01:25:51.520] Tainted.
[01:25:51.520 --> 01:25:54.640] People can be tainted by these artificial flavors that are out there.
[01:25:54.640 --> 01:25:57.920] They think they know what a strawberry tastes like because they taste enough artificial strawberry.
[01:25:57.920 --> 01:26:01.160] They don't really know what a strawberry really is supposed to taste like.
[01:26:01.800 --> 01:26:08.840] But isn't so much a food science understanding the basic interests of the consumer.
[01:26:08.840 --> 01:26:11.480] And consumers like fat, they like sugar.
[01:26:11.480 --> 01:26:16.120] They like things that historically, evolutionarily, were rare.
[01:26:16.120 --> 01:26:21.160] And if we can make healthy food taste that way, then we can have the best of both worlds.
[01:26:21.160 --> 01:26:22.360] And I'm down for that.
[01:26:22.360 --> 01:26:27.080] Yeah, well, that was one of the big issues: could you find volatiles that replace sweetness?
[01:26:27.400 --> 01:26:28.600] And you can.
[01:26:28.760 --> 01:26:34.440] That's been some work by Linda Bardaschuk in our institution, a guy named Thomas Calhoun.
[01:26:34.440 --> 01:26:40.040] They were looking at the flavor volatiles that made people sense sweetness in the absence of sugar.
[01:26:40.040 --> 01:26:40.280] Wow.
[01:26:40.520 --> 01:26:44.120] And so you would take two glasses of water, put in a tablespoon of sugar, and mix it.
[01:26:44.120 --> 01:26:49.400] In one of them, you would add some of these fruity volatiles, and people would say the one with the volatiles was sweeter.
[01:26:49.400 --> 01:26:49.800] Smarter.
[01:26:50.040 --> 01:26:51.640] Even if they couldn't perceive the volatiles.
[01:26:51.960 --> 01:26:57.720] But it's incredible when you think of like, you know, someone who enjoys cooking a lot of Italian food, right?
[01:26:57.720 --> 01:27:04.040] You know, onions and garlic, these are like the bases of so many different things in that, right?
[01:27:04.040 --> 01:27:11.560] And then, like, you can go to India now, and like the basis of their flavors, there's a lot of curries, which are constructed flavors anyway, right?
[01:27:11.560 --> 01:27:28.760] Like, the complexity is so broad that could you imagine if there was like another 50 culture versions of food that we just have no idea what it tastes like, but it's partly because we don't have the vegetables and the big beginnings to start those all those completely different types of foods that could be out there, you know?
[01:27:29.320 --> 01:27:34.040] I just love that because I just think, like Bob said, in the future, people might have access to this stuff.
[01:27:34.040 --> 01:27:41.080] They might be able to genetically engineer something like just by talking to an AI and then grow it and be like, oh my God, I made a new freaking fruit.
[01:27:41.480 --> 01:27:42.680] No one's ever tasted this before.
[01:27:43.080 --> 01:27:43.480] That's cool.
[01:27:43.880 --> 01:27:44.960] Are they using AI in?
[01:27:45.920 --> 01:27:46.800] Yeah, they are using this.
[01:27:46.800 --> 01:27:49.360] And mostly that's happening with the breeders now.
[01:27:49.360 --> 01:27:50.000] That's not anything.
[01:27:50.320 --> 01:27:52.480] My lab is working on the opposite of AI.
[01:27:53.040 --> 01:27:54.800] We decided everyone's going to AI.
[01:27:54.800 --> 01:27:58.640] We're just going to go to kind of ignorant randomness.
[01:27:58.640 --> 01:28:01.360] And so, what we decided to do, this is the coolest new science.
[01:28:01.360 --> 01:28:07.840] We were taking, we always wanted to take an organism and just put random DNA in it and see what comes from it.
[01:28:08.080 --> 01:28:09.280] And what do you mean by random DNA?
[01:28:09.440 --> 01:28:09.840] Random DNA.
[01:28:10.000 --> 01:28:12.080] So random genes, or like literally random sequence?
[01:28:12.240 --> 01:28:13.120] A random sequence.
[01:28:13.120 --> 01:28:14.000] So it's not even a gene.
[01:28:14.160 --> 01:28:15.040] It's not even a gene.
[01:28:16.160 --> 01:28:21.040] It encodes a protein product, a peptide, that has never existed in the universe before.
[01:28:21.200 --> 01:28:28.480] So instead of sort of having a hypothesis and going top-down theoretically, you're just like going bottom up and going, what happens if we just mix this stuff up?
[01:28:28.800 --> 01:28:30.080] It's just throwing a whole bunch of shit against a skin.
[01:28:30.160 --> 01:28:30.720] Yeah, see what sticks.
[01:28:31.920 --> 01:28:34.640] I always liken it to throwing monkey wrenches into the machine.
[01:28:34.880 --> 01:28:38.160] That you're standing next to this elaborate machine throwing monkey wrenches in.
[01:28:38.160 --> 01:28:39.440] Most of the time they don't do anything.
[01:28:39.760 --> 01:28:42.160] But once in a while, one sticks in the gears.
[01:28:42.160 --> 01:28:51.760] And so this is the coolest thing that we've been doing because we've been showing in bacteria as well as plants that we can identify new vulnerabilities for lethality by using random peptides.
[01:28:51.760 --> 01:29:01.840] And the random peptides stick in either in different places or the random RNAs suppress RNA that's required for certain developmental transitions.
[01:29:01.840 --> 01:29:10.400] And so why this is so cool is because we're not going to create a new herbicide or a new antibiotic by creating a peptide or a mimic of that peptide.
[01:29:10.400 --> 01:29:14.320] But what we will expose is a new vulnerability we didn't know about before.
[01:29:14.640 --> 01:29:19.280] And then have smart people who design molecules make something that fits that vulnerability.
[01:29:19.280 --> 01:29:24.280] Is this like a new kind of mutation farming where again you're just trying to make random shit happen to see if anything good comes of it?
[01:29:24.000 --> 01:29:24.840] Is it similar?
[01:29:25.120 --> 01:29:25.920] I think it's even weirder.
[01:29:26.400 --> 01:29:27.040] Even weirder.
[01:29:27.040 --> 01:29:32.280] Yeah, this is just, this is pure, just, and you have to have huge populations to be able to do this.
[01:29:32.600 --> 01:29:38.360] But still, with large populations, we find that lethality in about 5%, which is pretty amazing.
[01:29:38.360 --> 01:29:40.200] It's much higher than we would have predicted.
[01:29:40.200 --> 01:29:41.560] What plants are you doing this in?
[01:29:41.560 --> 01:29:44.840] We're doing this in a rabbidopsis, which is the laboratory plant.
[01:29:44.840 --> 01:29:52.600] But it also works in bacteria that we can disrupt very well-characterized bacterial processes with randomness.
[01:29:53.560 --> 01:29:57.160] How do you feel the GMO attitudes are out there in the country?
[01:29:57.800 --> 01:29:58.840] Have we made any progress?
[01:29:58.840 --> 01:29:59.880] Are things getting better?
[01:30:00.840 --> 01:30:01.960] Do we just stop talking about it?
[01:30:01.960 --> 01:30:02.600] What's going on?
[01:30:02.600 --> 01:30:04.040] We have absolutely made progress.
[01:30:04.520 --> 01:30:10.120] And I teach classes now on, I teach a class on critical thinking in agriculture and medicine.
[01:30:10.200 --> 01:30:12.760] I designed this course, and it is so much fun.
[01:30:12.760 --> 01:30:21.400] And we talk about all the different ways in which we are deceived, deceived ourselves, cognitive bias, statistical deception, all the things like that that you guys talk about every week.
[01:30:21.400 --> 01:30:22.360] It's fantastic.
[01:30:22.360 --> 01:30:23.880] We talk about alternative medicine.
[01:30:23.880 --> 01:30:25.800] Then we talk about genetic engineering.
[01:30:25.800 --> 01:30:28.440] And now when I talk about GMOs, everybody kind of glazes over.
[01:30:28.600 --> 01:30:29.480] Nobody cares.
[01:30:30.120 --> 01:30:32.920] None of the students are thinking it's a threat or a problem.
[01:30:32.920 --> 01:30:35.560] There's no problem to solve in that room.
[01:30:35.560 --> 01:30:39.640] But then it turns into how do we solve the problem that's still out there in the public?
[01:30:39.960 --> 01:30:45.720] And how do I deputize this room of 30 students to engage their skeptical friends?
[01:30:45.720 --> 01:30:48.200] And how do I get them to engage online?
[01:30:48.200 --> 01:30:50.840] We're in social media where this stuff runs rampant.
[01:30:50.840 --> 01:30:52.520] And how do we do it effectively?
[01:30:52.520 --> 01:31:02.520] And so that's where I've really just taken such a turn away from biology science into much more sociology and psychology and understand how we can be better persuaders.
[01:31:03.240 --> 01:31:06.600] And then I think that's been just the magic in the last 10 years.
[01:31:06.920 --> 01:31:07.640] I mean, absolutely.
[01:31:07.640 --> 01:31:08.520] Obviously, this is what we do.
[01:31:09.000 --> 01:31:17.760] Thinking very carefully about how do we persuade the j people individually, the general population, the people who matter in terms of regulators and whatnot.
[01:31:14.680 --> 01:31:19.760] And it's just tricky, and each topic is different.
[01:31:19.920 --> 01:31:24.480] But I think, yeah, the GMOs might, I also think that we've been moving the needle on that.
[01:31:24.480 --> 01:31:28.640] I do think because this topic is amenable to information.
[01:31:28.640 --> 01:31:32.720] You know, there was just, we are just combating misinformation and it's very correctable.
[01:31:32.720 --> 01:31:47.040] I also think, and this is just now just my gut feeling, is that a lot of the pushback against the GMOs, the anti-GMO attitude, was just an unfamiliarity with the technology and just a disgust kind of reaction to it.
[01:31:47.040 --> 01:31:56.000] I think, this is my hope, my thought and my hope, that much like IVF in vitro fertilization, remember the test two babies and all the protests and everything?
[01:31:56.000 --> 01:31:57.840] And now nobody cares, right?
[01:31:57.840 --> 01:31:59.440] I'm just hoping that the same thing is going to happen.
[01:31:59.440 --> 01:32:01.520] Like genetically ensuring everything is genetically engineered.
[01:32:01.520 --> 01:32:02.640] Who cares, right?
[01:32:02.640 --> 01:32:04.160] Do you think that we're heading in that direction?
[01:32:04.480 --> 01:32:05.440] I agree 100%.
[01:32:05.440 --> 01:32:15.040] I also think there's a lot of disaster fatigue that we've been told, well, you're going to get lumpy and things are going to fall off and all the problems that we're going to have never materialized.
[01:32:15.200 --> 01:32:15.680] Never happened.
[01:32:16.000 --> 01:32:18.720] And I think that has a big role in that too.
[01:32:18.720 --> 01:32:30.720] And so now when we can remind people of what they said, of what the opponents said, and then we can show the progress of where it's going and we can show all the beautiful things that we could be doing, that people do change their minds.
[01:32:31.040 --> 01:32:32.720] And you can kind of persuade.
[01:32:33.040 --> 01:32:38.160] There still is a rather vigorous anti-GMO movement out there that if you look for it, you can find it.
[01:32:38.160 --> 01:32:39.120] Oh, yeah, no question.
[01:32:39.120 --> 01:32:41.120] But it is changing.
[01:32:41.120 --> 01:32:48.400] And the big problem is that so many of the really good innovations that we have still haven't hit the road.
[01:32:48.640 --> 01:32:50.800] And that rubber hasn't hit the road.
[01:32:51.280 --> 01:32:54.000] And so all the, you know, we don't have golden rice yet.
[01:32:54.040 --> 01:32:59.520] You know, we don't have the bananas, the soybeans, all the other stuff that could solve vitamin A deficiency.
[01:33:01.000 --> 01:33:15.080] But also, it's like when the solutions feel like they're happening in the background to solve ever-increasing problem problems, like, oh, here's a solution to solve a blight that you never would have even known was there because we got out in front of it before it devastated the crop.
[01:33:15.080 --> 01:33:18.040] People don't recognize all of the progress.
[01:33:18.040 --> 01:33:22.280] They only recognize the progress when there's like a fundamental or radical change.
[01:33:22.280 --> 01:33:30.040] And so I think that's always a problem: is that in science, so much of what we have to do is to try to prevent devastation.
[01:33:30.040 --> 01:33:32.600] And then people don't recognize that we prevented the devastation.
[01:33:32.840 --> 01:33:34.520] If we stopped COVID from happening, nobody would know.
[01:33:34.920 --> 01:33:35.160] Exactly.
[01:33:35.720 --> 01:33:37.240] That's my job as a producer.
[01:33:37.240 --> 01:33:37.480] Right.
[01:33:37.720 --> 01:33:41.080] If you did your job well, nobody knows that you did it or that it was hard.
[01:33:41.720 --> 01:33:44.280] So I think that's crazy.
[01:33:44.840 --> 01:33:45.880] And I agree with you.
[01:33:45.880 --> 01:33:51.240] And I think we were in a situation where we fight hard, many fronts.
[01:33:51.240 --> 01:33:56.200] And GMOs, I think it's ironic when there was, what was it, Steve?
[01:33:56.200 --> 01:34:01.240] Was it papaya or was it when Hawaii had a papaya blight?
[01:34:01.240 --> 01:34:04.440] They just were like, we're going to plant the GMOs and we're just not going to be able to do it.
[01:34:04.600 --> 01:34:08.040] We're going to finally ignore the papaya because otherwise the papaya industry should go away.
[01:34:08.600 --> 01:34:09.560] I like that example.
[01:34:09.560 --> 01:34:10.920] I also use the cheese example.
[01:34:10.920 --> 01:34:14.760] It's like, there would be no cheese industry without GMO rennet.
[01:34:15.240 --> 01:34:15.720] Forget about it.
[01:34:16.200 --> 01:34:17.160] But you know what I think, though?
[01:34:17.160 --> 01:34:19.480] I think the big one is going to be chocolate.
[01:34:19.720 --> 01:34:21.800] Because there is a child blood problem.
[01:34:22.520 --> 01:34:22.840] You know this?
[01:34:23.160 --> 01:34:25.400] If GMOs save chocolate, then we've won.
[01:34:25.640 --> 01:34:26.360] No, but that's what I'm saying.
[01:34:26.440 --> 01:34:26.920] No, I know.
[01:34:27.240 --> 01:34:27.720] It's funny.
[01:34:27.720 --> 01:34:31.240] It's funny, but I really do think, imagine if chocolate went away for a little while.
[01:34:31.320 --> 01:34:32.520] They'll go, we could bring it back.
[01:34:32.520 --> 01:34:33.320] We have GMO.
[01:34:33.320 --> 01:34:33.960] We'll bring it back.
[01:34:33.960 --> 01:34:34.920] You'll have it in six months.
[01:34:34.920 --> 01:34:35.240] People are going to be able to do it.
[01:34:35.400 --> 01:34:37.040] That's like the American chestnut tree.
[01:34:37.000 --> 01:34:38.160] Yeah, that's right.
[01:34:38.600 --> 01:34:39.480] But yeah, I agree.
[01:34:39.640 --> 01:35:05.680] I do think the anti-GMO crowd, panicking a little bit about golden rice and some of these applications, because I think they know that if we solve vitamin A deficiency or make a huge improvement with a GMO food that breaks all of their propaganda, it's not patented, it's not controlled by Monsanto or some big corporation, it's free to farmers, they can plant their own seeds, whatever.
[01:35:06.000 --> 01:35:10.080] None of your boogeymen is true, and it's going to save in blankets.
[01:35:10.320 --> 01:35:16.960] It's like a perfect PR, superstorm pro-GMO against the whole anti-German.
[01:35:17.040 --> 01:35:18.240] I've never thought of it that way, Steve.
[01:35:18.240 --> 01:35:18.640] You're right.
[01:35:18.880 --> 01:35:19.680] They're panicking about it.
[01:35:19.760 --> 01:35:22.160] That's why they're so desperate for it not to come to market.
[01:35:22.480 --> 01:35:33.760] It would also help, I think, a lot of the people who are on the fence or who are confused understand the difference between genetically modified organisms and corporate practices.
[01:35:33.760 --> 01:35:34.080] Right.
[01:35:34.080 --> 01:35:35.840] Because very often they conflate the two.
[01:35:36.320 --> 01:35:36.960] 100%.
[01:35:37.360 --> 01:35:40.400] Almost always when I'm talking to a skeptic who's anti-GMO, it's that sort of thing.
[01:35:40.480 --> 01:35:41.440] Yeah, it's corporate stuff that they're doing.
[01:35:41.600 --> 01:35:44.160] It's like it's really, I don't think corporations should have that much power.
[01:35:44.160 --> 01:35:45.040] Well, they already do.
[01:35:45.040 --> 01:35:46.800] Yeah, that's not what we're talking about.
[01:35:46.800 --> 01:35:47.760] Yeah, that's what we're talking about.
[01:35:47.760 --> 01:35:50.160] Is the anti-GMO lobby waning?
[01:35:50.480 --> 01:35:54.720] Yeah, I think so, especially because gene editing has been so democratizing.
[01:35:54.720 --> 01:36:05.600] So going back to this CRISPR Cas9, the ability to change a letter or two, that technology has been very in the hands of universities and small companies.
[01:36:05.600 --> 01:36:07.360] And it's just a different feel.
[01:36:07.360 --> 01:36:08.800] Small governments can do it.
[01:36:08.800 --> 01:36:09.920] Everybody can do it.
[01:36:09.920 --> 01:36:16.800] And with that kind of ability, it really changes the dynamic of who can bring a product to market.
[01:36:16.800 --> 01:36:29.120] The traditional transgenic approach, what we usually think about as genetically modified, the fact that it was so regulated and that people pushed for more regulation meant that only a couple of companies had the ability to do it.
[01:36:29.200 --> 01:36:31.880] It was like three or four companies, right, that were putting out all the GMOs.
[01:36:29.520 --> 01:36:34.120] And those three or four companies said, you know what?
[01:36:34.120 --> 01:36:35.640] Make the process harder.
[01:36:29.840 --> 01:36:37.000] Make it harder.
[01:36:37.160 --> 01:36:41.880] Because if you make that process harder, it gives us exclusivity in that space.
[01:36:41.880 --> 01:36:48.440] And so the anti-GMO folks who were out there who said we hate these companies were doing nothing more than empowering the companies they hated.
[01:36:48.760 --> 01:36:49.640] Hilarious.
[01:36:50.120 --> 01:36:52.280] Yeah, that these companies were like, yeah, keep it up, guys.
[01:36:53.080 --> 01:36:54.360] And they were protests.
[01:36:54.600 --> 01:36:56.440] They were actually really doing the devil's thing.
[01:36:57.160 --> 01:36:58.600] Yeah, regulate the hell out of it, right?
[01:36:58.600 --> 01:37:00.600] Because then nobody can compete with us, right?
[01:37:00.600 --> 01:37:01.080] Yeah.
[01:37:01.080 --> 01:37:03.080] Anything else you want us to talk about that we didn't get to?
[01:37:03.080 --> 01:37:04.280] No, I think that's pretty much it.
[01:37:04.600 --> 01:37:12.040] The one thing that it may be to mention is people do need to be participating in these conversations still, and that it's not a dead issue.
[01:37:12.040 --> 01:37:13.000] We can't get complacent.
[01:37:13.800 --> 01:37:14.680] It'll come right back.
[01:37:15.000 --> 01:37:17.320] It has the possibility to come back.
[01:37:17.320 --> 01:37:24.520] And now they're fighting things like appeal, you know, this coding that you put on fruit to make it last longer, saying that it's Bill Gates poison, you know.
[01:37:25.000 --> 01:37:25.560] Poor Bill Gates.
[01:37:26.680 --> 01:37:28.440] I never give away billions of dollars.
[01:37:30.760 --> 01:37:33.800] But all these things are still being discussed.
[01:37:33.800 --> 01:37:39.560] But the bottom line is, they limit how technology can reach the poorest people on the planet.
[01:37:39.560 --> 01:37:41.560] The affluent are not missing meals.
[01:37:41.960 --> 01:37:50.680] And so being active in this and remembering who we're really trying to help here and making food last longer, make it taste better, all that stuff, that's what we have to be doing.
[01:37:50.680 --> 01:37:51.560] We've got to keep on it.
[01:37:51.880 --> 01:37:53.240] Well, thank you for your service.
[01:37:53.880 --> 01:37:55.240] I appreciate it, Kevin.
[01:37:55.400 --> 01:37:56.840] Doing the good work.
[01:37:59.720 --> 01:38:04.760] It's time for science or fiction.
[01:38:09.560 --> 01:38:14.360] Each week I come up with three science news items or facts: two real and one fake.
[01:38:14.440 --> 01:38:19.200] Then I challenge my panel of skeptics to tell me which one is the fake.
[01:38:14.840 --> 01:38:20.720] We have a theme this week.
[01:38:21.040 --> 01:38:23.440] The theme is U.S.
[01:38:23.440 --> 01:38:24.400] trivia.
[01:38:24.480 --> 01:38:27.520] That's randomoid facts about the U.S.
[01:38:27.760 --> 01:38:28.400] Random.
[01:38:28.400 --> 01:38:29.360] All right, here we go.
[01:38:30.000 --> 01:38:40.480] Kansas is not only the flattest state in the U.S., it is literally flatter than a pancake with a flatness score of 0.9997.
[01:38:40.800 --> 01:38:48.720] I number two, the coastline of Alaska is longer than the coastlines of all the other 49 states combined.
[01:38:48.720 --> 01:38:56.000] And item number three, the first telephone directory in the world was published in New Haven, Connecticut in 1878.
[01:38:56.000 --> 01:39:00.160] The names were not alphabetized and there were no phone numbers included.
[01:39:00.160 --> 01:39:01.040] Evan, go first.
[01:39:01.040 --> 01:39:06.720] Well, Kansas, yes, that's the name of a star, according to a movie I once watched.
[01:39:06.720 --> 01:39:07.920] Wizard of Oz, anyone?
[01:39:07.920 --> 01:39:08.400] No?
[01:39:09.280 --> 01:39:12.080] Literally flatter than a pancake.
[01:39:12.480 --> 01:39:14.320] Wow, that's flat.
[01:39:14.320 --> 01:39:15.920] That's pretty flat.
[01:39:15.920 --> 01:39:20.560] Although, well, I didn't even know there was this thing called a flatness score, frankly.
[01:39:20.880 --> 01:39:28.160] Even in these flat, flat states, you know, you are going to have some elevation differences, right?
[01:39:29.120 --> 01:39:32.800] So saying it's flatter than the pancake, I have no, really can't say.
[01:39:32.960 --> 01:39:38.320] The coastline of Alaska, number two, longer than the coastlines of all the other 49 states combined.
[01:39:38.320 --> 01:39:41.840] Okay, so the thing you got to remember about Alaska, Alaska's big.
[01:39:42.160 --> 01:39:45.040] It is the largest state for area.
[01:39:45.040 --> 01:39:48.880] And also, Alaska's got all these little islands among so many other things.
[01:39:48.840 --> 01:39:52.800] So and that coastline is a kind of a jagged maze.
[01:39:52.800 --> 01:39:57.040] So could it be could it be longer than the rest of all the coastlines?
[01:39:57.040 --> 01:39:58.000] I think it could be.
[01:39:58.000 --> 01:40:01.000] I have a feeling that one's going to wind up being science.
[01:40:01.320 --> 01:40:09.480] And the last one about the first telephone directory in New Haven, Connecticut, not far from here, 1878.
[01:40:09.480 --> 01:40:14.040] Names were not alphabetized and there were no phone numbers included.
[01:40:14.040 --> 01:40:16.920] I have a feeling that one is also science.
[01:40:16.920 --> 01:40:21.560] Therefore, I'm dubious about the whole pancake flatness Kansas thing.
[01:40:21.560 --> 01:40:23.400] That one, I think, is fiction.
[01:40:23.400 --> 01:40:24.200] Okay, Bob.
[01:40:24.360 --> 01:40:26.440] All right, Kansas, not the flattest state.
[01:40:26.440 --> 01:40:27.960] Oh, it is a flattest.
[01:40:28.200 --> 01:40:30.200] Yeah, I could see it being flatter than a pancake.
[01:40:30.200 --> 01:40:39.640] I mean, pancakes generally are a little domed, you know, because just the nature of how you just spread it, it just glop it on, and then it spreads, right?
[01:40:39.640 --> 01:40:45.320] It's got a viscosity, it spreads, and, you know, the middle is never going to spread quite as much as the outer edges.
[01:40:45.320 --> 01:40:48.120] It just seems to me that that's not that big of a deal.
[01:40:48.440 --> 01:40:50.040] The Alaska one, that's tough.
[01:40:50.280 --> 01:40:58.360] I assume that when comparing the coastline of Alaska and the rest of the United States, that they use the same fractal dimension for all of those measurements.
[01:40:59.480 --> 01:41:00.280] I'll make that assumption.
[01:41:00.600 --> 01:41:01.480] We will make that assumption.
[01:41:01.640 --> 01:41:02.280] You have to.
[01:41:02.280 --> 01:41:09.960] Because then you could say Florida has more coastline than every other country on the planet if you mess around with your fractal dimensions.
[01:41:09.960 --> 01:41:17.560] Thinking about it, it seems like it would be close, but I think there's kind of like lots of little ins and outs of Alaska.
[01:41:17.560 --> 01:41:20.280] So it might sneak by and actually be a little bit long.
[01:41:20.760 --> 01:41:25.960] I don't think it's by a dramatic amount, but so that would mean this last, I don't know what to make of this stupid.
[01:41:26.120 --> 01:41:29.240] So it's a phone directory, not in alphabetical order.
[01:41:29.240 --> 01:41:30.840] And oh, yeah, it doesn't have phone numbers.
[01:41:30.840 --> 01:41:31.480] What the hell?
[01:41:31.480 --> 01:41:35.400] Why, then, then it's not, then it's not a phone directory, a telephone directory.
[01:41:35.400 --> 01:41:37.880] I don't know what you would call it, but not a telephone directory.
[01:41:37.880 --> 01:41:39.640] So I don't know what the hell's going on with that.
[01:41:39.640 --> 01:41:42.040] I'm just going to say Alaska fiction.
[01:41:42.040 --> 01:41:43.000] Okay, Jay.
[01:41:43.000 --> 01:41:47.680] Yeah, so going backwards, I mean, you live close to New Haven.
[01:41:44.840 --> 01:41:48.480] You know, we both do.
[01:41:48.800 --> 01:41:57.040] I never heard of this, but this is one of those news items where I'm like, I don't see any reason why, you know, the first telephone directory wasn't published in New Haven.
[01:41:57.040 --> 01:41:57.280] Okay.
[01:41:57.280 --> 01:42:00.320] I mean, there's nothing really there that's making me go, hey, you know.
[01:42:00.320 --> 01:42:04.720] Now, let's keep in mind about item number two here: that Alaska is freaking huge.
[01:42:04.720 --> 01:42:11.520] And most of the times that you see a picture of it, it isn't relative size, not the actual size compared to the other things on the map.
[01:42:11.520 --> 01:42:12.080] That's true.
[01:42:12.080 --> 01:42:14.720] The Mercator projection will explain.
[01:42:14.800 --> 01:42:19.440] And, you know, it's got tons and tons of jigs and jags and islands and blah, blah, blah.
[01:42:19.520 --> 01:42:20.640] You know, it just like adds up.
[01:42:20.640 --> 01:42:22.960] So I, you know, I think that's probably true.
[01:42:22.960 --> 01:42:27.280] I mean, I can't imagine, I can imagine easily that it has more coastline just because of all the shapes.
[01:42:27.520 --> 01:42:30.240] You know, you look down California, you know, relatively straight.
[01:42:30.560 --> 01:42:32.480] Look down the East Coast, relatively straight.
[01:42:32.480 --> 01:42:34.720] You know, Alaska's just got so much busy going on.
[01:42:34.720 --> 01:42:37.520] So that leaves me to the first one here.
[01:42:37.520 --> 01:42:43.680] I wonder if they took into account the curvature of the earth when they said how flat Kansas is.
[01:42:43.680 --> 01:42:45.840] But it could be in like a basin, I guess.
[01:42:46.160 --> 01:42:50.160] And I have no reason to not believe that it's wicked flat.
[01:42:50.160 --> 01:42:55.520] I don't know about the number that Steve said, but I do not think that Kansas is the flattest state.
[01:42:55.520 --> 01:42:57.040] I just don't think it is.
[01:42:57.040 --> 01:43:01.200] I have reasons to believe this, and I will say that this one is a fiction.
[01:43:01.200 --> 01:43:02.160] And Kara.
[01:43:02.160 --> 01:43:06.320] Yeah, I got to go with, is that just Jay or is that Jay and Evan?
[01:43:06.320 --> 01:43:14.400] I got to go with Jay and Evan on this because I seem to remember living in Florida and Florida being really damn flat, like really flat.
[01:43:14.400 --> 01:43:18.800] And I don't know, maybe Kansas is flatter, but I bet you Florida gives it a run for its money.
[01:43:18.800 --> 01:43:20.160] I buy the Alaska one.
[01:43:20.160 --> 01:43:23.440] I think we have to remember that also a lot of the U.S.
[01:43:23.440 --> 01:43:25.280] is actually landlocked.
[01:43:25.280 --> 01:43:27.840] So, like, we're not just coastline all the way around.
[01:43:27.840 --> 01:43:31.160] I'm with Bob on the whole, like, why is it called a telephone directory?
[01:43:31.160 --> 01:43:34.680] But my assumption is 1878, did they not have phone numbers?
[01:43:34.680 --> 01:43:37.000] Was it like telegrams or something?
[01:43:37.000 --> 01:43:37.640] What was before?
[01:43:37.640 --> 01:43:38.200] Phones?
[01:43:38.200 --> 01:43:39.080] Telegraphs?
[01:43:39.080 --> 01:43:39.480] Smoke signal.
[01:43:39.640 --> 01:43:40.040] Telegraphs.
[01:43:40.360 --> 01:43:41.400] Yeah, it was a smoke signal directory.
[01:43:41.560 --> 01:43:41.880] Telegraph.
[01:43:42.120 --> 01:43:43.000] So maybe it was something like that.
[01:43:43.000 --> 01:43:44.440] And they just.
[01:43:44.440 --> 01:43:49.400] But I think I'm going to go with Eben and Jay and say that Kansas is not the flattest state in the U.S.
[01:43:49.560 --> 01:43:51.640] I have no idea if it's flatter than a pancake, though.
[01:43:51.640 --> 01:43:53.560] All right, so you all agree on the third one.
[01:43:53.560 --> 01:43:54.280] So we'll start there.
[01:43:54.280 --> 01:43:59.000] The first telephone directory in the world was published in New Haven, Connecticut in 1878.
[01:43:59.000 --> 01:44:03.080] The names were not alphabetized, and there were no phone numbers included.
[01:44:03.080 --> 01:44:08.280] You guys all think this one is science, and this one is science.
[01:44:08.280 --> 01:44:14.200] And yes, the telephone directory was a telephone directory, not some other kind of directory that they just called a telephone directory.
[01:44:14.200 --> 01:44:16.040] Do you know why there were no phone numbers listed?
[01:44:16.200 --> 01:44:16.840] Because operators.
[01:44:17.000 --> 01:44:18.600] There were only like eight names or something.
[01:44:19.240 --> 01:44:20.360] Yeah, you would just call the operator.
[01:44:20.600 --> 01:44:21.720] Phone numbers did not exist.
[01:44:21.720 --> 01:44:22.600] Yeah, you would call up the data.
[01:44:23.080 --> 01:44:26.920] You would just pick up a candle and talk to the connection.
[01:44:27.480 --> 01:44:28.600] Give me Joe Bag of Donuts.
[01:44:29.000 --> 01:44:34.680] And then the operator would know who he was and would literally just connect you to that person.
[01:44:34.680 --> 01:44:35.320] That's right.
[01:44:35.3
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 9: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 3 of 3 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
37.640 --> 01:43:38.200] Phones?
[01:43:38.200 --> 01:43:39.080] Telegraphs?
[01:43:39.080 --> 01:43:39.480] Smoke signal.
[01:43:39.640 --> 01:43:40.040] Telegraphs.
[01:43:40.360 --> 01:43:41.400] Yeah, it was a smoke signal directory.
[01:43:41.560 --> 01:43:41.880] Telegraph.
[01:43:42.120 --> 01:43:43.000] So maybe it was something like that.
[01:43:43.000 --> 01:43:44.440] And they just.
[01:43:44.440 --> 01:43:49.400] But I think I'm going to go with Eben and Jay and say that Kansas is not the flattest state in the U.S.
[01:43:49.560 --> 01:43:51.640] I have no idea if it's flatter than a pancake, though.
[01:43:51.640 --> 01:43:53.560] All right, so you all agree on the third one.
[01:43:53.560 --> 01:43:54.280] So we'll start there.
[01:43:54.280 --> 01:43:59.000] The first telephone directory in the world was published in New Haven, Connecticut in 1878.
[01:43:59.000 --> 01:44:03.080] The names were not alphabetized, and there were no phone numbers included.
[01:44:03.080 --> 01:44:08.280] You guys all think this one is science, and this one is science.
[01:44:08.280 --> 01:44:14.200] And yes, the telephone directory was a telephone directory, not some other kind of directory that they just called a telephone directory.
[01:44:14.200 --> 01:44:16.040] Do you know why there were no phone numbers listed?
[01:44:16.200 --> 01:44:16.840] Because operators.
[01:44:17.000 --> 01:44:18.600] There were only like eight names or something.
[01:44:19.240 --> 01:44:20.360] Yeah, you would just call the operator.
[01:44:20.600 --> 01:44:21.720] Phone numbers did not exist.
[01:44:21.720 --> 01:44:22.600] Yeah, you would call up the data.
[01:44:23.080 --> 01:44:26.920] You would just pick up a candle and talk to the connection.
[01:44:27.480 --> 01:44:28.600] Give me Joe Bag of Donuts.
[01:44:29.000 --> 01:44:34.680] And then the operator would know who he was and would literally just connect you to that person.
[01:44:34.680 --> 01:44:35.320] That's right.
[01:44:35.320 --> 01:44:37.560] Guess how many names were included?
[01:44:38.120 --> 01:44:38.520] 1870.
[01:44:38.680 --> 01:44:40.360] I think it was like eight or nine.
[01:44:40.920 --> 01:44:41.560] 50.
[01:44:41.640 --> 01:44:41.880] 50.
[01:44:42.200 --> 01:44:42.440] 50.
[01:44:42.680 --> 01:44:43.400] 5-0.
[01:44:43.400 --> 01:44:48.520] Yeah, so that was basically all the businesses and people who had phones in the New Haven area.
[01:44:48.840 --> 01:44:49.320] How many?
[01:44:49.320 --> 01:44:49.560] Yeah.
[01:44:49.560 --> 01:44:52.840] And you'd call the switchboards, they connect me to this person.
[01:44:53.160 --> 01:44:59.240] And then, you know, a few years later, when they had more people, they said, you know, we should probably put these names in alphabetical order.
[01:44:59.240 --> 01:45:07.480] And why don't we just assign numbers to them in case we have somebody like, you know, there's an operator who's who's covering who doesn't know everyone's name?
[01:45:07.480 --> 01:45:09.160] You know, like that was the next step.
[01:45:09.160 --> 01:45:11.640] And then that's when phone numbers were invented.
[01:45:11.640 --> 01:45:13.320] But yeah, first one in New Haven, Connecticut.
[01:45:13.320 --> 01:45:13.640] Cool.
[01:45:13.640 --> 01:45:15.280] All right, I guess we'll go backwards.
[01:45:14.680 --> 01:45:19.600] The coastline of Alaska is longer than the coastlines of all the other 49 states combined.
[01:45:19.680 --> 01:45:21.360] Bob, you think this one is the fiction?
[01:45:21.360 --> 01:45:23.280] Everyone else thinks this one is science.
[01:45:23.280 --> 01:45:26.400] And this one is science.
[01:45:26.400 --> 01:45:27.120] This is science.
[01:45:27.120 --> 01:45:27.920] Sorry, Bob.
[01:45:27.920 --> 01:45:30.000] So what about Mercator distortion?
[01:45:30.640 --> 01:45:30.960] Right?
[01:45:30.960 --> 01:45:38.240] Because if you look at a flat map, if you yeah, Alaska.
[01:45:38.800 --> 01:45:43.280] It's not as big as it looks on the Mercator projection, but it's still big.
[01:45:43.280 --> 01:45:45.920] But it is really because of what some of you said.
[01:45:45.920 --> 01:45:57.600] It's got so many jigs and jags and islands that it just, and there's like that one huge, you know, panhandle thing at the whisker thing at the bottom of Alaska.
[01:45:57.600 --> 01:46:00.560] It's just, if you look at it, it's a lot of coastline.
[01:46:00.560 --> 01:46:01.440] The rest of the U.S.
[01:46:01.440 --> 01:46:04.720] has a lot of coastline too, but it doesn't get up to quite as much as Alaska.
[01:46:05.040 --> 01:46:05.920] What's the ratio?
[01:46:05.920 --> 01:46:10.400] Well, Alaska has 34,000 miles of coastline, officially.
[01:46:11.520 --> 01:46:13.920] What's the next most coastline?
[01:46:14.160 --> 01:46:14.720] Which state?
[01:46:14.720 --> 01:46:15.280] Which state?
[01:46:15.280 --> 01:46:15.840] Oh, Hawaii?
[01:46:15.920 --> 01:46:16.640] Maybe California.
[01:46:17.920 --> 01:46:20.400] It's Florida, which has like 8,000.
[01:46:20.560 --> 01:46:21.520] It's not even close.
[01:46:22.720 --> 01:46:24.320] What's the total, though, of the states?
[01:46:24.560 --> 01:46:25.600] Less than 34,000.
[01:46:25.600 --> 01:46:25.920] I don't know.
[01:46:28.960 --> 01:46:30.400] Yeah, it's less than that.
[01:46:30.400 --> 01:46:31.600] Yeah, I think it's by a lot.
[01:46:31.600 --> 01:46:32.960] Like, it's not even close.
[01:46:32.960 --> 01:46:37.600] So, yeah, because it's because of how crazy the Alaska coastline is.
[01:46:37.600 --> 01:46:40.880] As Jay said, like, the West Coast is pretty straight.
[01:46:40.880 --> 01:46:42.960] The East Coast is mostly straight.
[01:46:43.040 --> 01:46:45.440] Then you get Florida, and then it's straight again.
[01:46:45.440 --> 01:46:46.880] Okay, let's go back number one.
[01:46:46.880 --> 01:46:55.200] Kansas is not only the flattest state in the U.S., it is literally flatter than a pancake with a flatness score of 0.9997.
[01:46:55.200 --> 01:46:57.600] That, of course, is the fiction.
[01:46:57.600 --> 01:47:00.440] The flattest state, CARA, is Florida.
[01:46:59.600 --> 01:47:11.480] Now, the other way to designate flattest, you could do a calculation of the difference between the highest point and the lowest point compared to its area, right?
[01:47:11.480 --> 01:47:15.320] I mean, Evan, you have to imagine a pancake the size of a state.
[01:47:15.320 --> 01:47:17.640] It would have a massive mountain in the middle.
[01:47:17.640 --> 01:47:18.040] You know what I mean?
[01:47:18.040 --> 01:47:21.240] Like, it would be, it would be a massive difference between the height.
[01:47:21.240 --> 01:47:26.120] But anyway, what do you think is the second flattest state after Florida?
[01:47:26.760 --> 01:47:28.520] I'm assuming not Kansas.
[01:47:28.520 --> 01:47:29.480] The second flattest.
[01:47:29.720 --> 01:47:30.760] Kentucky.
[01:47:30.760 --> 01:47:31.320] Really?
[01:47:31.320 --> 01:47:42.840] Then Delaware, Louisiana, Mississippi, Rhode Island, Indiana, Illinois, Ohio, Iowa, Wisconsin, Michigan, Missouri, Minnesota, New Jersey, Connecticut, Alabama, Arkansas, North Dakota, Pennsylvania, and then Kansas.
[01:47:42.840 --> 01:47:44.440] It's like right in the middle of the pack as well.
[01:47:44.920 --> 01:47:45.560] Close to the flat point.
[01:47:45.640 --> 01:47:47.080] It's the middle of the country, middle of the.
[01:47:47.960 --> 01:47:49.800] All you need is one hill, and then that screws it up.
[01:47:49.800 --> 01:47:50.440] It screws you over.
[01:47:50.600 --> 01:47:55.240] So the other way to designate it is just the difference between the high point and the low point, right?
[01:47:55.240 --> 01:47:57.640] So for Florida, it's 345 feet.
[01:47:57.640 --> 01:47:58.120] That's it.
[01:47:58.120 --> 01:47:58.680] That's the difference.
[01:47:58.760 --> 01:47:59.080] I knew that.
[01:48:00.360 --> 01:48:01.960] And the lowest point in Florida.
[01:48:01.960 --> 01:48:03.960] Kentucky's close, 388.
[01:48:03.960 --> 01:48:05.480] Delaware's 450.
[01:48:05.480 --> 01:48:07.080] Delaware's pretty flat as well.
[01:48:07.080 --> 01:48:08.200] I knew that fact about Florida.
[01:48:08.200 --> 01:48:10.040] I would not have guessed Kentucky.
[01:48:10.040 --> 01:48:13.160] Kansas is 3,363.
[01:48:13.160 --> 01:48:18.600] It's not anywhere near as flat as Florida, but it just has that reputation of being just like a big cornfield.
[01:48:18.600 --> 01:48:20.200] You know, it's just a flat state.
[01:48:20.840 --> 01:48:22.680] But it is flatter than a pancake.
[01:48:22.680 --> 01:48:23.560] That part is true.
[01:48:23.880 --> 01:48:24.440] All right.
[01:48:24.440 --> 01:48:25.880] Well, good job, guys.
[01:48:25.880 --> 01:48:27.000] Evan, give us a quote.
[01:48:27.000 --> 01:48:35.160] There is only one thing worse than coming home from the lab to a sink full of dirty dishes, and that's not going to the lab at all.
[01:48:35.160 --> 01:48:40.840] That's a wonderful quote by Shane Shwing Wu, experimental physicist.
[01:48:40.840 --> 01:48:49.840] She is considered the first, she was considered the first lady of physics, the Chinese Madame Curie, and the queen of nuclear research.
[01:48:44.520 --> 01:48:50.240] Awesome.
[01:48:50.560 --> 01:48:53.520] A very impressive resume she has.
[01:48:53.520 --> 01:48:56.720] She should have won a Nobel Prize in Physics.
[01:48:56.720 --> 01:49:08.080] She was part of a team that did win the 1957 Nobel Prize in Physics with two other Chinese researchers, but the men got the credit and she did not.
[01:49:09.120 --> 01:49:15.120] So, again, another injustice that should have been, should not have gone down that way.
[01:49:15.600 --> 01:49:17.040] Should not have happened that way.
[01:49:17.040 --> 01:49:17.520] All right.
[01:49:17.520 --> 01:49:18.800] Well, thank you, Evan.
[01:49:18.800 --> 01:49:19.280] Yep.
[01:49:19.280 --> 01:49:21.120] And thank you all for joining me this week.
[01:49:21.120 --> 01:49:21.440] Thanks, Steve.
[01:49:22.400 --> 01:49:23.440] Thank you, Steve.
[01:49:23.440 --> 01:49:25.120] And happy Thanksgiving, everyone.
[01:49:25.680 --> 01:49:26.400] Happy Thanksgiving.
[01:49:26.560 --> 01:49:28.080] Be safe out there, please.
[01:49:28.080 --> 01:49:33.840] Next episode will come out after Thanksgiving, and that'll be the episode we recorded while we were at Saikon.
[01:49:33.840 --> 01:49:34.480] Right.
[01:49:34.480 --> 01:49:37.520] Yep, and then we'll be back with a new episode in two weeks.
[01:49:37.520 --> 01:49:42.400] And until next week, this is your Skeptic's Guide to the Universe.
[01:49:44.960 --> 01:49:51.680] Skeptic's Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:49:51.680 --> 01:49:56.320] For more information, visit us at theskepticsguide.org.
[01:49:56.320 --> 01:50:00.240] Send your questions to info at the skepticsguide.org.
[01:50:00.240 --> 01:50:10.880] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:50:10.880 --> 01:50:14.240] Our listeners and supporters are what make SGU possible.
Prompt 10: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 11: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.320 --> 00:00:04.640] The Skeptic's Guide to the Universe is brought to you by Progressive Insurance.
[00:00:04.640 --> 00:00:08.960] Do you ever think about switching insurance companies to see if you could save some cash?
[00:00:08.960 --> 00:00:14.480] Progressive makes it easy to see if you could save when you bundle your home and auto policies.
[00:00:14.480 --> 00:00:17.120] Try it at progressive.com.
[00:00:17.120 --> 00:00:19.920] Progressive casualty insurance company and affiliates.
[00:00:19.920 --> 00:00:21.440] Potential savings will vary.
[00:00:21.440 --> 00:00:23.600] Not available in all states.
[00:00:27.120 --> 00:00:30.320] You're listening to the skeptic's guide to the universe.
[00:00:30.320 --> 00:00:33.200] Your escape to reality.
[00:00:33.840 --> 00:00:36.480] Hello and welcome to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:41.440] Today is Tuesday, November 19th, 2024, and this is your host, Stephen Novella.
[00:00:41.440 --> 00:00:43.120] Joining me this week are Bob Novella.
[00:00:43.120 --> 00:00:43.760] Hey, everybody.
[00:00:43.760 --> 00:00:45.120] Kara Santa Maria.
[00:00:45.120 --> 00:00:45.600] Howdy.
[00:00:45.600 --> 00:00:46.560] Jay Novella.
[00:00:46.560 --> 00:00:47.360] Hey, guys.
[00:00:47.360 --> 00:00:49.200] And Evan Bernstein.
[00:00:49.200 --> 00:00:50.720] Good evening, everyone.
[00:00:50.720 --> 00:00:51.680] How's everyone doing?
[00:00:51.680 --> 00:00:55.920] We're recording a little bit early because we're getting ready for the Thanksgiving break.
[00:00:56.240 --> 00:00:58.080] I'll be going away this weekend.
[00:00:58.080 --> 00:01:00.160] Yeah, I'm going to see my family in Denver.
[00:01:00.160 --> 00:01:03.520] Yeah, this is the holiday where the novella guys are all splitting up.
[00:01:03.520 --> 00:01:04.480] I'll go into our in-look.
[00:01:04.640 --> 00:01:04.960] Don't worry.
[00:01:04.960 --> 00:01:06.240] I'll hold down the state for you.
[00:01:06.240 --> 00:01:06.960] I'll be remaining.
[00:01:07.200 --> 00:01:07.680] Thank you.
[00:01:07.680 --> 00:01:08.160] Thank you.
[00:01:08.160 --> 00:01:08.960] I'll take care of things.
[00:01:08.960 --> 00:01:09.600] Just leave it to you.
[00:01:11.040 --> 00:01:12.160] I'm going to Oregon.
[00:01:12.160 --> 00:01:12.640] Oregon.
[00:01:12.640 --> 00:01:12.800] Oh.
[00:01:13.040 --> 00:01:13.760] Oregon.
[00:01:14.640 --> 00:01:17.280] I always mispronounce Oregon.
[00:01:17.280 --> 00:01:17.840] It's Oregon.
[00:01:18.720 --> 00:01:20.320] They don't want you to say Oregon.
[00:01:20.560 --> 00:01:21.360] No, it's Oregon.
[00:01:21.840 --> 00:01:22.240] Oregon.
[00:01:23.760 --> 00:01:26.720] And that's the people of the state who get to decide that, right?
[00:01:27.760 --> 00:01:28.880] I suppose that's fair.
[00:01:28.880 --> 00:01:35.280] So I know you guys have heard about the fact that the Onion has acquired InfoWars.
[00:01:35.760 --> 00:01:37.200] I mean, how wonderful is that?
[00:01:37.200 --> 00:01:37.920] Well, there's a wrinkle.
[00:01:37.920 --> 00:01:39.040] Did you guys hear the wrinkle?
[00:01:39.040 --> 00:01:39.680] No, what happened?
[00:01:39.920 --> 00:01:40.400] No.
[00:01:40.400 --> 00:01:42.080] It's being contested.
[00:01:42.240 --> 00:01:42.800] Of course it is.
[00:01:42.960 --> 00:01:43.600] Okay.
[00:01:44.160 --> 00:01:52.160] So, just as a quick review, InfoWars is the media outlet of Alex Jones, who got sued for lots of money.
[00:01:52.560 --> 00:02:02.920] It has to award like $1.5 billion to the families of the Sandy Hook massacre because of his hateful conspiracy theories, right?
[00:01:59.680 --> 00:02:05.240] One little bit of justice in the world.
[00:02:05.800 --> 00:02:11.400] And to pay off the money he owes, they're selling off his assets, including InfoWars.
[00:02:11.400 --> 00:02:21.720] And so that just was up for auction, and it was purchased by The Onion, or the parent company of The Onion, Global Tetrahedron is the name of the parent company.
[00:02:21.720 --> 00:02:27.240] And they plan on launching it as a basically satire of itself.
[00:02:27.240 --> 00:02:28.040] That's amazing.
[00:02:28.040 --> 00:02:29.000] Sometime next year.
[00:02:29.000 --> 00:02:31.240] It would be amazing, totally amazing.
[00:02:31.240 --> 00:02:42.760] But a company affiliated with Alex Jones is claiming that the bidding was not fair, that they did not have a fair opportunity to make a counterbid.
[00:02:42.760 --> 00:02:44.280] So now it has to go before the judge.
[00:02:44.280 --> 00:02:47.160] And hopefully this is just all a waste of time and it'll still happen.
[00:02:47.160 --> 00:02:47.960] A stall tactic.
[00:02:48.680 --> 00:02:53.160] The judge is not going to allow a company affiliated with Alex Jones to buy the company.
[00:02:53.160 --> 00:02:59.480] Well, their argument is that the goal of the bid is to get as much money for the families as possible.
[00:02:59.480 --> 00:03:02.680] And so if they have a higher bid, the judge should honor that.
[00:03:03.160 --> 00:03:05.560] But then are they just going to turn around and give it back to Alex Jones?
[00:03:05.560 --> 00:03:06.520] Well, they could.
[00:03:07.720 --> 00:03:08.520] I don't know.
[00:03:08.520 --> 00:03:16.280] Hopefully this won't sink the deal because it would be awesome if The Onion gets InfoWars.
[00:03:16.600 --> 00:03:26.040] Also, the people who sued Alex Jones, who were getting some of the money, essentially gave The Onion some of their money to get the bid up enough to win.
[00:03:26.040 --> 00:03:26.840] Oh, interesting.
[00:03:27.160 --> 00:03:28.840] So they're investors, in a sense.
[00:03:29.320 --> 00:03:31.880] I guess so, yeah, because they wanted this to happen, right?
[00:03:31.880 --> 00:03:36.600] And they didn't wanted to keep it away from Alex John, you know, a company affiliated with Alex Jones.
[00:03:36.600 --> 00:03:37.080] Okay.
[00:03:37.080 --> 00:03:37.400] All right.
[00:03:37.400 --> 00:03:40.360] Well, yeah, it sounds like a delay tactic more than anything to me.
[00:03:40.360 --> 00:03:42.280] Yeah, I mean, I don't know.
[00:03:42.280 --> 00:03:45.520] I think they legitimately want he wants to get it if he's got the money.
[00:03:45.680 --> 00:03:52.800] I don't know how much they paid for it, but I mean, it's, you know, look, people just don't, people will throw lawsuits out there like crazy today, you know?
[00:03:52.800 --> 00:03:54.640] Like, it's all BS.
[00:03:54.640 --> 00:03:56.000] Like, he didn't do it.
[00:03:56.000 --> 00:03:56.880] He lost it.
[00:03:56.880 --> 00:03:57.840] He wasn't on his game.
[00:03:57.840 --> 00:03:58.880] Whatever the problem was.
[00:03:58.880 --> 00:04:00.320] The time came and the time went.
[00:04:00.320 --> 00:04:02.320] It's all forecasted.
[00:04:02.320 --> 00:04:04.640] It's not like they, you know, secretly put it out.
[00:04:04.800 --> 00:04:06.560] Like, it was in the public eye.
[00:04:06.560 --> 00:04:07.120] That's it.
[00:04:07.120 --> 00:04:09.120] Yeah, we'll update you on if there's any.
[00:04:09.280 --> 00:04:12.160] I think it's going to be go before the judge next week.
[00:04:12.160 --> 00:04:12.400] Yeah.
[00:04:12.400 --> 00:04:12.880] Well, we'll see.
[00:04:13.120 --> 00:04:13.440] Okay.
[00:04:13.440 --> 00:04:14.160] Quick resolution.
[00:04:14.160 --> 00:04:14.640] Let's go.
[00:04:14.880 --> 00:04:16.720] Hopefully there'll be a good resolution.
[00:04:16.720 --> 00:04:16.880] All right.
[00:04:16.880 --> 00:04:22.480] Well, let's get into our news items because we have a great interview coming up later in the show with Kevin Fulta.
[00:04:22.480 --> 00:04:24.000] If you want to leave time for that.
[00:04:24.160 --> 00:04:31.440] I'm going to start us off by talking about giving robots a sense of self.
[00:04:31.440 --> 00:04:32.880] Oh, don't do that.
[00:04:33.520 --> 00:04:36.240] Well, brain circuit in there.
[00:04:36.720 --> 00:04:40.320] I've seen too many movies where that didn't work out as intended.
[00:04:40.320 --> 00:04:46.080] But don't they need that for like, I don't know, that embodied kind of thing, right?
[00:04:46.240 --> 00:04:51.440] So there was a recent paper published by three experts.
[00:04:51.440 --> 00:04:55.840] One is a robotic, a cognitive roboticist, right?
[00:04:55.840 --> 00:04:59.760] So they deal with AI that controls robots.
[00:04:59.760 --> 00:05:06.160] Another one is a cognitive psychologist who specializes in human-robot interactions.
[00:05:06.160 --> 00:05:08.160] And the third was a psychiatrist.
[00:05:08.160 --> 00:05:20.160] So they wrote a paper talking about the fact that we should be studying the elements of a human sense of self, the components that make that up in robots.
[00:05:20.160 --> 00:05:26.480] And then we could then do research on those elements in the robotic model.
[00:05:26.480 --> 00:05:34.040] To break this down, what's interesting about this, what I found interesting about this, obviously I'm a neuroscientist, I love the cognitive neurosciences, and I also love robots and AI.
[00:05:29.840 --> 00:05:35.080] So it all comes together.
[00:05:35.320 --> 00:05:40.280] What I've been saying for years is that AI is going to be a fantastic way to study neuroscience, right?
[00:05:40.280 --> 00:05:49.480] Because it essentially gives us an actual model that we could mess with, like what happens when we take this component out, or what happens when we dial this all the way up, or whatever.
[00:05:49.480 --> 00:05:52.360] And we could see how the pieces interact with each other.
[00:05:52.360 --> 00:05:55.000] But first, we have to know what all the pieces are, right?
[00:05:55.000 --> 00:06:00.520] So this is, you know, they're talking about what are the pieces of a sense of self.
[00:06:00.520 --> 00:06:03.960] Do you guys have, I mean, I've spoken about this before, but do you guys have any guesses?
[00:06:03.960 --> 00:06:16.280] Like, what do you think would be something very specific, like some circuit in the brain that doesn't create necessarily by itself a sense of self, but that would contribute to our sense of self?
[00:06:16.280 --> 00:06:23.240] Well, I think there's the obvious things like our proprioception, like our understanding of our own body map and where we are in space.
[00:06:23.720 --> 00:06:24.200] I don't know.
[00:06:24.200 --> 00:06:29.240] I did a whole podcast with this really interesting cognitive, I think he was a cognitive neuroscientist.
[00:06:29.240 --> 00:06:31.000] Ooh, he might have been a psychologist.
[00:06:31.000 --> 00:06:35.000] About how the concept of self only develops in relation to other.
[00:06:35.000 --> 00:06:38.280] And so you can't really have a sense of self if you're in a vacuum.
[00:06:38.280 --> 00:06:38.680] Right.
[00:06:38.680 --> 00:06:39.720] So that's correct.
[00:06:39.720 --> 00:06:45.400] And the different components, one that you're getting to, we would call is embodiment, right?
[00:06:45.400 --> 00:06:50.440] So there's a sense that you are in your body, right?
[00:06:50.440 --> 00:06:54.760] There's also a sense that you are separate from the rest of the universe.
[00:06:55.320 --> 00:06:59.560] At some point, you end and the rest of the universe begins.
[00:06:59.560 --> 00:07:02.280] What's interesting is that infants don't have that.
[00:07:02.840 --> 00:07:06.440] That develops after birth, like over time.
[00:07:06.680 --> 00:07:12.360] Then that module clicks in, and then they can see, like, this is my hand, and everything beyond that is not me.
[00:07:12.400 --> 00:07:13.960] You know, it's imagine that day in the world.
[00:07:14.200 --> 00:07:15.120] Oh, shit.
[00:07:16.800 --> 00:07:17.680] I'm not a god.
[00:07:17.680 --> 00:07:18.800] What the hell?
[00:07:14.600 --> 00:07:21.680] Well, you can see it when they recognize their own parts and stuff.
[00:07:22.000 --> 00:07:24.240] They start to see themselves in the mirror and all those good things.
[00:07:24.720 --> 00:07:26.000] They pass the mirror test.
[00:07:26.000 --> 00:07:31.680] So then another component is the sense that you own your body parts.
[00:07:31.680 --> 00:07:32.320] Oh, yeah.
[00:07:32.640 --> 00:07:34.400] And then agency, right?
[00:07:34.960 --> 00:07:35.840] Sometimes I rent out that.
[00:07:36.160 --> 00:07:38.240] Agency is a separate thing, Evan.
[00:07:38.240 --> 00:07:42.480] So then there's the sense that you control your body parts.
[00:07:42.480 --> 00:07:44.000] That's agency.
[00:07:44.000 --> 00:07:48.000] And then there's the sense that you can have an influence.
[00:07:48.000 --> 00:07:51.360] You can do stuff that influences the rest of the world, right?
[00:07:51.360 --> 00:07:54.640] You could do something that would affect something else.
[00:07:54.960 --> 00:08:00.880] And there's also the sense that other people have the same kind of sense of self that you have.
[00:08:00.880 --> 00:08:02.000] Yeah, that comes much later.
[00:08:02.000 --> 00:08:02.640] That comes later.
[00:08:02.640 --> 00:08:03.040] Yeah.
[00:08:03.520 --> 00:08:05.680] What does that have to do with your own sense of self?
[00:08:06.160 --> 00:08:08.720] Because you have to have a theory of mind, right?
[00:08:08.960 --> 00:08:15.520] And the theory of mind is that you have to know what an agent is in order to feel like you're an agent.
[00:08:15.520 --> 00:08:22.000] And if you are an agent, that has to mean that other people are also their own agents.
[00:08:22.000 --> 00:08:23.200] And they're not you.
[00:08:23.200 --> 00:08:24.960] They are separate from you.
[00:08:24.960 --> 00:08:39.920] The thing is, and I think a lot of people get tripped up who haven't, you know, who are not neuroscientists or who haven't thought about this, is that these things seem so fundamental, you might wonder, it's like, do we really need a circuit in the brain to make us feel that way?
[00:08:40.000 --> 00:08:41.680] These things are true.
[00:08:42.000 --> 00:08:43.120] But that's naive.
[00:08:43.120 --> 00:08:46.560] That's not how our consciousness works.
[00:08:46.560 --> 00:08:50.480] We don't feel that way simply because it's true.
[00:08:51.040 --> 00:08:57.920] We have to have it, it's a constructed, subjective experience that the brain has to actively make.
[00:08:57.920 --> 00:08:59.120] And it can be knocked out.
[00:08:59.120 --> 00:09:00.360] And it can get knocked out.
[00:08:59.440 --> 00:09:04.280] Mostly we know about these things when they get knocked out.
[00:09:04.600 --> 00:09:05.560] Brain injuries.
[00:08:59.840 --> 00:09:06.600] Injuries and drugs.
[00:09:06.760 --> 00:09:07.800] Those are two big ones, right?
[00:09:08.040 --> 00:09:11.080] And now we could do it with like transcranial magnetic stimulation.
[00:09:11.080 --> 00:09:13.880] We could turn off this circuit or turn off that circuit.
[00:09:13.880 --> 00:09:18.040] And so there are drugs, for example, that will make you have an out-of-body experience.
[00:09:18.040 --> 00:09:19.400] What's an out-of-body experience?
[00:09:19.400 --> 00:09:22.040] You lose your sense that you are in your body.
[00:09:23.720 --> 00:09:36.920] Or you feel like, and I remember when we interviewed a neuroscientist who did LSD, she said that when she took it, her body expanded to the size of the universe.
[00:09:37.400 --> 00:09:38.600] Shawn of Jennifer Willette.
[00:09:38.600 --> 00:09:41.000] Yeah, she and Sean did LSD for one of her books.
[00:09:41.000 --> 00:09:42.760] Although I don't think that's who that quote is from.
[00:09:42.760 --> 00:09:44.680] It's Susan Blackmore.
[00:09:44.680 --> 00:09:45.160] Oh, okay.
[00:09:45.640 --> 00:09:51.800] And but in any case, so what is that other than you not feeling you are separate from the universe?
[00:09:51.800 --> 00:09:53.720] You feel one with the universe, right?
[00:09:53.720 --> 00:10:00.680] Which feels like a spiritual experience, but it's just a breakdown of your sense that you are not one with the universe, right?
[00:10:00.680 --> 00:10:04.600] Which is, again, is an actively constructed sense that you have.
[00:10:04.600 --> 00:10:10.200] Because how can you have a sense of self unless you're distinguished from not self, right?
[00:10:10.520 --> 00:10:13.480] So you're in your body, self versus not-self.
[00:10:13.480 --> 00:10:15.720] And then we talk about alien hand syndrome.
[00:10:15.720 --> 00:10:17.000] What's alien hand syndrome?
[00:10:17.000 --> 00:10:22.840] The circuit that makes you feel as if you control your body is not working at some point.
[00:10:23.160 --> 00:10:24.760] How does that circuit work?
[00:10:24.760 --> 00:10:27.560] Well, first you decide you want to make a move.
[00:10:27.560 --> 00:10:31.160] You make a move, and then you feel and see the move.
[00:10:31.160 --> 00:10:34.440] And if it matches, your brain says you control your hand.
[00:10:34.440 --> 00:10:39.800] If it doesn't match, or that circuit is broken, you don't feel as if you are controlling your hand.
[00:10:39.800 --> 00:10:44.200] You feel like it's acting on its own agency, not your agency.
[00:10:44.200 --> 00:10:49.520] Even though subconsciously it may be your agency, you don't feel that it is.
[00:10:49.840 --> 00:10:59.040] So people will, with alien hand syndrome, will like be walking down the street and then their hand empties their pockets onto the sidewalk.
[00:10:59.840 --> 00:11:17.840] And there's the famous example in Oliver Sachs' book where a patient kept waking up on the floor and they discovered that he was, he thought that there was a cadaver leg in his bed and he would throw it out of the bed out of fear and then he would go with it because it was his own leg.
[00:11:18.160 --> 00:11:19.760] That's a different phenomenon.
[00:11:19.760 --> 00:11:20.720] That's neglect.
[00:11:20.720 --> 00:11:21.440] That's neglect.
[00:11:21.440 --> 00:11:36.400] Yeah, so that's the sense that a part of your body doesn't belong to you because it's not part of, because if you feel like a stroke in your right hemisphere, you won't know that the left side of your body belongs to you.
[00:11:36.720 --> 00:11:41.760] So you're talking about the difference between it feeling like it's possessed by somebody else or feeling like it's not possessed at all?
[00:11:41.760 --> 00:11:46.160] Well, so this is the difference between feeling like you own it versus feeling like you control it.
[00:11:46.160 --> 00:11:49.280] Alien hand syndrome is the lack of feeling that you control it.
[00:11:49.760 --> 00:11:50.960] And then neglect, it's a lack of feeling.
[00:11:51.200 --> 00:11:54.720] Neglect can result in a lack of sense that you own it.
[00:11:54.720 --> 00:11:58.720] So, yeah, and what patients will say, there's another patient in the bed with me, because that's not my leg.
[00:11:58.720 --> 00:12:00.240] It must be another patient's leg.
[00:12:00.240 --> 00:12:10.320] And the quickie bedside test we do is we take their paralyzed hand, you know, the one that they're neglecting, we hold it in front of their face, and we say, whose hand is this?
[00:12:10.320 --> 00:12:15.120] And they invariably say, that's your hand, even though you're showing them their own hand.
[00:12:15.440 --> 00:12:19.360] Because they don't feel like it's part of them, and therefore it isn't, right?
[00:12:19.760 --> 00:12:24.240] But a good example of both versions, or of like the two, how those things are actually separate.
[00:12:24.240 --> 00:12:24.560] They are.
[00:12:24.560 --> 00:12:29.880] They are distinct phenomena, although obviously these are all related, but these are the components that are distinct.
[00:12:29.880 --> 00:12:34.760] Steve, if you put on an alien costume, would that enhance your alien hand syndrome?
[00:12:34.760 --> 00:12:36.120] No, it wouldn't.
[00:12:29.440 --> 00:12:36.760] It's not necessary.
[00:12:37.240 --> 00:12:38.840] Now, one chuckle from anybody.
[00:12:38.840 --> 00:12:39.000] No.
[00:12:39.320 --> 00:12:40.280] It would look cool.
[00:12:41.720 --> 00:12:43.400] I was laughing on the inside.
[00:12:46.760 --> 00:12:48.520] A couple other interesting wrinkles here.
[00:12:48.520 --> 00:12:54.440] You can have a sense of ownership over a part of the body you don't, that doesn't exist, right?
[00:12:54.440 --> 00:12:58.920] So you can have what's called a supernumerary phantom limb.
[00:12:59.240 --> 00:13:00.280] So this happened.
[00:13:00.280 --> 00:13:07.640] These are now like you have a stroke, the ownership module gets disconnected from the paralyzed limb, but it's still working.
[00:13:07.640 --> 00:13:10.120] It's just not getting any feedback from your arm.
[00:13:10.120 --> 00:13:11.720] So it makes up an arm.
[00:13:12.840 --> 00:13:18.440] You have a phantom limb, and it's separate from your real physical limb.
[00:13:18.440 --> 00:13:20.360] And you feel like you own it.
[00:13:20.360 --> 00:13:21.240] It's part of you.
[00:13:21.240 --> 00:13:22.600] You control it.
[00:13:22.600 --> 00:13:27.080] You just can't obviously manipulate external reality with it because it exists only as a figment in your mind.
[00:13:27.320 --> 00:13:29.880] What about your homunculus in that scenario, Steve?
[00:13:29.880 --> 00:13:33.240] Yeah, I mean, the homunculus is somewhat plastic, right?
[00:13:33.720 --> 00:13:35.320] You can rearrange itself.
[00:13:35.320 --> 00:13:41.000] So it would have, by definition, would that have adapted to that phantom supernumerary limb?
[00:13:41.000 --> 00:13:42.840] They tend to go away over time.
[00:13:42.840 --> 00:13:45.960] So they tend not to persist indefinitely.
[00:13:47.560 --> 00:13:51.960] The most extreme case was somebody who had four phantom limbs.
[00:13:52.280 --> 00:13:54.760] Four supernumerary phantom limbs.
[00:13:54.760 --> 00:13:58.280] So he was literally Doc Ock because he had eight limbs.
[00:13:58.280 --> 00:14:00.040] But four of them weren't real.
[00:14:00.040 --> 00:14:09.480] So, you know, you know, and there's also phantom limb when you you like have a paralyzed arm, or you have like an amputated arm rather, and you still feel like it's there.
[00:14:09.480 --> 00:14:14.880] Because the arm is physically gone, but your ownership module still owns the limb.
[00:14:14.600 --> 00:14:18.240] It's that circuit is still there, even though there's no physical arm in place.
[00:14:18.560 --> 00:14:19.920] And Kare, have you ever heard of this?
[00:14:19.920 --> 00:14:28.160] I forget what the technical name of it is, but there's a very rare syndrome where people feel like they don't own parts of their actual body.
[00:14:28.160 --> 00:14:34.560] And they often present to doctors saying, I want you to amputate this thing attached to me.
[00:14:35.040 --> 00:14:35.840] Oh, interesting.
[00:14:35.840 --> 00:14:36.800] Because it's not me.
[00:14:36.800 --> 00:14:38.880] And it makes them feel very uncomfortable.
[00:14:38.880 --> 00:14:43.840] And there's a huge ethical discussion going on as to what is the appropriate thing to do about it.
[00:14:44.880 --> 00:14:45.840] Like their liver?
[00:14:46.160 --> 00:14:48.800] No, like they're not.
[00:14:48.880 --> 00:14:51.840] Like, let's say they don't want their, yeah, they're pinky.
[00:14:53.120 --> 00:14:55.600] But imagine feeling like your left arm is not you.
[00:14:55.600 --> 00:14:57.440] It's attached to you.
[00:14:57.760 --> 00:15:02.720] And there's something neurological going on, but they're probably often misdiagnosed as having some form of psychosis.
[00:15:02.720 --> 00:15:02.960] Right.
[00:15:02.960 --> 00:15:03.360] Yeah, where do you think that's a good idea?
[00:15:04.000 --> 00:15:05.200] Medicines help with that.
[00:15:06.160 --> 00:15:07.440] Not if that's not what's wrong.
[00:15:07.440 --> 00:15:09.760] He's saying there's a circuit in their brain that's faulty.
[00:15:09.760 --> 00:15:10.000] Right.
[00:15:10.400 --> 00:15:11.120] And it's so rare.
[00:15:11.120 --> 00:15:14.720] We don't really have a lot of research into it, so I don't know that we have it all fleshed out.
[00:15:14.720 --> 00:15:17.680] And certainly not a lot of treatment trials or anything with it.
[00:15:17.680 --> 00:15:18.240] Yeah.
[00:15:18.240 --> 00:15:23.040] So all of these components can be disconnected from each other is the cool part.
[00:15:23.040 --> 00:15:44.800] So now getting back to the robot thing with the paper, what they want to do is say, yeah, now that we kind of know all of these components, mainly from people with strokes or people on drugs or whatever, wherever these circuits go off for one reason or another, or anoxia or whatever, you know, this is, again, part of, often part of a near-death experience, for example, the out-of-body experience.
[00:15:44.800 --> 00:15:54.000] Let's see if we can replicate these components in a robot, make a robot feel as if it's inside its robot body.
[00:15:54.000 --> 00:15:56.080] And so they could replicate the circuits.
[00:15:56.080 --> 00:16:07.640] For example, for the agency module, you could say, like, the robot knows what it wants to do, meaning that at least there's the circuit that says you're going to raise your right arm.
[00:16:07.640 --> 00:16:11.560] So that information is in the AI controlling the robot.
[00:16:11.560 --> 00:16:20.920] It then will raise the arm, and then you have sensors that feed back to say how is the limb moving and does it match what you intended to do.
[00:16:20.920 --> 00:16:26.040] And if it does match, you have to give some kind of positive feedback to the algorithm.
[00:16:26.040 --> 00:16:33.160] And so that would essentially mimic that loop that exists in a human brain that makes you feel as if you own your body part, right?
[00:16:33.320 --> 00:16:35.080] Aren't they kind of doing that now, though?
[00:16:35.080 --> 00:16:46.600] To some extent, but they want to explicitly try to replicate these components of a sense of self that humans have in a robot and then see how that influences the robot's behavior.
[00:16:46.600 --> 00:16:51.240] Maybe it will improve the robot's ability to control its movements, for example.
[00:16:51.240 --> 00:16:53.960] Maybe it will have some consequences.
[00:16:54.520 --> 00:16:57.480] Probably we evolved these things for a reason.
[00:16:57.800 --> 00:17:00.920] Maybe give it a psychosis, some robotic psychosis.
[00:17:00.920 --> 00:17:01.480] Yeah.
[00:17:01.800 --> 00:17:09.400] Now, the question is: now, if we want to keep going forward with this, if you add all these things together, what does that add up to?
[00:17:09.720 --> 00:17:16.360] Does it add up to an actual sense of self that the robot has?
[00:17:16.360 --> 00:17:27.320] Now, I don't think that if we had all these circuits disconnected from each other and not connected to anything, that was also trying to replicate consciousness.
[00:17:27.320 --> 00:17:28.200] Consciousness, yeah.
[00:17:28.200 --> 00:17:28.520] Yeah.
[00:17:28.520 --> 00:17:29.960] So, and what is consciousness?
[00:17:29.960 --> 00:17:33.720] Again, that we could try to replicate that in an AI/slash robot.
[00:17:33.720 --> 00:17:39.320] That is, you know, essentially, if I had to strip it down, what we know now is wakeful consciousness.
[00:17:39.320 --> 00:17:42.760] If we're talking about clinically now, what is wakeful consciousness?
[00:17:42.760 --> 00:17:48.560] It is a constant communication that the brain is having with itself, right?
[00:17:48.800 --> 00:17:55.680] There's just this constant loop of neurological activity, and it's being activated by the brainstem.
[00:17:55.680 --> 00:18:03.920] The brainstem is constantly giving your cortex a kick in the ass, and then every time something happens, it leads to something else, which leads to something else, which leads to something else.
[00:18:03.920 --> 00:18:10.400] And if that just keeps happening, that chain of neurological events is your stream of consciousness, right?
[00:18:10.400 --> 00:18:21.760] And it's taking in sensory information, it's taking information from your own body, it's taking in information from other parts of your brain that are communicating with each other, and it produces the stream of consciousness, right?
[00:18:21.760 --> 00:18:23.680] That's sort of self-propelling.
[00:18:23.680 --> 00:18:27.440] And it's not just excitatory, it's like there's a lot of inhibitory action going on.
[00:18:27.440 --> 00:18:30.640] Well, yeah, then at the high levels, there's a lot of stuff's happening.
[00:18:30.640 --> 00:18:34.000] You have to sort of inhibit the stuff to control your behavior.
[00:18:34.000 --> 00:18:35.360] So it's not just chaos.
[00:18:35.600 --> 00:18:37.040] Or a massive seizure.
[00:18:37.040 --> 00:18:41.040] Yeah, well, that's the seizures are inhibited at a really basic level.
[00:18:41.040 --> 00:18:52.560] Like if you, every time you, like one neural circuit, bunch of clump of neurons sends a signal to another clump of neurons, they inhibit all of the adjacent neurons.
[00:18:52.560 --> 00:18:59.920] So that's to keep it from spreading outside of the circuit to prevent seizures and prevent what we call ephaptic transmission.
[00:18:59.920 --> 00:19:01.440] Ever hear that term, Kara?
[00:19:01.440 --> 00:19:02.080] Ephactic.
[00:19:02.240 --> 00:19:03.040] Ephaptic.
[00:19:03.040 --> 00:19:03.680] EPH.
[00:19:03.760 --> 00:19:04.160] Ephactic.
[00:19:04.240 --> 00:19:04.880] Ephaptic.
[00:19:04.880 --> 00:19:11.600] Yeah, basically meaning not through a circuit, but just spreading it to adjacent neurons.
[00:19:12.320 --> 00:19:21.040] Yeah, seizures are when the bunch of cells or neurons are firing, not along pathways, but just because they're all next to each other and they're just all firing.
[00:19:21.360 --> 00:19:21.600] Yep.
[00:19:21.600 --> 00:19:22.400] It's bad.
[00:19:22.400 --> 00:19:22.880] Right.
[00:19:22.880 --> 00:19:24.560] They're not good for the brain for that to happen.
[00:19:24.560 --> 00:19:24.960] Yeah.
[00:19:24.960 --> 00:19:27.200] And here's the final question I want to leave you guys with.
[00:19:27.520 --> 00:19:36.920] Is there a difference between a general sentient AI that exists only virtually and one that's in a robot?
[00:19:36.920 --> 00:19:42.680] And is it what would an AI that exists only virtually be like?
[00:19:42.680 --> 00:20:00.600] Now, the gray zone in between these two states is what I call the max headroom thing, which is that you could have a virtual body, an AI that's not in a robot, that's just on a computer, could have a virtual body and could have all of these sense of self modules running with the virtual body.
[00:20:00.600 --> 00:20:01.880] But what if you didn't do that?
[00:20:01.880 --> 00:20:05.560] What if you had none of these sense of self circuits running?
[00:20:05.560 --> 00:20:07.080] You just had the AI.
[00:20:07.080 --> 00:20:09.160] What would it experience?
[00:20:09.160 --> 00:20:11.160] And would that be sustainable?
[00:20:11.160 --> 00:20:11.960] Would that be functional?
[00:20:11.960 --> 00:20:17.000] Or do we really need to embody it in order for it to be a functioning self-aware AI?
[00:20:17.320 --> 00:20:28.440] Well, and I guess to make it even more complicated, if we're talking now about software, not hardware, software can control external hardware.
[00:20:28.440 --> 00:20:39.800] So even if it's not a robot body, if that AI has access to the grid or it has access to a server, could it then embody other machinery?
[00:20:39.960 --> 00:20:41.880] Yeah, maybe your house is its body.
[00:20:41.880 --> 00:20:42.280] Yeah.
[00:20:42.280 --> 00:20:43.800] Or a spaceship is its body.
[00:20:44.120 --> 00:20:45.720] That's Futurama right there.
[00:20:45.720 --> 00:20:46.920] Yeah, that's the best thing.
[00:20:47.080 --> 00:20:49.880] The spaceship is the, is the machine, is the machine.
[00:20:50.040 --> 00:20:51.720] But it's embodied in something.
[00:20:51.720 --> 00:20:52.840] Yeah, it's embodied in something.
[00:20:53.000 --> 00:20:55.880] It's embodied in something, but in a much more, I think, useful way.
[00:20:55.880 --> 00:20:59.880] Like that, to me, you could do a lot more with that than like a robot in a humanoid body.
[00:20:59.880 --> 00:21:01.560] Well, was HAL 9000 that?
[00:21:01.560 --> 00:21:05.080] I think so, because it had total control over the ship.
[00:21:05.560 --> 00:21:05.880] Yeah.
[00:21:06.280 --> 00:21:08.840] Yeah, except it was indistinguishable.
[00:21:08.800 --> 00:21:10.120] It seemed to be indistinguishable.
[00:21:10.280 --> 00:21:11.160] Anyway, it's fascinating.
[00:21:11.400 --> 00:21:16.320] The thing that's interesting is that we will be able to investigate all of these questions once we do it, right?
[00:21:16.320 --> 00:21:31.280] We could speculate now, but once we do it, we'll get a much better sense of how this sense of self and embodiment affects artificial intelligence, whether it's virtual or just in the void or if it's embodied in a robot.
[00:21:31.280 --> 00:21:32.400] We will see.
[00:21:32.400 --> 00:21:33.120] Cool.
[00:21:33.120 --> 00:21:39.840] All right, Kara, tell us about the new energy secretary, or at least the proposed new energy secretary.
[00:21:40.000 --> 00:21:42.320] You sound so happy when you say that.
[00:21:43.920 --> 00:21:49.520] I feel like this is going to be a new series because as I was prepping this, I found out that Dr.
[00:21:49.520 --> 00:21:53.920] Oz has been selected for Medicare and Medicaid.
[00:21:53.920 --> 00:21:54.560] Oh, what?
[00:21:57.040 --> 00:22:00.720] But I did not do a deep dive for that one.
[00:22:00.720 --> 00:22:01.600] So just set that up.
[00:22:01.840 --> 00:22:04.320] I thought a board of trustees dealt with that.
[00:22:04.320 --> 00:22:05.440] I guess there's somebody at the top.
[00:22:05.840 --> 00:22:07.200] There's always somebody at the top.
[00:22:07.520 --> 00:22:10.080] There's always somebody appointed who's in charge.
[00:22:10.080 --> 00:22:12.000] So instead, we're not going to talk about Dr.
[00:22:12.000 --> 00:22:13.760] Oz, at least not this week.
[00:22:14.080 --> 00:22:18.240] We will be talking about Chris Wright.
[00:22:18.640 --> 00:22:21.840] His full name is Christopher Allen Wright.
[00:22:21.840 --> 00:22:25.040] He's the CEO of Liberty Energy.
[00:22:25.040 --> 00:22:37.760] It's the second largest fracking company in the U.S., and he is the presumptive nominee for United States Secretary of Energy under this next Trump presidency.
[00:22:37.760 --> 00:22:42.400] He's obviously got a lot of experience in the energy sector.
[00:22:42.400 --> 00:22:51.520] He is a board member of a nuclear energy company, also a board member of a royalty payment company for mineral rights and mining rights.
[00:22:51.520 --> 00:22:56.480] But there's a little bit of a wrinkle in that he does not believe in climate change.
[00:22:56.480 --> 00:22:58.560] Didn't he also work for a solar company, too?
[00:22:58.560 --> 00:22:59.040] I heard?
[00:22:59.040 --> 00:23:03.320] He's worked for, yeah, he's been on boards and worked for like companies across the board.
[00:23:03.480 --> 00:23:18.600] And that's what Trump really pushed when he did his post on, I think, Truth Social, where he said, I'm thrilled to announce that Chris Wright will be joining my administration as both United States Secretary of Energy and member of the newly formed Council of National Energy.
[00:23:18.600 --> 00:23:21.560] He's been a leading technologist and entrepreneur in energy.
[00:23:21.560 --> 00:23:24.840] He's worked in nuclear, solar, geothermal, and oil and gas.
[00:23:25.080 --> 00:23:27.880] He is an oil and gas executive.
[00:23:27.880 --> 00:23:31.880] He is a firm believer.
[00:23:31.880 --> 00:23:39.880] Well, I actually, it's hard to know what somebody actually believes in their mind, but he's a firm proponent, not proponent, that's not the right word either.
[00:23:40.200 --> 00:23:48.280] He claims that there are no negative impacts from fossil fuel energy on the climate.
[00:23:48.920 --> 00:23:49.480] Wow.
[00:23:49.880 --> 00:23:51.320] That's a remarkable statement.
[00:23:51.640 --> 00:23:52.360] Pablo.
[00:23:52.360 --> 00:23:52.920] Thank you.
[00:23:52.920 --> 00:23:53.400] Despite that.
[00:23:53.640 --> 00:24:11.000] He claims in a video that he posted on his LinkedIn, and this is what he labeled the video: five commonly used words around energy and climate that are both deceptive and destructive: climate crisis, energy transition, carbon pollution, clean energy, and dirty energy.
[00:24:11.000 --> 00:24:13.240] Hashtag energy sopriety.
[00:24:13.240 --> 00:24:23.240] So he claims that, quote, we have seen no increase in the frequency or intensity of hurricanes, tornadoes, droughts, or floods, despite endless fear-mongering.
[00:24:23.240 --> 00:24:26.200] He says that there is no climate crisis.
[00:24:26.200 --> 00:24:41.640] And he goes on in this 12 and a half minute video that he posted to his LinkedIn about a year ago to basically argue that carbon dioxide cannot be a pollutant and carbon dioxide cannot have all of these downstream negative consequences because it's natural.
[00:24:42.600 --> 00:24:47.680] Because it's a natural phenomenon that occurs via photosynthesis.
[00:24:47.680 --> 00:24:49.600] Right, which is a nonsensical argument.
[00:24:49.600 --> 00:24:50.880] Yeah, and respiration.
[00:24:44.680 --> 00:24:51.120] Yeah.
[00:24:51.440 --> 00:24:53.200] So it's pretty scary.
[00:24:53.200 --> 00:25:02.400] He says that there is no climate crisis and the negative impacts from climate change, because of course he can't fully argue that climate change doesn't exist.
[00:25:02.400 --> 00:25:04.400] Like there are very few people who do that now.
[00:25:04.400 --> 00:25:06.400] Instead, they've sort of moved the goalposts.
[00:25:06.400 --> 00:25:11.760] And he says that the negative impacts from climate change are less than the benefits of using fossil fuels.
[00:25:11.760 --> 00:25:22.480] So he is a firm believer that we need to continue to drill, we need to continue to burn, that these approaches to energy are going to allow us energy independence.
[00:25:22.480 --> 00:25:26.640] And according to Donald Trump's Truth Social Post, energy, U.S.
[00:25:26.720 --> 00:25:29.280] energy dominance, he put it in all caps.
[00:25:30.640 --> 00:25:31.200] Dominant.
[00:25:31.520 --> 00:25:35.520] Yeah, which is a large goal of the administration.
[00:25:35.840 --> 00:25:41.120] You know, to be fair, Chris Wright has worked in alternative energy.
[00:25:41.120 --> 00:25:45.440] He works in energy, which means he's worked in renewables and non-renewables.
[00:25:45.600 --> 00:25:52.400] It does appear, I cannot speak for him, but it does appear that the motivation here is money.
[00:25:52.720 --> 00:25:54.480] It's not clean energy.
[00:25:54.480 --> 00:25:56.880] It's just energy, right?
[00:25:57.200 --> 00:26:04.640] And however it's going to be the most lucrative and the easiest to produce that energy is going to be the path.
[00:26:04.640 --> 00:26:10.320] And that's what's so scary because we know the cost now of natural gas.
[00:26:10.320 --> 00:26:13.360] We know the cost of crude oil.
[00:26:13.360 --> 00:26:16.000] We know the cost of fracking.
[00:26:16.000 --> 00:26:19.520] And these just aren't arguments that he's making.
[00:26:19.520 --> 00:26:25.520] And he's going to have inordinate power if he's confirmed by the Senate to lead the Department of Energy.
[00:26:25.760 --> 00:26:27.040] He will 100% be confirmed.
[00:26:27.040 --> 00:26:28.720] There's no way that they're not going to confirm him.
[00:26:28.720 --> 00:26:29.760] Oh, God, don't say that.
[00:26:30.200 --> 00:26:31.400] But let me tell you this, Kara.
[00:26:31.400 --> 00:26:34.520] I'm going to make an argument for why this is not as bad as it seems.
[00:26:34.840 --> 00:26:35.400] I know where you're going.
[00:26:35.640 --> 00:26:36.120] Good luck.
[00:26:37.800 --> 00:26:41.000] And I'm not just comparing him to the other secretary, whatever other people.
[00:26:41.240 --> 00:26:42.120] No, I know where you're going with this.
[00:26:42.440 --> 00:26:43.000] It's an argument.
[00:26:43.320 --> 00:26:43.720] It's an argument.
[00:26:44.120 --> 00:26:44.760] Here's my argument.
[00:26:44.920 --> 00:26:45.640] Let me put it out.
[00:26:46.120 --> 00:26:51.640] Obviously, it's bad to have somebody in that position who just straight up denies the science, right?
[00:26:51.640 --> 00:26:52.680] That's not a good thing.
[00:26:52.680 --> 00:27:02.520] And this will absolutely be a setback, and it would be worse, obviously, than if we had somebody who was fully on board with transitioning away from fossil fuels, which he isn't.
[00:27:03.320 --> 00:27:15.240] But at this point in time, essentially, we have two strategies for transitioning to renewable energy, green energy, low-carbon energy, and away from fossil fuels.
[00:27:15.240 --> 00:27:23.960] And these are not mutually exclusive, but there's some combination of reducing supply and reducing demand for fossil fuels, right?
[00:27:24.280 --> 00:27:29.640] So far, we are not taking the reduce the supply approach.
[00:27:29.640 --> 00:27:41.480] Under the Biden administration, the United States is producing more fossil fuels than any other country at any time ever in human history, including during Trump's administration.
[00:27:42.360 --> 00:27:43.400] So we're right on plan then.
[00:27:43.720 --> 00:27:46.440] Is that a function of just there being more people in more need?
[00:27:46.680 --> 00:27:48.760] It's a function of, you know what it is?
[00:27:48.760 --> 00:27:51.160] It's a function of Russia invading Ukraine.
[00:27:51.160 --> 00:27:59.960] So we, of course, we jacked up our oil and gas production to essentially displace Russia's sales to Europe.
[00:28:00.440 --> 00:28:09.560] We're trying to replace Russia's sales to Europe of natural gas and oil, and that put us over the top to more production than we've ever done before.
[00:28:09.560 --> 00:28:14.880] So, the idea that it's always been silly for Trump to say, we're going to bring oil back and we're going to be dominant.
[00:28:14.880 --> 00:28:15.760] We're already there, dude.
[00:28:14.600 --> 00:28:20.560] He's already producing more oil than we ever have before, or anyone has ever had.
[00:28:20.880 --> 00:28:22.800] But Biden didn't do that willy-nilly.
[00:28:23.120 --> 00:28:24.080] No, he did it deliberately.
[00:28:24.320 --> 00:28:26.240] He was deliberate with success.
[00:28:26.400 --> 00:28:28.320] But the point is, we're already there.
[00:28:28.320 --> 00:28:30.640] We're already producing all this oil.
[00:28:30.640 --> 00:28:33.280] And we're doing that to keep prices down.
[00:28:33.280 --> 00:28:38.720] Now, keeping prices down is actually a good thing because it lowers the value of pulling that oil out of the ground.
[00:28:38.720 --> 00:28:45.760] It's also good because it takes money away from authoritarians who are basically funded by the sale of fossil fuels.
[00:28:45.760 --> 00:28:46.240] Right.
[00:28:46.480 --> 00:28:50.720] So, ideally, ideally, we will reduce demand first.
[00:28:50.720 --> 00:28:52.240] How do we reduce demand first?
[00:28:52.240 --> 00:28:58.560] Because then that again reduces the cost, reduces the value of fossil fuel and the incentive to go after it.
[00:28:58.560 --> 00:29:02.160] You do that by transitioning to green energy, right?
[00:29:02.160 --> 00:29:09.920] So, fewer cars that run on gas, fewer cars that are running gas, and then fewer coal and gas-powered energy production.
[00:29:09.920 --> 00:29:19.520] So, for right now, it's more important that we build non-fossil fuel resources than that we restrict fossil fuel.
[00:29:19.520 --> 00:29:22.320] Eventually, we have to dial down the fossil fuel.
[00:29:22.320 --> 00:29:31.280] But for now, if we're just investing in expanding our non-fossil fuel infrastructure, that's fine.
[00:29:31.280 --> 00:29:42.560] And so, my hope is: again, this is just a hope, but there has been a lot of good news recently on the nuclear power front, which I just summarized in my blog post.
[00:29:42.560 --> 00:29:53.600] The Biden administration and also a consortium of countries around the world have pledged to triple nuclear power capacity by 2050.
[00:29:53.600 --> 00:29:54.320] Good.
[00:29:54.320 --> 00:29:55.040] Wow.
[00:29:55.040 --> 00:29:55.600] Triple.
[00:29:55.600 --> 00:29:56.560] That's huge.
[00:29:56.560 --> 00:29:57.600] That is huge.
[00:29:57.600 --> 00:29:58.880] It's a big piece of the puzzle.
[00:29:59.320 --> 00:30:09.000] Yeah, now, if by 2050 we have a 50% increase in our energy demand, that means doubling the nuclear percentage of production.
[00:30:09.000 --> 00:30:11.320] So right now we're about 19, 20%.
[00:30:11.320 --> 00:30:15.960] So we're talking about going to about 40% nuclear worldwide and in the U.S.
[00:30:16.200 --> 00:30:18.760] And that's probably where we should be.
[00:30:18.760 --> 00:30:26.840] So I don't know of any reason why this guy or why the Trump administration is going to undo nuclear.
[00:30:27.000 --> 00:30:28.840] No, I think he's pro-nuclear.
[00:30:28.920 --> 00:30:29.640] They're pro-nuclear.
[00:30:29.800 --> 00:30:31.240] This has broad bipartisan support.
[00:30:31.320 --> 00:30:34.200] So yeah, this has broad bipartisan support.
[00:30:34.200 --> 00:30:40.280] So as long as this keeps happening, that could keep us on pace to where we need to get by 2050.
[00:30:41.000 --> 00:30:43.240] It may not be good for the solar or the wind industry.
[00:30:43.560 --> 00:30:44.120] I get that.
[00:30:44.120 --> 00:30:46.120] That's where I'm more concerned.
[00:30:46.120 --> 00:30:57.000] But here's the thing: some people have argued that because wind and solar are currently the cheapest form of new energy to add to the grid, that it doesn't need a lot of subsidies at this point in time.
[00:30:57.000 --> 00:30:59.560] Companies are doing it because it's the cheapest.
[00:30:59.560 --> 00:31:10.520] And so hopefully that will have inertia unless they actively try to inhibit it, which they may, which Trump may just decide to mess with the wind industry just to do it because he doesn't like wind.
[00:31:10.920 --> 00:31:12.520] It kills birds, you know, and stuff.
[00:31:12.520 --> 00:31:15.720] I don't think this guy would do that because, as you say, he's kind of neutral.
[00:31:16.040 --> 00:31:22.040] Yeah, I think he's neutral about the source, like from a moralistic perspective, but that's actually a bad thing.
[00:31:22.360 --> 00:31:23.080] Yeah, I agree.
[00:31:23.080 --> 00:31:36.840] But I mean, but for now, doing the all of the above so that at least the renewables and the nuclear and the geothermal and the hydroelectric can still grow and expand, it may not be that much of a disaster, is what I'm saying.
[00:31:36.840 --> 00:31:38.040] It may not be bad.
[00:31:38.600 --> 00:31:39.160] I don't know.
[00:31:39.160 --> 00:31:42.760] I think that he wants to fully deregulate oil and gas.
[00:31:42.840 --> 00:31:47.440] They're going to deregulate, but they're also probably going to deregulate nuclear and deregulate solar and wind, too.
[00:31:44.840 --> 00:31:49.360] Yeah, but that's not going to.
[00:31:49.520 --> 00:31:56.160] So all of those things, yes, are going to make for more competition for alternative sources in the marketplace.
[00:31:56.160 --> 00:31:59.680] But what they don't do is they do not mitigate the pollution.
[00:32:00.000 --> 00:32:00.480] Of course.
[00:32:00.480 --> 00:32:00.880] Absolutely.
[00:32:01.680 --> 00:32:04.400] That is what is actually causing the climate crisis.
[00:32:04.400 --> 00:32:05.040] I agree.
[00:32:05.040 --> 00:32:14.320] But I think though my point is it's really complicated to try to figure out over the next four years what the net effect of this is going to be.
[00:32:14.320 --> 00:32:22.080] And if they continue to expand the non-fossil fuel infrastructure, it may not be that dramatic a difference.
[00:32:22.080 --> 00:32:32.240] And if we are in a much better place in terms of more nuclear, more wind, more solar in four years, that might be a better time to start really thinking of ways to dial back fossil fuels.
[00:32:32.240 --> 00:32:34.160] It wasn't going to happen in the short term anyway.
[00:32:34.160 --> 00:32:35.600] It wasn't happening under Biden.
[00:32:35.600 --> 00:32:38.000] It's definitely not going to happen under Trump.
[00:32:38.000 --> 00:32:42.880] So how quickly does, I mean, don't these nuclear plants take quite a while to get online.
[00:32:43.200 --> 00:32:48.960] But part of what Biden is already doing, he also put together, I mean, there's so many things going on.
[00:32:48.960 --> 00:32:53.520] So he announced $900 million to support startup Gen 3 nuclear reactors.
[00:32:53.520 --> 00:33:00.960] He was part of 25 signatories pledging tripling nuclear capacity by 2050.
[00:33:00.960 --> 00:33:13.360] And also there's the Advance Act, which was just passed with bipartisan support, which streamlines regulations and also provides sweeping support for the nuclear industry.
[00:33:13.360 --> 00:33:24.320] So they're trying to figure out ways, specifically, there's a commission, it's like figure out ways to make us be able to build nuclear reactors cheaper and faster and to streamline all the regulations.
[00:33:24.320 --> 00:33:25.680] That's already happening.
[00:33:25.680 --> 00:33:30.000] Again, I don't see that being undone under this guy or Trump.
[00:33:30.200 --> 00:33:45.400] You know, and this is my spidey senses picking up, but like, while I agree that the regulatory burden is high right now, and we've talked about this on the show before, and there does need to be some streamlining, I think that it is still very important.
[00:33:45.720 --> 00:33:46.200] Absolutely.
[00:33:46.200 --> 00:33:46.920] You can go too far.
[00:33:47.400 --> 00:33:48.200] Safely.
[00:33:48.200 --> 00:33:48.600] Absolutely.
[00:33:48.760 --> 00:33:54.120] And I'm very worried that too much streamlining could lead to disaster.
[00:33:54.120 --> 00:33:55.560] And we should be worried about that.
[00:33:55.560 --> 00:33:56.200] We should be worried about that.
[00:33:56.280 --> 00:33:59.480] Because then that will set us back, like the last disaster set us back.
[00:33:59.720 --> 00:34:01.720] So the devil's always going to be in the details.
[00:34:01.720 --> 00:34:03.560] And there is like a nuclear industry, too.
[00:34:03.560 --> 00:34:06.520] You know, they don't necessarily want to build unsafe reactors.
[00:34:06.520 --> 00:34:07.080] That's what I'm saying.
[00:34:07.080 --> 00:34:10.760] The net effect, it's hard to really calculate the net effect of all of this.
[00:34:10.760 --> 00:34:15.720] So, yes, they probably, because Trump deregulates with a machete, not with a scalpel, right?
[00:34:15.720 --> 00:34:16.680] We've seen that before.
[00:34:16.680 --> 00:34:18.920] That's clearly what they're going to do now.
[00:34:18.920 --> 00:34:22.040] And so, yeah, so that is a legitimate concern.
[00:34:22.040 --> 00:34:34.920] But, you know, the investors and the industry probably will welcome the deregulation, but hopefully, there's already international standards in place for the nuclear industry, and hopefully, they won't be eroded too much.
[00:34:35.720 --> 00:34:40.920] You know, it's like that is really where it bumps up, and that's the part where I don't have the same kind of hope that you have.
[00:34:41.400 --> 00:34:43.000] I'm trying to be positive over here, Karen.
[00:34:43.160 --> 00:34:46.760] But the other thing is, in four years, they could all snap back, you know what I mean?
[00:34:46.760 --> 00:34:52.120] Or at least some of them, or if they went too far, we could then, you know, we have time to claw that back.
[00:34:52.120 --> 00:34:54.680] It's not like whatever happens now is forever.
[00:34:54.680 --> 00:35:02.120] It's really just we're looking at four years with regards to regulation is forever if a plant is built under those regulations.
[00:35:02.280 --> 00:35:03.400] Yeah, probably not in four years.
[00:35:03.400 --> 00:35:04.280] That's a little fast.
[00:35:04.440 --> 00:35:05.720] Yeah, and that's the hope, right?
[00:35:05.720 --> 00:35:17.040] Because ultimately, while I agree with you that people who work in this sector do not want unsafe plants, there are many people who care more about profits than they do about it.
[00:35:17.040 --> 00:35:17.840] Absolutely.
[00:35:14.840 --> 00:35:19.760] So this is a complicated issue.
[00:35:20.000 --> 00:35:23.040] This is one that I am definitely going to be keeping my eye on.
[00:35:23.040 --> 00:35:26.240] I think it's not all doom and gloom.
[00:35:26.240 --> 00:35:35.920] This guy, you know, because of the nuclear thing, because that's all gaining momentum, I'm hoping that over the next four years, that's where they'll focus their efforts.
[00:35:35.920 --> 00:35:37.040] I hope so, too.
[00:35:37.040 --> 00:35:57.280] I mean, I do, I agree with you that it's not all doom and gloom, but I do think that the goalposts have been moved so far at this point that the reason it's not all doom and gloom is because, exactly like you're saying, maybe we'll have nuclear, but like the deregulation of fossil fuels is scaring the living shit out of me right now.
[00:35:57.360 --> 00:35:58.400] I got to be honest.
[00:35:58.400 --> 00:35:59.200] Yeah, I agree.
[00:35:59.200 --> 00:36:00.000] I agree.
[00:36:00.000 --> 00:36:02.000] I'm not sure how much more damage they can do.
[00:36:02.000 --> 00:36:04.480] We're already producing more than we've ever produced.
[00:36:04.480 --> 00:36:04.960] You know what I mean?
[00:36:05.040 --> 00:36:06.000] There's like only so much they can produce.
[00:36:06.160 --> 00:36:10.160] We're already producing more than we've ever produced with strong regulations in place.
[00:36:10.160 --> 00:36:13.200] So imagine when the lid comes off.
[00:36:13.840 --> 00:36:14.960] So we'll see.
[00:36:15.360 --> 00:36:16.720] We'll keep everyone updated.
[00:36:17.280 --> 00:36:22.720] We'll see where it is in the spectrum of worst case versus best case scenario.
[00:36:22.960 --> 00:36:26.720] I mean, of all of Trump's appointments, this is not the one that keeps me up at night.
[00:36:26.720 --> 00:36:27.040] Yeah.
[00:36:27.760 --> 00:36:28.880] Yes, this is not ideal.
[00:36:28.880 --> 00:36:32.400] This is basically what you would, this is exactly what I would have expected.
[00:36:32.400 --> 00:36:36.640] I guess it could have been worse, but I didn't expect somebody who fully denies climate change.
[00:36:36.800 --> 00:36:37.440] Oh, yeah, I did.
[00:36:37.440 --> 00:36:38.080] Oh, 100%.
[00:36:38.720 --> 00:36:39.680] I just thought we were past this.
[00:36:40.000 --> 00:36:41.120] No, because Trump denies it.
[00:36:41.120 --> 00:36:42.800] Trump is 100% denying climate change.
[00:36:43.120 --> 00:36:44.640] He's still saying it's a hoax.
[00:36:44.640 --> 00:36:49.920] Clearly, though, there's two people that keep me up at night in terms of the appointments right now.
[00:36:49.920 --> 00:36:53.600] One is Tulsi Gabbard, you know, because she's don't need to get into that.
[00:36:53.920 --> 00:36:57.600] As a head of intelligence, that's like actually dangerous for the country.
[00:36:57.600 --> 00:37:00.840] And the other one's RFK Jr., because he could destroy the federal health care.
[00:37:01.640 --> 00:37:02.520] Now I'm going to add Dr.
[00:37:02.520 --> 00:37:03.400] Oz to that.
[00:37:03.560 --> 00:37:03.960] Dr.
[00:36:59.840 --> 00:37:06.520] Oz is not nearly as bad as RFK Jr.
[00:37:06.840 --> 00:37:10.280] You know, I don't.
[00:37:10.280 --> 00:37:15.080] I don't know how much shift he could make at Medicaid and Medicaid.
[00:37:15.080 --> 00:37:20.760] Whereas RFK is like actively wants to cause mischief, wants to oppose vaccines.
[00:37:21.560 --> 00:37:23.800] He is the wrecking ball.
[00:37:24.120 --> 00:37:29.400] And, you know, Gabbard just doesn't know reality from fantasy, and that's dangerous as the head of intelligence.
[00:37:29.400 --> 00:37:31.240] That's very, very dangerous.
[00:37:31.240 --> 00:37:36.360] But I guess my thing is, you can say don't get vaccinated all you want.
[00:37:36.360 --> 00:37:43.720] I don't know if he can actually block a vaccine from being available to the public, but you can say don't get vaccinated all you want.
[00:37:43.720 --> 00:37:49.960] But if you defund or deeply change the structure of Medicaid and Medicare, people will die.
[00:37:49.960 --> 00:37:52.200] Lots and lots of people will die.
[00:37:52.200 --> 00:37:54.920] Well, there's lots of ways that RFK Jr.
[00:37:55.000 --> 00:37:59.160] can undermine our vaccine infrastructure in this country.
[00:37:59.800 --> 00:38:00.760] This is a separate talk.
[00:38:00.760 --> 00:38:07.480] Maybe we'll give this, you know, probably what we should do is bring David Gorsky on because he's been really writing a lot about RFK Jr.
[00:38:07.800 --> 00:38:10.680] And like we'll have a little primer last or two weeks ago.
[00:38:10.840 --> 00:38:14.600] We'll do a good deep dive on like what could he actually do?
[00:38:14.600 --> 00:38:16.600] Because that is a very interesting question.
[00:38:17.960 --> 00:38:22.200] But that's, I think, the most, you know, those two are the most scary appointees so far.
[00:38:22.360 --> 00:38:25.000] It's a big question: will he have a larger U.S.
[00:38:25.000 --> 00:38:27.000] body count than he had during COVID?
[00:38:27.000 --> 00:38:27.640] That's the big question.
[00:38:27.880 --> 00:38:28.280] Possibly.
[00:38:28.680 --> 00:38:29.800] It is possible.
[00:38:29.800 --> 00:38:31.720] Yeah, the next pandemic scares me.
[00:38:31.720 --> 00:38:34.920] All right, Bob, tell us about finding Planet 9.
[00:38:35.800 --> 00:38:36.760] My turn, is it?
[00:38:36.760 --> 00:38:37.160] Okay.
[00:38:37.480 --> 00:38:40.120] Planet X, or is it Planet 9?
[00:38:40.120 --> 00:38:41.320] Was in the news recently.
[00:38:41.320 --> 00:38:55.520] Scientists have published a proposal to use an array of 200 small telescopes that they say can prove if a massive planet indeed exists in the farthest reaches of our solar system in the region where the so-called trans-Neptunian objects dwell.
[00:38:55.520 --> 00:39:04.400] Daniel Gomez and Gary Bernstein from the Department of Physics and Astronomy, University of Pennsylvania, wrote on the online archive their paper.
[00:39:04.400 --> 00:39:11.520] It is named an automated occultation network for gravitational mapping of the trans-Neptunian solar system.
[00:39:11.520 --> 00:39:17.360] Okay, so to better appreciate this, let's explore the few bits of terminology typically found in these discussions.
[00:39:17.360 --> 00:39:20.640] First off, is it Planet X or is it Planet 9?
[00:39:20.640 --> 00:39:22.400] Planet X is more general.
[00:39:22.400 --> 00:39:25.600] That's a general term that's been used for many, many years.
[00:39:25.600 --> 00:39:27.120] Many years ago, a hundred years.
[00:39:28.560 --> 00:39:31.360] Right, used for the potential planet beyond Neptune.
[00:39:31.360 --> 00:39:32.240] That's Planet X.
[00:39:32.240 --> 00:39:49.200] Planet 9, on the other hand, is often used interchangeably with Planet X, of course, but it seems Planet 9 is used most often when referring to the idea that the ninth planet of our solar system could potentially be found by observing its impact on the orbits of trans-Neptunian objects.
[00:39:49.440 --> 00:39:55.120] So that's where you're going to mostly find the term Planet 9, and that makes sense, and that's fine.
[00:39:55.360 --> 00:39:58.480] All right, so this brings us to trans-Neptunian objects.
[00:39:58.480 --> 00:40:00.960] And it's not hard to predict what that term refers to.
[00:40:00.960 --> 00:40:04.240] It refers to objects beyond the orbit of Neptune.
[00:40:04.240 --> 00:40:07.760] But those objects have two primary categories.
[00:40:07.760 --> 00:40:15.200] The most distant trans-Neptunian objects exist in a region that I wasn't aware of called the scattered disk.
[00:40:15.520 --> 00:40:20.240] Now, these are really, really far away, up to 100 AUs, astronomical units.
[00:40:20.240 --> 00:40:24.800] Each AU is the distance from the Earth to the Sun, 93 million miles.
[00:40:25.040 --> 00:40:27.440] Sorry, I don't have the kilometers memorized.
[00:40:27.440 --> 00:40:34.840] The scattered disk contains small icy bodies, and they're in very eccentric orbits, really high off of the plane.
[00:40:34.840 --> 00:40:40.200] The other major area where trans-Neptunian objects exist, this is the place you really want to be.
[00:40:40.440 --> 00:40:43.480] If you ever hang out beyond Neptune, it's going to be the Kuiper Belt.
[00:40:43.480 --> 00:40:44.280] That's where you got to go.
[00:40:45.720 --> 00:40:47.000] Sure, sure, you know.
[00:40:47.800 --> 00:40:55.800] The Kuiper Belt starts at Neptune's orbit right beyond its orbit at 30 AUs and stretches out to 50 AUs or 55 AUs, I've heard as well.
[00:40:55.800 --> 00:40:57.480] So it's huge.
[00:40:57.480 --> 00:41:06.520] It's about 20 times as wide as the asteroid belt that we know between Mars, we know very well between Mars and Jupiter.
[00:41:06.520 --> 00:41:11.720] 20 times as wide and potentially 200 times as massive.
[00:41:11.720 --> 00:41:14.120] So the Kuiper Belt is gargantuan.
[00:41:14.120 --> 00:41:17.720] Kuiper Belt objects, though, they're not technically asteroids.
[00:41:17.720 --> 00:41:19.160] I wasn't quite aware of this.
[00:41:19.160 --> 00:41:29.000] As far as I can tell, it's because the word asteroid is mainly reserved for a location, not really what you're made of, but really where you exist.
[00:41:29.000 --> 00:41:34.920] So the large rocky objects between or near the orbits of Mars and Jupiter, those are asteroids.
[00:41:34.920 --> 00:41:38.440] So if you're there, if you come from there, you're an asteroid.
[00:41:38.440 --> 00:41:43.960] But Kuiper Belt objects are not referred to as asteroids.
[00:41:44.200 --> 00:41:46.520] They're just basically Kuiper Belt objects.
[00:41:46.520 --> 00:41:49.320] And they're actually, they're different also as well.
[00:41:49.320 --> 00:41:57.000] They're made up of frozen volatiles, various ices composed of methane, ammonia, and water, too, as well.
[00:41:57.000 --> 00:42:00.280] Trans-Neptunian objects are anything beyond Neptune.
[00:42:00.280 --> 00:42:07.800] And within that area, there's a huge Kuiper belt area, and there's also the more distant scattered disk object area.
[00:42:07.800 --> 00:42:13.160] So now, these objects are thought to be remnants from the solar system's formation.
[00:42:13.160 --> 00:42:14.480] So they are ancient.
[00:42:14.120 --> 00:42:18.000] And since they're so far away, they're basically unchanged.
[00:42:18.080 --> 00:42:24.160] So they would be amazing repositories of information of the early solar system because they have not been melted.
[00:42:24.160 --> 00:42:26.640] They have not been changed really in any way out there.
[00:42:26.640 --> 00:42:29.600] Now, their distant orbits, though, this was interesting.
[00:42:29.760 --> 00:43:51.080] They're in such distant orbits beyond Neptune, we think because Jupiter and Saturn got basically they got together and they imposed their gravitational will on these remnants and they forced them from out from in the early in the you know in the inner solar system closer maybe to Jupiter Saturn area perhaps but they've they've basically forced them out into the orbits that they are in now beyond Neptune the question then becomes can there be a true planet sized object out there a planet 9 or even multiple sub such objects hiding in the Kuiper belt many people think so now the evidence most often cited for this you know it's subtle it's nothing really that's overt but it's it is there and a lot of people are looking into it very closely it's this subtle clustering of orbits of some of these Kuiper belt objects to the scientists to the astronomers the orbits just don't seem to be as randomly distributed as you would expect them to be and one one next explanation they contend could be a very distant unseen planet some say it could have as much as as many as five earth masses a super earth out there waiting to be found that's five, I mean, that's would be i don't i mean, I don't believe that, but i think there could be something out there, and i hope i hope there is.
[00:43:51.080 --> 00:43:52.200] That would be amazing.
[00:43:52.200 --> 00:43:56.120] Now, of course they have they have searched and searched for Planet 9.
[00:43:56.360 --> 00:43:58.040] Nothing has has been found.
[00:43:58.040 --> 00:44:00.920] And this is where the paper comes in, this latest paper.
[00:44:00.920 --> 00:44:24.760] So the authors contend that using 200 small telescopes, like something like 20, 30 centimeters, I mean, pretty small, separated by five kilometers and lined up in an array that stretches end to end a thousand kilometers wide, that such an array of 200 telescopes could tell us these critical details about Planet 9, if it even exists.
[00:44:24.760 --> 00:44:31.080] So the key technique that they describe in detail in their paper is called occultation.
[00:44:31.080 --> 00:44:35.320] Occultation appears more capable and fascinating than I would have thought.
[00:44:35.320 --> 00:44:36.920] So here's how this works.
[00:44:36.920 --> 00:44:41.160] So imagine you're observing an asteroid or a trans-Neptunian object.
[00:44:41.160 --> 00:44:49.640] So you're observing it, and you precisely time to the nanosecond or so when it blocks a distant star, right?
[00:44:49.640 --> 00:44:59.560] It's moving in its orbit and it moves in front of a distant star that's in our galaxy somewhere, say, whatever, 10, 20, 30, 40 light years away, whatever it is.
[00:44:59.560 --> 00:45:06.280] So you time to the nanosecond when it's blocked, and then also to the nanosecond when the star reappears.
[00:45:06.280 --> 00:45:08.680] And we could do that very, very precisely.
[00:45:08.680 --> 00:45:17.000] Now, if you do that, not only with one telescope, but 200 of these telescopes, each one having their own slightly different angle, right?
[00:45:17.000 --> 00:45:22.600] Each one has its own specific angle onto that occultation event.
[00:45:22.600 --> 00:45:26.280] And so they'll have their own view, their own timings.
[00:45:26.280 --> 00:45:40.040] So if you take all these 200 timings and put them together, you combine all that data, what you get is you get an extremely precise understanding of the asteroid's orbit, where it is, and when, very, very precisely.
[00:45:40.040 --> 00:45:41.240] It gets even better than that.
[00:45:41.240 --> 00:45:50.480] The more of these occultations that you observe, the more accurate your timings and your positional data become, more so than any other method that's used alone.
[00:45:50.720 --> 00:45:59.920] So, then ultimately, then the idea here is that once you have these hyper-accurate orbits mapped out, we can then detect very subtle gravitational anomalies, right?
[00:45:59.920 --> 00:46:14.960] If we know down to down to the third, fourth, fifth, sixth decimal point when this star should be blocked by an asteroid or a trans-Neptunian object, if we know where that, and it doesn't happen, then you've then there's some, you had you have an anomaly.
[00:46:14.960 --> 00:46:20.320] You have a gravitational orbital anomaly, and that's something that can be investigated.
[00:46:20.320 --> 00:46:34.320] So, we may discover, for example, through these anomalies that various asteroids are moving, or trans-Neptunian objects, are moving in a way that points to an unknown large gravitational source in a very specific orbital location.
[00:46:34.320 --> 00:46:35.920] In other words, planet 9.
[00:46:35.920 --> 00:46:42.880] So, that's the hope that this hyper-accurate information can actually say there's got to be something over here.
[00:46:42.880 --> 00:46:50.160] Multiple asteroids, multiple trans-Neptunian objects are telling us that there's some mass in this specific area.
[00:46:50.640 --> 00:46:56.240] It seems like it's got, say, two Earth masses, and it's in this orbit this distance from the sun.
[00:46:56.240 --> 00:46:58.640] It could potentially be that specific.
[00:46:58.640 --> 00:47:08.320] So, all we would then have to do is just zoom in on that specific area, and we'd have a relatively, you know, very small parcel of space to investigate, and we could potentially find it.
[00:47:08.320 --> 00:47:09.600] Best case scenario.
[00:47:09.600 --> 00:47:12.320] That's how Obi-Wan Kenobi discovered Carino.
[00:47:12.320 --> 00:47:12.960] Yeah, exactly.
[00:47:12.960 --> 00:47:13.360] Red Law.
[00:47:13.520 --> 00:47:16.000] Oh, that's right, because something was missing.
[00:47:16.000 --> 00:47:18.160] Yeah, there was a gravitational source that was missing.
[00:47:18.160 --> 00:47:21.520] But there was no, right, but there was no body assigned to it.
[00:47:21.520 --> 00:47:24.640] It was a dead spot in space, but there had to have been something there.
[00:47:24.640 --> 00:47:25.040] Yeah.
[00:47:25.040 --> 00:47:25.760] All right.
[00:47:26.080 --> 00:47:27.200] So even if.
[00:47:27.440 --> 00:47:28.960] Bob, we're talking science here.
[00:47:29.280 --> 00:47:29.800] Yes.
[00:47:29.520 --> 00:47:30.680] Science.
[00:47:31.000 --> 00:47:38.040] So even if, even if Planet 9, though, is a bust, a server like this could be incredibly informative about our outer solar system, right?
[00:47:38.040 --> 00:47:41.080] There's still so much to learn even without Planet 9.
[00:47:41.320 --> 00:47:54.600] They believe, the researchers believe that a 10-year survey could find 1,800 new trans-Neptunian objects and reveal details about their properties, their orbital dynamics, their surfaces, so many different things.
[00:47:55.160 --> 00:47:59.880] It could refine also our understanding of the boundary of our solar system and how it evolved.
[00:47:59.880 --> 00:48:04.440] And Bob, a lot of the objects that it discovers could be dwarf planets, even if they're not full planets.
[00:48:04.440 --> 00:48:04.760] Oh, yeah.
[00:48:04.760 --> 00:48:15.240] Well, yeah, well, I didn't say, but yeah, if it's not clear, Kuiper Belt, dwarf planets basically are all in the Kuiper Belt, but the one exception, I think, is Cirrus in the main asteroid belt.
[00:48:15.720 --> 00:48:17.400] Because pretty much all the other ones are in.
[00:48:17.480 --> 00:48:20.120] So, yeah, that's where we find dwarf planets.
[00:48:20.360 --> 00:48:34.520] Favorite long shot, though, is the outside chance that it could reveal information about primordial black holes, black holes that existed from the early, early small ones that existed early in the universe, soon after the Big Bang.
[00:48:34.520 --> 00:48:42.760] So if a primordial back hole was to pass in front of a star, we could detect that by the microlensing event that would happen to the stars, right?
[00:48:42.760 --> 00:48:46.840] It would just bend the light, and we could say, oh, crap, there's a super dense mass right there.
[00:48:46.840 --> 00:48:48.920] It could be a primordial black hole.
[00:48:48.920 --> 00:48:51.400] Long shot, I know, but that would be cool.
[00:48:51.400 --> 00:48:56.760] But it would be truly amazing finding a true planet in the Kuiper Belt.
[00:48:57.000 --> 00:48:57.960] How epic would that be?
[00:48:57.960 --> 00:49:01.800] That would be the astronomical discovery of the millennia, really.
[00:49:01.800 --> 00:49:06.440] I mean, another planet, potentially, you know, multiple Earth masses.
[00:49:06.680 --> 00:49:10.600] You know, the number would go, you know, back from eight back up to nine.
[00:49:10.600 --> 00:49:19.680] I think it would make Pluto feel a lot better, you know, since one of its Kuiper Belt buddies was recognized as a true planet, even if even if Pluto could never reattain that.
[00:49:20.000 --> 00:49:26.000] And at this project is estimated to cost only 15 million USD.
[00:49:26.960 --> 00:49:30.400] That really, that's smaller than I would have anticipated.
[00:49:30.880 --> 00:49:32.000] That really is, I mean, sure.
[00:49:32.000 --> 00:49:32.880] I mean, I'd like to have that much.
[00:49:32.960 --> 00:49:33.680] It's a grounding error.
[00:49:33.680 --> 00:49:34.080] It's pretty small.
[00:49:34.240 --> 00:49:34.640] Yeah, that's just a trendy trend.
[00:49:34.800 --> 00:49:35.760] Yeah, it really is.
[00:49:35.760 --> 00:49:36.720] It's so tiny.
[00:49:36.720 --> 00:49:38.400] It sounds like an amazing deal.
[00:49:38.640 --> 00:49:40.560] It seems like a no-brainer deal to me.
[00:49:40.560 --> 00:49:48.640] Of course, once this is truly vetted by other scientists and astronomers and make sure their numbers look good, I mean, this sounds like a great idea.
[00:49:48.640 --> 00:49:49.520] I hope they do it.
[00:49:49.520 --> 00:49:49.840] All right.
[00:49:49.840 --> 00:49:50.720] Thanks, Bob.
[00:49:50.720 --> 00:49:51.360] Sure.
[00:49:51.360 --> 00:49:54.720] Evan, tell us about stress and paranormal belief.
[00:49:54.960 --> 00:49:56.640] It stresses me out.
[00:49:57.920 --> 00:49:58.960] Does it, though?
[00:49:59.280 --> 00:50:00.640] Mel, it depends.
[00:50:00.640 --> 00:50:07.440] Did you know there's something called the Revised Paranormal Belief Scale, the RPBS?
[00:50:07.760 --> 00:50:11.920] This is a tool I had not heard of before.
[00:50:12.560 --> 00:50:13.680] And shame on me.
[00:50:13.680 --> 00:50:16.880] I probably should have read about this before at some point.
[00:50:17.200 --> 00:50:23.440] It is a 26-item survey that measures a person's belief in paranormal phenomena.
[00:50:23.440 --> 00:50:35.360] It's a widely used tool in parapsychological research and was developed by Jerome Tabasic and published in the International Journal of Transpersonal Studies in 2004.
[00:50:35.360 --> 00:50:40.480] So this has been around a while, and I believe there are even references to this prior to that.
[00:50:40.480 --> 00:50:43.760] So yeah, for many decades, this tool has been there.
[00:50:43.760 --> 00:50:50.320] Basically, you have the participants respond on a Likert scale.
[00:50:50.320 --> 00:50:54.080] One, strongly disagree, to seven, strongly agree.
[00:50:54.080 --> 00:50:58.720] And they ask you questions about: well, how do you feel about witchcraft?
[00:50:58.720 --> 00:51:04.040] How do you feel about superstition or spiritualism or extraordinary life forms?
[00:51:04.200 --> 00:51:06.840] And down the list it goes.
[00:51:06.840 --> 00:51:11.640] So much prior research has relied on this, using this tool.
[00:51:11.640 --> 00:51:19.080] And the results have suggested that paranormal belief in general is not linked to poorer psychological well-being.
[00:51:19.080 --> 00:51:26.680] However, certain facets of paranormal belief, such as superstition, could be linked with stress vulnerability.
[00:51:27.000 --> 00:51:28.840] You're saying that's the claim of these authors.
[00:51:29.800 --> 00:51:32.440] Well, no, this is, again, this is the prior research.
[00:51:32.440 --> 00:51:34.120] I haven't even gotten to the current study.
[00:51:35.160 --> 00:51:35.960] That's what I've read as well.
[00:51:35.960 --> 00:51:43.640] It's like superstition is the one that gets triggered by feeling lack of control, feeling under stress, et cetera, depression.
[00:51:43.960 --> 00:51:53.720] Until along came a revised version of this tool called the Rash Purified Revised Paranormal Belief Scale, R-P-P-B-S for short.
[00:51:53.720 --> 00:52:17.960] Yes, another tool to measure people's beliefs in paranormal phenomena, but it's considered a better statistical method that they use with improved validity and reliability compared to the original one, enhanced ability to compare results across different populations, and a more robust measurement of paranormal belief as a unidimensional construct.
[00:52:17.960 --> 00:52:28.120] I also wanted to look up, I said, you know, because I'm unfamiliar with these tools, I did a little more research into it to figure out, is this legitimate?
[00:52:28.120 --> 00:52:30.120] Is this considered scientific?
[00:52:30.120 --> 00:52:37.000] Or is this kind of just, you know, something that experimental researchers are kind of throwing together on their own?
[00:52:37.000 --> 00:52:38.200] But no, they say it is.
[00:52:38.200 --> 00:52:41.080] They say this is legitimate.
[00:52:41.080 --> 00:52:43.560] It's part of psychological research.
[00:52:43.560 --> 00:52:48.480] It could also be used for a parapsychological bogus research, but it is part of just psychological research.
[00:52:44.920 --> 00:52:51.360] It's considered accepted as a reliable tool.
[00:52:51.600 --> 00:52:56.640] Yeah, like psychologists studying conspiracy theories doesn't mean that they believe in the conspiracy theories, they're studying them.
[00:52:56.640 --> 00:52:57.280] It's the same thing.
[00:52:57.280 --> 00:52:58.960] They're studying paranormal beliefs.
[00:52:58.960 --> 00:53:04.160] And that means, yeah, so that gets us now to the news item this week, which we can now better understand.
[00:53:04.160 --> 00:53:16.160] There was a new study that was published in PLUS1, P-L-O-S One, titled Re-evaluation of the Relationship Between Paranormal Belief and Perceived Stress Using Statistical Modeling.
[00:53:16.160 --> 00:53:21.600] The authors are Kenneth Drinkwater, Andrew Dinovian, and Neil Dagnall.
[00:53:21.600 --> 00:53:43.360] Drinkwater and his colleagues had 3,084 people complete the RASH model survey, which is the more refined survey, alongside a questionnaire evaluating different facets of perceived stress called the perceived stress scale, of course, to help deepen their understanding of potential links between paranormal belief and stress.
[00:53:43.360 --> 00:54:00.640] Here are some quotes of what the researchers found: finding support that the notion that traditional paranormal belief is associated with external control, specifically the notion that unknown supernatural forces and powers influence existence.
[00:54:00.640 --> 00:54:08.480] Feelings of distress and reduced ability to cope with stress were associated with these traditional paranormal beliefs.
[00:54:08.480 --> 00:54:14.480] So superstition is one of those considered traditional paranormal beliefs, right?
[00:54:14.480 --> 00:54:22.720] Sort of belief in witchcraft, ghosts, these sorts of things, external forces that you don't have control over.
[00:54:22.720 --> 00:54:32.920] And on the other side of this coin are the New Age philosophy, sort of paranormal beliefs, dealing with psy, spiritualism, precognition.
[00:54:33.240 --> 00:54:41.800] They could not find it to be statistically linked to any tendencies regarding distress or coping for these New Age philosophy.
[00:54:41.800 --> 00:54:44.280] And these, this is what was expected.
[00:54:44.440 --> 00:54:49.560] in their findings, in line with the idea that traditional paranormal belief reflects anxiety.
[00:54:49.560 --> 00:54:53.880] And again, it's about that lack of control over those external forces.
[00:54:53.880 --> 00:55:00.600] They do admit the study was exploratory and does not support any cause-effect relationship.
[00:55:00.920 --> 00:55:01.880] Why is it important?
[00:55:01.880 --> 00:55:06.040] So why did the authors, you know, why are they bothering with this?
[00:55:06.040 --> 00:55:12.120] And I thought they summed it up nicely in the abstract of the paper, which I will read to you now, this part of it.
[00:55:12.120 --> 00:55:22.600] Research into paranormal belief is important because supernatural credence persists within contemporary society and potentially influences everyday attitudes and behavior.
[00:55:22.600 --> 00:55:31.960] For instance, investigators report that paranormal belief is associated with lower levels of trust in science and higher anti-science attitudes.
[00:55:31.960 --> 00:55:40.520] These are notions not based upon reasoned or reliable evidence which conflict with prevailing conceptions of the world.
[00:55:40.520 --> 00:55:49.320] Specific examples allied to belief in the paranormal are endorsement of alternative medicine, anti-vaccination, and conspiracies.
[00:55:49.320 --> 00:56:01.080] Evidence suggests that paranormal belief is a form of non-clinical delusion arising from an over-reliance on emotional content and the failure to rigorously evaluate the validity of information.
[00:56:01.080 --> 00:56:12.520] And what is it we talk about on this show for the past 20 years and the 10 years prior to that that we've been a skeptical organization and all the shoulders of the giants that we've stood upon that came even before that.
[00:56:12.520 --> 00:56:14.600] It all boils down to this.
[00:56:14.960 --> 00:56:15.280] Right?
[00:56:15.280 --> 00:56:22.720] But I do think there's multiple moving parts here, you know, having followed this literature somewhat over the years, you know, as a skeptic.
[00:56:22.720 --> 00:56:31.040] There's also other studies which show that there's a correlation with intuitive thinking style versus an analytical thinking style, which makes perfect sense.
[00:56:31.440 --> 00:56:47.280] And there's also this question of, which I don't see as much in the literature, the fantasy-prone personality type, which I think is just an extreme version of this tendency to believe in the paranormal or maybe intuitive thinking.
[00:56:47.520 --> 00:56:49.840] But also, this is very context-dependent.
[00:56:49.840 --> 00:56:50.400] You know what I mean?
[00:56:50.400 --> 00:56:56.560] It's like what culture did you grow up in, and what is or how culturally acceptable are the beliefs that you're talking about?
[00:56:56.880 --> 00:57:12.080] And that may be where the real divide here is between traditional paranormal beliefs and New Age paranormal beliefs, in that the New Age one seems to be more of a, like you get into that subculture and that worldview.
[00:57:12.080 --> 00:57:24.960] And I think it tends to attract people who have, again, the sort of a fantasy-prone or intuitive thinking style, whereas like the more traditional ones may come about because of stress or whatever.
[00:57:24.960 --> 00:57:26.480] It makes sense that they would not be.
[00:57:26.480 --> 00:57:27.760] It's not all one phenomenon.
[00:57:27.760 --> 00:57:28.640] It's not monolithic.
[00:57:28.640 --> 00:57:40.080] I think there's also that conspiracy beliefs are its own phenomenon, you know, like the tendency to believe in conspiracies, while there's a ton of overlap, that is an entity unto itself.
[00:57:40.080 --> 00:57:40.880] Yeah, it's interesting.
[00:57:41.760 --> 00:57:50.040] I would imagine there would be, and of course more research is needed, correlations of stress and conspiracy theory.
[00:57:50.920 --> 00:57:52.080] Conspiracy thinking.
[00:57:53.040 --> 00:57:59.360] Yeah, maybe, but again, conspiracy thinking comes in two flavors: opportunistic conspiracy thinking and dedicated conspiracy thinking.
[00:57:59.360 --> 00:58:06.440] People who are conspiracy theorists, they believe in all conspiracies, and people who only believe in ones that support their worldview.
[00:58:06.920 --> 00:58:10.840] Opportunistically, those comforting ones won't cause the stress.
[00:58:11.080 --> 00:58:11.640] Right, probably.
[00:58:12.200 --> 00:58:14.920] Would not, probably not cause stress for them.
[00:58:15.240 --> 00:58:16.840] All right, it's an interesting area.
[00:58:16.840 --> 00:58:21.560] You know, again, I tend to follow this because it's pretty much in our sweet spot of what we do.
[00:58:21.560 --> 00:58:25.480] And it's fascinating to look at this as a psychological research project.
[00:58:25.800 --> 00:58:30.520] Well, everyone, we're going to take a quick break from our show to talk about our sponsor this week, Aura Frames.
[00:58:30.520 --> 00:58:31.560] Yeah, Aura Frames.
[00:58:31.560 --> 00:58:39.080] These are digital picture frames that allow users to display photos and videos directly from their smartphones or other devices.
[00:58:39.080 --> 00:58:45.240] These frames are Wi-Fi enabled, facilitating seamless photo sharing and display.
[00:58:45.240 --> 00:58:52.840] There's no memory cards or USBs required, and there's a reason Wirecutter named it the number one best digital photo frame.
[00:58:53.160 --> 00:58:58.360] I got my father-in-law, I got my mother-in-law, I got a couple of other relatives that we bought these frames for.
[00:58:58.360 --> 00:59:01.560] They're super easy to use, and they report back to me.
[00:59:01.560 --> 00:59:04.440] I have one too, but I'm just telling you what the people in my life thought.
[00:59:04.440 --> 00:59:05.720] They absolutely love it.
[00:59:05.720 --> 00:59:07.240] It's a fantastic present.
[00:59:07.240 --> 00:59:15.720] For a limited time, visit auraframes.com and get $45 off Aura's best-selling Carver Matte frames by using promo code skeptics at checkout.
[00:59:15.720 --> 00:59:20.040] That's A-U-R-A-Frames.com, promo code Skeptics.
[00:59:20.040 --> 00:59:24.920] This exclusive Black Friday Cyber Monday deal is their best of the year, so don't miss out.
[00:59:24.920 --> 00:59:26.360] Terms and conditions apply.
[00:59:26.360 --> 00:59:28.440] All right, guys, let's get back to the show.
[00:59:28.440 --> 00:59:30.760] All right, Jay, it's who's that noisy time?
[00:59:30.760 --> 00:59:33.720] All right, guys, last week I played this noisy.
[00:59:52.560 --> 00:59:54.000] In its entirety.
[00:59:54.000 --> 00:59:55.760] So, guys, have any guesses?
[00:59:55.760 --> 00:59:56.400] I have a guess.
[00:59:56.400 --> 00:59:59.760] That was Jay's first attempt at playing the Digi Dude.
[01:00:00.400 --> 01:00:02.800] It sounds like a super old recording.
[01:00:03.200 --> 01:00:04.160] That's a nice guess.
[01:00:04.160 --> 01:00:04.800] Anybody else?
[01:00:04.800 --> 01:00:06.880] It just sounded like an insect to me.
[01:00:06.880 --> 01:00:08.400] It has an insect-like quality.
[01:00:08.400 --> 01:00:08.720] Absolutely.
[01:00:08.880 --> 01:00:10.320] I thought I heard help me in there.
[01:00:10.800 --> 01:00:15.680] There was a listener, many that wrote in, one of them named Benjamin Davolt.
[01:00:15.920 --> 01:00:18.480] Ben here, the Frenchie from Japan.
[01:00:18.720 --> 01:00:20.000] I think this is a drone.
[01:00:20.000 --> 01:00:27.200] Quadcopter equipped with ultra-low noise propellers, probably the asymmetrical type with a counterweight on the side opposed to the blade.
[01:00:27.440 --> 01:00:30.400] That is, oh, and he says my name is pronounced Davu.
[01:00:30.400 --> 01:00:32.240] So Benjamin Davu.
[01:00:32.240 --> 01:00:35.600] So that's an interesting and very specific guess.
[01:00:35.600 --> 01:00:43.040] And, you know, quadcopters do make that kind of, you know, that, what would you call it, a buzzing sound, like almost like a droning noise?
[01:00:43.200 --> 01:00:44.240] A droning noise.
[01:00:44.560 --> 01:00:45.120] Yeah.
[01:00:45.440 --> 01:00:47.680] That is not correct, though, and we will continue on.
[01:00:47.680 --> 01:00:51.280] So a listener named Shane Hillier wrote in and said it's a murder hornet.
[01:00:51.280 --> 01:00:55.200] So yeah, Kara, somebody else agreed with you about the insect-like noise.
[01:00:55.600 --> 01:00:56.560] Murder hornets.
[01:00:56.640 --> 01:00:58.480] Listener named Stephen Walker wrote in.
[01:00:58.480 --> 01:01:05.360] He said, hi, our guess is a bee, a honeybee, doing its waggle dance to tell its buddies which direction to find the good stuff.
[01:01:05.360 --> 01:01:08.320] So yeah, there was a murder hornet and a bee.
[01:01:08.320 --> 01:01:10.800] So definitely people are hearing that kind of sound.
[01:01:10.800 --> 01:01:18.160] And I always find it fascinating that bees, you know, they talk with pheromones, which is basically, you know, smells.
[01:01:18.160 --> 01:01:20.800] It's pretty freaking cool that they can communicate with that.
[01:01:20.800 --> 01:01:23.040] So now I'm going to click right over to the winner.
[01:01:23.040 --> 01:01:26.720] We had multiple winners, but I'll play, I'll read one of them here.
[01:01:26.720 --> 01:01:34.520] So Frederick Neant was the first person to guess right, and he guessed this as the first known recording of a human voice.
[01:01:34.520 --> 01:01:34.920] Oh, wow.
[01:01:35.160 --> 01:01:38.440] I think at some point I've played this previously.
[01:01:38.760 --> 01:01:45.480] I had a different recording, and now this is like an update because something pretty remarkable happened where they were able to.
[01:01:47.080 --> 01:01:50.520] So let me give you another listener's answer.
[01:01:50.840 --> 01:01:52.760] This is Joshua Banta's answer.
[01:01:52.840 --> 01:01:55.320] Very, very nice description here.
[01:01:55.320 --> 01:02:03.480] He said it's an actual recording from 1860 of Eduard Leone Scott de Martinville singing a Claire de Lune.
[01:02:03.720 --> 01:02:08.360] This is something, it's played on something called a phonodogram.
[01:02:08.360 --> 01:02:09.880] But let me give you some more specifics here.
[01:02:09.880 --> 01:02:12.200] So James, he said James Buchanan was the U.S.
[01:02:12.200 --> 01:02:15.160] president at the time, pre-Civil War, pre-Abraham Lincoln.
[01:02:15.160 --> 01:02:30.040] And he said the Martinville invented a device called the phonodograph that collects sound by using a horn connected to a diaphragm, which caused a rigid bristle to vibrate and inscribe a visual representation on a hand-craked cylinder.
[01:02:30.040 --> 01:02:32.760] But this was never intended for playback, by the way.
[01:02:33.080 --> 01:02:36.920] It only produced visual images to show you what sound looked like.
[01:02:36.920 --> 01:02:38.760] It's just squiggles on paper.
[01:02:38.760 --> 01:02:42.920] But there was absolutely no capacity for there to be any playback.
[01:02:42.920 --> 01:02:52.520] Now we click forward to 2008 and the recording was transformed into a playable digital audio file by scientists at the Lawrence Berkeley National Laboratory in Berkeley, California.
[01:02:52.520 --> 01:02:53.640] And it worked really well.
[01:02:53.640 --> 01:03:00.120] So it was inferred to be played back at a 10-second speed, which caused the voice to sound like a woman or even a child.
[01:03:00.120 --> 01:03:03.000] But later, the scientists realized that this was the wrong speed.
[01:03:03.000 --> 01:03:08.440] And when they played it back at slower speeds, they found the one that they thought sounded the most correct, and it is of a man.
[01:03:08.440 --> 01:03:13.160] And they think it is Martinville himself singing Claire DeLune.
[01:03:13.160 --> 01:03:14.040] So I'll play it again.
[01:03:14.040 --> 01:03:18.720] Now, keep in mind, you know, this is the lowest fidelity recording you'll probably ever hear.
[01:03:33.840 --> 01:03:35.760] So that's pretty cool, guys.
[01:03:35.760 --> 01:03:36.160] Weird.
[01:03:37.200 --> 01:03:38.720] Unexpected past.
[01:03:38.720 --> 01:03:40.560] So thank you for sending that in.
[01:03:40.560 --> 01:03:46.640] I have a new noisy for you guys this week, and this was sent in by a listener named John Karabayk.
[01:03:46.640 --> 01:03:50.160] Thank you for sending in the phonetic pronunciation of your name.
[01:03:50.160 --> 01:03:52.720] And I'm going to play the sound now.
[01:04:01.280 --> 01:04:09.280] If you guys think you know what this week's noisy is, or you heard something cool, email us at wtn at the skepticsguide.org.
[01:04:09.280 --> 01:04:10.880] Steve, it's not too late.
[01:04:11.200 --> 01:04:14.720] It's actually, it is, it can be too late, but it isn't now.
[01:04:14.720 --> 01:04:26.160] If you're hearing this, and it's and it's basically like what, the 20th or the 21st or soon thereafter in November, you could buy tickets to, we have two shows going on in Washington, D.C.
[01:04:26.240 --> 01:04:40.320] We have a private show that is a live recording of the SGU, limited audience size, and we record the show and then for an hour, all of us, including George Hobb, will do something fun and unique that has never been done before in a live audience.
[01:04:40.320 --> 01:04:43.760] So if you want to have some fun, you can join us at our private show.
[01:04:43.760 --> 01:04:46.480] Or you can also go to the extravaganza.
[01:04:46.480 --> 01:04:50.160] This is the skeptical extravaganza stage show that we have.
[01:04:50.160 --> 01:04:51.520] It's going to be in D.C.
[01:04:51.520 --> 01:04:53.600] as well, and it's going to be that night, right?
[01:04:53.600 --> 01:04:56.000] The private show will be starting at 11:30 a.m.
[01:04:56.160 --> 01:04:59.520] and the extravaganza starts at 8 p.m.
[01:04:59.600 --> 01:05:01.400] Please get there around 7.
[01:04:59.760 --> 01:05:05.560] You can go to www.theskepticsguy.org.
[01:05:05.800 --> 01:05:10.920] We have buttons on there that link to tickets, which means you can buy them and you can come see us, and we'd love to have you.
[01:05:10.920 --> 01:05:16.280] And, Jay, we should tell people all of the social media stuff that we are on as well.
[01:05:16.280 --> 01:05:21.880] So, first of all, we do a live stream most Wednesdays starting at 1 p.m.
[01:05:22.200 --> 01:05:25.160] and most Fridays starting at 5 p.m.
[01:05:25.160 --> 01:05:26.360] Eastern.
[01:05:26.360 --> 01:05:28.520] We have a Facebook page.
[01:05:28.520 --> 01:05:33.400] Two blogs are affiliated with us: Neurological and Science-Based Medicine.
[01:05:33.720 --> 01:05:45.960] And we are still on X, but we are also now on Blue Sky, and we are on Instagram, and we post TikTok videos, two or three TikTok videos every week.
[01:05:46.280 --> 01:05:49.560] My most popular TikTok videos, what do you guys think?
[01:05:49.560 --> 01:05:50.360] What's it up to?
[01:05:50.680 --> 01:05:52.360] 4.5 million.
[01:05:52.920 --> 01:05:54.280] 5.7 million.
[01:05:54.520 --> 01:05:55.400] Oh, no.
[01:05:55.400 --> 01:05:56.600] 5.7 million?
[01:05:56.920 --> 01:05:57.800] Climbing.
[01:05:57.800 --> 01:05:58.840] Damn, man.
[01:05:59.160 --> 01:06:01.160] How is it really slowing down?
[01:06:01.400 --> 01:06:02.440] No, it's still ticking along.
[01:06:02.760 --> 01:06:03.320] It's still going strong.
[01:06:03.640 --> 01:06:04.040] Still going on.
[01:06:04.120 --> 01:06:04.840] Still tickety time.
[01:06:05.240 --> 01:06:08.440] The interesting thing is, like, there is just no rhyme or reason.
[01:06:09.240 --> 01:06:10.440] It just went viral.
[01:06:10.760 --> 01:06:18.920] So, sure, we could have slipped into TikTok's algorithm or whatever, but like, we make a ton of these videos, and it's all, you know, all revolving around the same theme.
[01:06:18.920 --> 01:06:23.080] Steve watches something on TikTok, and then he'll explain why that person is wrong or whatever.
[01:06:23.080 --> 01:06:31.720] Like, you know, we just kind of go into some of the more wilder things that people are talking about and bend the skeptical eye at it.
[01:06:31.720 --> 01:06:35.720] But this one, we, you know, me, Steve, and Ian were like, what happened?
[01:06:36.440 --> 01:06:37.400] I wish we knew the formula.
[01:06:37.400 --> 01:06:38.120] It just happened.
[01:06:38.120 --> 01:06:40.440] We're happy about it, but it is what it is.
[01:06:40.440 --> 01:06:41.720] Oh, and Steve, I can't forget.
[01:06:41.720 --> 01:06:42.200] Hold on.
[01:06:42.680 --> 01:06:46.000] These events in DC are fantastic, and I do hope that you can make it.
[01:06:46.000 --> 01:06:48.080] But my God, you got to go to Natacon.
[01:06:44.760 --> 01:06:50.800] This is our socializing conference.
[01:06:50.960 --> 01:06:53.680] This is a conference where people make connections with each other.
[01:06:53.680 --> 01:06:54.960] Tons of socializing.
[01:06:54.960 --> 01:06:56.640] We have a ton of entertainment that we do.
[01:06:56.640 --> 01:07:01.600] George Robb, Brian Wecht, and Andrea Jones Roy will join all of us here at the SGU.
[01:07:01.600 --> 01:07:06.640] It's 2.2 days of a lot of fun, and we really hope that you can join us.
[01:07:06.640 --> 01:07:09.840] You can go to nataconcon.com, right?
[01:07:09.840 --> 01:07:14.080] That's natacon con, c-o-n-c-on-dot c-o-m.com.
[01:07:14.080 --> 01:07:15.760] Bob, are you understanding what I'm saying?
[01:07:16.320 --> 01:07:16.960] Pretty sad.
[01:07:17.200 --> 01:07:20.080] It's nataconcon.com.
[01:07:20.080 --> 01:07:21.360] Ian, I swear to God.
[01:07:21.360 --> 01:07:22.720] All right, so anyway, guys.
[01:07:23.040 --> 01:07:25.360] Did you say 2.2 days?
[01:07:25.360 --> 01:07:26.160] Yes.
[01:07:28.080 --> 01:07:32.160] Roughly, yeah, because it's like you don't need to explain that, Jay.
[01:07:32.800 --> 01:07:34.640] Anyway, 210 days.
[01:07:34.640 --> 01:07:35.760] You'll see, because you'll be there.
[01:07:35.760 --> 01:07:37.600] So, anyway, please try to join us for that.
[01:07:37.600 --> 01:07:38.960] It's going to be a wonderful thing.
[01:07:38.960 --> 01:07:42.240] You know, we need some happiness in the world, and their happiness will be there.
[01:07:42.240 --> 01:07:43.440] So please join us.
[01:07:43.440 --> 01:07:43.760] All right.
[01:07:43.760 --> 01:07:44.720] Thanks, Jay.
[01:07:44.720 --> 01:07:46.240] We're going to do one quick email.
[01:07:46.240 --> 01:07:53.680] This comes from Mike Hampton, who writes, On Friday's live stream, you began talking about phrases like way anchor and such.
[01:07:53.680 --> 01:07:58.160] It reminded me of a phrase that I think wins the award for the dumbest phrase.
[01:07:58.720 --> 01:08:02.080] That is, you've got your work cut out for you.
[01:08:02.080 --> 01:08:09.520] I lived most of my life thinking this phrase meant you have an easy road ahead, which is what it should mean.
[01:08:09.520 --> 01:08:20.560] Any project you do that involves cutting, whether that be carpentry, papercrafts, sewing, et cetera, at least a quarter to a third of the project is cutting the material to the sizes and patterns you need.
[01:08:20.560 --> 01:08:30.600] So, if someone has prepared the material by cutting it out for you, the project is suddenly that much easier and going to make and going to take less time.
[01:08:29.840 --> 01:08:32.760] I was shocked to learn, shocked, shocked.
[01:08:33.720 --> 01:08:40.760] It actually means the opposite, which shot that phrase to the dumbest phrase in the English language as far as I'm concerned.
[01:08:41.080 --> 01:08:46.280] So, I looked into it because I love the etymology, especially of these kind of phrases.
[01:08:47.000 --> 01:08:47.640] What's your guess?
[01:08:47.640 --> 01:08:49.560] Where does that phrase come from?
[01:08:49.560 --> 01:08:50.920] What's the origin of you?
[01:08:51.080 --> 01:08:52.760] You've got your work cut out for you.
[01:08:52.760 --> 01:08:54.840] Oh, gosh, I'd only be guessing.
[01:08:54.840 --> 01:08:56.200] I mean, yeah, who knows?
[01:08:56.200 --> 01:08:57.640] Farming, something with farming?
[01:08:57.640 --> 01:08:59.000] Yeah, that tailoring.
[01:08:59.000 --> 01:09:02.920] It comes from tailoring, which was on his list, sewing.
[01:09:02.920 --> 01:09:09.560] And usually, the way a professional tailor would work is they would have an assistant.
[01:09:09.560 --> 01:09:20.600] The assistant would cut out all the patterns, and they would, for a dress or whatever, anything that they were going to make, and they would do all of that ahead of time.
[01:09:20.600 --> 01:09:24.840] And the primary reason for that was to make sure that they had everything, right?
[01:09:24.840 --> 01:09:36.600] So, you cut out all of the pieces, you make sure that every piece is there, and then the tailor would sew them all together into the final piece, the final dress or whatever.
[01:09:36.600 --> 01:09:47.480] Now, the sewing was the that's where all you know, most of the skill, the artistry, the artistry, and the, and that was very exacting and complicated work.
[01:09:47.480 --> 01:09:50.600] Whereas the cutting was you, you know, you're cutting it out of a pattern.
[01:09:50.600 --> 01:09:56.280] Not that that, you know, wasn't a lot of work, but that wasn't the tailor's work.
[01:09:56.280 --> 01:10:06.280] So, as a tailor, if you have your work cut out for you, that means you have a lot of work ahead of you, all that intricate sewing ahead of you.
[01:10:06.280 --> 01:10:07.080] I see.
[01:10:07.080 --> 01:10:07.560] Somebody else.
[01:10:07.720 --> 01:10:12.840] But I get what the dude is saying, because if you don't have your work cut out for you, there's even more work to write.
[01:10:13.080 --> 01:10:14.040] I hear what he's saying.
[01:10:14.800 --> 01:10:22.960] But because there was built into the origin of this phrase, this division of labor, not for the tailor, you have nothing to do because your assistant hasn't cut out the patterns yet.
[01:10:23.120 --> 01:10:27.440] Right, so you get to rest right now, but once your work is cut out for you, then your work begins.
[01:10:27.440 --> 01:10:32.240] So basically, this is when your work begins, when the work is quote unquote cut out for you.
[01:10:32.560 --> 01:10:34.400] But that's not really how we use the phrase.
[01:10:34.720 --> 01:10:36.640] No, we've twisted it a bit.
[01:10:37.040 --> 01:10:39.520] Well, it now means you have a long road ahead of you.
[01:10:39.520 --> 01:10:41.920] Yeah, it's going to get rough now from here on out.
[01:10:41.920 --> 01:10:42.160] Yeah.
[01:10:42.400 --> 01:10:48.160] Yeah, but it basically means you have a lot of work or difficult work or like, yeah, like you have a job to do.
[01:10:48.160 --> 01:10:50.400] Like, this is your job and you've got to do it.
[01:10:50.640 --> 01:10:57.440] Yeah, the uses evolve over time, but it does make sense in the context of its origin.
[01:10:57.440 --> 01:11:02.480] But yeah, some phrases do end up meaning the opposite of what they originally meant.
[01:11:02.480 --> 01:11:05.680] Like, for example, blood is thicker than water.
[01:11:05.680 --> 01:11:07.680] We've talked about this on the show before.
[01:11:07.760 --> 01:11:08.400] It means the opposite.
[01:11:08.640 --> 01:11:10.960] That's because it doesn't lob off the last part of the story.
[01:11:11.040 --> 01:11:14.080] Yeah, it's the blood of the, I forget now.
[01:11:14.080 --> 01:11:18.640] It was like the blood of the Christ is thicker than the water of the womb or something.
[01:11:18.640 --> 01:11:22.160] And it means the opposite of what people use it to mean now.
[01:11:22.800 --> 01:11:31.440] It means that your dedication to your religion is stronger than your familial ties, where people now use it to mean that your familial ties are the strongest.
[01:11:31.440 --> 01:11:32.800] The blood is thicker than water.
[01:11:33.600 --> 01:11:35.200] It was flipped in its meaning.
[01:11:35.200 --> 01:11:36.400] Yeah, interesting.
[01:11:36.800 --> 01:11:37.760] But it was always fascinating.
[01:11:38.080 --> 01:11:45.440] We do that a lot where we shorten the phrase, and the phrase, I've seen a few other examples of this where there's a long phrase with a moral at the end.
[01:11:45.440 --> 01:11:49.840] But when we shorten it, we only focus on the first sentence, which is actually the opposite of the point.
[01:11:49.840 --> 01:11:50.720] Like the proof is in the pudding.
[01:11:50.800 --> 01:11:51.520] The proof is in the pudding.
[01:11:51.840 --> 01:11:54.320] I mean, the proof of the pudding is in the taste.
[01:11:54.320 --> 01:11:54.800] It's the tasting.
[01:11:55.280 --> 01:11:57.360] Or this is one of my big P's.
[01:11:57.360 --> 01:11:59.120] When people say, I could care less.
[01:11:59.440 --> 01:11:59.680] Yes.
[01:11:59.680 --> 01:11:59.960] Oh, well.
[01:12:01.160 --> 01:12:01.640] You can?
[01:12:01.640 --> 01:12:01.880] Yeah.
[01:12:02.360 --> 01:12:03.080] You care more.
[01:12:03.080 --> 01:12:04.440] So you cared more.
[01:11:59.840 --> 01:12:06.600] I couldn't care less.
[01:12:07.000 --> 01:12:09.960] I care so little, I couldn't possibly care even less.
[01:12:09.960 --> 01:12:14.440] But people, they shorten it because we tend to shorten things, but that it re-flips the meaning when you shorten it.
[01:12:14.600 --> 01:12:15.080] That's annoying.
[01:12:15.080 --> 01:12:15.800] That's annoying.
[01:12:16.760 --> 01:12:18.920] Everyone out there, just stop that one.
[01:12:18.920 --> 01:12:20.200] Say, I couldn't.
[01:12:20.200 --> 01:12:21.240] Add the int.
[01:12:21.480 --> 01:12:23.960] I couldn't care less, please.
[01:12:27.400 --> 01:12:32.840] Guys, we have a great interview coming up with Kevin Folta, so let's go on with that interview now.
[01:12:44.920 --> 01:12:47.240] Well, we are joined now by Kevin Fulta.
[01:12:47.240 --> 01:12:48.760] Kevin, welcome back to the SGU.
[01:12:48.760 --> 01:12:49.320] Yeah, thank you.
[01:12:49.320 --> 01:12:50.680] It's really nice to be here again.
[01:12:50.680 --> 01:12:57.560] Yeah, it's been a while, and it's always good to interview people in person, you know, so we could look face-to-face and have a discussion rather than over the interwebbies.
[01:12:57.640 --> 01:13:07.720] So I've been dying to talk to you about a topic, about a new technique that we talked about just very, very briefly, about that plant biologists are using to make new cultivars.
[01:13:07.720 --> 01:13:08.680] You know what I'm talking about.
[01:13:08.680 --> 01:13:10.280] Well, yeah, this was the work that was done.
[01:13:10.600 --> 01:13:12.840] We mentioned this briefly with Judd Ward and the folks.
[01:13:12.840 --> 01:13:14.760] I can't remember the name of the company now.
[01:13:15.080 --> 01:13:22.120] But the newest, it's a new technique that is involving doubling the genetic material inside of a cell.
[01:13:22.120 --> 01:13:29.640] So basically creating a not just the old polyploids, but actually creating hybrids from hybrids.
[01:13:29.640 --> 01:13:32.600] So allowing complete genetic sets being to passed down.
[01:13:32.600 --> 01:13:35.240] So you're not getting genetic mixing in each generation.
[01:13:35.240 --> 01:13:40.280] So, yeah, so if you make a hybrid, then you can have those hybrid traits breed through to subsequent generations.
[01:13:40.280 --> 01:13:40.600] That's right.
[01:13:40.600 --> 01:13:42.520] You have to breed through going forward.
[01:13:42.520 --> 01:13:45.120] Yeah, because right now you can't do that with a hybrid.
[01:13:45.120 --> 01:13:45.680] That's right.
[01:13:44.840 --> 01:13:49.200] This is something that I know we bring up when we're talking about GMOs.
[01:13:49.280 --> 01:13:52.880] And we should probably remind our audience: you are a plant biologist.
[01:13:52.880 --> 01:13:53.840] And tell us, tell Kevin.
[01:13:54.080 --> 01:13:56.080] Well, I'm a molecular biologist by training.
[01:13:56.080 --> 01:14:07.280] I ended up working in plants, and we do a lot of work in genomics, mostly around flavors and aromas, but other major plant traits that are involved in traits that are important for farmers.
[01:14:07.280 --> 01:14:08.640] You're still a strawberry guy, though?
[01:14:08.880 --> 01:14:09.440] Not so much.
[01:14:09.440 --> 01:14:10.480] I'm out of strawberries now.
[01:14:10.880 --> 01:14:11.440] You don't have strawberries now?
[01:14:11.520 --> 01:14:11.840] No, no, no.
[01:14:11.920 --> 01:14:14.400] I wanted to get new strawberries from you from the corner.
[01:14:15.200 --> 01:14:17.280] I tried those strawberries on camera once.
[01:14:17.360 --> 01:14:17.760] That was amazing.
[01:14:18.240 --> 01:14:18.720] That's right.
[01:14:18.720 --> 01:14:19.120] That's right.
[01:14:19.440 --> 01:14:19.840] We did.
[01:14:20.160 --> 01:14:22.960] And all those strawberries went in the autoclave, unfortunately.
[01:14:22.960 --> 01:14:28.000] We created a fungus-resistant strawberry, but the industry wouldn't use it, so it's gone.
[01:14:28.480 --> 01:14:32.480] So, Kevin, you selectively bred these strawberries, right?
[01:14:32.880 --> 01:14:34.320] Just correct me when I'm wrong.
[01:14:34.320 --> 01:14:35.280] Well, you're wrong.
[01:14:36.240 --> 01:14:37.120] That's why I asked you.
[01:14:37.200 --> 01:14:37.680] Go ahead.
[01:14:37.680 --> 01:14:43.840] Yeah, these were a variety that already existed that we added a gene that would prime its immune system.
[01:14:43.840 --> 01:14:47.840] So even before a pathogen came along, it was ready to confront the pathogen.
[01:14:48.400 --> 01:14:50.800] And it was awesome because they didn't get as sick.
[01:14:51.120 --> 01:14:52.560] They would recover from disease.
[01:14:52.560 --> 01:14:53.360] It was great.
[01:14:53.360 --> 01:14:57.600] And as we know, fungicides or strawberries are fungicide-dependent crops.
[01:14:57.920 --> 01:15:06.480] And so it allowed us to potentially make something that would help the industry farm with fewer fungicides, which is great for the environment, great for farmers.
[01:15:06.480 --> 01:15:10.000] But there was a lukewarm feeling in the industry about it.
[01:15:10.480 --> 01:15:11.520] Because it was GM?
[01:15:11.520 --> 01:15:12.720] Because it was genetically homicide.
[01:15:15.200 --> 01:15:16.560] Was it a synthetic insertion?
[01:15:16.560 --> 01:15:18.880] Was it something from another organism?
[01:15:18.880 --> 01:15:20.160] It was a plant gene in a plant.
[01:15:20.160 --> 01:15:22.240] So it was a plant, it was from a rabbidopsis.
[01:15:22.560 --> 01:15:22.880] Cotton?
[01:15:23.120 --> 01:15:23.680] No, that's not cotton.
[01:15:23.760 --> 01:15:28.400] No, no, no rabidopsis is the little white lab mouse, a plants.
[01:15:26.440 --> 01:15:29.400] Yes, the little lab of plants, right?
[01:15:29.720 --> 01:15:31.880] And so we put that in, and it seemed to prime.
[01:15:31.880 --> 01:15:32.520] It did great.
[01:15:32.520 --> 01:15:34.120] The strawberry did wonderfully.
[01:15:29.280 --> 01:15:35.720] But it did have a yield hit.
[01:15:35.960 --> 01:15:40.360] So, in other words, you were resistant to disease, but you had a slight dip in yield.
[01:15:40.360 --> 01:15:43.960] And between that and the genetic engineering trait, they weren't so excited about that.
[01:15:44.200 --> 01:15:49.640] That's so interesting because a slight dip in yield, yes, I could see on paper that would be frightening to a farmer.
[01:15:49.640 --> 01:15:55.080] But when all of your crop gets taken by fungus, that's a big dip in yield.
[01:15:55.080 --> 01:15:56.120] So I guess they're willing to roll it up.
[01:15:56.440 --> 01:15:57.960] Long term, it might be positive.
[01:15:58.840 --> 01:16:00.120] But wait, way more importantly.
[01:16:00.440 --> 01:16:02.040] They were delicious.
[01:16:02.040 --> 01:16:07.960] And people need to, you know, people, I think, would be very responsive to it if they knew how good they were.
[01:16:07.960 --> 01:16:10.440] Yeah, but the base, that comes from the base hybrid.
[01:16:10.440 --> 01:16:14.600] The basic strawberry was so good that, you know, you had one more gene, you couldn't taste it.
[01:16:14.920 --> 01:16:16.600] It just purely was in the management.
[01:16:16.600 --> 01:16:20.840] It was allowing farmers to use less chemistry, which is really expensive to apply.
[01:16:20.840 --> 01:16:27.960] Right, so why you might not know this answer, but why would we still get these supermarket strawberries that don't have a lot of flavor?
[01:16:27.960 --> 01:16:31.800] Like, why aren't they just starting to grow these, even without the gene editing?
[01:16:31.800 --> 01:16:32.680] But that's changing.
[01:16:32.680 --> 01:16:40.200] And it's because over the last decade, my lab spent a lot of time identifying the genes that control the traits that consumers really like.
[01:16:40.200 --> 01:16:47.240] So we identified, we interviewed consumers, having them taste strawberries, and they tasted hundreds of different kinds of strawberries.
[01:16:47.240 --> 01:16:49.720] And then we went through and did principal component analysis.
[01:16:49.720 --> 01:16:56.200] So we took a strawberry, exploded it into its chemistry, and then said which ones always line up with consumer liking.
[01:16:56.200 --> 01:16:59.640] And there was always a list of 12 that consumers really liked.
[01:16:59.640 --> 01:17:04.200] You know, when the consumers liked it, one of those 12 or multiples of those 12 was available.
[01:17:04.200 --> 01:17:09.080] So then we found the genes that underlie those 12 volatile aroma traits.
[01:17:09.080 --> 01:17:19.440] And then once we had those genes nailed down, identified markers associated with them, so DNA signatures, that would then, as they were inherited, would allow us to get all of them in one place.
[01:17:20.160 --> 01:17:22.960] So that work is still ongoing by the strawberry breeding program.
[01:17:23.200 --> 01:17:24.080] And that's conventional breeding.
[01:17:24.400 --> 01:17:25.360] That's conventional breeding.
[01:17:25.360 --> 01:17:27.280] Yeah, you're not actually turning those genes on.
[01:17:27.280 --> 01:17:30.320] You're just finding cultivars that already have them and breeding them together.
[01:17:30.320 --> 01:17:30.640] That's right.
[01:17:30.960 --> 01:17:33.680] And the cool part is, this is all done in a seedling.
[01:17:33.680 --> 01:17:40.720] In the old days, we used to have to put out 10 acres of strawberries and taste them and run them through gas chromatography to find the ones that had it.
[01:17:40.720 --> 01:17:48.960] Now you take 384 seedlings in a petri dish, basically, do the assay, and then throw away the ones that don't have the markers.
[01:17:48.960 --> 01:17:51.520] So this work is still ongoing at the University of Florida.
[01:17:51.600 --> 01:17:53.840] And I'm a little bit separate from it these days.
[01:17:54.640 --> 01:17:55.600] Can I ask a quick question?
[01:17:55.600 --> 01:18:01.600] Like with this example, now that we have that kind of in our minds of how that works, could you just like CRISPR those things on?
[01:18:01.920 --> 01:18:03.600] Now that's more possible, yes.
[01:18:03.600 --> 01:18:05.040] And would that then be considered?
[01:18:05.040 --> 01:18:10.240] Obviously, it is genetic engineering, but would it be considered a genetically modified organism?
[01:18:10.240 --> 01:18:14.880] That would be only if we left around the hardware that did the edit.
[01:18:14.880 --> 01:18:23.040] So if you had Cas9, if you had the enzyme that does the little scissors trick, if that enzyme was engineered in to do the work, then it would be one thing.
[01:18:23.040 --> 01:18:31.760] But there's a lot of plants where you can engineer a single cell with CRISPR and do it in a single cell with the protein itself rather than have to install the genes.
[01:18:31.760 --> 01:18:35.920] So in other words, you just put in the hardware rather than have the cell make the hardware.
[01:18:35.920 --> 01:18:36.720] Oh my goodness.
[01:18:36.720 --> 01:18:38.800] This is so annoying that you have to worry about.
[01:18:38.800 --> 01:18:39.760] Oh yeah, totally.
[01:18:39.760 --> 01:18:46.160] But then you have that one cell and sometimes you can get that into two cells, five cells, ten cells, whatever, and then eventually having a whole plant from that one cell.
[01:18:46.160 --> 01:18:50.640] That one's foreign DNA-free, yet it contains the edit you're looking for.
[01:18:50.960 --> 01:18:53.120] That one is not a regulated article by the U.S.
[01:18:53.680 --> 01:18:57.280] genetically engineered, but not genetically modified in the U.S., is that correct?
[01:18:57.280 --> 01:19:00.920] Well, it's not a, as the USDA says, it's not a regulated article.
[01:18:59.920 --> 01:19:02.360] Yeah, it's such a bottom line.
[01:19:02.600 --> 01:19:11.560] Yeah, it just shows, it just goes to show that the fundamental choices that are made in legislation around this are so divorced from the science.
[01:19:11.560 --> 01:19:12.040] Yeah.
[01:19:12.280 --> 01:19:20.920] I mean, people, you know, have this perception that if they eat a genetically modified organism, that's going to do something to them, to their DNA, to their DNA and everything.
[01:19:20.920 --> 01:19:22.280] And again, it's misinformation.
[01:19:22.280 --> 01:19:23.240] It's disinformation.
[01:19:23.240 --> 01:19:24.200] It's big industry.
[01:19:24.680 --> 01:19:26.520] There's motivations behind all this stuff.
[01:19:26.680 --> 01:19:28.200] I want to get back to the hybrids.
[01:19:28.680 --> 01:19:39.240] So just as we were saying, that when you hybrid two plants together, the daughter plants have a certain mix of genes that, whatever, is a good crop.
[01:19:39.240 --> 01:19:45.320] But if you take those seeds and breed them together, you end up with a mix of genes that might not be what you want.
[01:19:45.320 --> 01:19:50.600] So, I know, before GMOs came out, like 95, 98% of crop seeds were hybrids.
[01:19:50.920 --> 01:19:51.640] Oh, cellar.
[01:19:51.960 --> 01:19:52.520] Cellar.
[01:19:52.520 --> 01:19:53.240] Yeah, it's still on.
[01:19:54.360 --> 01:19:58.360] You can't save your seeds and replant them, which is like an annoying thing about the whole anti-GMO thing.
[01:19:58.440 --> 01:19:59.240] You can't save seeds.
[01:19:59.400 --> 01:19:59.960] You never could.
[01:19:59.960 --> 01:20:01.640] You could never do that with the hybrids.
[01:20:01.640 --> 01:20:08.600] But now what you're saying is you can make a cultivar where the hybrid traits do go through to subsequent generations because...
[01:20:08.600 --> 01:20:12.600] So is that like in the seeds now, or is that you still have to do it every generation?
[01:20:12.920 --> 01:20:18.280] It is a, once you have the first plant, there's a process that you basically break meiosis, right?
[01:20:18.440 --> 01:20:25.240] So the segregation of alleles during, or of genetic complements during segregation of gametes.
[01:20:25.400 --> 01:20:26.760] I know I got to straighten that out.
[01:20:27.000 --> 01:20:27.880] He's talking about sex, yeah.
[01:20:28.200 --> 01:20:34.520] Basically, when you're making the pollen and the egg cells, you make sure that that doesn't reduce its gametes.
[01:20:34.520 --> 01:20:36.680] It doesn't go down with half the genetic complements.
[01:20:36.840 --> 01:20:37.240] Oh, interesting.
[01:20:37.480 --> 01:20:39.720] It passes the whole thing in one of the two.
[01:20:39.720 --> 01:20:41.720] So you're turning meiosis into mitosis.
[01:20:41.720 --> 01:20:43.880] You're basically turning meiosis into mitosis.
[01:20:43.880 --> 01:20:44.360] Interesting.
[01:20:44.360 --> 01:20:46.960] And that's passing down its genetic material.
[01:20:46.960 --> 01:20:47.200] Right.
[01:20:44.760 --> 01:20:50.720] So instead of reduction division, you're just dividing the cells.
[01:20:44.840 --> 01:20:51.120] Yeah.
[01:20:51.680 --> 01:20:55.760] And one of them gets the entire complement of the hybrid, the other one gets nothing.
[01:20:56.080 --> 01:20:58.160] And so, so, quick question, just to interject.
[01:20:58.400 --> 01:21:06.720] When a hybrid, in the before times, like you were talking about, which is still the now times, when a hybrid would breed with a hybrid, you might, it's like Punnett Squares, right?
[01:21:06.720 --> 01:21:09.680] Then you might get traits that were undesirable.
[01:21:09.680 --> 01:21:12.000] But are they always even viable?
[01:21:12.000 --> 01:21:15.120] Like, because I'm thinking about animal biology, which is where I come from.
[01:21:15.120 --> 01:21:17.600] And sometimes hybrids can't breed.
[01:21:17.600 --> 01:21:18.240] Right, that's true.
[01:21:18.240 --> 01:21:19.440] Does that happen in plants as well?
[01:21:19.440 --> 01:21:22.960] Yeah, there are examples where plants produce infertile offspring.
[01:21:22.960 --> 01:21:23.440] Interesting.
[01:21:23.440 --> 01:21:25.360] But that's usually a function of polyploidy.
[01:21:25.440 --> 01:21:37.680] So if you can, so like a seedless watermelon is a combination of one that has four times the genetic material bred against one that has the normal complement, and it comes out with 3x, and so that's where you get seedlessness.
[01:21:37.680 --> 01:21:39.200] Yeah, that's deliberate to make it seedless.
[01:21:39.200 --> 01:21:39.920] That's deliberate, yeah.
[01:21:39.920 --> 01:21:43.120] So you have to have the right female flowers in the field and the right male flowers.
[01:21:43.120 --> 01:21:43.600] Yeah, yeah.
[01:21:43.600 --> 01:21:44.480] Yeah, cool stuff.
[01:21:44.480 --> 01:21:48.320] So Kevin, there's a question that I've been wanting to ask you for a long time.
[01:21:48.320 --> 01:21:55.840] And I've been saying on the show a lot, you know, like everything we eat, all the food, all the meats, all this stuff, it's all been selectively bred, right?
[01:21:55.840 --> 01:21:58.000] And some of it has been genetically engineered.
[01:21:58.000 --> 01:22:03.040] So we have this limited spectrum of different types of fruits and vegetables and meats and everything.
[01:22:03.040 --> 01:22:07.840] But I'm curious to know, like, how broad could this flavor profile be?
[01:22:07.840 --> 01:22:14.480] Like, could there be a million more brand new fruits that don't taste like anything we've ever tasted before that just haven't been created yet?
[01:22:14.480 --> 01:22:15.200] Absolutely.
[01:22:15.200 --> 01:22:18.000] Yeah, there's so much volatile diversity out there.
[01:22:18.000 --> 01:22:21.360] And what you're looking at is a combination of two different things.
[01:22:21.360 --> 01:22:34.520] There's the volatiles that are out there, which are the major things that shape flavor along with acids and sugars and other aspects, where the tongue and the olfactory system collide in the brain, right, with all the different things it's sensing.
[01:22:34.840 --> 01:22:39.320] But there's so many unusual volatiles that are out there, and we find things all the time.
[01:22:39.320 --> 01:22:42.200] But also things that disrupt your ability to perceive them.
[01:22:42.200 --> 01:22:43.880] So things like miracle fruit.
[01:22:43.880 --> 01:22:48.360] You can eat a miracle fruit and then not be able to, and then when you taste acid, it tastes sweet.
[01:22:48.360 --> 01:22:52.040] So it's a weird sensory combination that's very interesting.
[01:22:52.040 --> 01:23:02.440] So to visualize it, this might be a hard thing to answer, but like we have like these little, you know, along the horizon, we have like a pineapple and watermelon, blah, blah, blah.
[01:23:02.440 --> 01:23:07.080] But literally, there could be like millions of fruit flavors that we've never tasted before.
[01:23:07.560 --> 01:23:09.880] Well, maybe you've tasted them, but you don't know you did.
[01:23:09.880 --> 01:23:17.000] Because what's happening in a strawberry, there are very small hints of the major flavor of peach and the major flavor of grape.
[01:23:17.000 --> 01:23:19.080] And you have to think of it like an orchestra.
[01:23:19.080 --> 01:23:24.760] All these things contribute just a tiny little bit that's barely perceptible unless you really know what you're looking for.
[01:23:24.760 --> 01:23:30.760] But we can use gas chromatography and other analytical methods, analytical chemistry methods, to be able to detect them.
[01:23:30.760 --> 01:23:34.840] And consumers notice when they're not there, even if they don't know why.
[01:23:35.640 --> 01:23:37.640] So maybe we can think about this in two ways.
[01:23:37.640 --> 01:23:47.320] One is, if you have all the fruity flavors that we could taste, right, then a fruit is some combination of those flavors, some subset, and some are more powerful than others.
[01:23:47.320 --> 01:23:50.840] So yes, strawberry has peach in it, but a peach has a lot of peach in it.
[01:23:50.840 --> 01:23:51.400] That's right.
[01:23:51.880 --> 01:24:05.720] But is there also potentially, like, you talk about the chemical space, you know, like, is there a space of flavors that nature hasn't necessarily fully explored all of those flavors, and maybe there is more room for even new flavors that we've never tasted before?
[01:24:05.880 --> 01:24:06.520] Can we even know?
[01:24:07.000 --> 01:24:07.720] Yeah, can we even know about that?
[01:24:07.880 --> 01:24:08.440] That's interesting.
[01:24:08.440 --> 01:24:09.480] I think there probably are.
[01:24:09.480 --> 01:24:29.840] I think if you start going into especially floral aromas where you're attracting pollinators and unusual pollinators, where you attract the enemies of herbivores, so you have plants that are being fed or are being fed upon, and then the plant will exude volatiles to attract the predator of the thing that's feeding on it.
[01:24:30.080 --> 01:24:31.680] It kind of calls in the troops.
[01:24:31.680 --> 01:24:38.560] But all these things may have some sort of flavor response in us, but we just haven't integrated them in the fruits and vegetables.
[01:24:38.560 --> 01:24:41.840] You know, there's the volatile ohm is expensive.
[01:24:42.480 --> 01:24:43.280] Yeah, yeah, right, exactly.
[01:24:43.440 --> 01:24:44.560] So remind me.
[01:24:44.880 --> 01:24:51.360] There may be flavors in plants that are delicious, but the plants are poisonous to humans.
[01:24:51.360 --> 01:24:51.840] Could be.
[01:24:51.840 --> 01:24:57.600] So we could make a non-poisonous version of them or get their genes into stuff we can't eat.
[01:24:57.600 --> 01:24:58.080] Absolutely.
[01:24:58.240 --> 01:25:02.320] And all kinds of other interesting stuff, not just flavors and aromas.
[01:25:02.320 --> 01:25:05.680] Look at like Sichuan peppercorns that have that numbing effect.
[01:25:05.920 --> 01:25:09.360] There's so many interesting sensory aspects of plant biology.
[01:25:09.360 --> 01:25:11.120] And I wish I still played in that field.
[01:25:11.280 --> 01:25:15.680] We really kind of moved out of it a bunch just because now it's in the hands of the breeders.
[01:25:15.680 --> 01:25:17.040] Now they've got to put it all together.
[01:25:17.040 --> 01:25:25.440] It reminds me, I was reading an article about medieval food and what it tasted like and the flavor profiles that they enjoyed.
[01:25:25.440 --> 01:25:28.320] If modern people had that, they would be like, what did I just eat?
[01:25:28.320 --> 01:25:29.280] That was horrible.
[01:25:29.280 --> 01:25:45.680] But it is making me think in the future that people of our future, our descendants might think, might feel bad for us because then, like, can you imagine they had so few things to eat, so few fruits, and they're enjoying, they have this cornucopia of tastes and flavors and volatiles that we just can't even imagine right now.
[01:25:45.680 --> 01:25:48.120] And you're just like, they would think of us.
[01:25:48.200 --> 01:25:50.560] Poor people that had horrible food.
[01:25:50.880 --> 01:25:51.520] Tainted.
[01:25:51.520 --> 01:25:54.640] People can be tainted by these artificial flavors that are out there.
[01:25:54.640 --> 01:25:57.920] They think they know what a strawberry tastes like because they taste enough artificial strawberry.
[01:25:57.920 --> 01:26:01.160] They don't really know what a strawberry really is supposed to taste like.
[01:26:01.800 --> 01:26:08.840] But isn't so much a food science understanding the basic interests of the consumer.
[01:26:08.840 --> 01:26:11.480] And consumers like fat, they like sugar.
[01:26:11.480 --> 01:26:16.120] They like things that historically, evolutionarily, were rare.
[01:26:16.120 --> 01:26:21.160] And if we can make healthy food taste that way, then we can have the best of both worlds.
[01:26:21.160 --> 01:26:22.360] And I'm down for that.
[01:26:22.360 --> 01:26:27.080] Yeah, well, that was one of the big issues: could you find volatiles that replace sweetness?
[01:26:27.400 --> 01:26:28.600] And you can.
[01:26:28.760 --> 01:26:34.440] That's been some work by Linda Bardaschuk in our institution, a guy named Thomas Calhoun.
[01:26:34.440 --> 01:26:40.040] They were looking at the flavor volatiles that made people sense sweetness in the absence of sugar.
[01:26:40.040 --> 01:26:40.280] Wow.
[01:26:40.520 --> 01:26:44.120] And so you would take two glasses of water, put in a tablespoon of sugar, and mix it.
[01:26:44.120 --> 01:26:49.400] In one of them, you would add some of these fruity volatiles, and people would say the one with the volatiles was sweeter.
[01:26:49.400 --> 01:26:49.800] Smarter.
[01:26:50.040 --> 01:26:51.640] Even if they couldn't perceive the volatiles.
[01:26:51.960 --> 01:26:57.720] But it's incredible when you think of like, you know, someone who enjoys cooking a lot of Italian food, right?
[01:26:57.720 --> 01:27:04.040] You know, onions and garlic, these are like the bases of so many different things in that, right?
[01:27:04.040 --> 01:27:11.560] And then, like, you can go to India now, and like the basis of their flavors, there's a lot of curries, which are constructed flavors anyway, right?
[01:27:11.560 --> 01:27:28.760] Like, the complexity is so broad that could you imagine if there was like another 50 culture versions of food that we just have no idea what it tastes like, but it's partly because we don't have the vegetables and the big beginnings to start those all those completely different types of foods that could be out there, you know?
[01:27:29.320 --> 01:27:34.040] I just love that because I just think, like Bob said, in the future, people might have access to this stuff.
[01:27:34.040 --> 01:27:41.080] They might be able to genetically engineer something like just by talking to an AI and then grow it and be like, oh my God, I made a new freaking fruit.
[01:27:41.480 --> 01:27:42.680] No one's ever tasted this before.
[01:27:43.080 --> 01:27:43.480] That's cool.
[01:27:43.880 --> 01:27:44.960] Are they using AI in?
[01:27:45.920 --> 01:27:46.800] Yeah, they are using this.
[01:27:46.800 --> 01:27:49.360] And mostly that's happening with the breeders now.
[01:27:49.360 --> 01:27:50.000] That's not anything.
[01:27:50.320 --> 01:27:52.480] My lab is working on the opposite of AI.
[01:27:53.040 --> 01:27:54.800] We decided everyone's going to AI.
[01:27:54.800 --> 01:27:58.640] We're just going to go to kind of ignorant randomness.
[01:27:58.640 --> 01:28:01.360] And so, what we decided to do, this is the coolest new science.
[01:28:01.360 --> 01:28:07.840] We were taking, we always wanted to take an organism and just put random DNA in it and see what comes from it.
[01:28:08.080 --> 01:28:09.280] And what do you mean by random DNA?
[01:28:09.440 --> 01:28:09.840] Random DNA.
[01:28:10.000 --> 01:28:12.080] So random genes, or like literally random sequence?
[01:28:12.240 --> 01:28:13.120] A random sequence.
[01:28:13.120 --> 01:28:14.000] So it's not even a gene.
[01:28:14.160 --> 01:28:15.040] It's not even a gene.
[01:28:16.160 --> 01:28:21.040] It encodes a protein product, a peptide, that has never existed in the universe before.
[01:28:21.200 --> 01:28:28.480] So instead of sort of having a hypothesis and going top-down theoretically, you're just like going bottom up and going, what happens if we just mix this stuff up?
[01:28:28.800 --> 01:28:30.080] It's just throwing a whole bunch of shit against a skin.
[01:28:30.160 --> 01:28:30.720] Yeah, see what sticks.
[01:28:31.920 --> 01:28:34.640] I always liken it to throwing monkey wrenches into the machine.
[01:28:34.880 --> 01:28:38.160] That you're standing next to this elaborate machine throwing monkey wrenches in.
[01:28:38.160 --> 01:28:39.440] Most of the time they don't do anything.
[01:28:39.760 --> 01:28:42.160] But once in a while, one sticks in the gears.
[01:28:42.160 --> 01:28:51.760] And so this is the coolest thing that we've been doing because we've been showing in bacteria as well as plants that we can identify new vulnerabilities for lethality by using random peptides.
[01:28:51.760 --> 01:29:01.840] And the random peptides stick in either in different places or the random RNAs suppress RNA that's required for certain developmental transitions.
[01:29:01.840 --> 01:29:10.400] And so why this is so cool is because we're not going to create a new herbicide or a new antibiotic by creating a peptide or a mimic of that peptide.
[01:29:10.400 --> 01:29:14.320] But what we will expose is a new vulnerability we didn't know about before.
[01:29:14.640 --> 01:29:19.280] And then have smart people who design molecules make something that fits that vulnerability.
[01:29:19.280 --> 01:29:24.280] Is this like a new kind of mutation farming where again you're just trying to make random shit happen to see if anything good comes of it?
[01:29:24.000 --> 01:29:24.840] Is it similar?
[01:29:25.120 --> 01:29:25.920] I think it's even weirder.
[01:29:26.400 --> 01:29:27.040] Even weirder.
[01:29:27.040 --> 01:29:32.280] Yeah, this is just, this is pure, just, and you have to have huge populations to be able to do this.
[01:29:32.600 --> 01:29:38.360] But still, with large populations, we find that lethality in about 5%, which is pretty amazing.
[01:29:38.360 --> 01:29:40.200] It's much higher than we would have predicted.
[01:29:40.200 --> 01:29:41.560] What plants are you doing this in?
[01:29:41.560 --> 01:29:44.840] We're doing this in a rabbidopsis, which is the laboratory plant.
[01:29:44.840 --> 01:29:52.600] But it also works in bacteria that we can disrupt very well-characterized bacterial processes with randomness.
[01:29:53.560 --> 01:29:57.160] How do you feel the GMO attitudes are out there in the country?
[01:29:57.800 --> 01:29:58.840] Have we made any progress?
[01:29:58.840 --> 01:29:59.880] Are things getting better?
[01:30:00.840 --> 01:30:01.960] Do we just stop talking about it?
[01:30:01.960 --> 01:30:02.600] What's going on?
[01:30:02.600 --> 01:30:04.040] We have absolutely made progress.
[01:30:04.520 --> 01:30:10.120] And I teach classes now on, I teach a class on critical thinking in agriculture and medicine.
[01:30:10.200 --> 01:30:12.760] I designed this course, and it is so much fun.
[01:30:12.760 --> 01:30:21.400] And we talk about all the different ways in which we are deceived, deceived ourselves, cognitive bias, statistical deception, all the things like that that you guys talk about every week.
[01:30:21.400 --> 01:30:22.360] It's fantastic.
[01:30:22.360 --> 01:30:23.880] We talk about alternative medicine.
[01:30:23.880 --> 01:30:25.800] Then we talk about genetic engineering.
[01:30:25.800 --> 01:30:28.440] And now when I talk about GMOs, everybody kind of glazes over.
[01:30:28.600 --> 01:30:29.480] Nobody cares.
[01:30:30.120 --> 01:30:32.920] None of the students are thinking it's a threat or a problem.
[01:30:32.920 --> 01:30:35.560] There's no problem to solve in that room.
[01:30:35.560 --> 01:30:39.640] But then it turns into how do we solve the problem that's still out there in the public?
[01:30:39.960 --> 01:30:45.720] And how do I deputize this room of 30 students to engage their skeptical friends?
[01:30:45.720 --> 01:30:48.200] And how do I get them to engage online?
[01:30:48.200 --> 01:30:50.840] We're in social media where this stuff runs rampant.
[01:30:50.840 --> 01:30:52.520] And how do we do it effectively?
[01:30:52.520 --> 01:31:02.520] And so that's where I've really just taken such a turn away from biology science into much more sociology and psychology and understand how we can be better persuaders.
[01:31:03.240 --> 01:31:06.600] And then I think that's been just the magic in the last 10 years.
[01:31:06.920 --> 01:31:07.640] I mean, absolutely.
[01:31:07.640 --> 01:31:08.520] Obviously, this is what we do.
[01:31:09.000 --> 01:31:17.760] Thinking very carefully about how do we persuade the j people individually, the general population, the people who matter in terms of regulators and whatnot.
[01:31:14.680 --> 01:31:19.760] And it's just tricky, and each topic is different.
[01:31:19.920 --> 01:31:24.480] But I think, yeah, the GMOs might, I also think that we've been moving the needle on that.
[01:31:24.480 --> 01:31:28.640] I do think because this topic is amenable to information.
[01:31:28.640 --> 01:31:32.720] You know, there was just, we are just combating misinformation and it's very correctable.
[01:31:32.720 --> 01:31:47.040] I also think, and this is just now just my gut feeling, is that a lot of the pushback against the GMOs, the anti-GMO attitude, was just an unfamiliarity with the technology and just a disgust kind of reaction to it.
[01:31:47.040 --> 01:31:56.000] I think, this is my hope, my thought and my hope, that much like IVF in vitro fertilization, remember the test two babies and all the protests and everything?
[01:31:56.000 --> 01:31:57.840] And now nobody cares, right?
[01:31:57.840 --> 01:31:59.440] I'm just hoping that the same thing is going to happen.
[01:31:59.440 --> 01:32:01.520] Like genetically ensuring everything is genetically engineered.
[01:32:01.520 --> 01:32:02.640] Who cares, right?
[01:32:02.640 --> 01:32:04.160] Do you think that we're heading in that direction?
[01:32:04.480 --> 01:32:05.440] I agree 100%.
[01:32:05.440 --> 01:32:15.040] I also think there's a lot of disaster fatigue that we've been told, well, you're going to get lumpy and things are going to fall off and all the problems that we're going to have never materialized.
[01:32:15.200 --> 01:32:15.680] Never happened.
[01:32:16.000 --> 01:32:18.720] And I think that has a big role in that too.
[01:32:18.720 --> 01:32:30.720] And so now when we can remind people of what they said, of what the opponents said, and then we can show the progress of where it's going and we can show all the beautiful things that we could be doing, that people do change their minds.
[01:32:31.040 --> 01:32:32.720] And you can kind of persuade.
[01:32:33.040 --> 01:32:38.160] There still is a rather vigorous anti-GMO movement out there that if you look for it, you can find it.
[01:32:38.160 --> 01:32:39.120] Oh, yeah, no question.
[01:32:39.120 --> 01:32:41.120] But it is changing.
[01:32:41.120 --> 01:32:48.400] And the big problem is that so many of the really good innovations that we have still haven't hit the road.
[01:32:48.640 --> 01:32:50.800] And that rubber hasn't hit the road.
[01:32:51.280 --> 01:32:54.000] And so all the, you know, we don't have golden rice yet.
[01:32:54.040 --> 01:32:59.520] You know, we don't have the bananas, the soybeans, all the other stuff that could solve vitamin A deficiency.
[01:33:01.000 --> 01:33:15.080] But also, it's like when the solutions feel like they're happening in the background to solve ever-increasing problem problems, like, oh, here's a solution to solve a blight that you never would have even known was there because we got out in front of it before it devastated the crop.
[01:33:15.080 --> 01:33:18.040] People don't recognize all of the progress.
[01:33:18.040 --> 01:33:22.280] They only recognize the progress when there's like a fundamental or radical change.
[01:33:22.280 --> 01:33:30.040] And so I think that's always a problem: is that in science, so much of what we have to do is to try to prevent devastation.
[01:33:30.040 --> 01:33:32.600] And then people don't recognize that we prevented the devastation.
[01:33:32.840 --> 01:33:34.520] If we stopped COVID from happening, nobody would know.
[01:33:34.920 --> 01:33:35.160] Exactly.
[01:33:35.720 --> 01:33:37.240] That's my job as a producer.
[01:33:37.240 --> 01:33:37.480] Right.
[01:33:37.720 --> 01:33:41.080] If you did your job well, nobody knows that you did it or that it was hard.
[01:33:41.720 --> 01:33:44.280] So I think that's crazy.
[01:33:44.840 --> 01:33:45.880] And I agree with you.
[01:33:45.880 --> 01:33:51.240] And I think we were in a situation where we fight hard, many fronts.
[01:33:51.240 --> 01:33:56.200] And GMOs, I think it's ironic when there was, what was it, Steve?
[01:33:56.200 --> 01:34:01.240] Was it papaya or was it when Hawaii had a papaya blight?
[01:34:01.240 --> 01:34:04.440] They just were like, we're going to plant the GMOs and we're just not going to be able to do it.
[01:34:04.600 --> 01:34:08.040] We're going to finally ignore the papaya because otherwise the papaya industry should go away.
[01:34:08.600 --> 01:34:09.560] I like that example.
[01:34:09.560 --> 01:34:10.920] I also use the cheese example.
[01:34:10.920 --> 01:34:14.760] It's like, there would be no cheese industry without GMO rennet.
[01:34:15.240 --> 01:34:15.720] Forget about it.
[01:34:16.200 --> 01:34:17.160] But you know what I think, though?
[01:34:17.160 --> 01:34:19.480] I think the big one is going to be chocolate.
[01:34:19.720 --> 01:34:21.800] Because there is a child blood problem.
[01:34:22.520 --> 01:34:22.840] You know this?
[01:34:23.160 --> 01:34:25.400] If GMOs save chocolate, then we've won.
[01:34:25.640 --> 01:34:26.360] No, but that's what I'm saying.
[01:34:26.440 --> 01:34:26.920] No, I know.
[01:34:27.240 --> 01:34:27.720] It's funny.
[01:34:27.720 --> 01:34:31.240] It's funny, but I really do think, imagine if chocolate went away for a little while.
[01:34:31.320 --> 01:34:32.520] They'll go, we could bring it back.
[01:34:32.520 --> 01:34:33.320] We have GMO.
[01:34:33.320 --> 01:34:33.960] We'll bring it back.
[01:34:33.960 --> 01:34:34.920] You'll have it in six months.
[01:34:34.920 --> 01:34:35.240] People are going to be able to do it.
[01:34:35.400 --> 01:34:37.040] That's like the American chestnut tree.
[01:34:37.000 --> 01:34:38.160] Yeah, that's right.
[01:34:38.600 --> 01:34:39.480] But yeah, I agree.
[01:34:39.640 --> 01:35:05.680] I do think the anti-GMO crowd, panicking a little bit about golden rice and some of these applications, because I think they know that if we solve vitamin A deficiency or make a huge improvement with a GMO food that breaks all of their propaganda, it's not patented, it's not controlled by Monsanto or some big corporation, it's free to farmers, they can plant their own seeds, whatever.
[01:35:06.000 --> 01:35:10.080] None of your boogeymen is true, and it's going to save in blankets.
[01:35:10.320 --> 01:35:16.960] It's like a perfect PR, superstorm pro-GMO against the whole anti-German.
[01:35:17.040 --> 01:35:18.240] I've never thought of it that way, Steve.
[01:35:18.240 --> 01:35:18.640] You're right.
[01:35:18.880 --> 01:35:19.680] They're panicking about it.
[01:35:19.760 --> 01:35:22.160] That's why they're so desperate for it not to come to market.
[01:35:22.480 --> 01:35:33.760] It would also help, I think, a lot of the people who are on the fence or who are confused understand the difference between genetically modified organisms and corporate practices.
[01:35:33.760 --> 01:35:34.080] Right.
[01:35:34.080 --> 01:35:35.840] Because very often they conflate the two.
[01:35:36.320 --> 01:35:36.960] 100%.
[01:35:37.360 --> 01:35:40.400] Almost always when I'm talking to a skeptic who's anti-GMO, it's that sort of thing.
[01:35:40.480 --> 01:35:41.440] Yeah, it's corporate stuff that they're doing.
[01:35:41.600 --> 01:35:44.160] It's like it's really, I don't think corporations should have that much power.
[01:35:44.160 --> 01:35:45.040] Well, they already do.
[01:35:45.040 --> 01:35:46.800] Yeah, that's not what we're talking about.
[01:35:46.800 --> 01:35:47.760] Yeah, that's what we're talking about.
[01:35:47.760 --> 01:35:50.160] Is the anti-GMO lobby waning?
[01:35:50.480 --> 01:35:54.720] Yeah, I think so, especially because gene editing has been so democratizing.
[01:35:54.720 --> 01:36:05.600] So going back to this CRISPR Cas9, the ability to change a letter or two, that technology has been very in the hands of universities and small companies.
[01:36:05.600 --> 01:36:07.360] And it's just a different feel.
[01:36:07.360 --> 01:36:08.800] Small governments can do it.
[01:36:08.800 --> 01:36:09.920] Everybody can do it.
[01:36:09.920 --> 01:36:16.800] And with that kind of ability, it really changes the dynamic of who can bring a product to market.
[01:36:16.800 --> 01:36:29.120] The traditional transgenic approach, what we usually think about as genetically modified, the fact that it was so regulated and that people pushed for more regulation meant that only a couple of companies had the ability to do it.
[01:36:29.200 --> 01:36:31.880] It was like three or four companies, right, that were putting out all the GMOs.
[01:36:29.520 --> 01:36:34.120] And those three or four companies said, you know what?
[01:36:34.120 --> 01:36:35.640] Make the process harder.
[01:36:29.840 --> 01:36:37.000] Make it harder.
[01:36:37.160 --> 01:36:41.880] Because if you make that process harder, it gives us exclusivity in that space.
[01:36:41.880 --> 01:36:48.440] And so the anti-GMO folks who were out there who said we hate these companies were doing nothing more than empowering the companies they hated.
[01:36:48.760 --> 01:36:49.640] Hilarious.
[01:36:50.120 --> 01:36:52.280] Yeah, that these companies were like, yeah, keep it up, guys.
[01:36:53.080 --> 01:36:54.360] And they were protests.
[01:36:54.600 --> 01:36:56.440] They were actually really doing the devil's thing.
[01:36:57.160 --> 01:36:58.600] Yeah, regulate the hell out of it, right?
[01:36:58.600 --> 01:37:00.600] Because then nobody can compete with us, right?
[01:37:00.600 --> 01:37:01.080] Yeah.
[01:37:01.080 --> 01:37:03.080] Anything else you want us to talk about that we didn't get to?
[01:37:03.080 --> 01:37:04.280] No, I think that's pretty much it.
[01:37:04.600 --> 01:37:12.040] The one thing that it may be to mention is people do need to be participating in these conversations still, and that it's not a dead issue.
[01:37:12.040 --> 01:37:13.000] We can't get complacent.
[01:37:13.800 --> 01:37:14.680] It'll come right back.
[01:37:15.000 --> 01:37:17.320] It has the possibility to come back.
[01:37:17.320 --> 01:37:24.520] And now they're fighting things like appeal, you know, this coding that you put on fruit to make it last longer, saying that it's Bill Gates poison, you know.
[01:37:25.000 --> 01:37:25.560] Poor Bill Gates.
[01:37:26.680 --> 01:37:28.440] I never give away billions of dollars.
[01:37:30.760 --> 01:37:33.800] But all these things are still being discussed.
[01:37:33.800 --> 01:37:39.560] But the bottom line is, they limit how technology can reach the poorest people on the planet.
[01:37:39.560 --> 01:37:41.560] The affluent are not missing meals.
[01:37:41.960 --> 01:37:50.680] And so being active in this and remembering who we're really trying to help here and making food last longer, make it taste better, all that stuff, that's what we have to be doing.
[01:37:50.680 --> 01:37:51.560] We've got to keep on it.
[01:37:51.880 --> 01:37:53.240] Well, thank you for your service.
[01:37:53.880 --> 01:37:55.240] I appreciate it, Kevin.
[01:37:55.400 --> 01:37:56.840] Doing the good work.
[01:37:59.720 --> 01:38:04.760] It's time for science or fiction.
[01:38:09.560 --> 01:38:14.360] Each week I come up with three science news items or facts: two real and one fake.
[01:38:14.440 --> 01:38:19.200] Then I challenge my panel of skeptics to tell me which one is the fake.
[01:38:14.840 --> 01:38:20.720] We have a theme this week.
[01:38:21.040 --> 01:38:23.440] The theme is U.S.
[01:38:23.440 --> 01:38:24.400] trivia.
[01:38:24.480 --> 01:38:27.520] That's randomoid facts about the U.S.
[01:38:27.760 --> 01:38:28.400] Random.
[01:38:28.400 --> 01:38:29.360] All right, here we go.
[01:38:30.000 --> 01:38:40.480] Kansas is not only the flattest state in the U.S., it is literally flatter than a pancake with a flatness score of 0.9997.
[01:38:40.800 --> 01:38:48.720] I number two, the coastline of Alaska is longer than the coastlines of all the other 49 states combined.
[01:38:48.720 --> 01:38:56.000] And item number three, the first telephone directory in the world was published in New Haven, Connecticut in 1878.
[01:38:56.000 --> 01:39:00.160] The names were not alphabetized and there were no phone numbers included.
[01:39:00.160 --> 01:39:01.040] Evan, go first.
[01:39:01.040 --> 01:39:06.720] Well, Kansas, yes, that's the name of a star, according to a movie I once watched.
[01:39:06.720 --> 01:39:07.920] Wizard of Oz, anyone?
[01:39:07.920 --> 01:39:08.400] No?
[01:39:09.280 --> 01:39:12.080] Literally flatter than a pancake.
[01:39:12.480 --> 01:39:14.320] Wow, that's flat.
[01:39:14.320 --> 01:39:15.920] That's pretty flat.
[01:39:15.920 --> 01:39:20.560] Although, well, I didn't even know there was this thing called a flatness score, frankly.
[01:39:20.880 --> 01:39:28.160] Even in these flat, flat states, you know, you are going to have some elevation differences, right?
[01:39:29.120 --> 01:39:32.800] So saying it's flatter than the pancake, I have no, really can't say.
[01:39:32.960 --> 01:39:38.320] The coastline of Alaska, number two, longer than the coastlines of all the other 49 states combined.
[01:39:38.320 --> 01:39:41.840] Okay, so the thing you got to remember about Alaska, Alaska's big.
[01:39:42.160 --> 01:39:45.040] It is the largest state for area.
[01:39:45.040 --> 01:39:48.880] And also, Alaska's got all these little islands among so many other things.
[01:39:48.840 --> 01:39:52.800] So and that coastline is a kind of a jagged maze.
[01:39:52.800 --> 01:39:57.040] So could it be could it be longer than the rest of all the coastlines?
[01:39:57.040 --> 01:39:58.000] I think it could be.
[01:39:58.000 --> 01:40:01.000] I have a feeling that one's going to wind up being science.
[01:40:01.320 --> 01:40:09.480] And the last one about the first telephone directory in New Haven, Connecticut, not far from here, 1878.
[01:40:09.480 --> 01:40:14.040] Names were not alphabetized and there were no phone numbers included.
[01:40:14.040 --> 01:40:16.920] I have a feeling that one is also science.
[01:40:16.920 --> 01:40:21.560] Therefore, I'm dubious about the whole pancake flatness Kansas thing.
[01:40:21.560 --> 01:40:23.400] That one, I think, is fiction.
[01:40:23.400 --> 01:40:24.200] Okay, Bob.
[01:40:24.360 --> 01:40:26.440] All right, Kansas, not the flattest state.
[01:40:26.440 --> 01:40:27.960] Oh, it is a flattest.
[01:40:28.200 --> 01:40:30.200] Yeah, I could see it being flatter than a pancake.
[01:40:30.200 --> 01:40:39.640] I mean, pancakes generally are a little domed, you know, because just the nature of how you just spread it, it just glop it on, and then it spreads, right?
[01:40:39.640 --> 01:40:45.320] It's got a viscosity, it spreads, and, you know, the middle is never going to spread quite as much as the outer edges.
[01:40:45.320 --> 01:40:48.120] It just seems to me that that's not that big of a deal.
[01:40:48.440 --> 01:40:50.040] The Alaska one, that's tough.
[01:40:50.280 --> 01:40:58.360] I assume that when comparing the coastline of Alaska and the rest of the United States, that they use the same fractal dimension for all of those measurements.
[01:40:59.480 --> 01:41:00.280] I'll make that assumption.
[01:41:00.600 --> 01:41:01.480] We will make that assumption.
[01:41:01.640 --> 01:41:02.280] You have to.
[01:41:02.280 --> 01:41:09.960] Because then you could say Florida has more coastline than every other country on the planet if you mess around with your fractal dimensions.
[01:41:09.960 --> 01:41:17.560] Thinking about it, it seems like it would be close, but I think there's kind of like lots of little ins and outs of Alaska.
[01:41:17.560 --> 01:41:20.280] So it might sneak by and actually be a little bit long.
[01:41:20.760 --> 01:41:25.960] I don't think it's by a dramatic amount, but so that would mean this last, I don't know what to make of this stupid.
[01:41:26.120 --> 01:41:29.240] So it's a phone directory, not in alphabetical order.
[01:41:29.240 --> 01:41:30.840] And oh, yeah, it doesn't have phone numbers.
[01:41:30.840 --> 01:41:31.480] What the hell?
[01:41:31.480 --> 01:41:35.400] Why, then, then it's not, then it's not a phone directory, a telephone directory.
[01:41:35.400 --> 01:41:37.880] I don't know what you would call it, but not a telephone directory.
[01:41:37.880 --> 01:41:39.640] So I don't know what the hell's going on with that.
[01:41:39.640 --> 01:41:42.040] I'm just going to say Alaska fiction.
[01:41:42.040 --> 01:41:43.000] Okay, Jay.
[01:41:43.000 --> 01:41:47.680] Yeah, so going backwards, I mean, you live close to New Haven.
[01:41:44.840 --> 01:41:48.480] You know, we both do.
[01:41:48.800 --> 01:41:57.040] I never heard of this, but this is one of those news items where I'm like, I don't see any reason why, you know, the first telephone directory wasn't published in New Haven.
[01:41:57.040 --> 01:41:57.280] Okay.
[01:41:57.280 --> 01:42:00.320] I mean, there's nothing really there that's making me go, hey, you know.
[01:42:00.320 --> 01:42:04.720] Now, let's keep in mind about item number two here: that Alaska is freaking huge.
[01:42:04.720 --> 01:42:11.520] And most of the times that you see a picture of it, it isn't relative size, not the actual size compared to the other things on the map.
[01:42:11.520 --> 01:42:12.080] That's true.
[01:42:12.080 --> 01:42:14.720] The Mercator projection will explain.
[01:42:14.800 --> 01:42:19.440] And, you know, it's got tons and tons of jigs and jags and islands and blah, blah, blah.
[01:42:19.520 --> 01:42:20.640] You know, it just like adds up.
[01:42:20.640 --> 01:42:22.960] So I, you know, I think that's probably true.
[01:42:22.960 --> 01:42:27.280] I mean, I can't imagine, I can imagine easily that it has more coastline just because of all the shapes.
[01:42:27.520 --> 01:42:30.240] You know, you look down California, you know, relatively straight.
[01:42:30.560 --> 01:42:32.480] Look down the East Coast, relatively straight.
[01:42:32.480 --> 01:42:34.720] You know, Alaska's just got so much busy going on.
[01:42:34.720 --> 01:42:37.520] So that leaves me to the first one here.
[01:42:37.520 --> 01:42:43.680] I wonder if they took into account the curvature of the earth when they said how flat Kansas is.
[01:42:43.680 --> 01:42:45.840] But it could be in like a basin, I guess.
[01:42:46.160 --> 01:42:50.160] And I have no reason to not believe that it's wicked flat.
[01:42:50.160 --> 01:42:55.520] I don't know about the number that Steve said, but I do not think that Kansas is the flattest state.
[01:42:55.520 --> 01:42:57.040] I just don't think it is.
[01:42:57.040 --> 01:43:01.200] I have reasons to believe this, and I will say that this one is a fiction.
[01:43:01.200 --> 01:43:02.160] And Kara.
[01:43:02.160 --> 01:43:06.320] Yeah, I got to go with, is that just Jay or is that Jay and Evan?
[01:43:06.320 --> 01:43:14.400] I got to go with Jay and Evan on this because I seem to remember living in Florida and Florida being really damn flat, like really flat.
[01:43:14.400 --> 01:43:18.800] And I don't know, maybe Kansas is flatter, but I bet you Florida gives it a run for its money.
[01:43:18.800 --> 01:43:20.160] I buy the Alaska one.
[01:43:20.160 --> 01:43:23.440] I think we have to remember that also a lot of the U.S.
[01:43:23.440 --> 01:43:25.280] is actually landlocked.
[01:43:25.280 --> 01:43:27.840] So, like, we're not just coastline all the way around.
[01:43:27.840 --> 01:43:31.160] I'm with Bob on the whole, like, why is it called a telephone directory?
[01:43:31.160 --> 01:43:34.680] But my assumption is 1878, did they not have phone numbers?
[01:43:34.680 --> 01:43:37.000] Was it like telegrams or something?
[01:43:37.000 --> 01:43:37.640] What was before?
[01:43:37.640 --> 01:43:38.200] Phones?
[01:43:38.200 --> 01:43:39.080] Telegraphs?
[01:43:39.080 --> 01:43:39.480] Smoke signal.
[01:43:39.640 --> 01:43:40.040] Telegraphs.
[01:43:40.360 --> 01:43:41.400] Yeah, it was a smoke signal directory.
[01:43:41.560 --> 01:43:41.880] Telegraph.
[01:43:42.120 --> 01:43:43.000] So maybe it was something like that.
[01:43:43.000 --> 01:43:44.440] And they just.
[01:43:44.440 --> 01:43:49.400] But I think I'm going to go with Eben and Jay and say that Kansas is not the flattest state in the U.S.
[01:43:49.560 --> 01:43:51.640] I have no idea if it's flatter than a pancake, though.
[01:43:51.640 --> 01:43:53.560] All right, so you all agree on the third one.
[01:43:53.560 --> 01:43:54.280] So we'll start there.
[01:43:54.280 --> 01:43:59.000] The first telephone directory in the world was published in New Haven, Connecticut in 1878.
[01:43:59.000 --> 01:44:03.080] The names were not alphabetized, and there were no phone numbers included.
[01:44:03.080 --> 01:44:08.280] You guys all think this one is science, and this one is science.
[01:44:08.280 --> 01:44:14.200] And yes, the telephone directory was a telephone directory, not some other kind of directory that they just called a telephone directory.
[01:44:14.200 --> 01:44:16.040] Do you know why there were no phone numbers listed?
[01:44:16.200 --> 01:44:16.840] Because operators.
[01:44:17.000 --> 01:44:18.600] There were only like eight names or something.
[01:44:19.240 --> 01:44:20.360] Yeah, you would just call the operator.
[01:44:20.600 --> 01:44:21.720] Phone numbers did not exist.
[01:44:21.720 --> 01:44:22.600] Yeah, you would call up the data.
[01:44:23.080 --> 01:44:26.920] You would just pick up a candle and talk to the connection.
[01:44:27.480 --> 01:44:28.600] Give me Joe Bag of Donuts.
[01:44:29.000 --> 01:44:34.680] And then the operator would know who he was and would literally just connect you to that person.
[01:44:34.680 --> 01:44:35.320] That's right.
[01:44:35.320 --> 01:44:37.560] Guess how many names were included?
[01:44:38.120 --> 01:44:38.520] 1870.
[01:44:38.680 --> 01:44:40.360] I think it was like eight or nine.
[01:44:40.920 --> 01:44:41.560] 50.
[01:44:41.640 --> 01:44:41.880] 50.
[01:44:42.200 --> 01:44:42.440] 50.
[01:44:42.680 --> 01:44:43.400] 5-0.
[01:44:43.400 --> 01:44:48.520] Yeah, so that was basically all the businesses and people who had phones in the New Haven area.
[01:44:48.840 --> 01:44:49.320] How many?
[01:44:49.320 --> 01:44:49.560] Yeah.
[01:44:49.560 --> 01:44:52.840] And you'd call the switchboards, they connect me to this person.
[01:44:53.160 --> 01:44:59.240] And then, you know, a few years later, when they had more people, they said, you know, we should probably put these names in alphabetical order.
[01:44:59.240 --> 01:45:07.480] And why don't we just assign numbers to them in case we have somebody like, you know, there's an operator who's who's covering who doesn't know everyone's name?
[01:45:07.480 --> 01:45:09.160] You know, like that was the next step.
[01:45:09.160 --> 01:45:11.640] And then that's when phone numbers were invented.
[01:45:11.640 --> 01:45:13.320] But yeah, first one in New Haven, Connecticut.
[01:45:13.320 --> 01:45:13.640] Cool.
[01:45:13.640 --> 01:45:15.280] All right, I guess we'll go backwards.
[01:45:14.680 --> 01:45:19.600] The coastline of Alaska is longer than the coastlines of all the other 49 states combined.
[01:45:19.680 --> 01:45:21.360] Bob, you think this one is the fiction?
[01:45:21.360 --> 01:45:23.280] Everyone else thinks this one is science.
[01:45:23.280 --> 01:45:26.400] And this one is science.
[01:45:26.400 --> 01:45:27.120] This is science.
[01:45:27.120 --> 01:45:27.920] Sorry, Bob.
[01:45:27.920 --> 01:45:30.000] So what about Mercator distortion?
[01:45:30.640 --> 01:45:30.960] Right?
[01:45:30.960 --> 01:45:38.240] Because if you look at a flat map, if you yeah, Alaska.
[01:45:38.800 --> 01:45:43.280] It's not as big as it looks on the Mercator projection, but it's still big.
[01:45:43.280 --> 01:45:45.920] But it is really because of what some of you said.
[01:45:45.920 --> 01:45:57.600] It's got so many jigs and jags and islands that it just, and there's like that one huge, you know, panhandle thing at the whisker thing at the bottom of Alaska.
[01:45:57.600 --> 01:46:00.560] It's just, if you look at it, it's a lot of coastline.
[01:46:00.560 --> 01:46:01.440] The rest of the U.S.
[01:46:01.440 --> 01:46:04.720] has a lot of coastline too, but it doesn't get up to quite as much as Alaska.
[01:46:05.040 --> 01:46:05.920] What's the ratio?
[01:46:05.920 --> 01:46:10.400] Well, Alaska has 34,000 miles of coastline, officially.
[01:46:11.520 --> 01:46:13.920] What's the next most coastline?
[01:46:14.160 --> 01:46:14.720] Which state?
[01:46:14.720 --> 01:46:15.280] Which state?
[01:46:15.280 --> 01:46:15.840] Oh, Hawaii?
[01:46:15.920 --> 01:46:16.640] Maybe California.
[01:46:17.920 --> 01:46:20.400] It's Florida, which has like 8,000.
[01:46:20.560 --> 01:46:21.520] It's not even close.
[01:46:22.720 --> 01:46:24.320] What's the total, though, of the states?
[01:46:24.560 --> 01:46:25.600] Less than 34,000.
[01:46:25.600 --> 01:46:25.920] I don't know.
[01:46:28.960 --> 01:46:30.400] Yeah, it's less than that.
[01:46:30.400 --> 01:46:31.600] Yeah, I think it's by a lot.
[01:46:31.600 --> 01:46:32.960] Like, it's not even close.
[01:46:32.960 --> 01:46:37.600] So, yeah, because it's because of how crazy the Alaska coastline is.
[01:46:37.600 --> 01:46:40.880] As Jay said, like, the West Coast is pretty straight.
[01:46:40.880 --> 01:46:42.960] The East Coast is mostly straight.
[01:46:43.040 --> 01:46:45.440] Then you get Florida, and then it's straight again.
[01:46:45.440 --> 01:46:46.880] Okay, let's go back number one.
[01:46:46.880 --> 01:46:55.200] Kansas is not only the flattest state in the U.S., it is literally flatter than a pancake with a flatness score of 0.9997.
[01:46:55.200 --> 01:46:57.600] That, of course, is the fiction.
[01:46:57.600 --> 01:47:00.440] The flattest state, CARA, is Florida.
[01:46:59.600 --> 01:47:11.480] Now, the other way to designate flattest, you could do a calculation of the difference between the highest point and the lowest point compared to its area, right?
[01:47:11.480 --> 01:47:15.320] I mean, Evan, you have to imagine a pancake the size of a state.
[01:47:15.320 --> 01:47:17.640] It would have a massive mountain in the middle.
[01:47:17.640 --> 01:47:18.040] You know what I mean?
[01:47:18.040 --> 01:47:21.240] Like, it would be, it would be a massive difference between the height.
[01:47:21.240 --> 01:47:26.120] But anyway, what do you think is the second flattest state after Florida?
[01:47:26.760 --> 01:47:28.520] I'm assuming not Kansas.
[01:47:28.520 --> 01:47:29.480] The second flattest.
[01:47:29.720 --> 01:47:30.760] Kentucky.
[01:47:30.760 --> 01:47:31.320] Really?
[01:47:31.320 --> 01:47:42.840] Then Delaware, Louisiana, Mississippi, Rhode Island, Indiana, Illinois, Ohio, Iowa, Wisconsin, Michigan, Missouri, Minnesota, New Jersey, Connecticut, Alabama, Arkansas, North Dakota, Pennsylvania, and then Kansas.
[01:47:42.840 --> 01:47:44.440] It's like right in the middle of the pack as well.
[01:47:44.920 --> 01:47:45.560] Close to the flat point.
[01:47:45.640 --> 01:47:47.080] It's the middle of the country, middle of the.
[01:47:47.960 --> 01:47:49.800] All you need is one hill, and then that screws it up.
[01:47:49.800 --> 01:47:50.440] It screws you over.
[01:47:50.600 --> 01:47:55.240] So the other way to designate it is just the difference between the high point and the low point, right?
[01:47:55.240 --> 01:47:57.640] So for Florida, it's 345 feet.
[01:47:57.640 --> 01:47:58.120] That's it.
[01:47:58.120 --> 01:47:58.680] That's the difference.
[01:47:58.760 --> 01:47:59.080] I knew that.
[01:48:00.360 --> 01:48:01.960] And the lowest point in Florida.
[01:48:01.960 --> 01:48:03.960] Kentucky's close, 388.
[01:48:03.960 --> 01:48:05.480] Delaware's 450.
[01:48:05.480 --> 01:48:07.080] Delaware's pretty flat as well.
[01:48:07.080 --> 01:48:08.200] I knew that fact about Florida.
[01:48:08.200 --> 01:48:10.040] I would not have guessed Kentucky.
[01:48:10.040 --> 01:48:13.160] Kansas is 3,363.
[01:48:13.160 --> 01:48:18.600] It's not anywhere near as flat as Florida, but it just has that reputation of being just like a big cornfield.
[01:48:18.600 --> 01:48:20.200] You know, it's just a flat state.
[01:48:20.840 --> 01:48:22.680] But it is flatter than a pancake.
[01:48:22.680 --> 01:48:23.560] That part is true.
[01:48:23.880 --> 01:48:24.440] All right.
[01:48:24.440 --> 01:48:25.880] Well, good job, guys.
[01:48:25.880 --> 01:48:27.000] Evan, give us a quote.
[01:48:27.000 --> 01:48:35.160] There is only one thing worse than coming home from the lab to a sink full of dirty dishes, and that's not going to the lab at all.
[01:48:35.160 --> 01:48:40.840] That's a wonderful quote by Shane Shwing Wu, experimental physicist.
[01:48:40.840 --> 01:48:49.840] She is considered the first, she was considered the first lady of physics, the Chinese Madame Curie, and the queen of nuclear research.
[01:48:44.520 --> 01:48:50.240] Awesome.
[01:48:50.560 --> 01:48:53.520] A very impressive resume she has.
[01:48:53.520 --> 01:48:56.720] She should have won a Nobel Prize in Physics.
[01:48:56.720 --> 01:49:08.080] She was part of a team that did win the 1957 Nobel Prize in Physics with two other Chinese researchers, but the men got the credit and she did not.
[01:49:09.120 --> 01:49:15.120] So, again, another injustice that should have been, should not have gone down that way.
[01:49:15.600 --> 01:49:17.040] Should not have happened that way.
[01:49:17.040 --> 01:49:17.520] All right.
[01:49:17.520 --> 01:49:18.800] Well, thank you, Evan.
[01:49:18.800 --> 01:49:19.280] Yep.
[01:49:19.280 --> 01:49:21.120] And thank you all for joining me this week.
[01:49:21.120 --> 01:49:21.440] Thanks, Steve.
[01:49:22.400 --> 01:49:23.440] Thank you, Steve.
[01:49:23.440 --> 01:49:25.120] And happy Thanksgiving, everyone.
[01:49:25.680 --> 01:49:26.400] Happy Thanksgiving.
[01:49:26.560 --> 01:49:28.080] Be safe out there, please.
[01:49:28.080 --> 01:49:33.840] Next episode will come out after Thanksgiving, and that'll be the episode we recorded while we were at Saikon.
[01:49:33.840 --> 01:49:34.480] Right.
[01:49:34.480 --> 01:49:37.520] Yep, and then we'll be back with a new episode in two weeks.
[01:49:37.520 --> 01:49:42.400] And until next week, this is your Skeptic's Guide to the Universe.
[01:49:44.960 --> 01:49:51.680] Skeptic's Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:49:51.680 --> 01:49:56.320] For more information, visit us at theskepticsguide.org.
[01:49:56.320 --> 01:50:00.240] Send your questions to info at the skepticsguide.org.
[01:50:00.240 --> 01:50:10.880] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:50:10.880 --> 01:50:14.240] Our listeners and supporters are what make SGU possible.