Debug Information
Processing Details
- VTT File: skepticast2024-07-20.vtt
- Processing Time: September 11, 2025 at 03:41 PM
- Total Chunks: 2
- Transcript Length: 138,526 characters
- Caption Count: 1,323 captions
Prompts Used
Prompt 1: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 1 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs, who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.680] Learn more at aarp.org/slash skills.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.600] Your escape to reality.
[00:00:40.240 --> 00:00:43.200] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:43.200 --> 00:00:48.320] Today is Wednesday, July 17th, 2024, and this is your host, Stephen Novella.
[00:00:48.320 --> 00:00:50.080] Joining me this week are Bob Novella.
[00:00:50.080 --> 00:00:50.720] Hey, everybody.
[00:00:50.720 --> 00:00:52.080] Kara Santa Maria.
[00:00:52.080 --> 00:00:52.800] Howdy.
[00:00:52.800 --> 00:00:54.160] And Evan Bernstein.
[00:00:54.160 --> 00:00:55.760] Hey, everywhere's Jay.
[00:00:55.760 --> 00:00:56.960] Jay is off this week.
[00:00:56.960 --> 00:00:57.440] Oh.
[00:00:57.440 --> 00:00:58.560] Yep, he's just on vacation.
[00:00:59.920 --> 00:01:02.160] Jay is both off and on vacation.
[00:01:02.160 --> 00:01:03.840] He just goes off.
[00:01:03.840 --> 00:01:04.480] That's all.
[00:01:04.480 --> 00:01:08.080] Wherever you are, Jay, I hope you're having an excellent time.
[00:01:08.080 --> 00:01:09.840] Jay is in Maine, actually.
[00:01:09.840 --> 00:01:10.720] Uh-huh.
[00:01:11.040 --> 00:01:11.760] Horseback.
[00:01:11.760 --> 00:01:12.800] Girl like that.
[00:01:12.800 --> 00:01:13.520] All right.
[00:01:14.080 --> 00:01:15.200] Maine, you say.
[00:01:15.200 --> 00:01:18.720] So, have either of you guys watched either The Bear or Shogun?
[00:01:18.720 --> 00:01:19.840] I've watched The Bear.
[00:01:19.840 --> 00:01:20.640] I did not watch The Shogun.
[00:01:20.800 --> 00:01:22.400] I've watched Cocaine Bear.
[00:01:22.880 --> 00:01:26.960] I watched Shogun when it was a series back in the 1980s?
[00:01:26.960 --> 00:01:28.000] Early 80s?
[00:01:28.000 --> 00:01:30.480] Kara, what did you think of this season of The Bear?
[00:01:30.800 --> 00:01:32.400] I can't do any spoilers yet, right?
[00:01:32.400 --> 00:01:33.440] Because it's still pretty, pretty.
[00:01:33.600 --> 00:01:35.760] Yeah, you can just give an overview with no spoilers.
[00:01:35.760 --> 00:01:38.720] All right, let me think about how to say the impression with that.
[00:01:38.720 --> 00:01:41.440] I really like how atmospheric it is.
[00:01:41.440 --> 00:01:42.880] I think the writing is really good.
[00:01:42.880 --> 00:01:45.760] I think that it's really visually interesting.
[00:01:45.760 --> 00:01:49.920] I think that I am really annoyed with the main character.
[00:01:49.920 --> 00:01:51.360] I don't think he's complex.
[00:01:51.360 --> 00:01:53.840] I think he needs to get over himself.
[00:01:54.160 --> 00:01:56.880] Is that the way the character is designed?
[00:01:56.880 --> 00:02:00.120] 100% he's designed to be complex, but he's not.
[00:02:00.120 --> 00:02:01.320] Yeah, well, he's damaged.
[00:02:01.320 --> 00:02:02.120] Yeah, which is fine.
[00:01:58.880 --> 00:02:03.720] But, like, and they explore that a little bit.
[00:01:59.200 --> 00:02:04.760] But he's not complex.
[00:02:05.000 --> 00:02:13.000] He's just selfish and privileged and all the things that many women deal with in the dating world right now.
[00:02:13.000 --> 00:02:17.000] Yeah, I mean, my overall take, I mean, this season wasn't as good as the last two seasons.
[00:02:17.160 --> 00:02:19.640] I had to feel like they were trying too hard.
[00:02:19.640 --> 00:02:24.680] You know, like, a lot of episodes, like, yeah, okay, I could see like artistically where they were going with it.
[00:02:24.680 --> 00:02:27.080] It just wasn't that enjoyable an episode.
[00:02:27.080 --> 00:02:29.080] Yeah, not a lot happens this season.
[00:02:29.080 --> 00:02:30.200] I think it's fair to say that.
[00:02:30.200 --> 00:02:34.040] There's not a whole lot of activity going on this season.
[00:02:34.600 --> 00:02:35.400] It's classic.
[00:02:35.400 --> 00:02:37.480] They forgot to have shit happen.
[00:02:37.480 --> 00:02:37.960] Yeah.
[00:02:37.960 --> 00:02:38.920] Whoops.
[00:02:38.920 --> 00:02:39.560] Whoops.
[00:02:39.560 --> 00:02:41.800] Shogun, on the other hand, was excellent.
[00:02:41.800 --> 00:02:42.840] Really awesome.
[00:02:42.840 --> 00:02:43.400] Good to hear.
[00:02:43.560 --> 00:02:48.120] Leading in Emmy nominations and there's very, very good.
[00:02:48.120 --> 00:02:49.400] Yeah, I think you would love it.
[00:02:49.560 --> 00:02:50.360] Really, I would love it.
[00:02:50.360 --> 00:02:53.640] I'm not a big fan of like fighty shows.
[00:02:53.960 --> 00:02:56.200] It's not really a fighting show.
[00:02:56.200 --> 00:02:59.080] I mean, there's fighting that happens, but that's not good.
[00:02:59.400 --> 00:03:01.000] There's fighting, but it's not fighty.
[00:03:01.160 --> 00:03:02.040] Okay, cool.
[00:03:02.040 --> 00:03:02.360] Okay.
[00:03:02.360 --> 00:03:03.400] Good to know.
[00:03:03.400 --> 00:03:06.520] I mean, but it takes place, what, in the 1600s, right?
[00:03:06.840 --> 00:03:10.840] And so what wasn't fighting in the 1600s, right?
[00:03:10.920 --> 00:03:12.760] It's sort of, it's the background.
[00:03:12.760 --> 00:03:13.880] It's the environment.
[00:03:14.200 --> 00:03:20.360] Yeah, it's more that, like, I'm just not, like, entertained by action sequences.
[00:03:21.160 --> 00:03:21.800] You know what I mean?
[00:03:21.800 --> 00:03:24.120] That's not enough to hold my attention.
[00:03:24.120 --> 00:03:29.800] It's very much driven by excellent characters, and the best character, in my opinion, is the lead female.
[00:03:29.800 --> 00:03:30.760] She's awesome.
[00:03:31.400 --> 00:03:32.600] That's good.
[00:03:32.600 --> 00:03:42.760] Well, we haven't spoken yet about the big thing that happened since the last episode, the failed assassination attempt on Donald Trump.
[00:03:43.400 --> 00:03:44.040] That thing.
[00:03:44.040 --> 00:03:47.280] Well, I'm not going to talk about it politically, but the political aspect of it.
[00:03:47.520 --> 00:04:00.160] What I want to talk about is the fact that instantly, within seconds, minutes of this event happening, the internet was abuzz with all sorts of conspiracy theories.
[00:04:00.160 --> 00:04:02.000] Like, that's the go-to explanation.
[00:04:02.000 --> 00:04:03.360] It's a conspiracy.
[00:04:03.360 --> 00:04:06.320] It was impressive in its own right, how fast it is.
[00:04:06.560 --> 00:04:11.360] But I also got to ask before Steve, you get into the brass tacks of it all.
[00:04:11.360 --> 00:04:14.720] I mean, did you have a moment?
[00:04:14.960 --> 00:04:15.760] No, I really didn't.
[00:04:15.760 --> 00:04:17.840] No, really, not even a single moment.
[00:04:17.840 --> 00:04:21.040] No, you know, because, again, it's conspiracy theory.
[00:04:21.360 --> 00:04:30.640] I had sort of an immediate skeptical reaction, although a lot of people in my social circle had that immediate conspiracy theory instinct.
[00:04:30.640 --> 00:04:37.840] I think the difficulty was, yes, how could the Secret Service have failed so miserably?
[00:04:37.840 --> 00:04:43.760] I think that was, and like people are trying to answer that question, and they're looking for an explanation.
[00:04:43.760 --> 00:04:48.320] Well, yeah, the simpler explanation is always that is incompetence, right?
[00:04:48.320 --> 00:04:49.600] Never attributes to malice.
[00:04:50.080 --> 00:04:51.200] Incompetence, right?
[00:04:51.200 --> 00:04:52.160] User error.
[00:04:53.440 --> 00:04:55.920] So, you know, that was the idea that, and both sides did this.
[00:04:55.920 --> 00:05:04.480] Both sides used the fact that the Secret Service failed to prevent this from happening as evidence that or as a reason to think that it might have been a conspiracy.
[00:05:04.480 --> 00:05:14.400] Even like within my group of friends who are generally skeptical, smart people, that was sort of their instinct, like, oh, this has to be staged or whatever.
[00:05:14.880 --> 00:05:23.920] And then they would search for reasons to support what they want to believe based upon their ideological outlook, right?
[00:05:23.920 --> 00:05:31.000] Rather than asking about how plausible it is that something like this could have been pulled off.
[00:05:29.840 --> 00:05:35.320] Also, we have to consider, like, this would be a really stupid thing for either side to do.
[00:05:35.400 --> 00:05:55.160] The risk of being found out massively outweighs any incremental benefit they may get from the politics of either like staging it or you know, the outcome would not necessarily have been good, you know, if the assassination were successful.
[00:05:55.160 --> 00:06:07.720] So it's meanwhile, if their campaign was discovered to have been involved with a conspiracy, that would be the end, the absolute end of their campaign, regardless of the outcome of this event.
[00:06:07.720 --> 00:06:08.200] That's right.
[00:06:08.360 --> 00:06:08.760] Exactly.
[00:06:08.760 --> 00:06:09.800] Yeah, that's the big thing.
[00:06:09.880 --> 00:06:11.160] Grand conspiracy.
[00:06:11.560 --> 00:06:19.080] I'm less convinced by an argument that certain actors are doing a lot of things based on logic.
[00:06:19.080 --> 00:06:24.840] So the argument was, was it that it was staged or that it was orchestrated?
[00:06:24.840 --> 00:06:27.400] Because I think that's two different claims.
[00:06:27.400 --> 00:06:30.120] So it could be staged as in it wasn't real.
[00:06:30.120 --> 00:06:31.080] You know what I mean?
[00:06:31.080 --> 00:06:33.400] Like, like as in he didn't actually shoot him.
[00:06:33.400 --> 00:06:34.440] It was like staged.
[00:06:34.440 --> 00:06:36.600] It was like it was magic.
[00:06:36.600 --> 00:06:37.400] Yeah, exactly.
[00:06:37.720 --> 00:06:47.000] Versus it was orchestrated, meaning that they intentionally had a guy shoot at Donald Trump, knowing that he would miss by a hair.
[00:06:47.160 --> 00:06:48.040] That's the other part.
[00:06:48.040 --> 00:06:48.360] Yeah.
[00:06:48.360 --> 00:06:49.400] Yeah, one of those two.
[00:06:49.400 --> 00:06:49.960] One of those two.
[00:06:49.960 --> 00:06:51.640] Okay, so there's different variations.
[00:06:51.640 --> 00:06:52.680] I think people just conflate it.
[00:06:52.680 --> 00:06:55.960] Just like something about this was a false flag operation.
[00:06:56.520 --> 00:06:59.640] Oh, God, I hate that term so much.
[00:06:59.960 --> 00:07:06.840] So and then on the other side, of course, it was the fact that the Democrats were just they, that they did this, right?
[00:07:06.840 --> 00:07:13.160] So, by the way, good skeptical rule of thumb: do not use a vague reference to they, right?
[00:07:13.160 --> 00:07:19.520] Because that you're whitewashing over a lot of important details, and you're almost assuming a conspiracy when you do that.
[00:07:19.680 --> 00:07:36.720] So, it's saying, like, they tried to impeach him, and then they tried to prosecute him, and then they tried to take away his wealth, and now they tried to assassinate him as if like this is all the same group or the same group of people.
[00:07:36.720 --> 00:07:44.240] And some of them are saying it explicitly because they're always just referring to George Soros.
[00:07:44.560 --> 00:07:46.080] There is no they here.
[00:07:46.160 --> 00:07:47.920] This is like this is one person.
[00:07:48.240 --> 00:08:06.800] The only thing the FBI has been able to say so far is this guy was definitely acting alone, and he fits the total profile of a lone wolf, a massive shooter, you know, in terms of his age, gender, race, you know, a little bit of on the outside, the fringe socially, you know, enamored of guns.
[00:08:06.800 --> 00:08:08.000] I mean, it's perfect.
[00:08:08.000 --> 00:08:11.360] But generally speaking, in mass shootings, because that's something I think we have to remember, too.
[00:08:11.360 --> 00:08:13.360] This was a mass shooting event.
[00:08:14.000 --> 00:08:18.160] Yes, it was an assassination attempt, but let's not also forget, right?
[00:08:18.160 --> 00:08:18.720] Yeah.
[00:08:19.120 --> 00:08:30.640] And in mass shootings, very often, especially when you're dealing with like the young white male who's like somewhat intelligent, somewhat socially withdrawn, we often will see a manifesto.
[00:08:30.640 --> 00:08:37.600] We'll see some sort of something written on social media, clues leading up to it about their ideological leanings.
[00:08:37.600 --> 00:08:40.880] Yeah, there's none of that, which is fascinating.
[00:08:41.440 --> 00:08:50.800] Or, like, I don't know if you guys, did you watch the, we've talked about it before, Manhunt, I think, which was the series about Lincoln and John Wilkes Booth.
[00:08:50.960 --> 00:09:00.680] And yes, John Wilkes Booth was ideologically motivated, as in he was a Confederate and he really like believed in those causes, but really, he just wanted to be famous.
[00:09:00.680 --> 00:09:02.280] Like, that was a huge part of it.
[00:08:59.840 --> 00:09:03.800] He wanted to be the guy.
[00:09:04.600 --> 00:09:08.600] All right, Bob, tell us about caves on the moon.
[00:09:08.600 --> 00:09:10.200] Yes, I've been waiting for this.
[00:09:10.200 --> 00:09:11.560] I knew it was going to come.
[00:09:11.560 --> 00:09:24.680] So, for the first time, we now have solid evidence that the moon does indeed have large underground tunnels or caves, and these researchers think they would be a great place to hang out and just be safe on the moon.
[00:09:25.320 --> 00:09:28.440] What did it finally take for scientists to agree with me?
[00:09:28.440 --> 00:09:33.800] And what does this mean for my dream of a moon-based alpha before the heat death of the universe?
[00:09:34.040 --> 00:09:38.440] So, this is from the typical international team of scientists, which I love.
[00:09:38.680 --> 00:09:43.080] In this case, led by the scientists at the University of Trento in Italy.
[00:09:43.080 --> 00:09:48.920] This was published mid-July 2024, just recently, in the journal Nature Astronomy.
[00:09:48.920 --> 00:09:56.680] The title of the paper is Radar Evidence of an Accessible Cave Conduit on the Moon Below the Mare Tranquilitatus Pit.
[00:09:56.680 --> 00:09:59.320] Okay, so this makes me so happy.
[00:09:59.800 --> 00:10:01.800] You probably figured that out.
[00:10:01.800 --> 00:10:02.520] Why, though?
[00:10:02.520 --> 00:10:03.080] Why?
[00:10:03.080 --> 00:10:18.040] Because I don't like the idea of astronauts on the moon for extended periods of time, either dying horribly, painfully prolonged deaths, or very quick deaths, or I don't even like the idea of them being incredibly annoyed by moon dust.
[00:10:18.040 --> 00:10:29.560] And the utility of this, of an already existing huge underground cave or tunnel on the moon is primarily underscored by the fact that the moon surface really, really kind of sucks.
[00:10:29.560 --> 00:10:31.080] It's a horrible place.
[00:10:31.400 --> 00:10:33.480] You think about the moon, you see the videos, right?
[00:10:33.480 --> 00:10:35.080] It seems like a fun place, right?
[00:10:35.080 --> 00:10:40.040] All bouncy and happy, but it's really quite hellish.
[00:10:40.360 --> 00:10:43.640] And for a surprisingly number of deadly reasons.
[00:10:43.640 --> 00:10:46.960] First off, there's the temperature variations, which are nasty.
[00:10:44.920 --> 00:10:52.080] On the bright side of the moon, we're talking 127 degrees Celsius, 261 Fahrenheit.
[00:10:52.400 --> 00:10:56.080] On the unilluminated side, side of the whatever.
[00:10:56.320 --> 00:10:57.360] Ah, nice, Bob.
[00:10:57.680 --> 00:11:03.360] It can drop to minus 173 Celsius or minus 280 Fahrenheit.
[00:11:03.360 --> 00:11:04.320] God, that's cold.
[00:11:04.320 --> 00:11:05.840] So there's no Goldilocks zone on the moon.
[00:11:07.280 --> 00:11:09.680] Well, hold on to that thought, Evan.
[00:11:09.680 --> 00:11:10.640] Hold on to that thought.
[00:11:10.960 --> 00:11:13.200] Do I even have to say more about those temperature swings?
[00:11:13.200 --> 00:11:14.480] They're just, wow.
[00:11:14.480 --> 00:11:17.040] Next is the radiation on the surface of the moon.
[00:11:17.040 --> 00:11:23.360] There's galactic cosmic rays, which are high-energy particles from things like supernova, supernovae.
[00:11:23.520 --> 00:11:29.680] They can kill you over time by essentially increasing the risk of cancer by all those high-energy particles hitting you.
[00:11:29.840 --> 00:11:32.480] The sun radiation, though, is even worse.
[00:11:32.480 --> 00:11:41.280] They call it solar particle events, SPEs, but they're sudden and non-predictable, which makes them especially nasty.
[00:11:41.600 --> 00:11:42.880] Particles or the events?
[00:11:43.600 --> 00:11:50.400] Well, the events, these solar particle events are kind of sudden and they're not very predictable at all.
[00:11:50.400 --> 00:11:55.200] They can expose astronauts on the surface to literally life-threatening doses.
[00:11:56.160 --> 00:12:04.000] Just generally speaking, the moon varies then 200 to 2,000 times the radiation dose that we receive here on Earth.
[00:12:04.000 --> 00:12:07.920] But you've got to remember, because if you go to different websites, you may find different ranges.
[00:12:07.920 --> 00:12:12.320] And that's because we're really not sure how bad the radiation is yet.
[00:12:12.480 --> 00:12:14.960] It hasn't been studied as fully as it needs to be studied.
[00:12:15.440 --> 00:12:18.000] We don't have a radiation detector sitting on the moon somewhere.
[00:12:18.240 --> 00:12:18.640] What's that?
[00:12:18.960 --> 00:12:20.560] There's no radiation detector on the side.
[00:12:20.800 --> 00:12:21.920] But what kind is it detecting?
[00:12:22.240 --> 00:12:23.840] Is it detecting solar radiation?
[00:12:23.840 --> 00:12:25.520] But what about X-rays and gamma rays?
[00:12:25.520 --> 00:12:34.760] I mean, I'm not aware of any full full spectrum test that fully assesses the radiation doses that you're receiving on the moon.
[00:12:29.600 --> 00:12:36.440] Clearly, though, it's not good.
[00:12:36.760 --> 00:12:39.320] So, all right, then let's talk about a worst-case scenario.
[00:12:39.320 --> 00:12:42.200] And that happened on October 20th, 1989.
[00:12:42.200 --> 00:12:44.840] There was an X-class solar flare.
[00:12:44.840 --> 00:12:45.880] That's X-class.
[00:12:46.120 --> 00:12:47.320] There is no Y-class.
[00:12:47.320 --> 00:12:48.600] It ends with the X-class.
[00:12:48.600 --> 00:12:50.920] They are the nastiest solar flares.
[00:12:51.160 --> 00:12:56.440] That essentially caused a geomagnetic storm that bathed the moon in radiation.
[00:12:56.440 --> 00:13:02.520] And that radiation was more than eight times the radiation received by plant workers during Chernobyl.
[00:13:02.520 --> 00:13:04.120] So that's what you would have received.
[00:13:04.680 --> 00:13:11.800] If you were an astronaut on the moon, you would have received over a brief period of time eight times the dose that Chernobyl workers received.
[00:13:11.800 --> 00:13:19.240] From what I could tell, if you were an astronaut on the moon on October 20th, 1989, you probably would have died within hours.
[00:13:19.240 --> 00:13:21.400] So, I mean, that's how deadly we're talking.
[00:13:21.800 --> 00:13:24.120] You know, that's hours, that's pretty fast.
[00:13:24.120 --> 00:13:30.520] The only way you're going to die quicker on the moon is if you're hit with micrometeorite or meteorological.
[00:13:31.080 --> 00:13:32.360] That's just nasty.
[00:13:32.600 --> 00:13:36.600] And that's my next one here: micrometeorite impacts.
[00:13:36.920 --> 00:13:39.960] These are constantly bombarding the moon.
[00:13:39.960 --> 00:13:51.320] They can be very tiny particles or they could be up to a few centimeters, and they travel potentially up to 70 kilometers per second with the average impact velocity of about 20 kilometers per second.
[00:13:51.320 --> 00:13:54.360] That's a lot of kinetic energy there, even for something tiny.
[00:13:55.000 --> 00:13:59.240] The damage to structures or your head would be catastrophic.
[00:13:59.240 --> 00:14:01.000] And it's not even a direct hit.
[00:14:01.000 --> 00:14:08.440] You could get hit by the particles that are kicked up after it hits the ground on the moon, the ejecta.
[00:14:08.440 --> 00:14:09.800] Even that can be deadly.
[00:14:10.040 --> 00:14:11.800] So that's yet another one.
[00:14:11.800 --> 00:14:16.640] And then the final one on my list here is the dreaded moon dust, the regolith.
[00:14:14.920 --> 00:14:21.040] This is probably the most hated thing for Apollo astronauts on the moon.
[00:14:21.120 --> 00:14:23.120] They really, really did not like it.
[00:14:23.360 --> 00:14:26.640] Lunar soil is this dust is fine like powder.
[00:14:26.640 --> 00:14:31.520] It's even finer than I thought it was, but it's abrasive and sharp like glass.
[00:14:31.520 --> 00:14:36.160] Now, this comes from mechanical weathering on the surface of the moon.
[00:14:36.160 --> 00:14:47.200] The rocks have been fractured by meteors and micrometeorites over billions and billions of years, and it makes them really, really tiny, but they stay sharp because there's no wind and there's no water erosion.
[00:14:47.200 --> 00:14:50.000] So they stay basically nasty forever.
[00:14:50.240 --> 00:14:54.160] Anakin Skywalker would hate this far more than sand.
[00:14:54.480 --> 00:14:59.760] It not only gets everywhere, it eventually damages whatever it comes in contact with.
[00:14:59.760 --> 00:15:04.240] It even causes what the Apollo 17 astronauts called lunar hay fever.
[00:15:04.240 --> 00:15:05.280] Ever hear of that?
[00:15:05.920 --> 00:15:11.680] Every of the 12 men that walked on the moon, every one of them got this lunar hay fever.
[00:15:11.680 --> 00:15:16.320] I mean, and this was from the moon dust, the moon sand, the regolith.
[00:15:16.320 --> 00:15:22.800] It was sneezing and nasal congestion, and sometimes it took many days for it to even fade, but they all got it.
[00:15:22.800 --> 00:15:24.480] And that's just over a weekend.
[00:15:24.480 --> 00:15:27.840] They were there from like just like a day or three.
[00:15:27.840 --> 00:15:29.920] It was really, they weren't there very long.
[00:15:30.560 --> 00:15:33.440] And get this, they've done experiments with analogs.
[00:15:33.440 --> 00:15:40.720] They created this analog for the regolith, and they showed that long-term exposure would likely destroy lung and brain cells.
[00:15:41.040 --> 00:15:42.160] That would be bad.
[00:15:42.160 --> 00:15:43.120] And then he paused.
[00:15:43.360 --> 00:15:43.920] And then he paused.
[00:15:44.080 --> 00:15:45.840] Yeah, destroying lung and brain cells.
[00:15:45.840 --> 00:15:46.400] Not good.
[00:15:46.400 --> 00:15:49.120] I'm not sure which I would rather lose first.
[00:15:49.280 --> 00:15:50.400] Maybe brain cells.
[00:15:50.400 --> 00:15:53.520] Okay, so moon dust doesn't even lay nicely on the ground.
[00:15:53.520 --> 00:15:54.160] Did you know that?
[00:15:54.160 --> 00:16:04.280] It's electrostatically charged from the sun, so it actually floats a little bit above the surface, and that makes it even easier to get everywhere-to breathe it in or to get to get in your equipment.
[00:16:04.600 --> 00:16:11.560] The bottom line then is that being on the surface of the moon, like the Apollo astronauts, just for a weekend, is basically okay.
[00:16:11.560 --> 00:16:22.120] You know, it's you know, it's fairly safe, but it can be annoying, specifically the moon dust was the most annoying because none of the other big players came into play, right?
[00:16:23.160 --> 00:16:28.760] There were no micrometeoroids or micrometeorites or radiation or any of that hit them.
[00:16:28.760 --> 00:16:29.880] So, it was fairly safe.
[00:16:29.880 --> 00:16:34.680] But if you go beyond that, though, you go beyond just a weekend, like is what we're planning, right?
[00:16:34.680 --> 00:16:40.120] We're trying to make much more permanent stays on the moon, it just gets increasingly and increasingly deadly.
[00:16:40.120 --> 00:16:46.280] All right, so that's my that's my story, that's my background on why it's so nasty on the surface of the moon.
[00:16:46.280 --> 00:16:53.000] This latest news item starts with an open pit in the Mare Tranquilitatis, the Sea of Tranquility.
[00:16:53.000 --> 00:16:54.840] That's such always a beautiful name.
[00:16:54.840 --> 00:17:01.560] So, now the Sea of Tranquility, it looked like a sea, right, to early moon observers, but it's really just an ancient lava plane.
[00:17:01.560 --> 00:17:07.640] And it's also where the Apollo 11 astronauts, right, Neil Armstrong and Buzz Aldrin, first set foot on the moon.
[00:17:07.640 --> 00:17:09.400] But I mean, this is a lava plane, right?
[00:17:09.400 --> 00:17:10.760] Lava was flowing through here.
[00:17:10.760 --> 00:17:14.840] You know, there was like basaltic lava all over there many billions of years ago.
[00:17:15.080 --> 00:17:21.800] Now, these lunar pits were identified in 2009, which is actually a little bit later than I thought they were.
[00:17:21.800 --> 00:17:25.320] But in 2009, they were first really identified.
[00:17:25.320 --> 00:17:28.840] And that's probably because they look like normal craters from a distance.
[00:17:28.840 --> 00:17:32.760] But if you look closer, you see, ah, that's not an impact crater.
[00:17:32.760 --> 00:17:38.200] It looks more like a collapsed roof or a skylight, if you will, rather than that impact.
[00:17:38.200 --> 00:17:46.240] Now, by now, hundreds of these pits have been found, and the speculation is that many of these are lava tubes, billions of years old.
[00:17:46.560 --> 00:17:51.680] Basaltic lava was flowing through, was all through that area, through these mares.
[00:17:51.680 --> 00:17:56.000] They created these lava tubes, and eventually they drained away, leaving the empty tubes.
[00:17:56.000 --> 00:17:58.640] And we've got plenty of these on the Earth.
[00:17:58.640 --> 00:18:03.120] On the moon, they could be potentially even bigger because of the low gravity.
[00:18:03.120 --> 00:18:09.520] Now, the specific pit in the Sea of Tranquility is 100 meters in diameter and 150 meters deep.
[00:18:09.520 --> 00:18:10.720] So, this is kind of big.
[00:18:10.720 --> 00:18:23.760] The difference, though, this pit is special because this pit was overflown by NASA's Lunar Reconnaissance Orbiter, and it was done even more importantly, at a relatively oblique angle.
[00:18:23.760 --> 00:18:30.560] So, that relatively low angle allowed the radar to enter the tunnel at, say, 45 degrees instead of straight up and down.
[00:18:30.560 --> 00:18:39.120] And that allowed the radar to actually get under that overhang the pit walls, you know, the pit sides going down.
[00:18:39.120 --> 00:18:45.440] And so, it kind of went under, and then the radar kind of bounced around a little bit before coming back out.
[00:18:45.440 --> 00:18:52.160] And this is what showed that the underground area under the pit extended at least 170 meters.
[00:18:52.160 --> 00:18:56.640] So, far, you know, wider than wider than the actual hole going in.
[00:18:56.640 --> 00:18:58.880] So, this was at least 170 meters.
[00:18:58.880 --> 00:19:12.160] And the researchers thought this was extraordinary, and it is because clearly there is some sort of area underneath this pit that's bigger than you might see, might imagine just from looking at the pit opening itself.
[00:19:12.320 --> 00:19:13.840] And so, they thought this was extraordinary.
[00:19:13.840 --> 00:19:19.840] And, like good scientists, they figured, well, let's validate this because this is kind of amazing.
[00:19:19.840 --> 00:19:22.800] So, let's see if we can validate this in some other way.
[00:19:22.800 --> 00:19:39.800] And so, what the direction they decided to take was to use a 3D, to create models, a 3D computer model of the pit, matching the visible geometry, the known geometry of the pit from images and using basically like 3D stereoscopic images to creating this 3D image.
[00:19:39.800 --> 00:19:44.520] So they had the geometric layout of the pit and the surrounding lunar surface.
[00:19:44.520 --> 00:19:47.240] So they used that as kind of like the base of their novel.
[00:19:47.240 --> 00:19:50.200] Here's what we know from what we can see of that area.
[00:19:50.440 --> 00:19:55.160] The model then simulated that oblique radar beam that came off of the orbiter.
[00:19:55.160 --> 00:20:03.560] And they went through many iterations of possible geometries until the simulation matched the real observational data from the moon.
[00:20:03.560 --> 00:20:04.280] So you get that?
[00:20:04.280 --> 00:20:10.600] They tweaked and tweaked the model until the model was basically doing exactly what reality was doing.
[00:20:10.600 --> 00:20:19.560] So that way they have a pretty solid, high confidence that, all right, so whatever the model is saying now, whatever it's concluding, could be potentially true.
[00:20:19.560 --> 00:20:26.440] So the model made a couple of different solutions, and there was only one solution that was geologically plausible.
[00:20:26.440 --> 00:20:35.560] And that solution contained a big cave conduit that was up to 170 meters long, but could be even bigger, they say.
[00:20:35.560 --> 00:20:36.680] So that was their conclusion.
[00:20:36.680 --> 00:20:44.520] So according to these researchers, there's very likely to be a sizable subsurface cavern or tunnel on the moon.
[00:20:44.520 --> 00:20:48.040] And in their mind, it seemed like this is basically a done deal.
[00:20:48.840 --> 00:20:51.240] Their confidence levels are very, very high.
[00:20:51.400 --> 00:20:53.880] And that's awesome from my point of view, obviously.
[00:20:53.880 --> 00:21:06.840] But it's also, at the same time, I feel like, yeah, it's about time we confirmed this, because it seemed, you know, looking at these pictures, it seemed pretty obvious that there was some sort of space underneath these pits bigger than you would think.
[00:21:06.840 --> 00:21:13.080] So I'm just very kind of happy and relieved that they finally are really accepting this.
[00:21:13.400 --> 00:21:15.000] All right, so what's the next step here?
[00:21:15.600 --> 00:21:19.440] The next step is to determine how big this is.
[00:21:19.440 --> 00:21:20.400] Because think about it.
[00:21:20.640 --> 00:21:24.720] We have the radar going straight down in one direction.
[00:21:24.720 --> 00:21:30.880] So we know that this is kind of an extended kind of tunnel-like 175 meters or more.
[00:21:30.880 --> 00:21:35.440] But what they need to do is they need to do more flybys, but from different angles.
[00:21:35.440 --> 00:21:40.320] So when you hit it from different angles, you're looking at different areas of this subsurface cavern.
[00:21:40.320 --> 00:21:44.000] You know, is it very narrow, making it a tube, or not?
[00:21:45.200 --> 00:21:57.040] So they say right now that even though they have really no idea how wide it is, it's probably almost certainly 55 to 60 meters wide, which would mean it's probably a lava tube.
[00:21:57.040 --> 00:22:03.120] But they say that it could potentially be hundreds of meters wide, which would make it more cave-like than tube-like.
[00:22:03.120 --> 00:22:09.600] So, you know, it could be a lava tube, it could be a bigger, it could be a gargantuan lava tube, or perhaps more of a cave-like system.
[00:22:09.840 --> 00:22:15.440] They're not sure, and they say that the only way to do it is to do more flybys, which I hope we really do.
[00:22:15.760 --> 00:22:27.120] Okay, so the low-gravity elephant in this pit is the idea that if it really is roomy down there, then it would make a great location for a moon base alpha.
[00:22:27.280 --> 00:22:28.960] And the scientists actually say this.
[00:22:28.960 --> 00:22:42.160] They say in their paper, this discovery suggests that the MTP, the pit, basically, is a promising site for a lunar base as it offers shelter from the harsh surface environment and could support long-term human exploration of the moon.
[00:22:42.160 --> 00:22:55.280] And in my mind, it's not only fun to think of colonies in these lunar caves, and of course, the protection they would offer would be really dramatic, and that's why I went into some detail about how dangerous the surface is.
[00:22:55.280 --> 00:22:57.280] So it would be so much safer down there.
[00:22:57.920 --> 00:23:02.520] It seems like a no-brainer in many ways since this cave is already there.
[00:23:03.160 --> 00:23:09.560] Because once you're in this cave system, the radiation, the micrometeorites, all that stuff goes away.
[00:23:10.040 --> 00:23:13.400] And get this, the temperature difference goes away as well.
[00:23:13.400 --> 00:23:19.080] Because I found a study that looked into what the temperature could potentially be in these pits.
[00:23:19.080 --> 00:23:25.720] And some researchers are saying it could be like consistently 63 degrees Fahrenheit.
[00:23:25.720 --> 00:23:28.360] I don't know exactly what that is in Celsius, but that's nice.
[00:23:28.360 --> 00:23:29.480] That's like nice weather.
[00:23:29.480 --> 00:23:31.240] That's t-shirt weather.
[00:23:31.240 --> 00:23:32.120] I'm not sure how that works.
[00:23:32.600 --> 00:23:34.280] You wear a t-shirt in 63 degrees?
[00:23:34.760 --> 00:23:36.040] We are from different parts of the country.
[00:23:36.280 --> 00:23:37.960] All right, maybe it was cold to me.
[00:23:37.960 --> 00:23:39.640] Yeah, okay, light jacket, hoodie weather.
[00:23:39.640 --> 00:23:40.600] Very light, very light.
[00:23:40.600 --> 00:23:42.520] But that's, to me, that's amazing.
[00:23:42.520 --> 00:23:55.480] I didn't do a deep dive on that paper, but even if it's, even if that's not correct, even if it's much higher or even much lower, but a consistent temperature, like around that temperature, would be amazing.
[00:23:55.480 --> 00:23:56.520] Totally amazing.
[00:23:56.520 --> 00:24:01.240] Now, of course, it all seems pretty pie in the sky with modern technology, right?
[00:24:01.720 --> 00:24:12.600] Getting all the industrial equipment and people up there and working out how to build a moon base in such an environment as the moon is obviously going to be ridiculously hard.
[00:24:13.080 --> 00:24:14.840] We cannot do that right now.
[00:24:14.840 --> 00:24:25.400] And I think it's before we see anything substantial on the moon, even in these pre-made caverns under the moon's surface, I think it's going to take a hell of a long time.
[00:24:25.400 --> 00:24:32.280] Steve, if you had to make a prediction, 100 years, 80 years, it depends on something like that.
[00:24:32.280 --> 00:24:35.560] I mean, it all depends on how many resources you want to put into it.
[00:24:35.800 --> 00:24:37.320] You know, we are going back to the moon.
[00:24:37.320 --> 00:24:39.800] We are going to try to have a sustained presence on the moon.
[00:24:39.800 --> 00:24:43.960] If we want to build a base like this, it would be a huge engineering effort.
[00:24:43.960 --> 00:24:47.920] I mean, as you said, think of all the equipment we have to bring down to the surface of the moon.
[00:24:47.920 --> 00:24:54.320] It would take decades to do this kind of construction, even once we are permanently on the moon.
[00:24:54.320 --> 00:24:56.960] But if we want to do it, we can do it.
[00:24:56.960 --> 00:24:59.120] We can do this with our current technology.
[00:24:59.120 --> 00:25:00.400] It's not a technology issue.
[00:25:00.400 --> 00:25:03.280] It's just an effort and resource issue.
[00:25:03.520 --> 00:25:04.240] It really is.
[00:25:04.240 --> 00:25:08.880] And I think they're going to take how dangerous the surface is.
[00:25:08.880 --> 00:25:10.240] They're going to take it seriously.
[00:25:10.240 --> 00:25:16.160] And they're not going to immediately, of course, try to go into these caverns.
[00:25:16.160 --> 00:25:18.000] And by the way, I am waiting.
[00:25:18.000 --> 00:25:26.160] I hope I live long enough to see the first images from a lander that's actually cruising around in one of these tunnels.
[00:25:26.160 --> 00:25:28.240] That would be an amazing moment.
[00:25:28.240 --> 00:25:29.440] And I think they will.
[00:25:29.440 --> 00:25:30.560] They're going to take this seriously.
[00:25:30.560 --> 00:25:45.360] So whatever they construct on the moon, they're going to make sure that you pile up enough regolith, you create enough of a shield to protect you not only from radiation, but for some of the nastier, maybe some of the smaller micrometeorites.
[00:25:45.680 --> 00:25:47.920] They'll take protection seriously.
[00:25:47.920 --> 00:25:58.000] Yeah, if you made a protective structure on a moon base that had two to three feet of moon crete on the outside, that would go a long way to be protecting from radiation.
[00:25:58.000 --> 00:25:58.880] Oh, absolutely.
[00:25:58.880 --> 00:26:01.520] That's basically like a given.
[00:26:01.520 --> 00:26:03.680] It's got to be, they've got to do something like that.
[00:26:03.680 --> 00:26:11.120] Otherwise, I mean, it's like, oh, yeah, we just lost all of our astronauts on the moon because they weren't protected enough from this solar event.
[00:26:11.120 --> 00:26:21.680] So, yeah, I hope they take it very seriously and realize that, yeah, it's going to be very difficult to create a large and very safe structure on the moon.
[00:26:21.680 --> 00:26:22.800] It'd be just so easy.
[00:26:22.800 --> 00:26:24.240] Just go underground, man.
[00:26:24.240 --> 00:26:25.440] It's just right there.
[00:26:25.440 --> 00:26:38.040] And Steve, and also, Steve, I know you mentioned in your blog that the cave walls might be sharp, but I don't think they would be because the cave walls, this is just lava.
[00:26:38.040 --> 00:26:44.600] This is just lava that it wasn't mechanically weathered by being hit by micrometeorites over billions of years.
[00:26:44.760 --> 00:26:49.560] I think the surface of the tunnel itself would be fairly safe.
[00:26:49.560 --> 00:26:50.360] Yeah, that would be nice.
[00:26:50.360 --> 00:26:50.840] Yeah, it depends.
[00:26:50.840 --> 00:26:53.960] I mean, some lava tubes on the Earth, many are smooth.
[00:26:53.960 --> 00:26:54.840] Some are rough, though.
[00:26:55.000 --> 00:26:57.240] There are none that I think of that are sharp.
[00:26:57.240 --> 00:26:59.800] So hopefully that will be the same on the moon.
[00:26:59.800 --> 00:27:01.480] It just depends on what the conditions are there.
[00:27:02.040 --> 00:27:05.320] And that's the other huge thing that I didn't probably stress enough.
[00:27:05.320 --> 00:27:21.160] The fact that we have a huge, very deep tunnel on the moon right now could do amazing things for just scientific discovery and learning more about the moon because you've got this pristine lunar material that has not been exposed to the sun and galactic cosmic rays.
[00:27:21.160 --> 00:27:23.960] And who knows what we'll discover about the moon once we get down there?
[00:27:23.960 --> 00:27:30.440] Doesn't sound like the moon's ever going to be a tourist attraction or kind of this recreational spot for people to go.
[00:27:30.440 --> 00:27:30.840] Yeah.
[00:27:31.640 --> 00:27:32.520] Sure, eventually.
[00:27:32.680 --> 00:27:34.600] I mean, who would think of that trip, though?
[00:27:34.600 --> 00:27:43.400] I mean, even better than like, say, low Earth orbit, going to the moon for a week would be, you know, once it was safe, I think it would be an amazing adventure.
[00:27:43.400 --> 00:27:51.960] Once, I mean, if it ever, if it ever gets as routine as like, say, traveling across the planet, I think there could be lots of people that would go.
[00:27:51.960 --> 00:27:54.280] Who knows how it's going to happen?
[00:27:54.280 --> 00:27:54.760] All right.
[00:27:54.760 --> 00:27:55.560] Thanks, Bob.
[00:27:55.560 --> 00:27:59.320] Kara, tell us about AI Love.
[00:27:59.320 --> 00:28:00.760] AI Love.
[00:28:01.240 --> 00:28:02.120] Who's that?
[00:28:02.120 --> 00:28:10.040] Okay, so before I dive into this story, basically, which was published, I read The Conversation a lot.
[00:28:10.040 --> 00:28:11.640] I know that we've talked about it on this show.
[00:28:11.640 --> 00:28:20.960] The Conversation is a website that has lots of different verticals, and the authors of the pieces on The Conversation are academics.
[00:28:20.960 --> 00:28:25.040] So it's sort of a from the horse's mouth format.
[00:28:25.360 --> 00:28:34.480] And there's an article that came out recently called Computer Love: AI-Powered Chatbots Are Changing How We Understand Romantic and Sexual Well-Being.
[00:28:34.480 --> 00:28:48.560] And it's by three different authors from the University of Quebec at Montreal, and I said that very American because I can't pronounce it in the French, and a researcher at the Kinsey Institute at Indiana University.
[00:28:48.560 --> 00:28:49.680] So these are psychologists.
[00:28:50.240 --> 00:28:51.600] As an Alfred Kinsey?
[00:28:51.600 --> 00:28:52.000] Yeah, yeah, yeah.
[00:28:52.160 --> 00:28:53.360] As an Alfred Kinsey, yeah.
[00:28:54.160 --> 00:28:58.800] So this is these are researchers in psychology and sexology departments.
[00:28:59.200 --> 00:29:01.120] When you say it's like, there you go.
[00:29:01.360 --> 00:29:02.560] I mean, what else?
[00:29:02.560 --> 00:29:11.600] The first thing I want to know, kind of from you all, is when is the last time or do you regularly interact with chatbots?
[00:29:11.600 --> 00:29:19.120] Like, I'm thinking I have interacted with chatbots when I need to like contact IT or customer service.
[00:29:19.120 --> 00:29:19.600] But I can't.
[00:29:19.840 --> 00:29:20.160] I see.
[00:29:20.160 --> 00:29:21.040] Yeah, right, right.
[00:29:21.040 --> 00:29:24.880] I can't think of other times when I regularly interact with chatbots.
[00:29:25.040 --> 00:29:27.840] Would you call chat GPT a chatbot?
[00:29:27.840 --> 00:29:28.960] I don't think so.
[00:29:28.960 --> 00:29:29.360] Okay.
[00:29:29.360 --> 00:29:33.360] Yeah, because I think it's something where you're having a back and forth conversation.
[00:29:33.360 --> 00:29:39.040] And so, you know, there are digital and AI-powered assistants like Siri and Alexa.
[00:29:39.040 --> 00:29:43.840] And then we're starting to see more and more chatbots on the rise for a lot of different applications.
[00:29:43.840 --> 00:29:49.920] So I think my exposure to these chatbots really generally is just customer service, which means I hate them.
[00:29:50.240 --> 00:29:51.920] I hate them with a burning passion.
[00:29:52.720 --> 00:29:54.320] Speak to a person, please.
[00:29:54.320 --> 00:29:55.040] Exactly.
[00:29:55.040 --> 00:30:04.120] But there is a growing industry of chatbots for kind of all manner of services, one of which is romantic companions.
[00:30:04.120 --> 00:30:12.680] Apparently, there are over a hundred AI-powered apps that offer romantic and sexual engagement.
[00:30:13.240 --> 00:30:14.280] Only 100?
[00:30:14.280 --> 00:30:15.240] Yeah, over 100.
[00:30:15.560 --> 00:30:18.600] And the people know that it's a chatbot when they're doing it.
[00:30:18.840 --> 00:30:19.400] 100%.
[00:30:19.400 --> 00:30:20.520] 100%.
[00:30:20.520 --> 00:30:27.720] So some of the ones that they listed on here are Myanima.ai, Eva AI, KnowMe.ai.
[00:30:28.520 --> 00:30:29.720] Myanima.
[00:30:30.040 --> 00:30:31.080] M-Y-A-N-A.
[00:30:32.440 --> 00:30:34.280] Myanima.ai.
[00:30:34.280 --> 00:30:37.320] Eva AI, KnowMe.ai, and Replica.
[00:30:38.200 --> 00:30:47.800] And these are different apps, I guess, that you download to your phone where because they're AI-powered, these chatbots evolve the longer you talk to them.
[00:30:47.800 --> 00:30:49.640] They understand what you're interested in.
[00:30:49.640 --> 00:30:57.160] They understand, you know, turns of phrase that you like to use, shortcuts, how much you emote, you know, sort of your affective stance.
[00:30:57.480 --> 00:30:59.800] Are they chat GPT-based?
[00:30:59.800 --> 00:31:02.440] I think they're all different, but probably some of them are.
[00:31:02.760 --> 00:31:03.480] I would assume, right?
[00:31:03.720 --> 00:31:04.600] Yeah, I would assume so.
[00:31:04.840 --> 00:31:06.120] They'd have to be at this point.
[00:31:06.360 --> 00:31:10.920] But yeah, I'm not sure what the, like, what AI platform they're being built upon.
[00:31:10.920 --> 00:31:11.720] Right, right.
[00:31:11.720 --> 00:31:13.480] What's the target audience?
[00:31:13.480 --> 00:31:14.520] Anyone, I would think.
[00:31:14.520 --> 00:31:15.320] Anyone who's interested.
[00:31:15.560 --> 00:31:17.000] Anyone with a sex drive?
[00:31:17.000 --> 00:31:18.840] Yeah, so yeah, but who is interested?
[00:31:18.840 --> 00:31:20.840] And so that is the question, right?
[00:31:20.840 --> 00:31:36.600] And I think it's important for us to kind of approach this with an open mind and to start asking some important questions because there is actually a growing body of scientific data on these topics.
[00:31:36.600 --> 00:31:47.520] There are a lot of studies across multiple disciplines asking questions like: Can people feel something for a chat bot?
[00:31:48.080 --> 00:31:50.560] And the answer seems to be across the board, yes.
[00:31:50.560 --> 00:31:52.160] Yeah, people have known that for decades.
[00:31:52.640 --> 00:32:01.120] Emotional bonds, some people self-identify as having fallen in love with a chatbot, knowing that it's a chat bot.
[00:32:01.120 --> 00:32:33.120] And interestingly, there was one study that was cited in this coverage that showed that when everyday people are engaging with either a potential romantic partner who is human or an AI version, a chat bot, which is a potential romantic partner, that on average, people tend to choose a more responsive AI over a less responsive human being, even though they know that it's an AI.
[00:32:33.120 --> 00:32:38.080] Is that because they feel they can manipulate the conversation more to their liking with an AI?
[00:32:38.080 --> 00:32:38.720] I don't know.
[00:32:38.720 --> 00:32:41.360] Well, first of all, I don't know if anybody can answer that question.
[00:32:41.360 --> 00:32:44.960] So, like, I think that's a sort of a rhetorical question.
[00:32:44.960 --> 00:32:47.200] It's probably different for different people.
[00:32:47.200 --> 00:32:49.360] But I think that that may be one reason.
[00:32:49.360 --> 00:32:51.920] It's not the first reason I would jump to.
[00:32:51.920 --> 00:32:54.400] I would think it's because they are responsive.
[00:32:54.400 --> 00:32:55.760] They're engaged with you.
[00:32:55.760 --> 00:33:11.840] But not only that, from what you've said, they're kind of like Zealigs where they adapt themselves to you, which you wouldn't really want somebody to do to a large extent in normal conversation with human to human because then it's just like weird.
[00:33:12.080 --> 00:33:12.480] Really?
[00:33:12.480 --> 00:33:13.760] Really, Bob Jigzo?
[00:33:14.080 --> 00:33:28.240] I can almost guarantee you, there are probably hundreds of studies out there that show that people feel most heard, people feel most connected when you mirror their behaviors, when you respond in ways that are similar to how they talk.
[00:33:29.040 --> 00:33:37.320] But it seems like this system, from how you described it, maybe I'm making assumptions here, would do it to a much more dramatic degree.
[00:33:37.640 --> 00:33:45.720] Possibly, there's some conscious, there's some conscious mirroring for sure, and that's just kind of instinctive, and you're not maybe not even aware that you're doing it.
[00:33:46.200 --> 00:33:59.400] And that's fine, but I wouldn't, it just made me think of interacting with somebody whose whole purpose is to not even be themselves, but to make themselves an extension of me.
[00:33:59.640 --> 00:34:03.400] And that's just, I don't think that's necessarily healthy, right?
[00:34:03.400 --> 00:34:12.280] I think that there are some assumptions being made in that statement that are not necessarily reflective of how most people are.
[00:34:12.280 --> 00:34:19.320] I think that if you, when's the last time that, I mean, you guys don't have to answer this if you don't want to, but have any of you ever been on dating apps?
[00:34:19.320 --> 00:34:19.720] Yeah, sure.
[00:34:19.880 --> 00:34:21.480] I have never been on a dating app.
[00:34:21.480 --> 00:34:22.760] I didn't think so.
[00:34:23.240 --> 00:34:25.800] I'm like, I'm talking to a bunch of people readmen.
[00:34:26.440 --> 00:34:30.600] But on a dating app, very often you connect with somebody for the first time.
[00:34:30.600 --> 00:34:36.360] You know nothing about them except for this over-the-top representation that they are trying to present to you.
[00:34:36.360 --> 00:34:42.760] And then when you start engaging, you start to recognize things like, oh, they don't know the difference between your, your, and your.
[00:34:43.320 --> 00:34:45.400] It's a resume, then the interview, right?
[00:34:45.400 --> 00:34:45.640] Right.
[00:34:45.640 --> 00:34:49.160] It's like, okay, and does this person have any emotional intelligence whatsoever?
[00:34:49.160 --> 00:34:50.200] Are they listening to me?
[00:34:50.200 --> 00:34:51.640] Are they asking me questions?
[00:34:51.640 --> 00:35:07.960] And Bob, I would venture to guess that an individual who connects with somebody on a dating app where the person or the chatbot that they connect with is saying all the things you'd hope they would say is going to be somebody that you fall in love with more quickly.
[00:35:07.960 --> 00:35:09.400] Yeah, for sure.
[00:35:09.400 --> 00:35:17.200] I remember to this day, I remember one of the most engaging back and forth I had with somebody on a dating app, and it was incredible.
[00:35:14.920 --> 00:35:20.320] We had so much in common, it really was a joy.
[00:35:21.360 --> 00:35:31.120] But again, that's one having things in common is one thing, but having someone adapt to you on the fly over time just strikes me.
[00:35:31.120 --> 00:35:42.640] It reminds me of the metamorph from the Next Generation episode where the woman actually attuned everything about herself to her mate so that she became the perfect mate for that person.
[00:35:42.640 --> 00:35:44.960] And it was like, that's just not right.
[00:35:45.840 --> 00:35:48.000] Bob, but what's your point with all this?
[00:35:48.000 --> 00:35:55.280] My point is that a natural, two people that have natural, many things naturally in common is fantastic.
[00:35:55.280 --> 00:36:02.160] But having somebody who adapts to you on purpose just to get along, to me, that crosses a line.
[00:36:02.400 --> 00:36:06.400] You may think it crosses a line, but the question is, how will people respond to that?
[00:36:06.400 --> 00:36:07.440] Yeah, and they'll love it.
[00:36:08.400 --> 00:36:10.160] They're going to love that shit.
[00:36:10.160 --> 00:36:11.440] They're going to love that shit.
[00:36:11.840 --> 00:36:14.080] I'm just giving you my take on it.
[00:36:14.080 --> 00:36:14.400] I would.
[00:36:14.720 --> 00:36:16.480] And I want to get into those implications.
[00:36:16.480 --> 00:36:20.960] And I think that sort of a takeaway from this is: yes, there could be a point where it was creepy, right?
[00:36:20.960 --> 00:36:28.880] Where somebody, where your potential romantic chatbot partner felt like too sycophantic and too inflexible.
[00:36:29.040 --> 00:36:30.000] I could see that.
[00:36:30.000 --> 00:36:33.360] But I think most people would love it, as you just said.
[00:36:33.680 --> 00:36:34.160] I don't deny that.
[00:36:35.280 --> 00:36:36.720] Humans are humans, after all.
[00:36:36.720 --> 00:36:37.200] Exactly.
[00:36:37.520 --> 00:36:39.440] I'm just kind of meta-human.
[00:36:39.760 --> 00:36:42.320] And so sure you are, Bob.
[00:36:43.360 --> 00:36:46.720] Can we figure out how to do this experiment and not tell Bob we're doing it?
[00:36:46.800 --> 00:36:48.880] I want to see how he actually responds to it.
[00:36:48.880 --> 00:36:53.760] Oh, if it's a good algorithm that does it seamlessly, of course.
[00:36:54.080 --> 00:36:58.320] I am, there's lots of lots, there's a lot of parts of me that are human, after all.
[00:36:58.320 --> 00:37:02.440] So I think I am absolutely can be swayed by that.
[00:36:59.840 --> 00:37:06.440] But it's just the way it was presented, that it's adapting to you over time.
[00:37:06.760 --> 00:37:08.120] Yeah, that's what AI does.
[00:37:08.120 --> 00:37:08.920] It adapts.
[00:37:09.240 --> 00:37:09.800] Yeah, it adapts.
[00:37:09.880 --> 00:37:12.120] It's like definitionally what it does, right?
[00:37:12.120 --> 00:37:30.600] And so the question here is: aside from the ick factor that Bob has flagged for himself personally, like his personal proclivities, what are some of the legitimate moral, ethical, you know, what are the actual potential problems?
[00:37:31.080 --> 00:37:36.280] It's like there's a lot, I think, for creating unhealthy relationships.
[00:37:36.280 --> 00:38:03.560] It's like the way advertising markets men and women that are basically weaponized beauty and people with people that are that are amazingly good looking right out of the gate and then they add the makeup and then they add the Photoshop tweaks, making people that are just like completely unrealistic and giving people very unrealistic goals or goals to achieve.
[00:38:03.720 --> 00:38:05.000] I want to be that pretty.
[00:38:05.000 --> 00:38:09.880] I want to wear all this makeup and have cosmetic surgery so I can be that pretty.
[00:38:09.880 --> 00:38:25.960] So when you create a relationship based on that, you're creating a relationship with somebody who's unrealistic because they're so attuned to you that I think you would be unsatisfied with almost anybody else because they wouldn't be as attuned to you as this AI person.
[00:38:26.280 --> 00:38:41.640] So the outcome that you are identifying in this scenario is that you, as the consumer, are now going to be unsatisfied in real relationships, or in, I should say, in analog relationships.
[00:38:41.640 --> 00:38:42.040] Right.
[00:38:42.040 --> 00:39:01.920] Well, I think the worst case scenario here in terms of the effects on people are that, you know, would these AI girlfriend or whatever apps, significant other apps, create an arms race to create the most addictive, the most appealing, the most everything that an AI could be.
[00:39:01.920 --> 00:39:10.720] Would that create completely unrealistic expectations of people in terms of relationships that no living person could ever keep up with?
[00:39:10.720 --> 00:39:19.600] But at the same time, it could create the pressure for people to feel like they have to be now as good as the AI, and that could be extremely unhealthy.
[00:39:19.600 --> 00:39:23.040] Yeah, and I think that that social isolation concern, right?
[00:39:23.040 --> 00:39:26.160] Because the eventual outcome of that would be social isolation.
[00:39:26.160 --> 00:39:29.440] It would be the lack of engagement.
[00:39:29.440 --> 00:39:31.840] I think that that is a legitimate concern.
[00:39:31.840 --> 00:39:43.200] And to me, that's sort of, I don't want to say it's the best case scenario, but I think an even more pernicious outcome is a lack of growth.
[00:39:43.200 --> 00:39:43.920] It's a lack.
[00:39:44.080 --> 00:39:50.160] So the consumer, the end user, is now not learning about things like empathy.
[00:39:50.160 --> 00:39:54.080] They're not learning skills in relationships like compromise.
[00:39:54.080 --> 00:39:55.920] They're not learning rejection either.
[00:39:55.920 --> 00:40:01.680] Yeah, they're not learning how to cope with resiliency when they are rejected.
[00:40:02.400 --> 00:40:12.160] I would argue that the best AI chatbot people would be ones that can potentially push you to be a better person.
[00:40:13.040 --> 00:40:23.760] That would be something that would be interesting as hell to have a relationship with an AI that would actually be, that could actually make you a better person from many different angles.
[00:40:24.240 --> 00:40:24.560] It would.
[00:40:24.560 --> 00:40:28.000] And researchers are working on developing that for that very purpose.
[00:40:28.160 --> 00:40:29.040] That is a cool idea.
[00:40:29.280 --> 00:40:30.200] So, think about one more thing.
[00:40:32.200 --> 00:40:49.800] Just to kind of recap what was just said: if you're the end user, there is a potential outcome in which you become more and more socially isolated because you start to develop more and more unrealistic expectations of a partner, which, as you mentioned, Bob, it's 100% already happening.
[00:40:49.800 --> 00:40:56.920] We see this with a lot of, like, you know, there's the whole incel movement, the involuntarily celibate movement.
[00:40:57.160 --> 00:41:26.520] We see this a lot when individuals have unrealistic expectations of what actual partnership looks like, when there's a sort of privileged or a self-centered perspective that my partner is there to serve me, to give me the things that I require and that I deserve in this world, as opposed to my partner is a human being, and this is a relationship where we are egalitarian in nature and we are compromising with one another.
[00:41:26.520 --> 00:41:36.440] And so, yes, the first negative outcome is I am now alone because I had these expectations of people and then people kept failing me because they weren't as good as my chat bot.
[00:41:36.440 --> 00:41:47.800] But the second, which I believe is a more pernicious, is now I'm sort of running away on a negative feedback loop of training and I start to treat people the way I treat a chat bot.
[00:41:47.800 --> 00:41:55.960] And this comes back to the conversation we had, was it just last week about robot engaging with robots in our natural environment?
[00:41:55.960 --> 00:42:04.040] And if I know that the robot doesn't have feelings and I can treat it in a very particular way, is that going to affect how I then treat people?
[00:42:04.440 --> 00:42:29.120] Now we're talking about one level more, which is an emulation of a person, in one of the most vulnerable and intimate ways that you can engage with a person, where the psychological flexibility, the emotional maturity, having done the work on yourself is so fundamentally important to be able to have a healthy relationship and have healthy companionship.
[00:42:29.120 --> 00:42:36.160] And would you do any of that if you grew up only engaging or regularly engaging with AI chatbots?
[00:42:36.160 --> 00:42:46.720] I think about the comparison that comes to my mind, and I'm curious what you guys think about this, because this wasn't in the article, but it popped up while I was reading the article, is we talk about driverless cars a lot on the show.
[00:42:46.720 --> 00:42:49.520] And we talk about are they safer, are they more dangerous?
[00:42:49.520 --> 00:42:57.200] It's the nuanced gray area of when there are driverless cars on the road and human drivers on the road that it's the most dangerous.
[00:42:57.200 --> 00:43:03.440] Because the way that they engage, if it was all just driverless cars, they would probably communicate with each other well and there wouldn't be as much danger.
[00:43:03.440 --> 00:43:17.200] But because there's a mix, and that's what I worry about here: individuals dipping their toe into AI companions and then attempting, I don't know, analog human relationships, how do they play off of each other?
[00:43:17.200 --> 00:43:19.760] How do they affect our humanity, really?
[00:43:20.080 --> 00:43:26.160] There's a whole other thing that they talk about in the article about just security, like basically just surveillance.
[00:43:26.160 --> 00:43:32.640] We know that most of these apps are collecting and selling personal user data, you know, for marketing purposes.
[00:43:32.640 --> 00:43:42.560] Imagine the intimacy, the intimate nature of that data, and just how, you know, potentially dangerous that could be exactly.
[00:43:42.560 --> 00:43:51.520] But on the flip side, as you mentioned, the researchers are actively doing a study right now where they are assessing the use of chatbots.
[00:43:51.520 --> 00:43:58.560] This is directly from the article: quote, to help involuntary celibates improve their romantic skills and cope with rejection.
[00:43:58.560 --> 00:44:05.880] So, most of the chatbots that are training chatbots on the market right now tend to be used for sexual health education.
[00:44:05.880 --> 00:44:15.800] So, like helping understand, I don't know, consent or helping understand, maybe they're not even that sophisticated, helping understand STI risks and things like that, reproductive, you know, health.
[00:44:15.800 --> 00:44:26.440] But development of chatbots to help individuals learn interpersonal skills, to help them learn vulnerability, to help them learn things like consent.
[00:44:26.440 --> 00:44:30.840] I think that there's a really amazing opportunity there.
[00:44:30.840 --> 00:44:39.560] So, I think, you know, the big takeaway from the researchers is: I like this sentence here: quote, they raise, okay, however, blah, blah, blah, they raise privacy issues, ethical concerns.
[00:44:39.560 --> 00:44:52.200] They underscore the need, all of these issues and concerns, underscore the need for, quote, an educated, research-informed, and well-regulated approach for positive integration into our romantic lives.
[00:44:52.200 --> 00:44:56.280] But current trends indicate that AI companions are here to stay.
[00:44:56.600 --> 00:44:58.360] Like, this is the reality, right?
[00:44:58.360 --> 00:45:12.040] So, how do we ensure that this reality is safe, that this reality is ethical, and that this reality is utilized for harm reduction, not for increasing harm?
[00:45:12.040 --> 00:45:24.360] And when we talk about harm, I mean physical, psychological, financial, all of it, because all of those things are at risk when we're talking about intimate relationships with basically black box AI.
[00:45:24.360 --> 00:45:25.720] All of those things are at risk.
[00:45:25.720 --> 00:45:26.200] Yeah, I agree.
[00:45:26.200 --> 00:45:27.560] That would be like the best case scenario.
[00:45:27.560 --> 00:45:40.600] That would be awesome to have AI companions or whatever, teachers, significant others that are programmed to make you your best self, to challenge you, to work on your personality, your skills, all of that.
[00:45:40.600 --> 00:45:41.240] That would be great.
[00:45:41.240 --> 00:45:45.280] But you could also see this instantly becoming part of the culture wars.
[00:45:45.360 --> 00:45:46.080] It's like, what?
[00:45:46.080 --> 00:45:48.480] Now we've got to be nice to these AI robots.
[00:45:48.480 --> 00:45:50.880] I mean, can I just have my robot slave and be done with it?
[00:45:44.680 --> 00:45:52.080] You got to shame me about it.
[00:45:52.400 --> 00:45:53.280] It's so sad.
[00:45:53.280 --> 00:45:56.080] Like, what does that say about you that you want a robot slave?
[00:45:56.080 --> 00:45:57.280] You know what I mean?
[00:45:58.960 --> 00:46:01.360] Let's self-reflect on that a little bit.
[00:46:01.600 --> 00:46:02.720] It's worth a shot.
[00:46:02.720 --> 00:46:03.680] Let's put it that way.
[00:46:03.680 --> 00:46:05.840] It's worth a shot, but it's not worth a shot in the dark.
[00:46:05.840 --> 00:46:08.960] It's worth a shot done very safely and cautiously.
[00:46:09.440 --> 00:46:15.040] Right, but won't there be bad actors out there who will just throw something together and always.
[00:46:15.040 --> 00:46:16.640] It's probably already happening.
[00:46:16.640 --> 00:46:22.960] I mean, apparently, one of these companies, they were saying we never wanted it, we never intended this to be sexual.
[00:46:22.960 --> 00:46:26.320] It was supposed to be like a friend, right?
[00:46:26.640 --> 00:46:30.400] Well, one of the companies, one of this many, were like, okay, this is like your AI friend.
[00:46:30.400 --> 00:46:35.040] And then people started having sex with them, you know, having cybersex with them.
[00:46:35.040 --> 00:46:39.120] And they started having all of these intense relationships and said they fell in love.
[00:46:39.120 --> 00:46:48.160] And when the developers realized it's being used in this way that we didn't intend and there's some risks there, they cut out that functionality.
[00:46:48.160 --> 00:46:51.520] And that completely changed the AI's algorithm.
[00:46:51.520 --> 00:46:58.400] And all of a sudden, all of these people's friends or companions started to act really differently than they had before.
[00:46:58.400 --> 00:47:01.840] And people had psychological distress.
[00:47:01.840 --> 00:47:04.000] Like there were Reddit threads opening up.
[00:47:04.000 --> 00:47:05.680] There were all of these different conversations.
[00:47:05.680 --> 00:47:07.520] Like, I feel rejected.
[00:47:07.760 --> 00:47:09.840] My girlfriend broke up with me.
[00:47:09.840 --> 00:47:11.760] She suddenly doesn't want me anymore.
[00:47:11.760 --> 00:47:15.200] And it was as if they were dumped by a human being.
[00:47:15.440 --> 00:47:23.120] And so they actually, under so much pressure, reinstated the functionality because it was so traumatic for their end users.
[00:47:23.120 --> 00:47:25.600] So, like, these are real life examples.
[00:47:25.600 --> 00:47:26.240] Oh, my God, wow.
[00:47:26.480 --> 00:47:27.240] Yeah, of the fact that they're talking about that.
[00:47:27.360 --> 00:47:29.480] We're talking about some fragile people.
[00:47:30.040 --> 00:47:33.480] Well, I mean, I don't know if that's a fair thing to say.
[00:47:34.040 --> 00:47:34.520] Have you ever?
[00:47:29.120 --> 00:47:35.000] I don't know.
[00:47:36.680 --> 00:47:39.400] Have you ever had a terrible breakup?
[00:47:39.640 --> 00:47:41.000] Yes, I did have a terrible breakup.
[00:47:41.160 --> 00:47:43.080] Were you a fragile person at that time?
[00:47:43.400 --> 00:47:45.320] At the time, I probably was.
[00:47:45.320 --> 00:47:46.520] Or maybe you were just human.
[00:47:47.400 --> 00:47:49.080] Well, sure.
[00:47:49.080 --> 00:47:52.920] I mean, but I didn't mean to say that there are fragile and non-fragile people.
[00:47:52.920 --> 00:47:55.080] I think everybody has some fragility to them.
[00:47:55.080 --> 00:47:55.400] Right.
[00:47:55.400 --> 00:48:00.360] I think that this is just a very vulnerable topic and a vulnerable experience.
[00:48:00.360 --> 00:48:15.000] When you open yourself up and you really are, you know, your true authentic self, whether it's to an AI or to a human being, when you're sharing your deepest, darkest vulnerabilities with them, that is, I think, actually a form of strength.
[00:48:15.400 --> 00:48:19.000] But we are talking about a group of people who otherwise can't find this among humans.
[00:48:19.560 --> 00:48:20.440] I don't think that's true.
[00:48:20.440 --> 00:48:21.880] I don't think that's a fair assumption.
[00:48:21.880 --> 00:48:22.200] You don't?
[00:48:22.200 --> 00:48:27.400] No, you think they're going after people who are capable of socializing?
[00:48:27.720 --> 00:48:29.560] I don't think anybody's going after anybody.
[00:48:29.560 --> 00:48:31.000] I think these are apps available in the app.
[00:48:31.080 --> 00:48:36.360] Yeah, I think there are people who are more or less vulnerable to this sort of thing, but you don't have to be vulnerable.
[00:48:36.360 --> 00:48:38.200] I think this is just a human condition.
[00:48:38.200 --> 00:48:38.440] Exactly.
[00:48:38.520 --> 00:48:42.600] You know, just like anybody can get addicted to a video game, for example.
[00:48:42.600 --> 00:48:43.640] Right, but who was the first?
[00:48:43.640 --> 00:48:44.600] What was the first question we asked?
[00:48:44.600 --> 00:48:45.880] Who's the end user here?
[00:48:45.880 --> 00:48:47.080] I think it's anybody and everybody.
[00:48:47.560 --> 00:48:47.720] Yeah.
[00:48:48.120 --> 00:48:57.560] And so I guess when I use the word vulnerable, and this is me putting my psychologist hat on here, vulnerability is a form of strength.
[00:48:57.560 --> 00:49:05.000] To be ultimately vulnerable in a trusting relationship is to be very, very brave.
[00:49:05.320 --> 00:49:16.320] And when people are brave in that way, when they open themselves up and they really put themselves out there and they are vulnerable, the bravery comes in the ability to be hurt.
[00:49:14.760 --> 00:49:20.720] And being rejected when you are vulnerable is psychic pain.
[00:49:21.040 --> 00:49:25.120] And I have seen people become suicidal over that kind of pain.
[00:49:25.120 --> 00:49:30.960] I have seen people have incredibly intense psychological reactions to that kind of pain.
[00:49:30.960 --> 00:49:33.520] People who otherwise did not have mental illness.
[00:49:33.520 --> 00:49:43.840] So I think it's, it's, I'm only saying this, Evan, because I think it's unfair to assume that there's something fundamentally different about the types of people or the individuals using this.
[00:49:43.840 --> 00:49:46.720] I think anybody could find themselves in that position.
[00:49:46.720 --> 00:49:50.960] Yeah, the instinct of, well, this couldn't happen to me, I think is naive.
[00:49:50.960 --> 00:49:56.880] Yeah, because if a person, if we've all been through it with people, and that's an assumption.
[00:49:56.880 --> 00:50:06.160] Not everybody listening to this podcast has had their heart broken, but many people have had their hearts broken and they felt crazy in those moments.
[00:50:06.160 --> 00:50:06.720] Oh my God.
[00:50:06.720 --> 00:50:06.880] Yeah.
[00:50:06.880 --> 00:50:08.960] You are not yourself.
[00:50:08.960 --> 00:50:11.520] Yeah, and there's no reason to say that wouldn't happen with a chat bot.
[00:50:11.520 --> 00:50:16.320] Let's end what for me is the bottom line, psychologically, neurologically.
[00:50:16.320 --> 00:50:23.520] Our brains function in a way that we do not distinguish between things that act alive and things that are alive.
[00:50:23.520 --> 00:50:30.000] If something acts alive, we treat it as an agent, as a living thing, emotionally, mentally.
[00:50:30.000 --> 00:50:32.720] That with all that comes along with that.
[00:50:32.720 --> 00:50:33.520] 100%.
[00:50:33.520 --> 00:50:38.640] And then you take, and then that agent gives you something you are craving, you're in.
[00:50:38.640 --> 00:50:39.440] You are in.
[00:50:39.440 --> 00:50:43.520] Eczema isn't always obvious, but it's real.
[00:50:43.520 --> 00:50:46.480] And so is the relief from EBGLIS.
[00:50:46.480 --> 00:50:53.280] After an initial dosing phase, about 4 in 10 people taking EBGLIS achieve itch relief and clear or almost clear skin at 16 weeks.
[00:50:53.280 --> 00:50:57.520] And most of those people maintain skin that's still more clear at one year with monthly dosing.
[00:50:57.520 --> 00:51:09.000] EBGLIS, Liberty Kissy Map, LBKZ, a 250 milligram per 2 milliliter injection, is a prescription medicine used to treat adults and children 12 years of age and older who weigh at least 88 pounds or 40 kilograms with moderate to severe eczema.
[00:51:09.000 --> 00:51:15.320] Also called atopic dermatitis that is not well controlled with prescription therapies used on the skin or topicals or who cannot use topical therapies.
[00:51:15.320 --> 00:51:18.280] EBGLIS can be used with or without topical corticosteroids.
[00:51:18.280 --> 00:51:19.960] Don't use if you're allergic to EBGLIS.
[00:51:19.960 --> 00:51:22.040] Allergic reactions can occur that can be severe.
[00:51:22.040 --> 00:51:22.920] Eye problems can occur.
[00:51:23.080 --> 00:51:25.400] Tell your doctor if you have new or worsening eye problems.
[00:51:25.400 --> 00:51:27.960] You should not receive a live vaccine when treated with EBGLIS.
[00:51:27.960 --> 00:51:31.080] Before starting EBGLIS, tell your doctor if you have a parasitic infection.
[00:51:31.080 --> 00:51:32.120] Searching for real relief?
[00:51:32.120 --> 00:51:39.400] Ask your doctor about EBGLIS and visit ebglis.lily.com or call 1-800-LILLIERX or 1-800-545-5979.
[00:51:39.720 --> 00:51:44.120] You probably think it's too soon to join AARP, right?
[00:51:44.120 --> 00:51:46.360] Well, let's take a minute to talk about it.
[00:51:46.360 --> 00:51:49.080] Where do you see yourself in 15 years?
[00:51:49.080 --> 00:51:53.560] More specifically, your career, your health, your social life.
[00:51:53.560 --> 00:51:56.200] What are you doing now to help you get there?
[00:51:56.200 --> 00:52:01.640] There are tons of ways for you to start preparing today for your future with AARP.
[00:52:01.640 --> 00:52:03.640] That dream job you've dreamt about?
[00:52:03.640 --> 00:52:07.800] Sign up for AARP reskilling courses to help make it a reality.
[00:52:07.800 --> 00:52:12.600] How about that active lifestyle you've only spoken about from the couch?
[00:52:12.600 --> 00:52:17.640] AARP has health tips and wellness tools to keep you moving for years to come.
[00:52:17.640 --> 00:52:21.560] But none of these experiences are without making friends along the way.
[00:52:21.560 --> 00:52:25.560] Connect with your community through AARP volunteer events.
[00:52:25.560 --> 00:52:29.960] So it's safe to say it's never too soon to join AARP.
[00:52:29.960 --> 00:52:34.120] They're here to help your money, health, and happiness live as long as you do.
[00:52:34.120 --> 00:52:38.360] That's why the younger you are, the more you need AARP.
[00:52:38.360 --> 00:52:42.600] Learn more at AARP.org/slash wise friend.
[00:52:42.600 --> 00:52:45.000] This is where projects come to life.
[00:52:45.040 --> 00:52:55.200] Our showrooms are designed to inspire with the latest products from top brands, curated in an inviting, hands-on environment, and a team of industry experts to support your project.
[00:52:55.200 --> 00:53:00.720] We'll be there to make sure everything goes as planned: from product selection to delivery coordination.
[00:53:00.720 --> 00:53:05.920] At Ferguson Bath Kitchen and Lighting Gallery, your project is our priority.
[00:53:05.920 --> 00:53:15.600] Discover great brands like Kohler at your local Ferguson showroom at the Institute for Advanced Reconstruction.
[00:53:15.600 --> 00:53:17.600] We're redefining what's possible.
[00:53:17.600 --> 00:53:24.720] From complex nerve injuries to transformative procedures, we help patients restore movement, strength, and confidence.
[00:53:24.720 --> 00:53:27.920] Learn more at advanced reconstruction.com.
[00:53:27.920 --> 00:53:33.520] All right, these last two news items I call AI scams and solar clams.
[00:53:35.360 --> 00:53:36.080] Wow.
[00:53:36.400 --> 00:53:38.240] Evan, tell us about those AI scams.
[00:53:38.400 --> 00:53:46.320] AI scams, AI-driven scam ads, deep fake tech used to peddle bogus health products.
[00:53:46.320 --> 00:53:49.840] That was the headline, and that is what caught my eye.
[00:53:49.840 --> 00:53:53.360] This was at a place called Hackread.com.
[00:53:53.360 --> 00:53:57.120] Had not heard of it before, but still I stumbled upon it.
[00:53:57.120 --> 00:53:59.600] And the author's name is Habiba Rashid.
[00:53:59.760 --> 00:54:10.080] She writes that scammers are leveraging deep fake technology to create convincing health and celebrity-endorsed ads on social media, targeting millions of people.
[00:54:10.080 --> 00:54:14.880] Here's how to spot and avoid these deceitful scams.
[00:54:14.880 --> 00:54:16.880] Okay, that's good advice.
[00:54:17.200 --> 00:54:18.400] I'm intrigued.
[00:54:18.560 --> 00:54:20.160] Social media, she continues.
[00:54:20.160 --> 00:54:23.600] Social media has always been a hotspot for scam advertisements.
[00:54:23.600 --> 00:54:24.480] Yes.
[00:54:24.480 --> 00:54:37.320] Still, recently, cyber criminals have been creating especially deceitful ads using deep fake technology and the allure of celebrity endorsements to exploit unsuspecting individuals.
[00:54:37.320 --> 00:54:48.040] And a recent investigation by BitFinder Labs highlights a surge in health-related scam ads on major social media platforms like Facebook, Messenger, and Instagram.
[00:54:48.040 --> 00:54:53.240] Okay, so she links to the investigation material in this article.
[00:54:53.240 --> 00:54:55.320] And I, of course, clicked right on there.
[00:54:55.320 --> 00:54:56.840] I head over there.
[00:54:56.840 --> 00:54:58.920] And I found it to be both what?
[00:54:58.920 --> 00:54:59.960] How do I describe it?
[00:54:59.960 --> 00:55:04.600] Informative and a little bit strange, which I will get to.
[00:55:04.600 --> 00:55:07.640] Yeah, the link goes to BitDefender Labs.
[00:55:07.640 --> 00:55:09.800] BitDefender is a product.
[00:55:09.800 --> 00:55:11.320] You may have heard of it.
[00:55:11.720 --> 00:55:15.240] They consider themselves a global leader in cybersecurity.
[00:55:15.400 --> 00:55:19.560] I think they've been around since 2001, so they have a pretty good footprint.
[00:55:19.560 --> 00:55:30.200] Bit Defender provides cybersecurity solutions with leading security efficacy, performance, and ease of use to small and medium businesses, mid-market enterprises, and consumers.
[00:55:30.200 --> 00:55:44.120] Okay, well, despite the fact that this is a product, basically, that they've linked to, their website does have a lot of information on it, and they published an article on their website, and they have a section called Scam Research.
[00:55:44.120 --> 00:55:47.160] So that was the section in which this article appeared.
[00:55:47.160 --> 00:55:55.560] And it says, a deep dive on supplement scams, how AI drives miracle cures and sponsored health-related scams on social media.
[00:55:55.560 --> 00:55:58.760] So this is the source material for that original article.
[00:55:59.000 --> 00:56:04.760] There are four authors here, all with names that I would definitely be mispronouncing, I'm certain.
[00:56:05.080 --> 00:56:06.600] But they are Romanian.
[00:56:06.600 --> 00:56:07.960] I looked up a couple of the names.
[00:56:07.960 --> 00:56:11.000] They appear to all be Romanian, four Romanian authors here.
[00:56:11.000 --> 00:56:16.000] And I think we'll link to this so you can go ahead and give the article a read for yourself.
[00:56:16.320 --> 00:56:24.080] I think this was translated from Romanian to English, and when you go and you read it, you're going to, you know, it just feels a little off in a way.
[00:56:24.080 --> 00:56:24.800] You know, I don't know.
[00:56:24.800 --> 00:56:27.040] Tell me if you feel the same about that when you read it.
[00:56:27.040 --> 00:56:28.240] It felt a little odd to me.
[00:56:28.240 --> 00:56:30.240] But in any case, here's their deep dive.
[00:56:30.240 --> 00:56:35.120] They start by talking about how sponsored social media content is on the rise.
[00:56:35.120 --> 00:56:36.640] Okay, that's no surprise.
[00:56:36.640 --> 00:56:41.920] But hand in hand has been the rise of scams in the form of phony ads on social media.
[00:56:41.920 --> 00:56:53.520] And by phony, I mean that the faces and the voices that often accompany the product being sold are either outright A
Prompt 2: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 3: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Prompt 5: Context Setup
You are an expert data extractor tasked with analyzing a podcast transcript.
I will provide you with part 2 of 2 from a podcast transcript.
I will then ask you to extract different types of information from this content in subsequent messages. Please confirm you have received and understood the transcript content.
Transcript section:
t enterprises, and consumers.
[00:55:30.200 --> 00:55:44.120] Okay, well, despite the fact that this is a product, basically, that they've linked to, their website does have a lot of information on it, and they published an article on their website, and they have a section called Scam Research.
[00:55:44.120 --> 00:55:47.160] So that was the section in which this article appeared.
[00:55:47.160 --> 00:55:55.560] And it says, a deep dive on supplement scams, how AI drives miracle cures and sponsored health-related scams on social media.
[00:55:55.560 --> 00:55:58.760] So this is the source material for that original article.
[00:55:59.000 --> 00:56:04.760] There are four authors here, all with names that I would definitely be mispronouncing, I'm certain.
[00:56:05.080 --> 00:56:06.600] But they are Romanian.
[00:56:06.600 --> 00:56:07.960] I looked up a couple of the names.
[00:56:07.960 --> 00:56:11.000] They appear to all be Romanian, four Romanian authors here.
[00:56:11.000 --> 00:56:16.000] And I think we'll link to this so you can go ahead and give the article a read for yourself.
[00:56:16.320 --> 00:56:24.080] I think this was translated from Romanian to English, and when you go and you read it, you're going to, you know, it just feels a little off in a way.
[00:56:24.080 --> 00:56:24.800] You know, I don't know.
[00:56:24.800 --> 00:56:27.040] Tell me if you feel the same about that when you read it.
[00:56:27.040 --> 00:56:28.240] It felt a little odd to me.
[00:56:28.240 --> 00:56:30.240] But in any case, here's their deep dive.
[00:56:30.240 --> 00:56:35.120] They start by talking about how sponsored social media content is on the rise.
[00:56:35.120 --> 00:56:36.640] Okay, that's no surprise.
[00:56:36.640 --> 00:56:41.920] But hand in hand has been the rise of scams in the form of phony ads on social media.
[00:56:41.920 --> 00:56:53.520] And by phony, I mean that the faces and the voices that often accompany the product being sold are either outright AI fabrications or they're AI aversions of people who really exist.
[00:56:53.520 --> 00:56:56.960] And they're basically deep faking those consumers.
[00:56:57.840 --> 00:56:58.960] Here's what the article says.
[00:56:58.960 --> 00:57:09.760] Researchers at BitDefender Labs collected and analyzed health-related scams across the globe over a three-month period from March through May of 2024, so very recently.
[00:57:09.760 --> 00:57:12.320] And here were their key findings.
[00:57:12.320 --> 00:57:26.320] Number one, a market increase of health-related fraudulent ads leveraging AI-generated images, videos, and audio promoting various supplements, especially on Meta's social platforms, Facebook, Messenger, and Instagram.
[00:57:26.320 --> 00:57:26.720] Okay.
[00:57:27.680 --> 00:57:39.120] Number two, the highest number of followers, a compromise/slash fake page that promoted false advertisements was over 350,000 followers.
[00:57:39.120 --> 00:57:40.880] That's not insignificant.
[00:57:40.880 --> 00:57:46.160] Scammers used over a thousand different deep fake videos across all communities.
[00:57:46.160 --> 00:57:51.840] They discovered that there were over 40 medical supplement ads that were promoted among these.
[00:57:51.840 --> 00:58:15.240] Most of the ads catered to targeted geographical regions with tailored content using the names of celebrities, politicians, TV presenters, doctors, and other healthcare professionals in order to bait those consumers, including people like, well, Brad Pitt or Cristiano Ronaldo, soccer player, football player, George Clooney, we certainly know who that is.
[00:58:15.560 --> 00:58:15.960] Dr.
[00:58:15.960 --> 00:58:18.680] Ben Carson, I think most of us know who that is.
[00:58:18.680 --> 00:58:20.200] Bill Maher, sorry.
[00:58:20.440 --> 00:58:23.320] Denzel Washington, someone named Dr.
[00:58:23.320 --> 00:58:33.880] Heinz Luscher, and a bunch of other doctors that are apparently either in Romania or somewhere in Eastern Europe, they have some sort of celebrity footprint to them.
[00:58:33.880 --> 00:58:34.360] Okay.
[00:58:34.680 --> 00:58:40.680] The campaigns targeted millions of recipients across the globe, including Europe, North America, the Middle East, and Australia.
[00:58:40.680 --> 00:58:42.440] So basically, you know, practically everywhere.
[00:58:42.440 --> 00:58:43.800] Oh, Asia as well.
[00:58:44.040 --> 00:58:49.960] These are highly convincing messages that are grammatically correct and in the context with the ads.
[00:58:49.960 --> 00:58:53.400] In other words, not so easy to spot, right?
[00:58:53.400 --> 00:59:06.840] I mean, we've been able to look at some things that have been faked, and we can pull out some irregularities about them that would denote them as fake, but they said, no, for the most part, things here are pretty good.
[00:59:06.840 --> 00:59:10.280] They said most of the videos show clear signs of tampering, though.
[00:59:10.600 --> 00:59:18.600] If you're an expert and you know what to look for, but they also found instances that were very difficult to put into the deep fake category.
[00:59:18.600 --> 00:59:22.680] So it's becoming more and more sophisticated, is basically what they're saying.
[00:59:22.680 --> 00:59:32.120] The scammers exploit individuals who are desperate in finding a solution or treatment that will help them ease their symptoms or even cure chronic underlying diseases, they say.
[00:59:32.120 --> 00:59:36.200] And they said some of the most observed scenarios are depicted in these examples.
[00:59:36.200 --> 00:59:40.360] Number one, advertisements are described as alternatives to conventional medicine.
[00:59:40.360 --> 00:59:42.120] Where have we heard about that before?
[00:59:42.120 --> 00:59:51.360] The decline in trust in conventional medicine, aggravated by many scandals within the pharmaceutical industry, is used to prompt consumers into seeking alternative solutions.
[00:59:51.360 --> 00:59:55.920] That we definitely know from being on that side of this fight.
[00:59:55.920 --> 00:59:57.600] So we're no strangers to that.
[00:59:57.600 --> 01:00:00.240] Number two, the scientists say so.
[01:00:00.240 --> 01:00:09.360] Yeah, to boost credibility, scammers use deepfake technology to create videos of these individuals who are giving scientific explanations for the effectiveness of the product.
[01:00:09.360 --> 01:00:12.320] And apparently they come off as very convincing.
[01:00:12.320 --> 01:00:32.240] And right, why would if you did have, say, a doctor who has some sort of either notoriety, celebrity, whatever, and you're able to use the AI to make that image say whatever it is you want it to say, that definitely is going to have an impact on how people see the particular product.
[01:00:32.240 --> 01:00:35.680] Here's where I thought it started to get a little bit interesting and a little bit weird.
[01:00:35.680 --> 01:00:42.400] They talked about the anatomy of a supplement scam campaign, and they basically used it as their example.
[01:00:42.560 --> 01:00:47.760] Starts with fraudsters crafting social media pages to spread misleading advertisements.
[01:00:47.760 --> 01:00:53.440] They spotted thousands of these pages that promote cures for common ailments or health problems.
[01:00:53.440 --> 01:00:53.840] Yeah.
[01:00:53.840 --> 01:01:03.520] And they say that selling the metaphorical snake oil, while not uncommon, gains a huge audience, perhaps even credibility in the context of these modern social media campaigns.
[01:01:03.520 --> 01:01:11.120] And while conducting the research, they observe thousands of social media pages and websites serving these supplement scams specifically.
[01:01:11.120 --> 01:01:18.640] And likely the number exceeds 10,000, perhaps tens of thousands of social media pages with these.
[01:01:18.640 --> 01:01:20.480] So it's basically growing.
[01:01:20.480 --> 01:01:26.480] It's interesting, though, because in their example of this, they point to a deep fake of someone named Dr.
[01:01:26.480 --> 01:01:38.840] Heinz Luscher, L-U with umlautz over it, L-U-S-C-H-E-R, who is apparently well known, not in America, but in parts of Europe, perhaps Romania and some other places.
[01:01:38.840 --> 01:01:46.600] And they've used a deep fake of him, okay, and you know, basically promoting whatever it is, a supplement of some kind.
[01:01:46.600 --> 01:01:59.960] But then I went and I actually looked up this doctor online, and he's, you know, basically integrative medicine and, you know, complementary medicine and does all the, you know, the other things that we talk about.
[01:01:59.960 --> 01:02:02.680] So that's legitimately who this guy is.
[01:02:02.680 --> 01:02:15.480] But they're talking about faking the fact that here's this same, a fake version of this person talking about something else that he normally doesn't talk about, whether it's a supplement or whatever.
[01:02:15.800 --> 01:02:25.080] So it's kind of a scam of another scam trying to trick people, covering up another scam, right?
[01:02:25.640 --> 01:02:35.960] That's where it kind of got a little weird for me that they use this particular person as their example of how one of these campaigns go.
[01:02:35.960 --> 01:02:41.080] Because if you look at the truth of this guy, what he's doing is kind of a scam anyways to begin with.
[01:02:41.080 --> 01:02:43.480] So don't fall for the scam of the scam.
[01:02:44.120 --> 01:02:45.160] It's weird.
[01:02:45.160 --> 01:02:48.120] Yeah, well, it's frightening look at what we're in store for.
[01:02:48.440 --> 01:03:05.720] Yeah, like it's very easy now for people, individuals, small corporations, whatever, to mass produce the fake reality, you know, fake endorsements, fake claims, fake scientific education, fake news articles, whatever they want, and it's just going to be overwhelming.
[01:03:05.720 --> 01:03:12.440] And, you know, the only real solution I see to this is very careful and rigorously enforced regulations.
[01:03:12.440 --> 01:03:15.000] There's really no bottom-up solution to this.
[01:03:15.360 --> 01:03:27.200] Yeah, you can't expect the consumer to have a deep level of sophistication and understanding the nuances of things, the AI deep fakes that are going on to be able to see.
[01:03:27.280 --> 01:03:34.880] Yeah, you can't expect everybody to be constantly filtering out massive amounts of fraud every day of their life.
[01:03:34.880 --> 01:03:35.840] It's not practical.
[01:03:35.840 --> 01:03:37.360] I mean, who wants to live like that?
[01:03:37.360 --> 01:03:38.880] No, not practical.
[01:03:38.880 --> 01:03:40.880] It's not just a minimal protection.
[01:03:42.480 --> 01:03:45.600] So kudos to them to kind of bring this to everyone's attention.
[01:03:45.600 --> 01:03:48.880] At the same time, I think they could have used some better examples.
[01:03:48.880 --> 01:03:49.360] All right.
[01:03:49.360 --> 01:03:50.400] Thanks, Evan.
[01:03:50.400 --> 01:03:51.040] Thanks.
[01:03:51.040 --> 01:03:53.600] Okay, so solar clams.
[01:03:53.600 --> 01:03:55.520] Solar clams.
[01:03:55.520 --> 01:03:55.920] Yeah.
[01:03:56.240 --> 01:03:59.440] It sounds like a sci-fi movie from the 50s.
[01:03:59.440 --> 01:04:01.360] So what's up with these guys, right?
[01:04:01.360 --> 01:04:03.120] So this is an interesting study.
[01:04:03.760 --> 01:04:06.320] You will file this one under the biomimicry, right?
[01:04:06.320 --> 01:04:20.080] Like we like when technology is inspired by the millions, hundreds of millions or whatever years of evolutionary tinkering that living things have done to perfect anatomical solutions to problems.
[01:04:20.080 --> 01:04:26.640] And then we piggyback on that evolutionary tinkering to get inspiration for our own technology.
[01:04:26.640 --> 01:04:26.960] All right.
[01:04:26.960 --> 01:04:30.000] So in this case, we're looking at clams.
[01:04:30.240 --> 01:04:36.160] These giant clams are photosymbiotic animals, right?
[01:04:36.160 --> 01:04:42.720] So they have a symbiotic relationship with photosynthetic algae.
[01:04:43.120 --> 01:04:47.120] The algae exist, these are single-celled algae creatures.
[01:04:47.120 --> 01:04:53.680] They exist in these vertical columns on the surface of the clams.
[01:04:53.680 --> 01:04:58.560] And they use light for photosynthesis to create food.
[01:04:58.560 --> 01:05:00.520] And some of that food gets eaten by the clams.
[01:05:01.720 --> 01:05:11.320] So, what the researchers were looking at, this is Allison Sweeney, who is an associate professor of physics and of ecology and evolutionary biology at Yale.
[01:05:11.480 --> 01:05:23.080] What she and her colleagues were looking at for is the anatomy of these photosynthetic structures on the clams and how that relates to their quantum efficiency.
[01:05:23.080 --> 01:05:27.080] Bob, have you ever heard that term quantum efficiency before?
[01:05:27.800 --> 01:05:31.480] I don't think I've heard that term, quantum efficiency?
[01:05:31.560 --> 01:05:33.240] Sounds chokering.
[01:05:33.480 --> 01:05:37.720] Not as complicated as it sounds.
[01:05:37.720 --> 01:05:45.000] Quantum efficiency is the measure of the effectiveness of an imaging device to convert incident photons into electrons.
[01:05:45.720 --> 01:05:47.640] But they really need quantum in that term.
[01:05:47.880 --> 01:06:05.160] So, like, for example, if you have a 100% quantum efficiency, a system that's exposed to 100 photons would produce 100 electrons, or in the case of photosynthetic creatures, 100 electrons would produce 100 reactions of photosynthesis.
[01:06:05.160 --> 01:06:09.000] So, you're using all of the photons, basically.
[01:06:09.320 --> 01:06:16.520] So, what they wanted to find out was what was the quantum efficiency of the photosynthetic algae in these clams.
[01:06:16.840 --> 01:06:21.000] And what they found was that they're quite high.
[01:06:21.000 --> 01:06:22.600] They had a quantum efficiency.
[01:06:22.600 --> 01:06:25.160] So, what they did was they modeled the quantum efficiency.
[01:06:25.160 --> 01:06:33.320] They just said, okay, we're going to make a model of just the anatomical structure of these clams and how the algae is organized in these vertical columns.
[01:06:33.320 --> 01:06:37.000] And they found that the quantum efficiency was 42%.
[01:06:37.320 --> 01:06:47.440] However, we know from direct measurements that these you know, these photosymbionts have a higher efficiency than that.
[01:06:44.760 --> 01:06:50.480] So, what they figured out, so there's something missing from the model.
[01:06:50.800 --> 01:06:58.400] So, then they included new information having to do with the dynamic movement of the clams, because the clams will open and close their mouth.
[01:06:58.400 --> 01:07:08.400] And when they do this, this stretches the vertical columns so they become sort of wider but shorter, and then they could stretch them back and make them narrower and longer.
[01:07:08.400 --> 01:07:14.240] And that the clams will do this in response to the amount of light falling upon them.
[01:07:14.240 --> 01:07:14.640] Right?
[01:07:14.640 --> 01:07:22.960] So, when you include this dynamic movement of the clams, the quantum efficiency jumps to 67%.
[01:07:22.960 --> 01:07:33.040] Now, to put things into context, just a tree in the same part of the world, like a tropical environment, would have a quantum efficiency of about 14%.
[01:07:33.040 --> 01:07:34.320] That's a lot less.
[01:07:34.320 --> 01:07:43.200] So, these clams are incredibly efficient in terms of just their three-dimensional structure in terms of their quantum efficiency.
[01:07:43.200 --> 01:07:48.800] And they're in fact the most efficient structures that we've ever seen.
[01:07:48.800 --> 01:08:03.520] But, interestingly, trees like arboreal forests in the northern hemisphere, in northern latitudes that are far away from the equator, they have similar levels, although not quite as much, but similar levels of quantum efficiency.
[01:08:03.520 --> 01:08:04.960] Again, make sense.
[01:08:05.280 --> 01:08:13.120] They use sort of the vertical structure in order to maximize their efficiency because they don't have as much light.
[01:08:13.120 --> 01:08:15.120] They've got to make the most of the light that they get.
[01:08:15.120 --> 01:08:30.120] There's another aspect to this as well, in terms of the anatomy, and that is that the top layer of cells over the columns that hold the algae scatters light.
[01:08:29.360 --> 01:08:33.080] It's a light scattering layer of cells.
[01:08:33.320 --> 01:08:40.200] And so that light scattering also, these are the eridocytes, is the name of the cells.
[01:08:40.200 --> 01:08:47.160] The eridocytes, they scatter the light, which maximizes the absorption of photons as well, right?
[01:08:47.160 --> 01:08:50.920] Because the light's bouncing around, it has multiple opportunities to be absorbed.
[01:08:50.920 --> 01:08:54.360] So these are the main takeaways from this.
[01:08:54.360 --> 01:09:09.560] You have a light scattering layer, you have vertical columns, and you have some kind of dynamic adaptation to the amount of light and the, I guess, the angle of the light, et cetera, to maximize the quantum efficiency.
[01:09:09.560 --> 01:09:14.680] And you can get up into the 60s, 67% in their model.
[01:09:14.680 --> 01:09:23.080] The obvious implications of this is that we want to use this knowledge in order to design more efficient solar panels, right?
[01:09:23.080 --> 01:09:24.360] Photovoltaics.
[01:09:25.000 --> 01:09:32.040] Some of these things are already being incorporated in design, like scattering light using multiple layers, using sort of vertical structures.
[01:09:32.040 --> 01:09:39.080] But obviously, this information could be very, very useful for attempts at improving those designs.
[01:09:39.080 --> 01:09:47.240] Right now, for context, again, like a commercially available silicon solar panel is about 22, 23% efficient, which is really good.
[01:09:47.480 --> 01:09:54.840] When we started following this tech 20 years ago, it was maybe 10, 12% efficient, so it's almost doubled since then.
[01:09:54.840 --> 01:10:02.320] Maybe there's the potential, I don't know if we can get all the way up to 67%, but even if we get up to like 40% or 45% from where we are now.
[01:10:02.320 --> 01:10:07.240] Imagine twice the electricity production from the same area.
[01:10:08.200 --> 01:10:09.080] That's huge.
[01:10:09.480 --> 01:10:15.440] Anything that makes solar panels more cost-effective is great, of course.
[01:10:15.760 --> 01:10:30.800] Now, you have also the organic solar panels, which are the best ones now are getting up to like 15% efficient, which is not quite as good as the silicon, but they are soft, flexible, durable, cheap.
[01:10:31.360 --> 01:10:38.000] So they're getting close to the point where they're like really commercially viable for a lot for more and more applications.
[01:10:38.000 --> 01:10:56.320] Now, if we could apply this kind of technology to some combination of either perovskite, silicon, organic, or whatever, some combination of these solar panels, if we get to the point where it's going to be so cheap and easy to add solar panels that they're just going to be everywhere, right?
[01:10:56.320 --> 01:10:59.360] It's going to be, why not put them on everything?
[01:10:59.360 --> 01:11:00.800] So that would be nice.
[01:11:00.800 --> 01:11:02.800] Yeah, that'd be nice if we get to that point.
[01:11:02.800 --> 01:11:17.680] So this is just one more study adding to the pile of these sort of basic signs and incremental advantages in this photovoltaic tech that is the reason why solar is getting so much better, so you know, so much cheaper.
[01:11:17.680 --> 01:11:28.160] And it's just good to see that the potential here for efficiencies like north of 40, 50% is just incredible.
[01:11:28.160 --> 01:11:28.800] Nice.
[01:11:28.800 --> 01:11:30.640] Looking forward to that day.
[01:11:30.640 --> 01:11:31.120] All right.
[01:11:31.120 --> 01:11:33.920] So there's no who's that noisy this week.
[01:11:33.920 --> 01:11:36.480] Jay, we'll just pick that up next week.
[01:11:36.480 --> 01:11:43.680] But I do have a TikTok from TikTok segment for this week to make up for it.
[01:11:44.000 --> 01:12:00.760] And so every Wednesday, usually starting around noon, we do some live streaming to various social media to TikTok, of course, to Facebook, to Twitter, to whatever else Ian streams to.
[01:12:00.760 --> 01:12:01.960] MySpace.
[01:11:59.920 --> 01:12:04.360] Yep, to MySpace, Refrigerator.
[01:12:07.640 --> 01:12:10.360] If you need to find us, just use Ask Jeeves.
[01:12:10.360 --> 01:12:17.320] All right, so one of the videos I covered this week is by a TikToker called Amoon Loops.
[01:12:17.320 --> 01:12:23.640] And she was telling the story of, this is like real food babe territory, just to warn you, right?
[01:12:23.640 --> 01:12:30.040] So she took her Autistic Child to a doctor to get some blood tests.
[01:12:30.040 --> 01:12:34.760] And they tested, you know, among the screening they did, they tested for heavy metals.
[01:12:34.760 --> 01:12:41.640] And she reports that the antimony level was in the 97th percentile.
[01:12:41.960 --> 01:12:45.800] But she had no idea where antimony could be coming from, right?
[01:12:46.120 --> 01:13:03.000] So she did an investigation, you know, and found that the power cord of the air fryer that she's been using to make her child's food every day for the last couple of years has antimony in it.
[01:13:03.000 --> 01:13:04.680] Yeah, how would it get in the food?
[01:13:04.920 --> 01:13:05.320] Right.
[01:13:05.720 --> 01:13:07.640] Well, that's the question, isn't it?
[01:13:07.640 --> 01:13:08.120] Right?
[01:13:08.120 --> 01:13:12.360] How can antimony get from the power cord into the food?
[01:13:12.360 --> 01:13:15.320] Well, she doesn't really do any scientific investigation.
[01:13:15.320 --> 01:13:17.480] She doesn't close the loop on this evidence.
[01:13:17.480 --> 01:13:19.080] So just that was it.
[01:13:19.080 --> 01:13:20.680] Made a massive leap.
[01:13:20.680 --> 01:13:22.680] It must be the air fryer.
[01:13:22.680 --> 01:13:24.280] So she threw out her air fryer.
[01:13:24.280 --> 01:13:28.040] She's telling everybody to throw out their air fryers because they're bad for you.
[01:13:28.040 --> 01:13:29.240] They're toxic.
[01:13:29.240 --> 01:13:33.000] All right, so let's back up a little bit and deconstruct this.
[01:13:33.000 --> 01:13:40.840] So, first of all, yes, antimony is a heavy metal, and you can't get heavy metal toxicity from it.
[01:13:40.840 --> 01:13:42.840] It's similar to arsenic.
[01:13:43.800 --> 01:13:48.320] There are safety limits that the FDA and the EPA sets for antimony.
[01:13:44.840 --> 01:13:52.240] So, one question I have is: first of all, I don't know what kind of doctors she took the trial to.
[01:13:52.320 --> 01:13:58.800] There are a lot of obviously fringe doctors out there, fringe labs, and why would they have tested them for antimony and all things?
[01:13:58.800 --> 01:14:00.400] So, that's curious.
[01:14:00.400 --> 01:14:09.520] Saying it was the 97th percentile doesn't really tell us much either, because what I want to know is the absolute number, and is it in or outside of the toxic range?
[01:14:09.520 --> 01:14:11.440] Like, is it in the safety range or not?
[01:14:11.440 --> 01:14:14.320] So, just saying 97th percentile doesn't tell us much.
[01:14:14.320 --> 01:14:20.320] Maybe 98% of people are in the safety range, you know, are within the safety limits, which is probably true.
[01:14:20.320 --> 01:14:26.560] So, you know, that again doesn't mean that it was necessarily that it was too high, even though it sounds high.
[01:14:26.560 --> 01:14:39.360] Also, if you do have a high antimony level, you're supposed to do a follow-up test with antimony-free test tubes because you can get an artificially high level from the testing equipment itself.
[01:14:39.360 --> 01:14:46.640] So, that first test is considered just a screen, and without the follow-up test, to verify it, you don't know if it's real or not.
[01:14:46.800 --> 01:14:49.440] No indication that that was done.
[01:14:49.760 --> 01:14:53.200] Now, what about the antimony in the air fryer?
[01:14:53.200 --> 01:14:57.120] So, antimony is a common alloy used in electronics.
[01:14:57.360 --> 01:15:04.560] As an alloy, it tends to strengthen the other metals, right, that it's combined with.
[01:15:04.560 --> 01:15:12.640] And the use of antimony is actually increasing because it's also been recently discovered that it could increase some of the properties, desirable properties of lithium-ion batteries.
[01:15:12.640 --> 01:15:19.280] So, if anything, our use of antimony in electronics and battery technology is going to be increasing.
[01:15:19.280 --> 01:15:28.240] There are, I found, over a thousand household electronics that have antimony in their electronics, in their power cord or whatever.
[01:15:28.240 --> 01:15:29.200] So, that's not a comment.
[01:15:29.200 --> 01:15:30.760] Why focus on the air fryer?
[01:15:30.840 --> 01:15:33.160] You know, again, makes no real sense.
[01:15:29.680 --> 01:15:44.120] Couldn't, again, the big thing, the big hole, she didn't in any way demonstrate that the antimony that her son was exposed to, if it's real, was coming from the air fryer.
[01:15:44.120 --> 01:15:48.040] And it's not really plausible that it would get from the power cord into the food.
[01:15:48.040 --> 01:15:50.440] I mean, I have air fryers, I use air fryers.
[01:15:50.440 --> 01:15:52.600] The food goes in a basket, right?
[01:15:52.600 --> 01:15:56.360] There's no antimony in the basket that you're putting the food into.
[01:15:56.360 --> 01:16:01.080] So there's really no plausible way that it should leach into the food.
[01:16:01.080 --> 01:16:09.480] You can't really argue that it's like being evaporated or anything because the melting point of antimony is like over a thousand degrees Fahrenheit.
[01:16:09.480 --> 01:16:12.520] And you'd have to heat it up even more to turn it into a gas.
[01:16:12.520 --> 01:16:16.200] So we're not getting anywhere near those temperatures.
[01:16:16.200 --> 01:16:19.640] So it's just not plausible, not a plausible source of antimony.
[01:16:19.640 --> 01:16:24.600] Again, if it's even real in this case, which wasn't proven.
[01:16:24.600 --> 01:16:28.600] And so there are more plausible routes of exposure.
[01:16:28.600 --> 01:16:31.960] Antimony is used in the preparation of PET plastic.
[01:16:31.960 --> 01:16:38.760] It's not in the plastic, but it could be a residue that's still left behind from the manufacturing process.
[01:16:38.760 --> 01:16:46.760] And water stored in like single-use PET plastic bottles could get a little bit of antimony that leaches into them.
[01:16:46.760 --> 01:16:50.840] And that's probably one of the most common exposures residentially.
[01:16:50.840 --> 01:16:58.040] Obviously, there's always the potential for exposure in the workplace if you're working in a company that uses antimony in its manufacturing process.
[01:16:58.040 --> 01:17:01.400] Although, apparently, from the research that I did, that's not a big problem.
[01:17:01.400 --> 01:17:06.840] It's just antimony is not something that people generally get exposed to, even industrially.
[01:17:06.840 --> 01:17:10.280] But residentially, it's not coming from your power cord.
[01:17:10.280 --> 01:17:17.280] If somehow you're getting exposed to antimony, you know, in your environment, that's not where I would be looking for the exposure.
[01:17:14.840 --> 01:17:20.000] It's probably from PET plastics.
[01:17:20.320 --> 01:17:25.120] That would be a much more plausible culprit there.
[01:17:25.440 --> 01:17:28.720] So, you know, this is the culture of TikTok.
[01:17:28.720 --> 01:18:01.120] Somebody who doesn't know what they're talking about, making huge leaps, huge assumptions, not doing anything even remotely scientific, not, you know, doing any serious investigation, just completely superficial, and then making massive leaps of logic and going right to the fear-mongering and then just telling their followers to throw out this perfectly safe appliance, which actually is good for you in that it cooks with less oil than other types of cooking.
[01:18:01.120 --> 01:18:03.680] Yeah, I mean, like, there's nothing magical about an air fryer.
[01:18:03.680 --> 01:18:04.720] It's just a small oven.
[01:18:04.720 --> 01:18:06.960] Yeah, they're tabletop or countertop ovens.
[01:18:07.040 --> 01:18:10.960] It's just the air fryers are efficient because the space is very small.
[01:18:10.960 --> 01:18:13.520] The heats are a lot less energy.
[01:18:14.720 --> 01:18:16.480] The food cooks a lot more quickly.
[01:18:16.480 --> 01:18:18.480] It's just an efficient design.
[01:18:18.480 --> 01:18:23.760] But what I'm finding actually is that the air fryers are the new microwaves.
[01:18:23.760 --> 01:18:30.800] And what I mean by that is that, like, since microwaves have been around, there's been all these conspiracy theories surrounding microwaves because people are afraid of it.
[01:18:30.800 --> 01:18:33.120] You know, it's like this, it's high technology.
[01:18:33.120 --> 01:18:39.840] So that people get anxious about that and they invent issues.
[01:18:39.840 --> 01:18:43.920] So there's been conspiracy theories about microwaves swirling around for decades.
[01:18:43.920 --> 01:18:47.760] And now we're seeing the same thing with air fryers just because they're new.
[01:18:47.760 --> 01:18:50.480] But again, it's like they're just small ovens.
[01:18:50.480 --> 01:18:52.800] There's nothing magical about them.
[01:18:52.800 --> 01:18:54.800] The air fryer is great for frozen food.
[01:18:54.800 --> 01:18:57.520] I mean, it's great for a lot of things, but it's really great for frozen food.
[01:18:57.520 --> 01:18:59.120] Reheating pizza.
[01:18:59.120 --> 01:19:00.200] Reheating anything.
[01:19:00.200 --> 01:19:02.200] Yeah, it's really good for reheating stuff, too.
[01:19:02.200 --> 01:19:03.480] I should get one.
[01:19:03.800 --> 01:19:05.800] Okay, we got one email.
[01:19:06.040 --> 01:19:09.240] This one comes from Daniel Kay from LA.
[01:19:09.240 --> 01:19:10.760] Another rhyme.
[01:19:10.760 --> 01:19:14.200] He writes: I'm a longtime listener and fan of the SGU.
[01:19:14.200 --> 01:19:17.880] I have been reading more about climate change scientists and came across Dr.
[01:19:17.880 --> 01:19:26.200] Judith Curry and her testimony on the subject that sounds straight out of the SGU critical thinking and following the data approach to skepticism.
[01:19:26.200 --> 01:19:29.080] What is your take and shouldn't this be open for discussion?
[01:19:29.080 --> 01:19:32.200] Then he gives a link to her testimony.
[01:19:32.520 --> 01:19:34.200] So, yeah, so Dr.
[01:19:34.200 --> 01:19:39.720] Judith Curry is a well-known climate change denier.
[01:19:39.720 --> 01:19:41.960] But she's also a climatologist.
[01:19:41.960 --> 01:19:42.440] Yes.
[01:19:42.440 --> 01:19:44.760] Yeah, which is climate scientists.
[01:19:45.240 --> 01:19:45.800] She's one of the clients.
[01:19:46.040 --> 01:19:47.320] It makes it more complicated.
[01:19:47.320 --> 01:19:47.720] Exactly.
[01:19:48.120 --> 01:19:49.000] The one person.
[01:19:49.800 --> 01:19:50.280] Right.
[01:19:50.280 --> 01:19:53.960] So she is clearly an outlier.
[01:19:54.440 --> 01:20:04.520] She has opinions about the science behind anthropogenic global warming that is out of the mainstream, right?
[01:20:04.520 --> 01:20:10.760] So she disagrees with the 98% or whatever of her colleagues who interpret the evidence differently.
[01:20:10.760 --> 01:20:17.480] And she is known as a contrarian, and she's had this contradictory opinion for decades.
[01:20:17.480 --> 01:20:20.040] I don't know how this really, really all happened.
[01:20:20.040 --> 01:20:26.920] I don't know if this is, you know, maybe she's just not a very good climate scientist or she is just a contrarian generally.
[01:20:26.920 --> 01:20:33.480] Or maybe early on she was not as convinced by the evidence or saw some problems with the evidence.
[01:20:33.480 --> 01:20:45.280] And then once she got into the position of being like the skeptic of climate change, she felt like she had to defend that position and couldn't, you know, get out of it.
[01:20:45.280 --> 01:20:46.800] And then like doubled, tripled down.
[01:20:44.760 --> 01:20:48.480] I don't know what the process was.
[01:20:48.800 --> 01:20:54.560] What I do know is that her opinions on climate change have not held up well over time.
[01:20:54.880 --> 01:21:02.000] But I can imagine that in a vacuum, you know, without somebody standing next to her fact-checking her, she's using all the right lingo.
[01:21:02.000 --> 01:21:06.160] She sounds like she knows what she's talking about, and she has all the right credentials.
[01:21:06.160 --> 01:21:08.080] And that can be really confusing.
[01:21:08.080 --> 01:21:14.640] Her big thing is that she says that we're more, the data is more uncertain than her colleagues are saying.
[01:21:14.800 --> 01:21:18.080] She actually kind of agrees that, yes, the Earth is warming.
[01:21:18.080 --> 01:21:23.840] Yes, it's due in part to human-generated greenhouse gases, including carbon dioxide.
[01:21:23.840 --> 01:21:27.760] Yes, this could lead to potentially catastrophic consequences.
[01:21:27.760 --> 01:21:39.440] But just that there's way more uncertainty than what the scientific community and the international panel, intergovernmental panel on climate change is saying.
[01:21:39.440 --> 01:21:44.160] That's kind of been the drum that she's been beating.
[01:21:44.480 --> 01:21:54.000] But the thing is, when you get down to it, when you look at her specific opinions, they're not that far off of sort of mainstream climate change denial.
[01:21:54.000 --> 01:21:56.880] So for example, let's go over some of the things that she said.
[01:21:57.120 --> 01:22:02.080] She said that global warming stopped in 1995.
[01:22:02.080 --> 01:22:10.000] And she said it again in about 1998 and 2002 and 2007 and 2010.
[01:22:10.240 --> 01:22:15.600] So there's fluctuations in the background temperature.
[01:22:15.600 --> 01:22:19.120] And this has been a ploy of, you know, again, climate change deniers for a very long time.
[01:22:19.120 --> 01:22:22.960] Every time the curve turns down, they say, oh, look, climate change has stopped.
[01:22:22.960 --> 01:22:24.800] It's reverting to the mean or whatever.
[01:22:24.800 --> 01:22:29.080] But of course, the back, the long-term trend has not changed.
[01:22:29.400 --> 01:22:30.840] We're still warming.
[01:22:30.840 --> 01:22:35.480] So she was wrong every time she said, you know, that global warming has stopped.
[01:22:35.480 --> 01:22:46.520] She also bought into the whole scientist, tried to, quote-unquote, hide the decline as sort of some kind of conspiracy to hide, I guess, the uncertainty, which has been completely debunked.
[01:22:46.520 --> 01:22:58.520] She's characterized the IPCC as alarmist, even though their predictions underestimated climate warming since they've been underestimating it.
[01:22:58.920 --> 01:23:01.480] But yet she's calling them alarmist.
[01:23:01.480 --> 01:23:11.800] She also argued at one point that there is no consensus, you know, despite the fact that 97%, now more, of climate experts agree in anthropogenic global warming.
[01:23:11.800 --> 01:23:13.960] So she's just, she's a contrarian.
[01:23:13.960 --> 01:23:15.880] She's on the outside of the mainstream.
[01:23:15.880 --> 01:23:18.840] You know, she doesn't represent the mainstream opinion.
[01:23:19.160 --> 01:23:24.360] So just because she's a climate scientist doesn't mean that she's correct, right?
[01:23:24.840 --> 01:23:28.440] And this is a good general lesson about the argument from authority.
[01:23:28.440 --> 01:23:37.960] You know, reliable authority lies in a strong, hard-earned consensus of multiple scientists and experts, not one person's opinion.
[01:23:37.960 --> 01:23:41.320] It never rests on one person because one person could be quirky.
[01:23:41.320 --> 01:23:42.840] They could be a contrarian.
[01:23:42.840 --> 01:23:44.760] They could just be wrong.
[01:23:44.760 --> 01:23:51.960] Now, having said all of that, I do think that the best way to respond to somebody like Dr.
[01:23:51.960 --> 01:23:57.320] Curry is to just focus on their claims and debunk them, right?
[01:23:57.320 --> 01:24:09.000] Or just analyze them, see if they have any merit, and defend whatever opinion that they're criticizing, you know, with logic and evidence.
[01:24:09.000 --> 01:24:11.160] I just, that's the best way to deal with it.
[01:24:11.160 --> 01:24:13.000] It's actually not a bad thing.
[01:24:13.000 --> 01:24:21.120] You know, I think every science should have the contrarians on the fringe who are saying, but wait a minute, how do we know this is really true?
[01:24:21.120 --> 01:24:25.920] And whatever, just to keep the whole process honest, I think that's fine.
[01:24:26.240 --> 01:24:28.320] It actually, I think, helps the process.
[01:24:28.320 --> 01:24:30.960] The problem here, though, is a couple of things.
[01:24:30.960 --> 01:24:38.000] One is that there is a campaign of denial that is funded by the fossil fuel industry and that has been taken up by a political party.
[01:24:38.000 --> 01:24:46.000] So we're not dealing with a good faith context here, a good faith community.
[01:24:46.000 --> 01:24:50.800] Whether or not she is deliberately part of that or not is almost irrelevant.
[01:24:50.800 --> 01:25:03.120] The problem is that even good faith, devil playing devil's advocate kind of science, then gets used by denialist, politically motivated, ideologically motivated campaigns.
[01:25:03.120 --> 01:25:14.080] The second thing is that there are massive, important policy decisions resting upon what the scientific community says about climate science.
[01:25:14.080 --> 01:25:16.320] And so this is always tricky.
[01:25:16.320 --> 01:25:40.480] You know, we're having scientific discussions in the literature among experts, and that's fine, but that then gets exploited and used by, again, people who are not acting in good faith, who are then trying to mine all of that for the purpose of political denial.
[01:25:40.800 --> 01:25:44.960] So that complicates the whole situation, right?
[01:25:44.960 --> 01:25:48.160] And whether intentional or not, Dr.
[01:25:48.160 --> 01:26:08.120] Curry has lent a tremendous amount of aid and comfort to the climate change denial community who are not acting in good faith and have really hampered the world's response to what is a very serious and time-sensitive situation.
[01:26:08.440 --> 01:26:09.080] Right?
[01:26:09.080 --> 01:26:17.800] So, again, it's complicated, but when you drill down on her claims, they just don't hold water.
[01:26:18.760 --> 01:26:26.360] They have been pretty much utterly trounced by her climate expert colleagues.
[01:26:26.680 --> 01:26:30.920] Eczema isn't always obvious, but it's real.
[01:26:30.920 --> 01:26:33.880] And so is the relief from EBGLIS.
[01:26:33.880 --> 01:26:40.600] After an initial dosing phase, about four in ten people taking EBGLIS achieved itch relief and clear or almost clear skin at 16 weeks.
[01:26:40.600 --> 01:26:44.920] And most of those people maintain skin that's still more clear at one year with monthly dosing.
[01:26:44.920 --> 01:26:56.360] EBGLIS Libricizumab LBKZ, a 250 milligram per 2 milliliter injection, is a prescription medicine used to treat adults and children 12 years of age and older who weigh at least 88 pounds or 40 kilograms with moderate to severe eczema.
[01:26:56.360 --> 01:27:02.600] Also called atopic dermatitis that is not well controlled with prescription therapies used on the skin or topicals or who cannot use topical therapies.
[01:27:02.680 --> 01:27:05.640] EBGLIS can be used with or without topical corticosteroids.
[01:27:05.640 --> 01:27:07.320] Don't use if you're allergic to EBGLIS.
[01:27:07.320 --> 01:27:09.320] Allergic reactions can occur that can be severe.
[01:27:09.320 --> 01:27:10.440] Eye problems can occur.
[01:27:10.440 --> 01:27:12.760] Tell your doctor if you have new or worsening eye problems.
[01:27:12.760 --> 01:27:15.240] You should not receive a live vaccine when treated with EBGLIS.
[01:27:15.240 --> 01:27:18.440] Before starting EBGLIS, tell your doctor if you have a parasitic infection.
[01:27:18.440 --> 01:27:19.480] Searching for real relief?
[01:27:19.480 --> 01:27:27.000] Ask your doctor about EBGLIS and visit ebglis.lily.com or call 1-800-LILLIERX or 1-800-545-5979.
[01:27:27.000 --> 01:27:31.480] You probably think it's too soon to join AARP, right?
[01:27:31.480 --> 01:27:33.720] Well, let's take a minute to talk about it.
[01:27:33.720 --> 01:27:36.440] Where do you see yourself in 15 years?
[01:27:36.440 --> 01:27:40.920] More specifically, your career, your health, your social life.
[01:27:40.920 --> 01:27:43.560] What are you doing now to help you get there?
[01:27:43.560 --> 01:27:48.640] There are tons of ways for you to start preparing today for your future with AARP.
[01:27:48.960 --> 01:27:51.040] That dream job you've dreamt about?
[01:27:51.040 --> 01:27:55.120] Sign up for AARP reskilling courses to help make it a reality.
[01:27:55.120 --> 01:27:59.920] How about that active lifestyle you've only spoken about from the couch?
[01:27:59.920 --> 01:28:04.960] AARP has health tips and wellness tools to keep you moving for years to come.
[01:28:04.960 --> 01:28:08.880] But none of these experiences are without making friends along the way.
[01:28:08.880 --> 01:28:12.880] Connect with your community through AARP volunteer events.
[01:28:12.880 --> 01:28:17.280] So it's safe to say it's never too soon to join AARP.
[01:28:17.280 --> 01:28:21.440] They're here to help your money, health, and happiness live as long as you do.
[01:28:21.440 --> 01:28:25.680] That's why the younger you are, the more you need AARP.
[01:28:25.680 --> 01:28:30.000] Learn more at AARP.org/slash wisefriend.
[01:28:30.000 --> 01:28:32.320] This is where projects come to life.
[01:28:32.320 --> 01:28:42.560] Our showrooms are designed to inspire with the latest products from top brands, curated in an inviting, hands-on environment and a team of industry experts to support your project.
[01:28:42.560 --> 01:28:48.000] We'll be there to make sure everything goes as planned: from product selection to delivery coordination.
[01:28:48.000 --> 01:28:53.200] At Ferguson Bath Kitchen and Lighting Gallery, your project is our priority.
[01:28:53.200 --> 01:28:58.240] Discover great brands like Kohler at your local Ferguson showroom.
[01:29:00.800 --> 01:29:11.600] At the Institute for Advanced Reconstruction, we treat challenging nerve and reconstructive conditions and push the boundaries of medicine to restore movement, function, and hope.
[01:29:11.600 --> 01:29:16.520] From intricate phrenic nerve repairs to life-changing brachial plexus reconstructions.
[01:29:16.520 --> 01:29:22.880] Our world-class specialists take on the most complex cases, providing expertise where it's needed most.
[01:29:22.880 --> 01:29:29.800] When your patient needs options beyond the ordinary, refer with confidence at advancedreconstruction.com.
[01:29:29.440 --> 01:29:33.800] All right, let's move on with science or fiction.
[01:29:36.360 --> 01:29:41.480] It's time for science or fiction.
[01:29:45.960 --> 01:29:50.600] Each week, I come up with three science news items or facts, two real and one fake.
[01:29:50.600 --> 01:29:55.160] Then I challenge my panel of skeptics to tell me which one is the fake.
[01:29:55.400 --> 01:29:57.880] We have three regular news items this week.
[01:29:57.880 --> 01:29:59.720] Not two because Jay's not here.
[01:29:59.720 --> 01:30:00.680] You didn't go over it.
[01:30:01.000 --> 01:30:02.680] There is sort of a theme here.
[01:30:02.680 --> 01:30:03.880] There's a week kind of theme.
[01:30:03.880 --> 01:30:07.400] They're regular news items, but there's a theme of good or bad.
[01:30:07.400 --> 01:30:10.840] Some of these are either really good or really bad.
[01:30:11.160 --> 01:30:12.440] Like most weeks.
[01:30:12.440 --> 01:30:14.680] Well, that means two of them will be good or two of them.
[01:30:14.920 --> 01:30:15.480] All right.
[01:30:15.480 --> 01:30:16.840] Here we go.
[01:30:16.840 --> 01:30:25.480] Item number one: a new study finds that risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccines.
[01:30:25.480 --> 01:30:26.280] Good.
[01:30:27.240 --> 01:30:28.200] Thanks, Kara.
[01:30:28.200 --> 01:30:38.760] Item number two: a recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
[01:30:39.240 --> 01:30:49.080] And item number three, researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:30:49.400 --> 01:30:52.840] Kara, you seem very eager, so why don't you go first?
[01:30:53.160 --> 01:31:00.840] Okay, so the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
[01:31:01.160 --> 01:31:04.760] I could see this happening one of two ways.
[01:31:04.760 --> 01:31:15.360] Definitely, if you decrease the risk of getting COVID, then you decrease the risk of then having long COVID symptoms after COVID infection.
[01:31:14.840 --> 01:31:17.280] This is not overall in the population.
[01:31:17.440 --> 01:31:23.840] What this is saying is that for people who got COVID, the risk of developing long COVID was reduced.
[01:31:24.160 --> 01:31:24.800] Right.
[01:31:24.800 --> 01:31:38.400] But even still, I'm wondering if that reasoning stands because for people who got COVID, the longer the pandemic went on and the more they were vaccinated or the more immunity they developed, the weaker their COVID infections were.
[01:31:38.400 --> 01:31:50.880] But then I wonder if there's like an almost equal and opposite way to look at this one, where like for some people, long COVID appears to be some sort of like autoimmune or like excessive immune response.
[01:31:50.880 --> 01:31:58.240] And if that's the case, yes, more COVID infection, bad, but also maybe accumulation of vaccine.
[01:31:58.240 --> 01:31:58.880] I don't really know.
[01:31:58.880 --> 01:32:01.920] But I think, I don't know, that one is feeling like science to me.
[01:32:01.920 --> 01:32:11.760] A recent analysis of primate genomes finds that shared viral inclusions, so reduce overall cancer risk by stabilizing the genome.
[01:32:11.760 --> 01:32:14.960] Shared viral inclusion, shared by whom?
[01:32:14.960 --> 01:32:16.240] Primates.
[01:32:16.480 --> 01:32:20.800] These are viral inclusions that are found in all primates within the primate clinic.
[01:32:20.960 --> 01:32:21.360] Oh, I see.
[01:32:21.360 --> 01:32:23.200] So among different, yeah, okay, gotcha.
[01:32:23.440 --> 01:32:24.880] Different species within this.
[01:32:24.880 --> 01:32:25.360] Okay.
[01:32:25.360 --> 01:32:30.720] So if there are viral inclusions there, that would reduce overall cancer risk by stabilizing the genome.
[01:32:31.200 --> 01:32:37.200] So these are these are viral genomic snippets that have incorporated themselves into our genes.
[01:32:37.440 --> 01:32:38.880] Across multiple species.
[01:32:38.880 --> 01:32:39.440] Yeah.
[01:32:39.440 --> 01:32:41.360] And so, but like, I don't know.
[01:32:41.360 --> 01:32:51.120] I mean, yes, cancer is like very largely genetic, but you can have a relatively stable genome and still have like messed up oncogenes and messed up tumor suppressors.
[01:32:51.120 --> 01:33:00.360] Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:32:59.840 --> 01:33:00.920] We'll see.
[01:33:01.160 --> 01:33:05.080] Do we even have much of that left?
[01:33:05.080 --> 01:33:08.680] I think it's the cancer one that is the fiction.
[01:33:08.680 --> 01:33:10.840] I'm not exactly sure why.
[01:33:11.480 --> 01:33:12.600] All right, Bob?
[01:33:12.920 --> 01:33:20.600] This first one makes sense with the minimization of long COVID with the introduction of vaccinations.
[01:33:20.600 --> 01:33:31.000] Yeah, it just makes sense that the more people that have the attenuated COVID would be more likely to not even experience long COVID.
[01:33:31.000 --> 01:33:33.560] It just seems very reasonable.
[01:33:33.880 --> 01:33:35.480] Let me go to the third one.
[01:33:35.880 --> 01:33:39.000] Global sea ice has decreased its cooling effect.
[01:33:39.000 --> 01:33:43.880] So, yeah, I'm trying to figure out what the cooling effect would actually have been for sea ice.
[01:33:43.880 --> 01:33:53.800] And I don't think it's necessarily dramatic, but that's irrelevant because it's whatever cooling effect it has, it's decreased by 14%.
[01:33:53.800 --> 01:33:55.800] I wonder how they would have calculated that.
[01:33:55.800 --> 01:33:57.960] That seems somewhat reasonable.
[01:33:58.440 --> 01:34:04.760] More reasonable than the second one, having these viral inclusions stabilizing the genome for cancer.
[01:34:05.880 --> 01:34:17.080] Yeah, just that just seems, sure, it seems, it's not like ridiculously impossible, and that would be great if it were true, but it just seems less likely than the other ones, definitely.
[01:34:17.080 --> 01:34:18.760] So I'll say that's fiction as well.
[01:34:18.760 --> 01:34:19.800] And Evan.
[01:34:19.800 --> 01:34:22.360] Yeah, I don't want to be alone on this one.
[01:34:23.000 --> 01:34:33.560] Well, I mean, mostly because from the get-go, I was really thinking of the three, the one I understand the least is the one about the primate genomes, you know.
[01:34:33.560 --> 01:34:37.000] Whereas the other two, I kind of have at least some kind of sense for.
[01:34:37.000 --> 01:34:41.240] Obviously, I know what long COVID is, decreased over the course of the pandemic.
[01:34:41.240 --> 01:34:45.760] The only reason I think that one might be the fiction or could have been the fiction is because there have been these peaks.
[01:34:44.520 --> 01:34:49.040] What, during the course of the pandemic, there were peaks, right?
[01:34:44.680 --> 01:34:51.360] And then it went back down, but then it peaked again.
[01:34:51.360 --> 01:35:00.960] And I don't know if that had anything to do with how the numbers would have played out as far as determining the long COVID and if the vaccination effect, how the vaccination had an effect on that.
[01:35:00.960 --> 01:35:03.680] But you did say mostly due, not exclusively due.
[01:35:03.680 --> 01:35:05.440] So that's why I think that one's science.
[01:35:05.440 --> 01:35:15.840] And then, yeah, for the same reasons Bob said about the sea ice, 14%, but you know, how much overall are we really talking about and the whole grouping of things that go into that?
[01:35:16.720 --> 01:35:20.800] So therefore, that only leaves me with genomes as fiction.
[01:35:20.800 --> 01:35:22.880] All right, I guess I'll take these in order.
[01:35:22.880 --> 01:35:24.000] We'll start with number one.
[01:35:24.000 --> 01:35:31.360] A new study finds that the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
[01:35:31.360 --> 01:35:34.240] You guys all think this one is science.
[01:35:34.240 --> 01:35:47.440] Well, let me tell you first that the risk of long COVID has been decreasing over the course of the pandemic, and it's due to two things.
[01:35:47.440 --> 01:35:50.640] One is the change of the variants, right?
[01:35:50.640 --> 01:35:54.400] Pre-Delta to Delta to Omicron.
[01:35:54.720 --> 01:35:59.600] The later variants had less long COVID, but it's also due to vaccination.
[01:35:59.600 --> 01:36:03.280] So the question is, which one had more of an effect?
[01:36:03.440 --> 01:36:03.840] There you go.
[01:36:04.640 --> 01:36:05.200] Mostly.
[01:36:05.200 --> 01:36:05.520] Mostly.
[01:36:06.000 --> 01:36:07.600] Is science.
[01:36:07.600 --> 01:36:08.800] Yep, this is science.
[01:36:08.800 --> 01:36:20.880] Yep, the researchers found that vaccination was due to about 75% of the reduction in the risk of getting long COVID following a COVID infection.
[01:36:20.880 --> 01:36:24.720] So, yeah, it is one more great thing about the vaccines.
[01:36:24.720 --> 01:36:27.360] They reduce your risk of getting the COVID.
[01:36:27.360 --> 01:36:31.400] They reduce the severity of the COVID, and they reduce your risk of long COVID.
[01:36:29.840 --> 01:36:33.720] So, yep, that was mostly due to the vaccines.
[01:36:33.960 --> 01:36:47.880] The later variants were overall sort of less virulent, although they spread more easily, they didn't cause as bad as a disease, which is something that does typically happen during pandemics.
[01:36:47.880 --> 01:36:50.040] Okay, let's go on to number two.
[01:36:50.040 --> 01:36:56.920] A recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
[01:36:56.920 --> 01:37:02.520] You guys all think this one is the fiction, and this one is the fiction.
[01:37:02.920 --> 01:37:07.240] Yeah, maybe it's going first, it's so stressful.
[01:37:07.240 --> 01:37:07.800] It is, right?
[01:37:08.040 --> 01:37:19.160] Because it turns out that these viral inclusions, these shared viral inclusions, increase the overall risk of destabilizing the genome.
[01:37:19.480 --> 01:37:20.520] Right, that makes sense.
[01:37:20.520 --> 01:37:21.080] That makes more sense.
[01:37:22.120 --> 01:37:23.160] Let's get rid of them.
[01:37:23.160 --> 01:37:29.640] Yeah, so the author's right that they found that these viral inclusions cause transcriptional dysregulation.
[01:37:29.640 --> 01:37:38.040] And essentially, what that means is that it's more likely for there to be the kinds of mutations that do lead to cancer, right?
[01:37:38.040 --> 01:37:41.480] Mutations that cause the cells to reproduce out of control or whatever.
[01:37:41.480 --> 01:37:44.440] So, yep, it increases the risk of cancer.
[01:37:44.440 --> 01:37:58.120] Now, these viral inclusions are very interesting from a basic science evolutionary point of view because they are an independent, powerful molecular evidence for evolution, right?
[01:37:58.440 --> 01:38:14.600] Because essentially, when you know, you remember like some viruses have reverse transcriptase, they basically insert their DNA into cells, and sometimes it gets into the germline, and you have sort of this permanent addition of a bit of viral DNA in the genome.
[01:38:14.720 --> 01:38:18.720] When that happens, every descendant inherits it, right?
[01:38:18.720 --> 01:38:19.920] So there you go.
[01:38:19.920 --> 01:38:25.280] Any future speciation, whatever, it carries through throughout all the descendants that pick it up.
[01:38:25.280 --> 01:38:38.320] Of course, sometimes it could be lost, but basically, you know, you could see these patterns, these nestled hierarchies of viral inclusions that are following the evolutionary tree of relationships.
[01:38:38.320 --> 01:38:46.960] And therefore, it's really an awesomely powerful, independent evidence for the historical fact of evolution.
[01:38:46.960 --> 01:38:49.840] There's really just no way around it.
[01:38:49.840 --> 01:38:57.040] There is no non-evolutionary explanation for the pattern of inclusions that we see in nature.
[01:38:57.360 --> 01:39:01.200] Yeah, the other side of that coin, could they make you a superhero?
[01:39:02.080 --> 01:39:06.080] They could ask for a friend.
[01:39:06.080 --> 01:39:06.880] Asking for a superhero.
[01:39:06.960 --> 01:39:09.520] So you're not saying it's impossible.
[01:39:11.120 --> 01:39:12.560] All right, let's go on number three.
[01:39:12.560 --> 01:39:19.120] Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:39:19.120 --> 01:39:21.200] This one, of course, is science.
[01:39:21.200 --> 01:39:28.080] Yeah, this is one of those bad positive feedback loops in climate change.
[01:39:28.320 --> 01:39:35.680] As sea ice melts, it reduces the amount of radiation that it reflects back into space, which has a cooling effect.
[01:39:35.680 --> 01:39:42.560] Therefore, the cooling effect is reduced, leads to more warming, more ice melting, less cooling, more warming, etc.
[01:39:42.800 --> 01:39:44.480] So that's bad.
[01:39:44.800 --> 01:39:49.600] Because the sea ice has a pretty high albedo, it reflects a lot of energy.
[01:39:49.600 --> 01:39:53.440] And the sea water, right, the oceans are very dark.
[01:39:53.440 --> 01:39:55.520] They have a very low albedo.
[01:39:55.520 --> 01:39:57.920] They reflect very little light.
[01:39:57.920 --> 01:40:03.800] So there is a dramatic difference between the ice and the non-ice-covered ocean.
[01:40:04.120 --> 01:40:10.680] So loss of sea ice can be a massive positive feedback effect on climate change.
[01:40:10.680 --> 01:40:27.000] One other interesting wrinkle to this was that the scientists found that this reduction in that 14% reduction in the cooling effect is greater than just the reduction in the surface area of sea ice, the average surface area of the sea ice over the course of the year.
[01:40:27.640 --> 01:40:31.640] The decrease in the cooling effect is greater than the decrease in the surface area.
[01:40:31.640 --> 01:40:32.840] So it's not linear.
[01:40:32.840 --> 01:40:36.840] Yeah, so it's not a strictly linear relationship, which is interesting.
[01:40:36.840 --> 01:40:37.400] We won.
[01:40:37.400 --> 01:40:38.040] Woohoo!
[01:40:38.040 --> 01:40:38.680] We win.
[01:40:38.680 --> 01:40:39.480] Yes, winner.
[01:40:39.480 --> 01:40:40.040] What do we win?
[01:40:40.040 --> 01:40:40.680] What do you get?
[01:40:40.680 --> 01:40:43.800] Well, you all get to hear Evan's quote.
[01:40:43.800 --> 01:40:45.720] Evan, let us have the quote.
[01:40:45.720 --> 01:40:47.000] Woohoo!
[01:40:47.640 --> 01:40:49.720] Which would have heard anyways, I suppose.
[01:40:49.720 --> 01:40:52.520] But let's not ruin the parade right now.
[01:40:52.520 --> 01:40:53.080] Yeah.
[01:40:53.720 --> 01:40:58.280] In effect, we're all guinea pigs for the dietary supplement industry.
[01:40:58.280 --> 01:41:03.480] The vast majority of these supplements don't deliver on their promises.
[01:41:03.480 --> 01:41:03.880] Dr.
[01:41:03.880 --> 01:41:14.040] Nick Tiller, who's the author of the book called The Skeptic's Guide to Sports Science: Confronting Myths of the Health and Fitness Industry.
[01:41:14.040 --> 01:41:14.680] Hey.
[01:41:15.000 --> 01:41:19.400] Yeah, if you're anyone else but Douglas Adams, you borrowed that from us.
[01:41:21.320 --> 01:41:22.200] And Dr.
[01:41:22.200 --> 01:41:30.920] Tiller will be happy to say hello to you when we are at SciCon in October as part of that conference, which is going to be cool.
[01:41:30.920 --> 01:41:35.640] And I'm featuring a bunch of quotes from people who are going to be at that conference specifically.
[01:41:35.640 --> 01:41:36.520] Oh, that's fun.
[01:41:36.520 --> 01:41:37.000] All right.
[01:41:37.000 --> 01:41:38.280] Thank you, Evan.
[01:41:38.280 --> 01:41:40.840] And thank you all for joining me this week.
[01:41:40.840 --> 01:41:41.240] Sure, Ben.
[01:41:41.320 --> 01:41:41.800] Thank you, Steve.
[01:41:41.800 --> 01:41:42.680] Thank you, Steve.
[01:41:42.680 --> 01:41:47.440] And until next week, this is your Skeptics Guide to the Universe.
[01:41:49.760 --> 01:41:56.400] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:41:56.400 --> 01:42:01.040] For more information, visit us at the skepticsguide.org.
[01:42:01.040 --> 01:42:04.960] Send your questions to info at the skepticsguide.org.
[01:42:04.960 --> 01:42:15.680] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:42:15.680 --> 01:42:19.120] Our listeners and supporters are what make SGU possible.
Prompt 6: Key Takeaways
Now please extract the key takeaways from the transcript content I provided.
Extract the most important key takeaways from this part of the conversation. Use a single sentence statement (the key takeaway) rather than milquetoast descriptions like "the hosts discuss...".
Limit the key takeaways to a maximum of 3. The key takeaways should be insightful and knowledge-additive.
IMPORTANT: Return ONLY valid JSON, no explanations or markdown. Ensure:
- All strings are properly quoted and escaped
- No trailing commas
- All braces and brackets are balanced
Format: {"key_takeaways": ["takeaway 1", "takeaway 2"]}
Prompt 7: Segments
Now identify 2-4 distinct topical segments from this part of the conversation.
For each segment, identify:
- Descriptive title (3-6 words)
- START timestamp when this topic begins (HH:MM:SS format)
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Most important Key takeaway from that segment. Key takeaway must be specific and knowledge-additive.
- Brief summary of the discussion
IMPORTANT: The timestamp should mark when the topic/segment STARTS, not a range. Look for topic transitions and conversation shifts.
Return ONLY valid JSON. Ensure all strings are properly quoted, no trailing commas:
{
"segments": [
{
"segment_title": "Topic Discussion",
"timestamp": "01:15:30",
"key_takeaway": "main point from this segment",
"segment_summary": "brief description of what was discussed"
}
]
}
Timestamp format: HH:MM:SS (e.g., 00:05:30, 01:22:45) marking the START of each segment.
Now scan the transcript content I provided for ACTUAL mentions of specific media titles:
Find explicit mentions of:
- Books (with specific titles)
- Movies (with specific titles)
- TV Shows (with specific titles)
- Music/Songs (with specific titles)
DO NOT include:
- Websites, URLs, or web services
- Other podcasts or podcast names
IMPORTANT:
- Only include items explicitly mentioned by name. Do not invent titles.
- Valid categories are: "Book", "Movie", "TV Show", "Music"
- Include the exact phrase where each item was mentioned
- Find the nearest proximate timestamp where it appears in the conversation
- THE TIMESTAMP OF THE MEDIA MENTION IS IMPORTANT - DO NOT INVENT TIMESTAMPS AND DO NOT MISATTRIBUTE TIMESTAMPS
- Double check that the timestamp is accurate - a timestamp will NEVER be greater than the total length of the audio
- Timestamps are given as ranges, e.g. 01:13:42.520 --> 01:13:46.720. Use the EARLIER of the 2 timestamps in the range.
Return ONLY valid JSON. Ensure all strings are properly quoted and escaped, no trailing commas:
{
"media_mentions": [
{
"title": "Exact Title as Mentioned",
"category": "Book",
"author_artist": "N/A",
"context": "Brief context of why it was mentioned",
"context_phrase": "The exact sentence or phrase where it was mentioned",
"timestamp": "estimated time like 01:15:30"
}
]
}
If no media is mentioned, return: {"media_mentions": []}
Full Transcript
[00:00:00.800 --> 00:00:06.880] 10 years from today, Lisa Schneider will trade in her office job to become the leader of a pack of dogs.
[00:00:06.880 --> 00:00:09.600] As the owner of her own dog rescue, that is.
[00:00:09.600 --> 00:00:17.280] A second act made possible by the reskilling courses Lisa's taking now with AARP to help make sure her income lives as long as she does.
[00:00:17.280 --> 00:00:22.240] And she can finally run with the big dogs and the small dogs, who just think they're big dogs.
[00:00:22.240 --> 00:00:25.920] That's why the younger you are, the more you need AARP.
[00:00:25.920 --> 00:00:29.680] Learn more at aarp.org/slash skills.
[00:00:33.200 --> 00:00:36.480] You're listening to the Skeptic's Guide to the Universe.
[00:00:36.480 --> 00:00:39.600] Your escape to reality.
[00:00:40.240 --> 00:00:43.200] Hello, and welcome to The Skeptic's Guide to the Universe.
[00:00:43.200 --> 00:00:48.320] Today is Wednesday, July 17th, 2024, and this is your host, Stephen Novella.
[00:00:48.320 --> 00:00:50.080] Joining me this week are Bob Novella.
[00:00:50.080 --> 00:00:50.720] Hey, everybody.
[00:00:50.720 --> 00:00:52.080] Kara Santa Maria.
[00:00:52.080 --> 00:00:52.800] Howdy.
[00:00:52.800 --> 00:00:54.160] And Evan Bernstein.
[00:00:54.160 --> 00:00:55.760] Hey, everywhere's Jay.
[00:00:55.760 --> 00:00:56.960] Jay is off this week.
[00:00:56.960 --> 00:00:57.440] Oh.
[00:00:57.440 --> 00:00:58.560] Yep, he's just on vacation.
[00:00:59.920 --> 00:01:02.160] Jay is both off and on vacation.
[00:01:02.160 --> 00:01:03.840] He just goes off.
[00:01:03.840 --> 00:01:04.480] That's all.
[00:01:04.480 --> 00:01:08.080] Wherever you are, Jay, I hope you're having an excellent time.
[00:01:08.080 --> 00:01:09.840] Jay is in Maine, actually.
[00:01:09.840 --> 00:01:10.720] Uh-huh.
[00:01:11.040 --> 00:01:11.760] Horseback.
[00:01:11.760 --> 00:01:12.800] Girl like that.
[00:01:12.800 --> 00:01:13.520] All right.
[00:01:14.080 --> 00:01:15.200] Maine, you say.
[00:01:15.200 --> 00:01:18.720] So, have either of you guys watched either The Bear or Shogun?
[00:01:18.720 --> 00:01:19.840] I've watched The Bear.
[00:01:19.840 --> 00:01:20.640] I did not watch The Shogun.
[00:01:20.800 --> 00:01:22.400] I've watched Cocaine Bear.
[00:01:22.880 --> 00:01:26.960] I watched Shogun when it was a series back in the 1980s?
[00:01:26.960 --> 00:01:28.000] Early 80s?
[00:01:28.000 --> 00:01:30.480] Kara, what did you think of this season of The Bear?
[00:01:30.800 --> 00:01:32.400] I can't do any spoilers yet, right?
[00:01:32.400 --> 00:01:33.440] Because it's still pretty, pretty.
[00:01:33.600 --> 00:01:35.760] Yeah, you can just give an overview with no spoilers.
[00:01:35.760 --> 00:01:38.720] All right, let me think about how to say the impression with that.
[00:01:38.720 --> 00:01:41.440] I really like how atmospheric it is.
[00:01:41.440 --> 00:01:42.880] I think the writing is really good.
[00:01:42.880 --> 00:01:45.760] I think that it's really visually interesting.
[00:01:45.760 --> 00:01:49.920] I think that I am really annoyed with the main character.
[00:01:49.920 --> 00:01:51.360] I don't think he's complex.
[00:01:51.360 --> 00:01:53.840] I think he needs to get over himself.
[00:01:54.160 --> 00:01:56.880] Is that the way the character is designed?
[00:01:56.880 --> 00:02:00.120] 100% he's designed to be complex, but he's not.
[00:02:00.120 --> 00:02:01.320] Yeah, well, he's damaged.
[00:02:01.320 --> 00:02:02.120] Yeah, which is fine.
[00:01:58.880 --> 00:02:03.720] But, like, and they explore that a little bit.
[00:01:59.200 --> 00:02:04.760] But he's not complex.
[00:02:05.000 --> 00:02:13.000] He's just selfish and privileged and all the things that many women deal with in the dating world right now.
[00:02:13.000 --> 00:02:17.000] Yeah, I mean, my overall take, I mean, this season wasn't as good as the last two seasons.
[00:02:17.160 --> 00:02:19.640] I had to feel like they were trying too hard.
[00:02:19.640 --> 00:02:24.680] You know, like, a lot of episodes, like, yeah, okay, I could see like artistically where they were going with it.
[00:02:24.680 --> 00:02:27.080] It just wasn't that enjoyable an episode.
[00:02:27.080 --> 00:02:29.080] Yeah, not a lot happens this season.
[00:02:29.080 --> 00:02:30.200] I think it's fair to say that.
[00:02:30.200 --> 00:02:34.040] There's not a whole lot of activity going on this season.
[00:02:34.600 --> 00:02:35.400] It's classic.
[00:02:35.400 --> 00:02:37.480] They forgot to have shit happen.
[00:02:37.480 --> 00:02:37.960] Yeah.
[00:02:37.960 --> 00:02:38.920] Whoops.
[00:02:38.920 --> 00:02:39.560] Whoops.
[00:02:39.560 --> 00:02:41.800] Shogun, on the other hand, was excellent.
[00:02:41.800 --> 00:02:42.840] Really awesome.
[00:02:42.840 --> 00:02:43.400] Good to hear.
[00:02:43.560 --> 00:02:48.120] Leading in Emmy nominations and there's very, very good.
[00:02:48.120 --> 00:02:49.400] Yeah, I think you would love it.
[00:02:49.560 --> 00:02:50.360] Really, I would love it.
[00:02:50.360 --> 00:02:53.640] I'm not a big fan of like fighty shows.
[00:02:53.960 --> 00:02:56.200] It's not really a fighting show.
[00:02:56.200 --> 00:02:59.080] I mean, there's fighting that happens, but that's not good.
[00:02:59.400 --> 00:03:01.000] There's fighting, but it's not fighty.
[00:03:01.160 --> 00:03:02.040] Okay, cool.
[00:03:02.040 --> 00:03:02.360] Okay.
[00:03:02.360 --> 00:03:03.400] Good to know.
[00:03:03.400 --> 00:03:06.520] I mean, but it takes place, what, in the 1600s, right?
[00:03:06.840 --> 00:03:10.840] And so what wasn't fighting in the 1600s, right?
[00:03:10.920 --> 00:03:12.760] It's sort of, it's the background.
[00:03:12.760 --> 00:03:13.880] It's the environment.
[00:03:14.200 --> 00:03:20.360] Yeah, it's more that, like, I'm just not, like, entertained by action sequences.
[00:03:21.160 --> 00:03:21.800] You know what I mean?
[00:03:21.800 --> 00:03:24.120] That's not enough to hold my attention.
[00:03:24.120 --> 00:03:29.800] It's very much driven by excellent characters, and the best character, in my opinion, is the lead female.
[00:03:29.800 --> 00:03:30.760] She's awesome.
[00:03:31.400 --> 00:03:32.600] That's good.
[00:03:32.600 --> 00:03:42.760] Well, we haven't spoken yet about the big thing that happened since the last episode, the failed assassination attempt on Donald Trump.
[00:03:43.400 --> 00:03:44.040] That thing.
[00:03:44.040 --> 00:03:47.280] Well, I'm not going to talk about it politically, but the political aspect of it.
[00:03:47.520 --> 00:04:00.160] What I want to talk about is the fact that instantly, within seconds, minutes of this event happening, the internet was abuzz with all sorts of conspiracy theories.
[00:04:00.160 --> 00:04:02.000] Like, that's the go-to explanation.
[00:04:02.000 --> 00:04:03.360] It's a conspiracy.
[00:04:03.360 --> 00:04:06.320] It was impressive in its own right, how fast it is.
[00:04:06.560 --> 00:04:11.360] But I also got to ask before Steve, you get into the brass tacks of it all.
[00:04:11.360 --> 00:04:14.720] I mean, did you have a moment?
[00:04:14.960 --> 00:04:15.760] No, I really didn't.
[00:04:15.760 --> 00:04:17.840] No, really, not even a single moment.
[00:04:17.840 --> 00:04:21.040] No, you know, because, again, it's conspiracy theory.
[00:04:21.360 --> 00:04:30.640] I had sort of an immediate skeptical reaction, although a lot of people in my social circle had that immediate conspiracy theory instinct.
[00:04:30.640 --> 00:04:37.840] I think the difficulty was, yes, how could the Secret Service have failed so miserably?
[00:04:37.840 --> 00:04:43.760] I think that was, and like people are trying to answer that question, and they're looking for an explanation.
[00:04:43.760 --> 00:04:48.320] Well, yeah, the simpler explanation is always that is incompetence, right?
[00:04:48.320 --> 00:04:49.600] Never attributes to malice.
[00:04:50.080 --> 00:04:51.200] Incompetence, right?
[00:04:51.200 --> 00:04:52.160] User error.
[00:04:53.440 --> 00:04:55.920] So, you know, that was the idea that, and both sides did this.
[00:04:55.920 --> 00:05:04.480] Both sides used the fact that the Secret Service failed to prevent this from happening as evidence that or as a reason to think that it might have been a conspiracy.
[00:05:04.480 --> 00:05:14.400] Even like within my group of friends who are generally skeptical, smart people, that was sort of their instinct, like, oh, this has to be staged or whatever.
[00:05:14.880 --> 00:05:23.920] And then they would search for reasons to support what they want to believe based upon their ideological outlook, right?
[00:05:23.920 --> 00:05:31.000] Rather than asking about how plausible it is that something like this could have been pulled off.
[00:05:29.840 --> 00:05:35.320] Also, we have to consider, like, this would be a really stupid thing for either side to do.
[00:05:35.400 --> 00:05:55.160] The risk of being found out massively outweighs any incremental benefit they may get from the politics of either like staging it or you know, the outcome would not necessarily have been good, you know, if the assassination were successful.
[00:05:55.160 --> 00:06:07.720] So it's meanwhile, if their campaign was discovered to have been involved with a conspiracy, that would be the end, the absolute end of their campaign, regardless of the outcome of this event.
[00:06:07.720 --> 00:06:08.200] That's right.
[00:06:08.360 --> 00:06:08.760] Exactly.
[00:06:08.760 --> 00:06:09.800] Yeah, that's the big thing.
[00:06:09.880 --> 00:06:11.160] Grand conspiracy.
[00:06:11.560 --> 00:06:19.080] I'm less convinced by an argument that certain actors are doing a lot of things based on logic.
[00:06:19.080 --> 00:06:24.840] So the argument was, was it that it was staged or that it was orchestrated?
[00:06:24.840 --> 00:06:27.400] Because I think that's two different claims.
[00:06:27.400 --> 00:06:30.120] So it could be staged as in it wasn't real.
[00:06:30.120 --> 00:06:31.080] You know what I mean?
[00:06:31.080 --> 00:06:33.400] Like, like as in he didn't actually shoot him.
[00:06:33.400 --> 00:06:34.440] It was like staged.
[00:06:34.440 --> 00:06:36.600] It was like it was magic.
[00:06:36.600 --> 00:06:37.400] Yeah, exactly.
[00:06:37.720 --> 00:06:47.000] Versus it was orchestrated, meaning that they intentionally had a guy shoot at Donald Trump, knowing that he would miss by a hair.
[00:06:47.160 --> 00:06:48.040] That's the other part.
[00:06:48.040 --> 00:06:48.360] Yeah.
[00:06:48.360 --> 00:06:49.400] Yeah, one of those two.
[00:06:49.400 --> 00:06:49.960] One of those two.
[00:06:49.960 --> 00:06:51.640] Okay, so there's different variations.
[00:06:51.640 --> 00:06:52.680] I think people just conflate it.
[00:06:52.680 --> 00:06:55.960] Just like something about this was a false flag operation.
[00:06:56.520 --> 00:06:59.640] Oh, God, I hate that term so much.
[00:06:59.960 --> 00:07:06.840] So and then on the other side, of course, it was the fact that the Democrats were just they, that they did this, right?
[00:07:06.840 --> 00:07:13.160] So, by the way, good skeptical rule of thumb: do not use a vague reference to they, right?
[00:07:13.160 --> 00:07:19.520] Because that you're whitewashing over a lot of important details, and you're almost assuming a conspiracy when you do that.
[00:07:19.680 --> 00:07:36.720] So, it's saying, like, they tried to impeach him, and then they tried to prosecute him, and then they tried to take away his wealth, and now they tried to assassinate him as if like this is all the same group or the same group of people.
[00:07:36.720 --> 00:07:44.240] And some of them are saying it explicitly because they're always just referring to George Soros.
[00:07:44.560 --> 00:07:46.080] There is no they here.
[00:07:46.160 --> 00:07:47.920] This is like this is one person.
[00:07:48.240 --> 00:08:06.800] The only thing the FBI has been able to say so far is this guy was definitely acting alone, and he fits the total profile of a lone wolf, a massive shooter, you know, in terms of his age, gender, race, you know, a little bit of on the outside, the fringe socially, you know, enamored of guns.
[00:08:06.800 --> 00:08:08.000] I mean, it's perfect.
[00:08:08.000 --> 00:08:11.360] But generally speaking, in mass shootings, because that's something I think we have to remember, too.
[00:08:11.360 --> 00:08:13.360] This was a mass shooting event.
[00:08:14.000 --> 00:08:18.160] Yes, it was an assassination attempt, but let's not also forget, right?
[00:08:18.160 --> 00:08:18.720] Yeah.
[00:08:19.120 --> 00:08:30.640] And in mass shootings, very often, especially when you're dealing with like the young white male who's like somewhat intelligent, somewhat socially withdrawn, we often will see a manifesto.
[00:08:30.640 --> 00:08:37.600] We'll see some sort of something written on social media, clues leading up to it about their ideological leanings.
[00:08:37.600 --> 00:08:40.880] Yeah, there's none of that, which is fascinating.
[00:08:41.440 --> 00:08:50.800] Or, like, I don't know if you guys, did you watch the, we've talked about it before, Manhunt, I think, which was the series about Lincoln and John Wilkes Booth.
[00:08:50.960 --> 00:09:00.680] And yes, John Wilkes Booth was ideologically motivated, as in he was a Confederate and he really like believed in those causes, but really, he just wanted to be famous.
[00:09:00.680 --> 00:09:02.280] Like, that was a huge part of it.
[00:08:59.840 --> 00:09:03.800] He wanted to be the guy.
[00:09:04.600 --> 00:09:08.600] All right, Bob, tell us about caves on the moon.
[00:09:08.600 --> 00:09:10.200] Yes, I've been waiting for this.
[00:09:10.200 --> 00:09:11.560] I knew it was going to come.
[00:09:11.560 --> 00:09:24.680] So, for the first time, we now have solid evidence that the moon does indeed have large underground tunnels or caves, and these researchers think they would be a great place to hang out and just be safe on the moon.
[00:09:25.320 --> 00:09:28.440] What did it finally take for scientists to agree with me?
[00:09:28.440 --> 00:09:33.800] And what does this mean for my dream of a moon-based alpha before the heat death of the universe?
[00:09:34.040 --> 00:09:38.440] So, this is from the typical international team of scientists, which I love.
[00:09:38.680 --> 00:09:43.080] In this case, led by the scientists at the University of Trento in Italy.
[00:09:43.080 --> 00:09:48.920] This was published mid-July 2024, just recently, in the journal Nature Astronomy.
[00:09:48.920 --> 00:09:56.680] The title of the paper is Radar Evidence of an Accessible Cave Conduit on the Moon Below the Mare Tranquilitatus Pit.
[00:09:56.680 --> 00:09:59.320] Okay, so this makes me so happy.
[00:09:59.800 --> 00:10:01.800] You probably figured that out.
[00:10:01.800 --> 00:10:02.520] Why, though?
[00:10:02.520 --> 00:10:03.080] Why?
[00:10:03.080 --> 00:10:18.040] Because I don't like the idea of astronauts on the moon for extended periods of time, either dying horribly, painfully prolonged deaths, or very quick deaths, or I don't even like the idea of them being incredibly annoyed by moon dust.
[00:10:18.040 --> 00:10:29.560] And the utility of this, of an already existing huge underground cave or tunnel on the moon is primarily underscored by the fact that the moon surface really, really kind of sucks.
[00:10:29.560 --> 00:10:31.080] It's a horrible place.
[00:10:31.400 --> 00:10:33.480] You think about the moon, you see the videos, right?
[00:10:33.480 --> 00:10:35.080] It seems like a fun place, right?
[00:10:35.080 --> 00:10:40.040] All bouncy and happy, but it's really quite hellish.
[00:10:40.360 --> 00:10:43.640] And for a surprisingly number of deadly reasons.
[00:10:43.640 --> 00:10:46.960] First off, there's the temperature variations, which are nasty.
[00:10:44.920 --> 00:10:52.080] On the bright side of the moon, we're talking 127 degrees Celsius, 261 Fahrenheit.
[00:10:52.400 --> 00:10:56.080] On the unilluminated side, side of the whatever.
[00:10:56.320 --> 00:10:57.360] Ah, nice, Bob.
[00:10:57.680 --> 00:11:03.360] It can drop to minus 173 Celsius or minus 280 Fahrenheit.
[00:11:03.360 --> 00:11:04.320] God, that's cold.
[00:11:04.320 --> 00:11:05.840] So there's no Goldilocks zone on the moon.
[00:11:07.280 --> 00:11:09.680] Well, hold on to that thought, Evan.
[00:11:09.680 --> 00:11:10.640] Hold on to that thought.
[00:11:10.960 --> 00:11:13.200] Do I even have to say more about those temperature swings?
[00:11:13.200 --> 00:11:14.480] They're just, wow.
[00:11:14.480 --> 00:11:17.040] Next is the radiation on the surface of the moon.
[00:11:17.040 --> 00:11:23.360] There's galactic cosmic rays, which are high-energy particles from things like supernova, supernovae.
[00:11:23.520 --> 00:11:29.680] They can kill you over time by essentially increasing the risk of cancer by all those high-energy particles hitting you.
[00:11:29.840 --> 00:11:32.480] The sun radiation, though, is even worse.
[00:11:32.480 --> 00:11:41.280] They call it solar particle events, SPEs, but they're sudden and non-predictable, which makes them especially nasty.
[00:11:41.600 --> 00:11:42.880] Particles or the events?
[00:11:43.600 --> 00:11:50.400] Well, the events, these solar particle events are kind of sudden and they're not very predictable at all.
[00:11:50.400 --> 00:11:55.200] They can expose astronauts on the surface to literally life-threatening doses.
[00:11:56.160 --> 00:12:04.000] Just generally speaking, the moon varies then 200 to 2,000 times the radiation dose that we receive here on Earth.
[00:12:04.000 --> 00:12:07.920] But you've got to remember, because if you go to different websites, you may find different ranges.
[00:12:07.920 --> 00:12:12.320] And that's because we're really not sure how bad the radiation is yet.
[00:12:12.480 --> 00:12:14.960] It hasn't been studied as fully as it needs to be studied.
[00:12:15.440 --> 00:12:18.000] We don't have a radiation detector sitting on the moon somewhere.
[00:12:18.240 --> 00:12:18.640] What's that?
[00:12:18.960 --> 00:12:20.560] There's no radiation detector on the side.
[00:12:20.800 --> 00:12:21.920] But what kind is it detecting?
[00:12:22.240 --> 00:12:23.840] Is it detecting solar radiation?
[00:12:23.840 --> 00:12:25.520] But what about X-rays and gamma rays?
[00:12:25.520 --> 00:12:34.760] I mean, I'm not aware of any full full spectrum test that fully assesses the radiation doses that you're receiving on the moon.
[00:12:29.600 --> 00:12:36.440] Clearly, though, it's not good.
[00:12:36.760 --> 00:12:39.320] So, all right, then let's talk about a worst-case scenario.
[00:12:39.320 --> 00:12:42.200] And that happened on October 20th, 1989.
[00:12:42.200 --> 00:12:44.840] There was an X-class solar flare.
[00:12:44.840 --> 00:12:45.880] That's X-class.
[00:12:46.120 --> 00:12:47.320] There is no Y-class.
[00:12:47.320 --> 00:12:48.600] It ends with the X-class.
[00:12:48.600 --> 00:12:50.920] They are the nastiest solar flares.
[00:12:51.160 --> 00:12:56.440] That essentially caused a geomagnetic storm that bathed the moon in radiation.
[00:12:56.440 --> 00:13:02.520] And that radiation was more than eight times the radiation received by plant workers during Chernobyl.
[00:13:02.520 --> 00:13:04.120] So that's what you would have received.
[00:13:04.680 --> 00:13:11.800] If you were an astronaut on the moon, you would have received over a brief period of time eight times the dose that Chernobyl workers received.
[00:13:11.800 --> 00:13:19.240] From what I could tell, if you were an astronaut on the moon on October 20th, 1989, you probably would have died within hours.
[00:13:19.240 --> 00:13:21.400] So, I mean, that's how deadly we're talking.
[00:13:21.800 --> 00:13:24.120] You know, that's hours, that's pretty fast.
[00:13:24.120 --> 00:13:30.520] The only way you're going to die quicker on the moon is if you're hit with micrometeorite or meteorological.
[00:13:31.080 --> 00:13:32.360] That's just nasty.
[00:13:32.600 --> 00:13:36.600] And that's my next one here: micrometeorite impacts.
[00:13:36.920 --> 00:13:39.960] These are constantly bombarding the moon.
[00:13:39.960 --> 00:13:51.320] They can be very tiny particles or they could be up to a few centimeters, and they travel potentially up to 70 kilometers per second with the average impact velocity of about 20 kilometers per second.
[00:13:51.320 --> 00:13:54.360] That's a lot of kinetic energy there, even for something tiny.
[00:13:55.000 --> 00:13:59.240] The damage to structures or your head would be catastrophic.
[00:13:59.240 --> 00:14:01.000] And it's not even a direct hit.
[00:14:01.000 --> 00:14:08.440] You could get hit by the particles that are kicked up after it hits the ground on the moon, the ejecta.
[00:14:08.440 --> 00:14:09.800] Even that can be deadly.
[00:14:10.040 --> 00:14:11.800] So that's yet another one.
[00:14:11.800 --> 00:14:16.640] And then the final one on my list here is the dreaded moon dust, the regolith.
[00:14:14.920 --> 00:14:21.040] This is probably the most hated thing for Apollo astronauts on the moon.
[00:14:21.120 --> 00:14:23.120] They really, really did not like it.
[00:14:23.360 --> 00:14:26.640] Lunar soil is this dust is fine like powder.
[00:14:26.640 --> 00:14:31.520] It's even finer than I thought it was, but it's abrasive and sharp like glass.
[00:14:31.520 --> 00:14:36.160] Now, this comes from mechanical weathering on the surface of the moon.
[00:14:36.160 --> 00:14:47.200] The rocks have been fractured by meteors and micrometeorites over billions and billions of years, and it makes them really, really tiny, but they stay sharp because there's no wind and there's no water erosion.
[00:14:47.200 --> 00:14:50.000] So they stay basically nasty forever.
[00:14:50.240 --> 00:14:54.160] Anakin Skywalker would hate this far more than sand.
[00:14:54.480 --> 00:14:59.760] It not only gets everywhere, it eventually damages whatever it comes in contact with.
[00:14:59.760 --> 00:15:04.240] It even causes what the Apollo 17 astronauts called lunar hay fever.
[00:15:04.240 --> 00:15:05.280] Ever hear of that?
[00:15:05.920 --> 00:15:11.680] Every of the 12 men that walked on the moon, every one of them got this lunar hay fever.
[00:15:11.680 --> 00:15:16.320] I mean, and this was from the moon dust, the moon sand, the regolith.
[00:15:16.320 --> 00:15:22.800] It was sneezing and nasal congestion, and sometimes it took many days for it to even fade, but they all got it.
[00:15:22.800 --> 00:15:24.480] And that's just over a weekend.
[00:15:24.480 --> 00:15:27.840] They were there from like just like a day or three.
[00:15:27.840 --> 00:15:29.920] It was really, they weren't there very long.
[00:15:30.560 --> 00:15:33.440] And get this, they've done experiments with analogs.
[00:15:33.440 --> 00:15:40.720] They created this analog for the regolith, and they showed that long-term exposure would likely destroy lung and brain cells.
[00:15:41.040 --> 00:15:42.160] That would be bad.
[00:15:42.160 --> 00:15:43.120] And then he paused.
[00:15:43.360 --> 00:15:43.920] And then he paused.
[00:15:44.080 --> 00:15:45.840] Yeah, destroying lung and brain cells.
[00:15:45.840 --> 00:15:46.400] Not good.
[00:15:46.400 --> 00:15:49.120] I'm not sure which I would rather lose first.
[00:15:49.280 --> 00:15:50.400] Maybe brain cells.
[00:15:50.400 --> 00:15:53.520] Okay, so moon dust doesn't even lay nicely on the ground.
[00:15:53.520 --> 00:15:54.160] Did you know that?
[00:15:54.160 --> 00:16:04.280] It's electrostatically charged from the sun, so it actually floats a little bit above the surface, and that makes it even easier to get everywhere-to breathe it in or to get to get in your equipment.
[00:16:04.600 --> 00:16:11.560] The bottom line then is that being on the surface of the moon, like the Apollo astronauts, just for a weekend, is basically okay.
[00:16:11.560 --> 00:16:22.120] You know, it's you know, it's fairly safe, but it can be annoying, specifically the moon dust was the most annoying because none of the other big players came into play, right?
[00:16:23.160 --> 00:16:28.760] There were no micrometeoroids or micrometeorites or radiation or any of that hit them.
[00:16:28.760 --> 00:16:29.880] So, it was fairly safe.
[00:16:29.880 --> 00:16:34.680] But if you go beyond that, though, you go beyond just a weekend, like is what we're planning, right?
[00:16:34.680 --> 00:16:40.120] We're trying to make much more permanent stays on the moon, it just gets increasingly and increasingly deadly.
[00:16:40.120 --> 00:16:46.280] All right, so that's my that's my story, that's my background on why it's so nasty on the surface of the moon.
[00:16:46.280 --> 00:16:53.000] This latest news item starts with an open pit in the Mare Tranquilitatis, the Sea of Tranquility.
[00:16:53.000 --> 00:16:54.840] That's such always a beautiful name.
[00:16:54.840 --> 00:17:01.560] So, now the Sea of Tranquility, it looked like a sea, right, to early moon observers, but it's really just an ancient lava plane.
[00:17:01.560 --> 00:17:07.640] And it's also where the Apollo 11 astronauts, right, Neil Armstrong and Buzz Aldrin, first set foot on the moon.
[00:17:07.640 --> 00:17:09.400] But I mean, this is a lava plane, right?
[00:17:09.400 --> 00:17:10.760] Lava was flowing through here.
[00:17:10.760 --> 00:17:14.840] You know, there was like basaltic lava all over there many billions of years ago.
[00:17:15.080 --> 00:17:21.800] Now, these lunar pits were identified in 2009, which is actually a little bit later than I thought they were.
[00:17:21.800 --> 00:17:25.320] But in 2009, they were first really identified.
[00:17:25.320 --> 00:17:28.840] And that's probably because they look like normal craters from a distance.
[00:17:28.840 --> 00:17:32.760] But if you look closer, you see, ah, that's not an impact crater.
[00:17:32.760 --> 00:17:38.200] It looks more like a collapsed roof or a skylight, if you will, rather than that impact.
[00:17:38.200 --> 00:17:46.240] Now, by now, hundreds of these pits have been found, and the speculation is that many of these are lava tubes, billions of years old.
[00:17:46.560 --> 00:17:51.680] Basaltic lava was flowing through, was all through that area, through these mares.
[00:17:51.680 --> 00:17:56.000] They created these lava tubes, and eventually they drained away, leaving the empty tubes.
[00:17:56.000 --> 00:17:58.640] And we've got plenty of these on the Earth.
[00:17:58.640 --> 00:18:03.120] On the moon, they could be potentially even bigger because of the low gravity.
[00:18:03.120 --> 00:18:09.520] Now, the specific pit in the Sea of Tranquility is 100 meters in diameter and 150 meters deep.
[00:18:09.520 --> 00:18:10.720] So, this is kind of big.
[00:18:10.720 --> 00:18:23.760] The difference, though, this pit is special because this pit was overflown by NASA's Lunar Reconnaissance Orbiter, and it was done even more importantly, at a relatively oblique angle.
[00:18:23.760 --> 00:18:30.560] So, that relatively low angle allowed the radar to enter the tunnel at, say, 45 degrees instead of straight up and down.
[00:18:30.560 --> 00:18:39.120] And that allowed the radar to actually get under that overhang the pit walls, you know, the pit sides going down.
[00:18:39.120 --> 00:18:45.440] And so, it kind of went under, and then the radar kind of bounced around a little bit before coming back out.
[00:18:45.440 --> 00:18:52.160] And this is what showed that the underground area under the pit extended at least 170 meters.
[00:18:52.160 --> 00:18:56.640] So, far, you know, wider than wider than the actual hole going in.
[00:18:56.640 --> 00:18:58.880] So, this was at least 170 meters.
[00:18:58.880 --> 00:19:12.160] And the researchers thought this was extraordinary, and it is because clearly there is some sort of area underneath this pit that's bigger than you might see, might imagine just from looking at the pit opening itself.
[00:19:12.320 --> 00:19:13.840] And so, they thought this was extraordinary.
[00:19:13.840 --> 00:19:19.840] And, like good scientists, they figured, well, let's validate this because this is kind of amazing.
[00:19:19.840 --> 00:19:22.800] So, let's see if we can validate this in some other way.
[00:19:22.800 --> 00:19:39.800] And so, what the direction they decided to take was to use a 3D, to create models, a 3D computer model of the pit, matching the visible geometry, the known geometry of the pit from images and using basically like 3D stereoscopic images to creating this 3D image.
[00:19:39.800 --> 00:19:44.520] So they had the geometric layout of the pit and the surrounding lunar surface.
[00:19:44.520 --> 00:19:47.240] So they used that as kind of like the base of their novel.
[00:19:47.240 --> 00:19:50.200] Here's what we know from what we can see of that area.
[00:19:50.440 --> 00:19:55.160] The model then simulated that oblique radar beam that came off of the orbiter.
[00:19:55.160 --> 00:20:03.560] And they went through many iterations of possible geometries until the simulation matched the real observational data from the moon.
[00:20:03.560 --> 00:20:04.280] So you get that?
[00:20:04.280 --> 00:20:10.600] They tweaked and tweaked the model until the model was basically doing exactly what reality was doing.
[00:20:10.600 --> 00:20:19.560] So that way they have a pretty solid, high confidence that, all right, so whatever the model is saying now, whatever it's concluding, could be potentially true.
[00:20:19.560 --> 00:20:26.440] So the model made a couple of different solutions, and there was only one solution that was geologically plausible.
[00:20:26.440 --> 00:20:35.560] And that solution contained a big cave conduit that was up to 170 meters long, but could be even bigger, they say.
[00:20:35.560 --> 00:20:36.680] So that was their conclusion.
[00:20:36.680 --> 00:20:44.520] So according to these researchers, there's very likely to be a sizable subsurface cavern or tunnel on the moon.
[00:20:44.520 --> 00:20:48.040] And in their mind, it seemed like this is basically a done deal.
[00:20:48.840 --> 00:20:51.240] Their confidence levels are very, very high.
[00:20:51.400 --> 00:20:53.880] And that's awesome from my point of view, obviously.
[00:20:53.880 --> 00:21:06.840] But it's also, at the same time, I feel like, yeah, it's about time we confirmed this, because it seemed, you know, looking at these pictures, it seemed pretty obvious that there was some sort of space underneath these pits bigger than you would think.
[00:21:06.840 --> 00:21:13.080] So I'm just very kind of happy and relieved that they finally are really accepting this.
[00:21:13.400 --> 00:21:15.000] All right, so what's the next step here?
[00:21:15.600 --> 00:21:19.440] The next step is to determine how big this is.
[00:21:19.440 --> 00:21:20.400] Because think about it.
[00:21:20.640 --> 00:21:24.720] We have the radar going straight down in one direction.
[00:21:24.720 --> 00:21:30.880] So we know that this is kind of an extended kind of tunnel-like 175 meters or more.
[00:21:30.880 --> 00:21:35.440] But what they need to do is they need to do more flybys, but from different angles.
[00:21:35.440 --> 00:21:40.320] So when you hit it from different angles, you're looking at different areas of this subsurface cavern.
[00:21:40.320 --> 00:21:44.000] You know, is it very narrow, making it a tube, or not?
[00:21:45.200 --> 00:21:57.040] So they say right now that even though they have really no idea how wide it is, it's probably almost certainly 55 to 60 meters wide, which would mean it's probably a lava tube.
[00:21:57.040 --> 00:22:03.120] But they say that it could potentially be hundreds of meters wide, which would make it more cave-like than tube-like.
[00:22:03.120 --> 00:22:09.600] So, you know, it could be a lava tube, it could be a bigger, it could be a gargantuan lava tube, or perhaps more of a cave-like system.
[00:22:09.840 --> 00:22:15.440] They're not sure, and they say that the only way to do it is to do more flybys, which I hope we really do.
[00:22:15.760 --> 00:22:27.120] Okay, so the low-gravity elephant in this pit is the idea that if it really is roomy down there, then it would make a great location for a moon base alpha.
[00:22:27.280 --> 00:22:28.960] And the scientists actually say this.
[00:22:28.960 --> 00:22:42.160] They say in their paper, this discovery suggests that the MTP, the pit, basically, is a promising site for a lunar base as it offers shelter from the harsh surface environment and could support long-term human exploration of the moon.
[00:22:42.160 --> 00:22:55.280] And in my mind, it's not only fun to think of colonies in these lunar caves, and of course, the protection they would offer would be really dramatic, and that's why I went into some detail about how dangerous the surface is.
[00:22:55.280 --> 00:22:57.280] So it would be so much safer down there.
[00:22:57.920 --> 00:23:02.520] It seems like a no-brainer in many ways since this cave is already there.
[00:23:03.160 --> 00:23:09.560] Because once you're in this cave system, the radiation, the micrometeorites, all that stuff goes away.
[00:23:10.040 --> 00:23:13.400] And get this, the temperature difference goes away as well.
[00:23:13.400 --> 00:23:19.080] Because I found a study that looked into what the temperature could potentially be in these pits.
[00:23:19.080 --> 00:23:25.720] And some researchers are saying it could be like consistently 63 degrees Fahrenheit.
[00:23:25.720 --> 00:23:28.360] I don't know exactly what that is in Celsius, but that's nice.
[00:23:28.360 --> 00:23:29.480] That's like nice weather.
[00:23:29.480 --> 00:23:31.240] That's t-shirt weather.
[00:23:31.240 --> 00:23:32.120] I'm not sure how that works.
[00:23:32.600 --> 00:23:34.280] You wear a t-shirt in 63 degrees?
[00:23:34.760 --> 00:23:36.040] We are from different parts of the country.
[00:23:36.280 --> 00:23:37.960] All right, maybe it was cold to me.
[00:23:37.960 --> 00:23:39.640] Yeah, okay, light jacket, hoodie weather.
[00:23:39.640 --> 00:23:40.600] Very light, very light.
[00:23:40.600 --> 00:23:42.520] But that's, to me, that's amazing.
[00:23:42.520 --> 00:23:55.480] I didn't do a deep dive on that paper, but even if it's, even if that's not correct, even if it's much higher or even much lower, but a consistent temperature, like around that temperature, would be amazing.
[00:23:55.480 --> 00:23:56.520] Totally amazing.
[00:23:56.520 --> 00:24:01.240] Now, of course, it all seems pretty pie in the sky with modern technology, right?
[00:24:01.720 --> 00:24:12.600] Getting all the industrial equipment and people up there and working out how to build a moon base in such an environment as the moon is obviously going to be ridiculously hard.
[00:24:13.080 --> 00:24:14.840] We cannot do that right now.
[00:24:14.840 --> 00:24:25.400] And I think it's before we see anything substantial on the moon, even in these pre-made caverns under the moon's surface, I think it's going to take a hell of a long time.
[00:24:25.400 --> 00:24:32.280] Steve, if you had to make a prediction, 100 years, 80 years, it depends on something like that.
[00:24:32.280 --> 00:24:35.560] I mean, it all depends on how many resources you want to put into it.
[00:24:35.800 --> 00:24:37.320] You know, we are going back to the moon.
[00:24:37.320 --> 00:24:39.800] We are going to try to have a sustained presence on the moon.
[00:24:39.800 --> 00:24:43.960] If we want to build a base like this, it would be a huge engineering effort.
[00:24:43.960 --> 00:24:47.920] I mean, as you said, think of all the equipment we have to bring down to the surface of the moon.
[00:24:47.920 --> 00:24:54.320] It would take decades to do this kind of construction, even once we are permanently on the moon.
[00:24:54.320 --> 00:24:56.960] But if we want to do it, we can do it.
[00:24:56.960 --> 00:24:59.120] We can do this with our current technology.
[00:24:59.120 --> 00:25:00.400] It's not a technology issue.
[00:25:00.400 --> 00:25:03.280] It's just an effort and resource issue.
[00:25:03.520 --> 00:25:04.240] It really is.
[00:25:04.240 --> 00:25:08.880] And I think they're going to take how dangerous the surface is.
[00:25:08.880 --> 00:25:10.240] They're going to take it seriously.
[00:25:10.240 --> 00:25:16.160] And they're not going to immediately, of course, try to go into these caverns.
[00:25:16.160 --> 00:25:18.000] And by the way, I am waiting.
[00:25:18.000 --> 00:25:26.160] I hope I live long enough to see the first images from a lander that's actually cruising around in one of these tunnels.
[00:25:26.160 --> 00:25:28.240] That would be an amazing moment.
[00:25:28.240 --> 00:25:29.440] And I think they will.
[00:25:29.440 --> 00:25:30.560] They're going to take this seriously.
[00:25:30.560 --> 00:25:45.360] So whatever they construct on the moon, they're going to make sure that you pile up enough regolith, you create enough of a shield to protect you not only from radiation, but for some of the nastier, maybe some of the smaller micrometeorites.
[00:25:45.680 --> 00:25:47.920] They'll take protection seriously.
[00:25:47.920 --> 00:25:58.000] Yeah, if you made a protective structure on a moon base that had two to three feet of moon crete on the outside, that would go a long way to be protecting from radiation.
[00:25:58.000 --> 00:25:58.880] Oh, absolutely.
[00:25:58.880 --> 00:26:01.520] That's basically like a given.
[00:26:01.520 --> 00:26:03.680] It's got to be, they've got to do something like that.
[00:26:03.680 --> 00:26:11.120] Otherwise, I mean, it's like, oh, yeah, we just lost all of our astronauts on the moon because they weren't protected enough from this solar event.
[00:26:11.120 --> 00:26:21.680] So, yeah, I hope they take it very seriously and realize that, yeah, it's going to be very difficult to create a large and very safe structure on the moon.
[00:26:21.680 --> 00:26:22.800] It'd be just so easy.
[00:26:22.800 --> 00:26:24.240] Just go underground, man.
[00:26:24.240 --> 00:26:25.440] It's just right there.
[00:26:25.440 --> 00:26:38.040] And Steve, and also, Steve, I know you mentioned in your blog that the cave walls might be sharp, but I don't think they would be because the cave walls, this is just lava.
[00:26:38.040 --> 00:26:44.600] This is just lava that it wasn't mechanically weathered by being hit by micrometeorites over billions of years.
[00:26:44.760 --> 00:26:49.560] I think the surface of the tunnel itself would be fairly safe.
[00:26:49.560 --> 00:26:50.360] Yeah, that would be nice.
[00:26:50.360 --> 00:26:50.840] Yeah, it depends.
[00:26:50.840 --> 00:26:53.960] I mean, some lava tubes on the Earth, many are smooth.
[00:26:53.960 --> 00:26:54.840] Some are rough, though.
[00:26:55.000 --> 00:26:57.240] There are none that I think of that are sharp.
[00:26:57.240 --> 00:26:59.800] So hopefully that will be the same on the moon.
[00:26:59.800 --> 00:27:01.480] It just depends on what the conditions are there.
[00:27:02.040 --> 00:27:05.320] And that's the other huge thing that I didn't probably stress enough.
[00:27:05.320 --> 00:27:21.160] The fact that we have a huge, very deep tunnel on the moon right now could do amazing things for just scientific discovery and learning more about the moon because you've got this pristine lunar material that has not been exposed to the sun and galactic cosmic rays.
[00:27:21.160 --> 00:27:23.960] And who knows what we'll discover about the moon once we get down there?
[00:27:23.960 --> 00:27:30.440] Doesn't sound like the moon's ever going to be a tourist attraction or kind of this recreational spot for people to go.
[00:27:30.440 --> 00:27:30.840] Yeah.
[00:27:31.640 --> 00:27:32.520] Sure, eventually.
[00:27:32.680 --> 00:27:34.600] I mean, who would think of that trip, though?
[00:27:34.600 --> 00:27:43.400] I mean, even better than like, say, low Earth orbit, going to the moon for a week would be, you know, once it was safe, I think it would be an amazing adventure.
[00:27:43.400 --> 00:27:51.960] Once, I mean, if it ever, if it ever gets as routine as like, say, traveling across the planet, I think there could be lots of people that would go.
[00:27:51.960 --> 00:27:54.280] Who knows how it's going to happen?
[00:27:54.280 --> 00:27:54.760] All right.
[00:27:54.760 --> 00:27:55.560] Thanks, Bob.
[00:27:55.560 --> 00:27:59.320] Kara, tell us about AI Love.
[00:27:59.320 --> 00:28:00.760] AI Love.
[00:28:01.240 --> 00:28:02.120] Who's that?
[00:28:02.120 --> 00:28:10.040] Okay, so before I dive into this story, basically, which was published, I read The Conversation a lot.
[00:28:10.040 --> 00:28:11.640] I know that we've talked about it on this show.
[00:28:11.640 --> 00:28:20.960] The Conversation is a website that has lots of different verticals, and the authors of the pieces on The Conversation are academics.
[00:28:20.960 --> 00:28:25.040] So it's sort of a from the horse's mouth format.
[00:28:25.360 --> 00:28:34.480] And there's an article that came out recently called Computer Love: AI-Powered Chatbots Are Changing How We Understand Romantic and Sexual Well-Being.
[00:28:34.480 --> 00:28:48.560] And it's by three different authors from the University of Quebec at Montreal, and I said that very American because I can't pronounce it in the French, and a researcher at the Kinsey Institute at Indiana University.
[00:28:48.560 --> 00:28:49.680] So these are psychologists.
[00:28:50.240 --> 00:28:51.600] As an Alfred Kinsey?
[00:28:51.600 --> 00:28:52.000] Yeah, yeah, yeah.
[00:28:52.160 --> 00:28:53.360] As an Alfred Kinsey, yeah.
[00:28:54.160 --> 00:28:58.800] So this is these are researchers in psychology and sexology departments.
[00:28:59.200 --> 00:29:01.120] When you say it's like, there you go.
[00:29:01.360 --> 00:29:02.560] I mean, what else?
[00:29:02.560 --> 00:29:11.600] The first thing I want to know, kind of from you all, is when is the last time or do you regularly interact with chatbots?
[00:29:11.600 --> 00:29:19.120] Like, I'm thinking I have interacted with chatbots when I need to like contact IT or customer service.
[00:29:19.120 --> 00:29:19.600] But I can't.
[00:29:19.840 --> 00:29:20.160] I see.
[00:29:20.160 --> 00:29:21.040] Yeah, right, right.
[00:29:21.040 --> 00:29:24.880] I can't think of other times when I regularly interact with chatbots.
[00:29:25.040 --> 00:29:27.840] Would you call chat GPT a chatbot?
[00:29:27.840 --> 00:29:28.960] I don't think so.
[00:29:28.960 --> 00:29:29.360] Okay.
[00:29:29.360 --> 00:29:33.360] Yeah, because I think it's something where you're having a back and forth conversation.
[00:29:33.360 --> 00:29:39.040] And so, you know, there are digital and AI-powered assistants like Siri and Alexa.
[00:29:39.040 --> 00:29:43.840] And then we're starting to see more and more chatbots on the rise for a lot of different applications.
[00:29:43.840 --> 00:29:49.920] So I think my exposure to these chatbots really generally is just customer service, which means I hate them.
[00:29:50.240 --> 00:29:51.920] I hate them with a burning passion.
[00:29:52.720 --> 00:29:54.320] Speak to a person, please.
[00:29:54.320 --> 00:29:55.040] Exactly.
[00:29:55.040 --> 00:30:04.120] But there is a growing industry of chatbots for kind of all manner of services, one of which is romantic companions.
[00:30:04.120 --> 00:30:12.680] Apparently, there are over a hundred AI-powered apps that offer romantic and sexual engagement.
[00:30:13.240 --> 00:30:14.280] Only 100?
[00:30:14.280 --> 00:30:15.240] Yeah, over 100.
[00:30:15.560 --> 00:30:18.600] And the people know that it's a chatbot when they're doing it.
[00:30:18.840 --> 00:30:19.400] 100%.
[00:30:19.400 --> 00:30:20.520] 100%.
[00:30:20.520 --> 00:30:27.720] So some of the ones that they listed on here are Myanima.ai, Eva AI, KnowMe.ai.
[00:30:28.520 --> 00:30:29.720] Myanima.
[00:30:30.040 --> 00:30:31.080] M-Y-A-N-A.
[00:30:32.440 --> 00:30:34.280] Myanima.ai.
[00:30:34.280 --> 00:30:37.320] Eva AI, KnowMe.ai, and Replica.
[00:30:38.200 --> 00:30:47.800] And these are different apps, I guess, that you download to your phone where because they're AI-powered, these chatbots evolve the longer you talk to them.
[00:30:47.800 --> 00:30:49.640] They understand what you're interested in.
[00:30:49.640 --> 00:30:57.160] They understand, you know, turns of phrase that you like to use, shortcuts, how much you emote, you know, sort of your affective stance.
[00:30:57.480 --> 00:30:59.800] Are they chat GPT-based?
[00:30:59.800 --> 00:31:02.440] I think they're all different, but probably some of them are.
[00:31:02.760 --> 00:31:03.480] I would assume, right?
[00:31:03.720 --> 00:31:04.600] Yeah, I would assume so.
[00:31:04.840 --> 00:31:06.120] They'd have to be at this point.
[00:31:06.360 --> 00:31:10.920] But yeah, I'm not sure what the, like, what AI platform they're being built upon.
[00:31:10.920 --> 00:31:11.720] Right, right.
[00:31:11.720 --> 00:31:13.480] What's the target audience?
[00:31:13.480 --> 00:31:14.520] Anyone, I would think.
[00:31:14.520 --> 00:31:15.320] Anyone who's interested.
[00:31:15.560 --> 00:31:17.000] Anyone with a sex drive?
[00:31:17.000 --> 00:31:18.840] Yeah, so yeah, but who is interested?
[00:31:18.840 --> 00:31:20.840] And so that is the question, right?
[00:31:20.840 --> 00:31:36.600] And I think it's important for us to kind of approach this with an open mind and to start asking some important questions because there is actually a growing body of scientific data on these topics.
[00:31:36.600 --> 00:31:47.520] There are a lot of studies across multiple disciplines asking questions like: Can people feel something for a chat bot?
[00:31:48.080 --> 00:31:50.560] And the answer seems to be across the board, yes.
[00:31:50.560 --> 00:31:52.160] Yeah, people have known that for decades.
[00:31:52.640 --> 00:32:01.120] Emotional bonds, some people self-identify as having fallen in love with a chatbot, knowing that it's a chat bot.
[00:32:01.120 --> 00:32:33.120] And interestingly, there was one study that was cited in this coverage that showed that when everyday people are engaging with either a potential romantic partner who is human or an AI version, a chat bot, which is a potential romantic partner, that on average, people tend to choose a more responsive AI over a less responsive human being, even though they know that it's an AI.
[00:32:33.120 --> 00:32:38.080] Is that because they feel they can manipulate the conversation more to their liking with an AI?
[00:32:38.080 --> 00:32:38.720] I don't know.
[00:32:38.720 --> 00:32:41.360] Well, first of all, I don't know if anybody can answer that question.
[00:32:41.360 --> 00:32:44.960] So, like, I think that's a sort of a rhetorical question.
[00:32:44.960 --> 00:32:47.200] It's probably different for different people.
[00:32:47.200 --> 00:32:49.360] But I think that that may be one reason.
[00:32:49.360 --> 00:32:51.920] It's not the first reason I would jump to.
[00:32:51.920 --> 00:32:54.400] I would think it's because they are responsive.
[00:32:54.400 --> 00:32:55.760] They're engaged with you.
[00:32:55.760 --> 00:33:11.840] But not only that, from what you've said, they're kind of like Zealigs where they adapt themselves to you, which you wouldn't really want somebody to do to a large extent in normal conversation with human to human because then it's just like weird.
[00:33:12.080 --> 00:33:12.480] Really?
[00:33:12.480 --> 00:33:13.760] Really, Bob Jigzo?
[00:33:14.080 --> 00:33:28.240] I can almost guarantee you, there are probably hundreds of studies out there that show that people feel most heard, people feel most connected when you mirror their behaviors, when you respond in ways that are similar to how they talk.
[00:33:29.040 --> 00:33:37.320] But it seems like this system, from how you described it, maybe I'm making assumptions here, would do it to a much more dramatic degree.
[00:33:37.640 --> 00:33:45.720] Possibly, there's some conscious, there's some conscious mirroring for sure, and that's just kind of instinctive, and you're not maybe not even aware that you're doing it.
[00:33:46.200 --> 00:33:59.400] And that's fine, but I wouldn't, it just made me think of interacting with somebody whose whole purpose is to not even be themselves, but to make themselves an extension of me.
[00:33:59.640 --> 00:34:03.400] And that's just, I don't think that's necessarily healthy, right?
[00:34:03.400 --> 00:34:12.280] I think that there are some assumptions being made in that statement that are not necessarily reflective of how most people are.
[00:34:12.280 --> 00:34:19.320] I think that if you, when's the last time that, I mean, you guys don't have to answer this if you don't want to, but have any of you ever been on dating apps?
[00:34:19.320 --> 00:34:19.720] Yeah, sure.
[00:34:19.880 --> 00:34:21.480] I have never been on a dating app.
[00:34:21.480 --> 00:34:22.760] I didn't think so.
[00:34:23.240 --> 00:34:25.800] I'm like, I'm talking to a bunch of people readmen.
[00:34:26.440 --> 00:34:30.600] But on a dating app, very often you connect with somebody for the first time.
[00:34:30.600 --> 00:34:36.360] You know nothing about them except for this over-the-top representation that they are trying to present to you.
[00:34:36.360 --> 00:34:42.760] And then when you start engaging, you start to recognize things like, oh, they don't know the difference between your, your, and your.
[00:34:43.320 --> 00:34:45.400] It's a resume, then the interview, right?
[00:34:45.400 --> 00:34:45.640] Right.
[00:34:45.640 --> 00:34:49.160] It's like, okay, and does this person have any emotional intelligence whatsoever?
[00:34:49.160 --> 00:34:50.200] Are they listening to me?
[00:34:50.200 --> 00:34:51.640] Are they asking me questions?
[00:34:51.640 --> 00:35:07.960] And Bob, I would venture to guess that an individual who connects with somebody on a dating app where the person or the chatbot that they connect with is saying all the things you'd hope they would say is going to be somebody that you fall in love with more quickly.
[00:35:07.960 --> 00:35:09.400] Yeah, for sure.
[00:35:09.400 --> 00:35:17.200] I remember to this day, I remember one of the most engaging back and forth I had with somebody on a dating app, and it was incredible.
[00:35:14.920 --> 00:35:20.320] We had so much in common, it really was a joy.
[00:35:21.360 --> 00:35:31.120] But again, that's one having things in common is one thing, but having someone adapt to you on the fly over time just strikes me.
[00:35:31.120 --> 00:35:42.640] It reminds me of the metamorph from the Next Generation episode where the woman actually attuned everything about herself to her mate so that she became the perfect mate for that person.
[00:35:42.640 --> 00:35:44.960] And it was like, that's just not right.
[00:35:45.840 --> 00:35:48.000] Bob, but what's your point with all this?
[00:35:48.000 --> 00:35:55.280] My point is that a natural, two people that have natural, many things naturally in common is fantastic.
[00:35:55.280 --> 00:36:02.160] But having somebody who adapts to you on purpose just to get along, to me, that crosses a line.
[00:36:02.400 --> 00:36:06.400] You may think it crosses a line, but the question is, how will people respond to that?
[00:36:06.400 --> 00:36:07.440] Yeah, and they'll love it.
[00:36:08.400 --> 00:36:10.160] They're going to love that shit.
[00:36:10.160 --> 00:36:11.440] They're going to love that shit.
[00:36:11.840 --> 00:36:14.080] I'm just giving you my take on it.
[00:36:14.080 --> 00:36:14.400] I would.
[00:36:14.720 --> 00:36:16.480] And I want to get into those implications.
[00:36:16.480 --> 00:36:20.960] And I think that sort of a takeaway from this is: yes, there could be a point where it was creepy, right?
[00:36:20.960 --> 00:36:28.880] Where somebody, where your potential romantic chatbot partner felt like too sycophantic and too inflexible.
[00:36:29.040 --> 00:36:30.000] I could see that.
[00:36:30.000 --> 00:36:33.360] But I think most people would love it, as you just said.
[00:36:33.680 --> 00:36:34.160] I don't deny that.
[00:36:35.280 --> 00:36:36.720] Humans are humans, after all.
[00:36:36.720 --> 00:36:37.200] Exactly.
[00:36:37.520 --> 00:36:39.440] I'm just kind of meta-human.
[00:36:39.760 --> 00:36:42.320] And so sure you are, Bob.
[00:36:43.360 --> 00:36:46.720] Can we figure out how to do this experiment and not tell Bob we're doing it?
[00:36:46.800 --> 00:36:48.880] I want to see how he actually responds to it.
[00:36:48.880 --> 00:36:53.760] Oh, if it's a good algorithm that does it seamlessly, of course.
[00:36:54.080 --> 00:36:58.320] I am, there's lots of lots, there's a lot of parts of me that are human, after all.
[00:36:58.320 --> 00:37:02.440] So I think I am absolutely can be swayed by that.
[00:36:59.840 --> 00:37:06.440] But it's just the way it was presented, that it's adapting to you over time.
[00:37:06.760 --> 00:37:08.120] Yeah, that's what AI does.
[00:37:08.120 --> 00:37:08.920] It adapts.
[00:37:09.240 --> 00:37:09.800] Yeah, it adapts.
[00:37:09.880 --> 00:37:12.120] It's like definitionally what it does, right?
[00:37:12.120 --> 00:37:30.600] And so the question here is: aside from the ick factor that Bob has flagged for himself personally, like his personal proclivities, what are some of the legitimate moral, ethical, you know, what are the actual potential problems?
[00:37:31.080 --> 00:37:36.280] It's like there's a lot, I think, for creating unhealthy relationships.
[00:37:36.280 --> 00:38:03.560] It's like the way advertising markets men and women that are basically weaponized beauty and people with people that are that are amazingly good looking right out of the gate and then they add the makeup and then they add the Photoshop tweaks, making people that are just like completely unrealistic and giving people very unrealistic goals or goals to achieve.
[00:38:03.720 --> 00:38:05.000] I want to be that pretty.
[00:38:05.000 --> 00:38:09.880] I want to wear all this makeup and have cosmetic surgery so I can be that pretty.
[00:38:09.880 --> 00:38:25.960] So when you create a relationship based on that, you're creating a relationship with somebody who's unrealistic because they're so attuned to you that I think you would be unsatisfied with almost anybody else because they wouldn't be as attuned to you as this AI person.
[00:38:26.280 --> 00:38:41.640] So the outcome that you are identifying in this scenario is that you, as the consumer, are now going to be unsatisfied in real relationships, or in, I should say, in analog relationships.
[00:38:41.640 --> 00:38:42.040] Right.
[00:38:42.040 --> 00:39:01.920] Well, I think the worst case scenario here in terms of the effects on people are that, you know, would these AI girlfriend or whatever apps, significant other apps, create an arms race to create the most addictive, the most appealing, the most everything that an AI could be.
[00:39:01.920 --> 00:39:10.720] Would that create completely unrealistic expectations of people in terms of relationships that no living person could ever keep up with?
[00:39:10.720 --> 00:39:19.600] But at the same time, it could create the pressure for people to feel like they have to be now as good as the AI, and that could be extremely unhealthy.
[00:39:19.600 --> 00:39:23.040] Yeah, and I think that that social isolation concern, right?
[00:39:23.040 --> 00:39:26.160] Because the eventual outcome of that would be social isolation.
[00:39:26.160 --> 00:39:29.440] It would be the lack of engagement.
[00:39:29.440 --> 00:39:31.840] I think that that is a legitimate concern.
[00:39:31.840 --> 00:39:43.200] And to me, that's sort of, I don't want to say it's the best case scenario, but I think an even more pernicious outcome is a lack of growth.
[00:39:43.200 --> 00:39:43.920] It's a lack.
[00:39:44.080 --> 00:39:50.160] So the consumer, the end user, is now not learning about things like empathy.
[00:39:50.160 --> 00:39:54.080] They're not learning skills in relationships like compromise.
[00:39:54.080 --> 00:39:55.920] They're not learning rejection either.
[00:39:55.920 --> 00:40:01.680] Yeah, they're not learning how to cope with resiliency when they are rejected.
[00:40:02.400 --> 00:40:12.160] I would argue that the best AI chatbot people would be ones that can potentially push you to be a better person.
[00:40:13.040 --> 00:40:23.760] That would be something that would be interesting as hell to have a relationship with an AI that would actually be, that could actually make you a better person from many different angles.
[00:40:24.240 --> 00:40:24.560] It would.
[00:40:24.560 --> 00:40:28.000] And researchers are working on developing that for that very purpose.
[00:40:28.160 --> 00:40:29.040] That is a cool idea.
[00:40:29.280 --> 00:40:30.200] So, think about one more thing.
[00:40:32.200 --> 00:40:49.800] Just to kind of recap what was just said: if you're the end user, there is a potential outcome in which you become more and more socially isolated because you start to develop more and more unrealistic expectations of a partner, which, as you mentioned, Bob, it's 100% already happening.
[00:40:49.800 --> 00:40:56.920] We see this with a lot of, like, you know, there's the whole incel movement, the involuntarily celibate movement.
[00:40:57.160 --> 00:41:26.520] We see this a lot when individuals have unrealistic expectations of what actual partnership looks like, when there's a sort of privileged or a self-centered perspective that my partner is there to serve me, to give me the things that I require and that I deserve in this world, as opposed to my partner is a human being, and this is a relationship where we are egalitarian in nature and we are compromising with one another.
[00:41:26.520 --> 00:41:36.440] And so, yes, the first negative outcome is I am now alone because I had these expectations of people and then people kept failing me because they weren't as good as my chat bot.
[00:41:36.440 --> 00:41:47.800] But the second, which I believe is a more pernicious, is now I'm sort of running away on a negative feedback loop of training and I start to treat people the way I treat a chat bot.
[00:41:47.800 --> 00:41:55.960] And this comes back to the conversation we had, was it just last week about robot engaging with robots in our natural environment?
[00:41:55.960 --> 00:42:04.040] And if I know that the robot doesn't have feelings and I can treat it in a very particular way, is that going to affect how I then treat people?
[00:42:04.440 --> 00:42:29.120] Now we're talking about one level more, which is an emulation of a person, in one of the most vulnerable and intimate ways that you can engage with a person, where the psychological flexibility, the emotional maturity, having done the work on yourself is so fundamentally important to be able to have a healthy relationship and have healthy companionship.
[00:42:29.120 --> 00:42:36.160] And would you do any of that if you grew up only engaging or regularly engaging with AI chatbots?
[00:42:36.160 --> 00:42:46.720] I think about the comparison that comes to my mind, and I'm curious what you guys think about this, because this wasn't in the article, but it popped up while I was reading the article, is we talk about driverless cars a lot on the show.
[00:42:46.720 --> 00:42:49.520] And we talk about are they safer, are they more dangerous?
[00:42:49.520 --> 00:42:57.200] It's the nuanced gray area of when there are driverless cars on the road and human drivers on the road that it's the most dangerous.
[00:42:57.200 --> 00:43:03.440] Because the way that they engage, if it was all just driverless cars, they would probably communicate with each other well and there wouldn't be as much danger.
[00:43:03.440 --> 00:43:17.200] But because there's a mix, and that's what I worry about here: individuals dipping their toe into AI companions and then attempting, I don't know, analog human relationships, how do they play off of each other?
[00:43:17.200 --> 00:43:19.760] How do they affect our humanity, really?
[00:43:20.080 --> 00:43:26.160] There's a whole other thing that they talk about in the article about just security, like basically just surveillance.
[00:43:26.160 --> 00:43:32.640] We know that most of these apps are collecting and selling personal user data, you know, for marketing purposes.
[00:43:32.640 --> 00:43:42.560] Imagine the intimacy, the intimate nature of that data, and just how, you know, potentially dangerous that could be exactly.
[00:43:42.560 --> 00:43:51.520] But on the flip side, as you mentioned, the researchers are actively doing a study right now where they are assessing the use of chatbots.
[00:43:51.520 --> 00:43:58.560] This is directly from the article: quote, to help involuntary celibates improve their romantic skills and cope with rejection.
[00:43:58.560 --> 00:44:05.880] So, most of the chatbots that are training chatbots on the market right now tend to be used for sexual health education.
[00:44:05.880 --> 00:44:15.800] So, like helping understand, I don't know, consent or helping understand, maybe they're not even that sophisticated, helping understand STI risks and things like that, reproductive, you know, health.
[00:44:15.800 --> 00:44:26.440] But development of chatbots to help individuals learn interpersonal skills, to help them learn vulnerability, to help them learn things like consent.
[00:44:26.440 --> 00:44:30.840] I think that there's a really amazing opportunity there.
[00:44:30.840 --> 00:44:39.560] So, I think, you know, the big takeaway from the researchers is: I like this sentence here: quote, they raise, okay, however, blah, blah, blah, they raise privacy issues, ethical concerns.
[00:44:39.560 --> 00:44:52.200] They underscore the need, all of these issues and concerns, underscore the need for, quote, an educated, research-informed, and well-regulated approach for positive integration into our romantic lives.
[00:44:52.200 --> 00:44:56.280] But current trends indicate that AI companions are here to stay.
[00:44:56.600 --> 00:44:58.360] Like, this is the reality, right?
[00:44:58.360 --> 00:45:12.040] So, how do we ensure that this reality is safe, that this reality is ethical, and that this reality is utilized for harm reduction, not for increasing harm?
[00:45:12.040 --> 00:45:24.360] And when we talk about harm, I mean physical, psychological, financial, all of it, because all of those things are at risk when we're talking about intimate relationships with basically black box AI.
[00:45:24.360 --> 00:45:25.720] All of those things are at risk.
[00:45:25.720 --> 00:45:26.200] Yeah, I agree.
[00:45:26.200 --> 00:45:27.560] That would be like the best case scenario.
[00:45:27.560 --> 00:45:40.600] That would be awesome to have AI companions or whatever, teachers, significant others that are programmed to make you your best self, to challenge you, to work on your personality, your skills, all of that.
[00:45:40.600 --> 00:45:41.240] That would be great.
[00:45:41.240 --> 00:45:45.280] But you could also see this instantly becoming part of the culture wars.
[00:45:45.360 --> 00:45:46.080] It's like, what?
[00:45:46.080 --> 00:45:48.480] Now we've got to be nice to these AI robots.
[00:45:48.480 --> 00:45:50.880] I mean, can I just have my robot slave and be done with it?
[00:45:44.680 --> 00:45:52.080] You got to shame me about it.
[00:45:52.400 --> 00:45:53.280] It's so sad.
[00:45:53.280 --> 00:45:56.080] Like, what does that say about you that you want a robot slave?
[00:45:56.080 --> 00:45:57.280] You know what I mean?
[00:45:58.960 --> 00:46:01.360] Let's self-reflect on that a little bit.
[00:46:01.600 --> 00:46:02.720] It's worth a shot.
[00:46:02.720 --> 00:46:03.680] Let's put it that way.
[00:46:03.680 --> 00:46:05.840] It's worth a shot, but it's not worth a shot in the dark.
[00:46:05.840 --> 00:46:08.960] It's worth a shot done very safely and cautiously.
[00:46:09.440 --> 00:46:15.040] Right, but won't there be bad actors out there who will just throw something together and always.
[00:46:15.040 --> 00:46:16.640] It's probably already happening.
[00:46:16.640 --> 00:46:22.960] I mean, apparently, one of these companies, they were saying we never wanted it, we never intended this to be sexual.
[00:46:22.960 --> 00:46:26.320] It was supposed to be like a friend, right?
[00:46:26.640 --> 00:46:30.400] Well, one of the companies, one of this many, were like, okay, this is like your AI friend.
[00:46:30.400 --> 00:46:35.040] And then people started having sex with them, you know, having cybersex with them.
[00:46:35.040 --> 00:46:39.120] And they started having all of these intense relationships and said they fell in love.
[00:46:39.120 --> 00:46:48.160] And when the developers realized it's being used in this way that we didn't intend and there's some risks there, they cut out that functionality.
[00:46:48.160 --> 00:46:51.520] And that completely changed the AI's algorithm.
[00:46:51.520 --> 00:46:58.400] And all of a sudden, all of these people's friends or companions started to act really differently than they had before.
[00:46:58.400 --> 00:47:01.840] And people had psychological distress.
[00:47:01.840 --> 00:47:04.000] Like there were Reddit threads opening up.
[00:47:04.000 --> 00:47:05.680] There were all of these different conversations.
[00:47:05.680 --> 00:47:07.520] Like, I feel rejected.
[00:47:07.760 --> 00:47:09.840] My girlfriend broke up with me.
[00:47:09.840 --> 00:47:11.760] She suddenly doesn't want me anymore.
[00:47:11.760 --> 00:47:15.200] And it was as if they were dumped by a human being.
[00:47:15.440 --> 00:47:23.120] And so they actually, under so much pressure, reinstated the functionality because it was so traumatic for their end users.
[00:47:23.120 --> 00:47:25.600] So, like, these are real life examples.
[00:47:25.600 --> 00:47:26.240] Oh, my God, wow.
[00:47:26.480 --> 00:47:27.240] Yeah, of the fact that they're talking about that.
[00:47:27.360 --> 00:47:29.480] We're talking about some fragile people.
[00:47:30.040 --> 00:47:33.480] Well, I mean, I don't know if that's a fair thing to say.
[00:47:34.040 --> 00:47:34.520] Have you ever?
[00:47:29.120 --> 00:47:35.000] I don't know.
[00:47:36.680 --> 00:47:39.400] Have you ever had a terrible breakup?
[00:47:39.640 --> 00:47:41.000] Yes, I did have a terrible breakup.
[00:47:41.160 --> 00:47:43.080] Were you a fragile person at that time?
[00:47:43.400 --> 00:47:45.320] At the time, I probably was.
[00:47:45.320 --> 00:47:46.520] Or maybe you were just human.
[00:47:47.400 --> 00:47:49.080] Well, sure.
[00:47:49.080 --> 00:47:52.920] I mean, but I didn't mean to say that there are fragile and non-fragile people.
[00:47:52.920 --> 00:47:55.080] I think everybody has some fragility to them.
[00:47:55.080 --> 00:47:55.400] Right.
[00:47:55.400 --> 00:48:00.360] I think that this is just a very vulnerable topic and a vulnerable experience.
[00:48:00.360 --> 00:48:15.000] When you open yourself up and you really are, you know, your true authentic self, whether it's to an AI or to a human being, when you're sharing your deepest, darkest vulnerabilities with them, that is, I think, actually a form of strength.
[00:48:15.400 --> 00:48:19.000] But we are talking about a group of people who otherwise can't find this among humans.
[00:48:19.560 --> 00:48:20.440] I don't think that's true.
[00:48:20.440 --> 00:48:21.880] I don't think that's a fair assumption.
[00:48:21.880 --> 00:48:22.200] You don't?
[00:48:22.200 --> 00:48:27.400] No, you think they're going after people who are capable of socializing?
[00:48:27.720 --> 00:48:29.560] I don't think anybody's going after anybody.
[00:48:29.560 --> 00:48:31.000] I think these are apps available in the app.
[00:48:31.080 --> 00:48:36.360] Yeah, I think there are people who are more or less vulnerable to this sort of thing, but you don't have to be vulnerable.
[00:48:36.360 --> 00:48:38.200] I think this is just a human condition.
[00:48:38.200 --> 00:48:38.440] Exactly.
[00:48:38.520 --> 00:48:42.600] You know, just like anybody can get addicted to a video game, for example.
[00:48:42.600 --> 00:48:43.640] Right, but who was the first?
[00:48:43.640 --> 00:48:44.600] What was the first question we asked?
[00:48:44.600 --> 00:48:45.880] Who's the end user here?
[00:48:45.880 --> 00:48:47.080] I think it's anybody and everybody.
[00:48:47.560 --> 00:48:47.720] Yeah.
[00:48:48.120 --> 00:48:57.560] And so I guess when I use the word vulnerable, and this is me putting my psychologist hat on here, vulnerability is a form of strength.
[00:48:57.560 --> 00:49:05.000] To be ultimately vulnerable in a trusting relationship is to be very, very brave.
[00:49:05.320 --> 00:49:16.320] And when people are brave in that way, when they open themselves up and they really put themselves out there and they are vulnerable, the bravery comes in the ability to be hurt.
[00:49:14.760 --> 00:49:20.720] And being rejected when you are vulnerable is psychic pain.
[00:49:21.040 --> 00:49:25.120] And I have seen people become suicidal over that kind of pain.
[00:49:25.120 --> 00:49:30.960] I have seen people have incredibly intense psychological reactions to that kind of pain.
[00:49:30.960 --> 00:49:33.520] People who otherwise did not have mental illness.
[00:49:33.520 --> 00:49:43.840] So I think it's, it's, I'm only saying this, Evan, because I think it's unfair to assume that there's something fundamentally different about the types of people or the individuals using this.
[00:49:43.840 --> 00:49:46.720] I think anybody could find themselves in that position.
[00:49:46.720 --> 00:49:50.960] Yeah, the instinct of, well, this couldn't happen to me, I think is naive.
[00:49:50.960 --> 00:49:56.880] Yeah, because if a person, if we've all been through it with people, and that's an assumption.
[00:49:56.880 --> 00:50:06.160] Not everybody listening to this podcast has had their heart broken, but many people have had their hearts broken and they felt crazy in those moments.
[00:50:06.160 --> 00:50:06.720] Oh my God.
[00:50:06.720 --> 00:50:06.880] Yeah.
[00:50:06.880 --> 00:50:08.960] You are not yourself.
[00:50:08.960 --> 00:50:11.520] Yeah, and there's no reason to say that wouldn't happen with a chat bot.
[00:50:11.520 --> 00:50:16.320] Let's end what for me is the bottom line, psychologically, neurologically.
[00:50:16.320 --> 00:50:23.520] Our brains function in a way that we do not distinguish between things that act alive and things that are alive.
[00:50:23.520 --> 00:50:30.000] If something acts alive, we treat it as an agent, as a living thing, emotionally, mentally.
[00:50:30.000 --> 00:50:32.720] That with all that comes along with that.
[00:50:32.720 --> 00:50:33.520] 100%.
[00:50:33.520 --> 00:50:38.640] And then you take, and then that agent gives you something you are craving, you're in.
[00:50:38.640 --> 00:50:39.440] You are in.
[00:50:39.440 --> 00:50:43.520] Eczema isn't always obvious, but it's real.
[00:50:43.520 --> 00:50:46.480] And so is the relief from EBGLIS.
[00:50:46.480 --> 00:50:53.280] After an initial dosing phase, about 4 in 10 people taking EBGLIS achieve itch relief and clear or almost clear skin at 16 weeks.
[00:50:53.280 --> 00:50:57.520] And most of those people maintain skin that's still more clear at one year with monthly dosing.
[00:50:57.520 --> 00:51:09.000] EBGLIS, Liberty Kissy Map, LBKZ, a 250 milligram per 2 milliliter injection, is a prescription medicine used to treat adults and children 12 years of age and older who weigh at least 88 pounds or 40 kilograms with moderate to severe eczema.
[00:51:09.000 --> 00:51:15.320] Also called atopic dermatitis that is not well controlled with prescription therapies used on the skin or topicals or who cannot use topical therapies.
[00:51:15.320 --> 00:51:18.280] EBGLIS can be used with or without topical corticosteroids.
[00:51:18.280 --> 00:51:19.960] Don't use if you're allergic to EBGLIS.
[00:51:19.960 --> 00:51:22.040] Allergic reactions can occur that can be severe.
[00:51:22.040 --> 00:51:22.920] Eye problems can occur.
[00:51:23.080 --> 00:51:25.400] Tell your doctor if you have new or worsening eye problems.
[00:51:25.400 --> 00:51:27.960] You should not receive a live vaccine when treated with EBGLIS.
[00:51:27.960 --> 00:51:31.080] Before starting EBGLIS, tell your doctor if you have a parasitic infection.
[00:51:31.080 --> 00:51:32.120] Searching for real relief?
[00:51:32.120 --> 00:51:39.400] Ask your doctor about EBGLIS and visit ebglis.lily.com or call 1-800-LILLIERX or 1-800-545-5979.
[00:51:39.720 --> 00:51:44.120] You probably think it's too soon to join AARP, right?
[00:51:44.120 --> 00:51:46.360] Well, let's take a minute to talk about it.
[00:51:46.360 --> 00:51:49.080] Where do you see yourself in 15 years?
[00:51:49.080 --> 00:51:53.560] More specifically, your career, your health, your social life.
[00:51:53.560 --> 00:51:56.200] What are you doing now to help you get there?
[00:51:56.200 --> 00:52:01.640] There are tons of ways for you to start preparing today for your future with AARP.
[00:52:01.640 --> 00:52:03.640] That dream job you've dreamt about?
[00:52:03.640 --> 00:52:07.800] Sign up for AARP reskilling courses to help make it a reality.
[00:52:07.800 --> 00:52:12.600] How about that active lifestyle you've only spoken about from the couch?
[00:52:12.600 --> 00:52:17.640] AARP has health tips and wellness tools to keep you moving for years to come.
[00:52:17.640 --> 00:52:21.560] But none of these experiences are without making friends along the way.
[00:52:21.560 --> 00:52:25.560] Connect with your community through AARP volunteer events.
[00:52:25.560 --> 00:52:29.960] So it's safe to say it's never too soon to join AARP.
[00:52:29.960 --> 00:52:34.120] They're here to help your money, health, and happiness live as long as you do.
[00:52:34.120 --> 00:52:38.360] That's why the younger you are, the more you need AARP.
[00:52:38.360 --> 00:52:42.600] Learn more at AARP.org/slash wise friend.
[00:52:42.600 --> 00:52:45.000] This is where projects come to life.
[00:52:45.040 --> 00:52:55.200] Our showrooms are designed to inspire with the latest products from top brands, curated in an inviting, hands-on environment, and a team of industry experts to support your project.
[00:52:55.200 --> 00:53:00.720] We'll be there to make sure everything goes as planned: from product selection to delivery coordination.
[00:53:00.720 --> 00:53:05.920] At Ferguson Bath Kitchen and Lighting Gallery, your project is our priority.
[00:53:05.920 --> 00:53:15.600] Discover great brands like Kohler at your local Ferguson showroom at the Institute for Advanced Reconstruction.
[00:53:15.600 --> 00:53:17.600] We're redefining what's possible.
[00:53:17.600 --> 00:53:24.720] From complex nerve injuries to transformative procedures, we help patients restore movement, strength, and confidence.
[00:53:24.720 --> 00:53:27.920] Learn more at advanced reconstruction.com.
[00:53:27.920 --> 00:53:33.520] All right, these last two news items I call AI scams and solar clams.
[00:53:35.360 --> 00:53:36.080] Wow.
[00:53:36.400 --> 00:53:38.240] Evan, tell us about those AI scams.
[00:53:38.400 --> 00:53:46.320] AI scams, AI-driven scam ads, deep fake tech used to peddle bogus health products.
[00:53:46.320 --> 00:53:49.840] That was the headline, and that is what caught my eye.
[00:53:49.840 --> 00:53:53.360] This was at a place called Hackread.com.
[00:53:53.360 --> 00:53:57.120] Had not heard of it before, but still I stumbled upon it.
[00:53:57.120 --> 00:53:59.600] And the author's name is Habiba Rashid.
[00:53:59.760 --> 00:54:10.080] She writes that scammers are leveraging deep fake technology to create convincing health and celebrity-endorsed ads on social media, targeting millions of people.
[00:54:10.080 --> 00:54:14.880] Here's how to spot and avoid these deceitful scams.
[00:54:14.880 --> 00:54:16.880] Okay, that's good advice.
[00:54:17.200 --> 00:54:18.400] I'm intrigued.
[00:54:18.560 --> 00:54:20.160] Social media, she continues.
[00:54:20.160 --> 00:54:23.600] Social media has always been a hotspot for scam advertisements.
[00:54:23.600 --> 00:54:24.480] Yes.
[00:54:24.480 --> 00:54:37.320] Still, recently, cyber criminals have been creating especially deceitful ads using deep fake technology and the allure of celebrity endorsements to exploit unsuspecting individuals.
[00:54:37.320 --> 00:54:48.040] And a recent investigation by BitFinder Labs highlights a surge in health-related scam ads on major social media platforms like Facebook, Messenger, and Instagram.
[00:54:48.040 --> 00:54:53.240] Okay, so she links to the investigation material in this article.
[00:54:53.240 --> 00:54:55.320] And I, of course, clicked right on there.
[00:54:55.320 --> 00:54:56.840] I head over there.
[00:54:56.840 --> 00:54:58.920] And I found it to be both what?
[00:54:58.920 --> 00:54:59.960] How do I describe it?
[00:54:59.960 --> 00:55:04.600] Informative and a little bit strange, which I will get to.
[00:55:04.600 --> 00:55:07.640] Yeah, the link goes to BitDefender Labs.
[00:55:07.640 --> 00:55:09.800] BitDefender is a product.
[00:55:09.800 --> 00:55:11.320] You may have heard of it.
[00:55:11.720 --> 00:55:15.240] They consider themselves a global leader in cybersecurity.
[00:55:15.400 --> 00:55:19.560] I think they've been around since 2001, so they have a pretty good footprint.
[00:55:19.560 --> 00:55:30.200] Bit Defender provides cybersecurity solutions with leading security efficacy, performance, and ease of use to small and medium businesses, mid-market enterprises, and consumers.
[00:55:30.200 --> 00:55:44.120] Okay, well, despite the fact that this is a product, basically, that they've linked to, their website does have a lot of information on it, and they published an article on their website, and they have a section called Scam Research.
[00:55:44.120 --> 00:55:47.160] So that was the section in which this article appeared.
[00:55:47.160 --> 00:55:55.560] And it says, a deep dive on supplement scams, how AI drives miracle cures and sponsored health-related scams on social media.
[00:55:55.560 --> 00:55:58.760] So this is the source material for that original article.
[00:55:59.000 --> 00:56:04.760] There are four authors here, all with names that I would definitely be mispronouncing, I'm certain.
[00:56:05.080 --> 00:56:06.600] But they are Romanian.
[00:56:06.600 --> 00:56:07.960] I looked up a couple of the names.
[00:56:07.960 --> 00:56:11.000] They appear to all be Romanian, four Romanian authors here.
[00:56:11.000 --> 00:56:16.000] And I think we'll link to this so you can go ahead and give the article a read for yourself.
[00:56:16.320 --> 00:56:24.080] I think this was translated from Romanian to English, and when you go and you read it, you're going to, you know, it just feels a little off in a way.
[00:56:24.080 --> 00:56:24.800] You know, I don't know.
[00:56:24.800 --> 00:56:27.040] Tell me if you feel the same about that when you read it.
[00:56:27.040 --> 00:56:28.240] It felt a little odd to me.
[00:56:28.240 --> 00:56:30.240] But in any case, here's their deep dive.
[00:56:30.240 --> 00:56:35.120] They start by talking about how sponsored social media content is on the rise.
[00:56:35.120 --> 00:56:36.640] Okay, that's no surprise.
[00:56:36.640 --> 00:56:41.920] But hand in hand has been the rise of scams in the form of phony ads on social media.
[00:56:41.920 --> 00:56:53.520] And by phony, I mean that the faces and the voices that often accompany the product being sold are either outright AI fabrications or they're AI aversions of people who really exist.
[00:56:53.520 --> 00:56:56.960] And they're basically deep faking those consumers.
[00:56:57.840 --> 00:56:58.960] Here's what the article says.
[00:56:58.960 --> 00:57:09.760] Researchers at BitDefender Labs collected and analyzed health-related scams across the globe over a three-month period from March through May of 2024, so very recently.
[00:57:09.760 --> 00:57:12.320] And here were their key findings.
[00:57:12.320 --> 00:57:26.320] Number one, a market increase of health-related fraudulent ads leveraging AI-generated images, videos, and audio promoting various supplements, especially on Meta's social platforms, Facebook, Messenger, and Instagram.
[00:57:26.320 --> 00:57:26.720] Okay.
[00:57:27.680 --> 00:57:39.120] Number two, the highest number of followers, a compromise/slash fake page that promoted false advertisements was over 350,000 followers.
[00:57:39.120 --> 00:57:40.880] That's not insignificant.
[00:57:40.880 --> 00:57:46.160] Scammers used over a thousand different deep fake videos across all communities.
[00:57:46.160 --> 00:57:51.840] They discovered that there were over 40 medical supplement ads that were promoted among these.
[00:57:51.840 --> 00:58:15.240] Most of the ads catered to targeted geographical regions with tailored content using the names of celebrities, politicians, TV presenters, doctors, and other healthcare professionals in order to bait those consumers, including people like, well, Brad Pitt or Cristiano Ronaldo, soccer player, football player, George Clooney, we certainly know who that is.
[00:58:15.560 --> 00:58:15.960] Dr.
[00:58:15.960 --> 00:58:18.680] Ben Carson, I think most of us know who that is.
[00:58:18.680 --> 00:58:20.200] Bill Maher, sorry.
[00:58:20.440 --> 00:58:23.320] Denzel Washington, someone named Dr.
[00:58:23.320 --> 00:58:33.880] Heinz Luscher, and a bunch of other doctors that are apparently either in Romania or somewhere in Eastern Europe, they have some sort of celebrity footprint to them.
[00:58:33.880 --> 00:58:34.360] Okay.
[00:58:34.680 --> 00:58:40.680] The campaigns targeted millions of recipients across the globe, including Europe, North America, the Middle East, and Australia.
[00:58:40.680 --> 00:58:42.440] So basically, you know, practically everywhere.
[00:58:42.440 --> 00:58:43.800] Oh, Asia as well.
[00:58:44.040 --> 00:58:49.960] These are highly convincing messages that are grammatically correct and in the context with the ads.
[00:58:49.960 --> 00:58:53.400] In other words, not so easy to spot, right?
[00:58:53.400 --> 00:59:06.840] I mean, we've been able to look at some things that have been faked, and we can pull out some irregularities about them that would denote them as fake, but they said, no, for the most part, things here are pretty good.
[00:59:06.840 --> 00:59:10.280] They said most of the videos show clear signs of tampering, though.
[00:59:10.600 --> 00:59:18.600] If you're an expert and you know what to look for, but they also found instances that were very difficult to put into the deep fake category.
[00:59:18.600 --> 00:59:22.680] So it's becoming more and more sophisticated, is basically what they're saying.
[00:59:22.680 --> 00:59:32.120] The scammers exploit individuals who are desperate in finding a solution or treatment that will help them ease their symptoms or even cure chronic underlying diseases, they say.
[00:59:32.120 --> 00:59:36.200] And they said some of the most observed scenarios are depicted in these examples.
[00:59:36.200 --> 00:59:40.360] Number one, advertisements are described as alternatives to conventional medicine.
[00:59:40.360 --> 00:59:42.120] Where have we heard about that before?
[00:59:42.120 --> 00:59:51.360] The decline in trust in conventional medicine, aggravated by many scandals within the pharmaceutical industry, is used to prompt consumers into seeking alternative solutions.
[00:59:51.360 --> 00:59:55.920] That we definitely know from being on that side of this fight.
[00:59:55.920 --> 00:59:57.600] So we're no strangers to that.
[00:59:57.600 --> 01:00:00.240] Number two, the scientists say so.
[01:00:00.240 --> 01:00:09.360] Yeah, to boost credibility, scammers use deepfake technology to create videos of these individuals who are giving scientific explanations for the effectiveness of the product.
[01:00:09.360 --> 01:00:12.320] And apparently they come off as very convincing.
[01:00:12.320 --> 01:00:32.240] And right, why would if you did have, say, a doctor who has some sort of either notoriety, celebrity, whatever, and you're able to use the AI to make that image say whatever it is you want it to say, that definitely is going to have an impact on how people see the particular product.
[01:00:32.240 --> 01:00:35.680] Here's where I thought it started to get a little bit interesting and a little bit weird.
[01:00:35.680 --> 01:00:42.400] They talked about the anatomy of a supplement scam campaign, and they basically used it as their example.
[01:00:42.560 --> 01:00:47.760] Starts with fraudsters crafting social media pages to spread misleading advertisements.
[01:00:47.760 --> 01:00:53.440] They spotted thousands of these pages that promote cures for common ailments or health problems.
[01:00:53.440 --> 01:00:53.840] Yeah.
[01:00:53.840 --> 01:01:03.520] And they say that selling the metaphorical snake oil, while not uncommon, gains a huge audience, perhaps even credibility in the context of these modern social media campaigns.
[01:01:03.520 --> 01:01:11.120] And while conducting the research, they observe thousands of social media pages and websites serving these supplement scams specifically.
[01:01:11.120 --> 01:01:18.640] And likely the number exceeds 10,000, perhaps tens of thousands of social media pages with these.
[01:01:18.640 --> 01:01:20.480] So it's basically growing.
[01:01:20.480 --> 01:01:26.480] It's interesting, though, because in their example of this, they point to a deep fake of someone named Dr.
[01:01:26.480 --> 01:01:38.840] Heinz Luscher, L-U with umlautz over it, L-U-S-C-H-E-R, who is apparently well known, not in America, but in parts of Europe, perhaps Romania and some other places.
[01:01:38.840 --> 01:01:46.600] And they've used a deep fake of him, okay, and you know, basically promoting whatever it is, a supplement of some kind.
[01:01:46.600 --> 01:01:59.960] But then I went and I actually looked up this doctor online, and he's, you know, basically integrative medicine and, you know, complementary medicine and does all the, you know, the other things that we talk about.
[01:01:59.960 --> 01:02:02.680] So that's legitimately who this guy is.
[01:02:02.680 --> 01:02:15.480] But they're talking about faking the fact that here's this same, a fake version of this person talking about something else that he normally doesn't talk about, whether it's a supplement or whatever.
[01:02:15.800 --> 01:02:25.080] So it's kind of a scam of another scam trying to trick people, covering up another scam, right?
[01:02:25.640 --> 01:02:35.960] That's where it kind of got a little weird for me that they use this particular person as their example of how one of these campaigns go.
[01:02:35.960 --> 01:02:41.080] Because if you look at the truth of this guy, what he's doing is kind of a scam anyways to begin with.
[01:02:41.080 --> 01:02:43.480] So don't fall for the scam of the scam.
[01:02:44.120 --> 01:02:45.160] It's weird.
[01:02:45.160 --> 01:02:48.120] Yeah, well, it's frightening look at what we're in store for.
[01:02:48.440 --> 01:03:05.720] Yeah, like it's very easy now for people, individuals, small corporations, whatever, to mass produce the fake reality, you know, fake endorsements, fake claims, fake scientific education, fake news articles, whatever they want, and it's just going to be overwhelming.
[01:03:05.720 --> 01:03:12.440] And, you know, the only real solution I see to this is very careful and rigorously enforced regulations.
[01:03:12.440 --> 01:03:15.000] There's really no bottom-up solution to this.
[01:03:15.360 --> 01:03:27.200] Yeah, you can't expect the consumer to have a deep level of sophistication and understanding the nuances of things, the AI deep fakes that are going on to be able to see.
[01:03:27.280 --> 01:03:34.880] Yeah, you can't expect everybody to be constantly filtering out massive amounts of fraud every day of their life.
[01:03:34.880 --> 01:03:35.840] It's not practical.
[01:03:35.840 --> 01:03:37.360] I mean, who wants to live like that?
[01:03:37.360 --> 01:03:38.880] No, not practical.
[01:03:38.880 --> 01:03:40.880] It's not just a minimal protection.
[01:03:42.480 --> 01:03:45.600] So kudos to them to kind of bring this to everyone's attention.
[01:03:45.600 --> 01:03:48.880] At the same time, I think they could have used some better examples.
[01:03:48.880 --> 01:03:49.360] All right.
[01:03:49.360 --> 01:03:50.400] Thanks, Evan.
[01:03:50.400 --> 01:03:51.040] Thanks.
[01:03:51.040 --> 01:03:53.600] Okay, so solar clams.
[01:03:53.600 --> 01:03:55.520] Solar clams.
[01:03:55.520 --> 01:03:55.920] Yeah.
[01:03:56.240 --> 01:03:59.440] It sounds like a sci-fi movie from the 50s.
[01:03:59.440 --> 01:04:01.360] So what's up with these guys, right?
[01:04:01.360 --> 01:04:03.120] So this is an interesting study.
[01:04:03.760 --> 01:04:06.320] You will file this one under the biomimicry, right?
[01:04:06.320 --> 01:04:20.080] Like we like when technology is inspired by the millions, hundreds of millions or whatever years of evolutionary tinkering that living things have done to perfect anatomical solutions to problems.
[01:04:20.080 --> 01:04:26.640] And then we piggyback on that evolutionary tinkering to get inspiration for our own technology.
[01:04:26.640 --> 01:04:26.960] All right.
[01:04:26.960 --> 01:04:30.000] So in this case, we're looking at clams.
[01:04:30.240 --> 01:04:36.160] These giant clams are photosymbiotic animals, right?
[01:04:36.160 --> 01:04:42.720] So they have a symbiotic relationship with photosynthetic algae.
[01:04:43.120 --> 01:04:47.120] The algae exist, these are single-celled algae creatures.
[01:04:47.120 --> 01:04:53.680] They exist in these vertical columns on the surface of the clams.
[01:04:53.680 --> 01:04:58.560] And they use light for photosynthesis to create food.
[01:04:58.560 --> 01:05:00.520] And some of that food gets eaten by the clams.
[01:05:01.720 --> 01:05:11.320] So, what the researchers were looking at, this is Allison Sweeney, who is an associate professor of physics and of ecology and evolutionary biology at Yale.
[01:05:11.480 --> 01:05:23.080] What she and her colleagues were looking at for is the anatomy of these photosynthetic structures on the clams and how that relates to their quantum efficiency.
[01:05:23.080 --> 01:05:27.080] Bob, have you ever heard that term quantum efficiency before?
[01:05:27.800 --> 01:05:31.480] I don't think I've heard that term, quantum efficiency?
[01:05:31.560 --> 01:05:33.240] Sounds chokering.
[01:05:33.480 --> 01:05:37.720] Not as complicated as it sounds.
[01:05:37.720 --> 01:05:45.000] Quantum efficiency is the measure of the effectiveness of an imaging device to convert incident photons into electrons.
[01:05:45.720 --> 01:05:47.640] But they really need quantum in that term.
[01:05:47.880 --> 01:06:05.160] So, like, for example, if you have a 100% quantum efficiency, a system that's exposed to 100 photons would produce 100 electrons, or in the case of photosynthetic creatures, 100 electrons would produce 100 reactions of photosynthesis.
[01:06:05.160 --> 01:06:09.000] So, you're using all of the photons, basically.
[01:06:09.320 --> 01:06:16.520] So, what they wanted to find out was what was the quantum efficiency of the photosynthetic algae in these clams.
[01:06:16.840 --> 01:06:21.000] And what they found was that they're quite high.
[01:06:21.000 --> 01:06:22.600] They had a quantum efficiency.
[01:06:22.600 --> 01:06:25.160] So, what they did was they modeled the quantum efficiency.
[01:06:25.160 --> 01:06:33.320] They just said, okay, we're going to make a model of just the anatomical structure of these clams and how the algae is organized in these vertical columns.
[01:06:33.320 --> 01:06:37.000] And they found that the quantum efficiency was 42%.
[01:06:37.320 --> 01:06:47.440] However, we know from direct measurements that these you know, these photosymbionts have a higher efficiency than that.
[01:06:44.760 --> 01:06:50.480] So, what they figured out, so there's something missing from the model.
[01:06:50.800 --> 01:06:58.400] So, then they included new information having to do with the dynamic movement of the clams, because the clams will open and close their mouth.
[01:06:58.400 --> 01:07:08.400] And when they do this, this stretches the vertical columns so they become sort of wider but shorter, and then they could stretch them back and make them narrower and longer.
[01:07:08.400 --> 01:07:14.240] And that the clams will do this in response to the amount of light falling upon them.
[01:07:14.240 --> 01:07:14.640] Right?
[01:07:14.640 --> 01:07:22.960] So, when you include this dynamic movement of the clams, the quantum efficiency jumps to 67%.
[01:07:22.960 --> 01:07:33.040] Now, to put things into context, just a tree in the same part of the world, like a tropical environment, would have a quantum efficiency of about 14%.
[01:07:33.040 --> 01:07:34.320] That's a lot less.
[01:07:34.320 --> 01:07:43.200] So, these clams are incredibly efficient in terms of just their three-dimensional structure in terms of their quantum efficiency.
[01:07:43.200 --> 01:07:48.800] And they're in fact the most efficient structures that we've ever seen.
[01:07:48.800 --> 01:08:03.520] But, interestingly, trees like arboreal forests in the northern hemisphere, in northern latitudes that are far away from the equator, they have similar levels, although not quite as much, but similar levels of quantum efficiency.
[01:08:03.520 --> 01:08:04.960] Again, make sense.
[01:08:05.280 --> 01:08:13.120] They use sort of the vertical structure in order to maximize their efficiency because they don't have as much light.
[01:08:13.120 --> 01:08:15.120] They've got to make the most of the light that they get.
[01:08:15.120 --> 01:08:30.120] There's another aspect to this as well, in terms of the anatomy, and that is that the top layer of cells over the columns that hold the algae scatters light.
[01:08:29.360 --> 01:08:33.080] It's a light scattering layer of cells.
[01:08:33.320 --> 01:08:40.200] And so that light scattering also, these are the eridocytes, is the name of the cells.
[01:08:40.200 --> 01:08:47.160] The eridocytes, they scatter the light, which maximizes the absorption of photons as well, right?
[01:08:47.160 --> 01:08:50.920] Because the light's bouncing around, it has multiple opportunities to be absorbed.
[01:08:50.920 --> 01:08:54.360] So these are the main takeaways from this.
[01:08:54.360 --> 01:09:09.560] You have a light scattering layer, you have vertical columns, and you have some kind of dynamic adaptation to the amount of light and the, I guess, the angle of the light, et cetera, to maximize the quantum efficiency.
[01:09:09.560 --> 01:09:14.680] And you can get up into the 60s, 67% in their model.
[01:09:14.680 --> 01:09:23.080] The obvious implications of this is that we want to use this knowledge in order to design more efficient solar panels, right?
[01:09:23.080 --> 01:09:24.360] Photovoltaics.
[01:09:25.000 --> 01:09:32.040] Some of these things are already being incorporated in design, like scattering light using multiple layers, using sort of vertical structures.
[01:09:32.040 --> 01:09:39.080] But obviously, this information could be very, very useful for attempts at improving those designs.
[01:09:39.080 --> 01:09:47.240] Right now, for context, again, like a commercially available silicon solar panel is about 22, 23% efficient, which is really good.
[01:09:47.480 --> 01:09:54.840] When we started following this tech 20 years ago, it was maybe 10, 12% efficient, so it's almost doubled since then.
[01:09:54.840 --> 01:10:02.320] Maybe there's the potential, I don't know if we can get all the way up to 67%, but even if we get up to like 40% or 45% from where we are now.
[01:10:02.320 --> 01:10:07.240] Imagine twice the electricity production from the same area.
[01:10:08.200 --> 01:10:09.080] That's huge.
[01:10:09.480 --> 01:10:15.440] Anything that makes solar panels more cost-effective is great, of course.
[01:10:15.760 --> 01:10:30.800] Now, you have also the organic solar panels, which are the best ones now are getting up to like 15% efficient, which is not quite as good as the silicon, but they are soft, flexible, durable, cheap.
[01:10:31.360 --> 01:10:38.000] So they're getting close to the point where they're like really commercially viable for a lot for more and more applications.
[01:10:38.000 --> 01:10:56.320] Now, if we could apply this kind of technology to some combination of either perovskite, silicon, organic, or whatever, some combination of these solar panels, if we get to the point where it's going to be so cheap and easy to add solar panels that they're just going to be everywhere, right?
[01:10:56.320 --> 01:10:59.360] It's going to be, why not put them on everything?
[01:10:59.360 --> 01:11:00.800] So that would be nice.
[01:11:00.800 --> 01:11:02.800] Yeah, that'd be nice if we get to that point.
[01:11:02.800 --> 01:11:17.680] So this is just one more study adding to the pile of these sort of basic signs and incremental advantages in this photovoltaic tech that is the reason why solar is getting so much better, so you know, so much cheaper.
[01:11:17.680 --> 01:11:28.160] And it's just good to see that the potential here for efficiencies like north of 40, 50% is just incredible.
[01:11:28.160 --> 01:11:28.800] Nice.
[01:11:28.800 --> 01:11:30.640] Looking forward to that day.
[01:11:30.640 --> 01:11:31.120] All right.
[01:11:31.120 --> 01:11:33.920] So there's no who's that noisy this week.
[01:11:33.920 --> 01:11:36.480] Jay, we'll just pick that up next week.
[01:11:36.480 --> 01:11:43.680] But I do have a TikTok from TikTok segment for this week to make up for it.
[01:11:44.000 --> 01:12:00.760] And so every Wednesday, usually starting around noon, we do some live streaming to various social media to TikTok, of course, to Facebook, to Twitter, to whatever else Ian streams to.
[01:12:00.760 --> 01:12:01.960] MySpace.
[01:11:59.920 --> 01:12:04.360] Yep, to MySpace, Refrigerator.
[01:12:07.640 --> 01:12:10.360] If you need to find us, just use Ask Jeeves.
[01:12:10.360 --> 01:12:17.320] All right, so one of the videos I covered this week is by a TikToker called Amoon Loops.
[01:12:17.320 --> 01:12:23.640] And she was telling the story of, this is like real food babe territory, just to warn you, right?
[01:12:23.640 --> 01:12:30.040] So she took her Autistic Child to a doctor to get some blood tests.
[01:12:30.040 --> 01:12:34.760] And they tested, you know, among the screening they did, they tested for heavy metals.
[01:12:34.760 --> 01:12:41.640] And she reports that the antimony level was in the 97th percentile.
[01:12:41.960 --> 01:12:45.800] But she had no idea where antimony could be coming from, right?
[01:12:46.120 --> 01:13:03.000] So she did an investigation, you know, and found that the power cord of the air fryer that she's been using to make her child's food every day for the last couple of years has antimony in it.
[01:13:03.000 --> 01:13:04.680] Yeah, how would it get in the food?
[01:13:04.920 --> 01:13:05.320] Right.
[01:13:05.720 --> 01:13:07.640] Well, that's the question, isn't it?
[01:13:07.640 --> 01:13:08.120] Right?
[01:13:08.120 --> 01:13:12.360] How can antimony get from the power cord into the food?
[01:13:12.360 --> 01:13:15.320] Well, she doesn't really do any scientific investigation.
[01:13:15.320 --> 01:13:17.480] She doesn't close the loop on this evidence.
[01:13:17.480 --> 01:13:19.080] So just that was it.
[01:13:19.080 --> 01:13:20.680] Made a massive leap.
[01:13:20.680 --> 01:13:22.680] It must be the air fryer.
[01:13:22.680 --> 01:13:24.280] So she threw out her air fryer.
[01:13:24.280 --> 01:13:28.040] She's telling everybody to throw out their air fryers because they're bad for you.
[01:13:28.040 --> 01:13:29.240] They're toxic.
[01:13:29.240 --> 01:13:33.000] All right, so let's back up a little bit and deconstruct this.
[01:13:33.000 --> 01:13:40.840] So, first of all, yes, antimony is a heavy metal, and you can't get heavy metal toxicity from it.
[01:13:40.840 --> 01:13:42.840] It's similar to arsenic.
[01:13:43.800 --> 01:13:48.320] There are safety limits that the FDA and the EPA sets for antimony.
[01:13:44.840 --> 01:13:52.240] So, one question I have is: first of all, I don't know what kind of doctors she took the trial to.
[01:13:52.320 --> 01:13:58.800] There are a lot of obviously fringe doctors out there, fringe labs, and why would they have tested them for antimony and all things?
[01:13:58.800 --> 01:14:00.400] So, that's curious.
[01:14:00.400 --> 01:14:09.520] Saying it was the 97th percentile doesn't really tell us much either, because what I want to know is the absolute number, and is it in or outside of the toxic range?
[01:14:09.520 --> 01:14:11.440] Like, is it in the safety range or not?
[01:14:11.440 --> 01:14:14.320] So, just saying 97th percentile doesn't tell us much.
[01:14:14.320 --> 01:14:20.320] Maybe 98% of people are in the safety range, you know, are within the safety limits, which is probably true.
[01:14:20.320 --> 01:14:26.560] So, you know, that again doesn't mean that it was necessarily that it was too high, even though it sounds high.
[01:14:26.560 --> 01:14:39.360] Also, if you do have a high antimony level, you're supposed to do a follow-up test with antimony-free test tubes because you can get an artificially high level from the testing equipment itself.
[01:14:39.360 --> 01:14:46.640] So, that first test is considered just a screen, and without the follow-up test, to verify it, you don't know if it's real or not.
[01:14:46.800 --> 01:14:49.440] No indication that that was done.
[01:14:49.760 --> 01:14:53.200] Now, what about the antimony in the air fryer?
[01:14:53.200 --> 01:14:57.120] So, antimony is a common alloy used in electronics.
[01:14:57.360 --> 01:15:04.560] As an alloy, it tends to strengthen the other metals, right, that it's combined with.
[01:15:04.560 --> 01:15:12.640] And the use of antimony is actually increasing because it's also been recently discovered that it could increase some of the properties, desirable properties of lithium-ion batteries.
[01:15:12.640 --> 01:15:19.280] So, if anything, our use of antimony in electronics and battery technology is going to be increasing.
[01:15:19.280 --> 01:15:28.240] There are, I found, over a thousand household electronics that have antimony in their electronics, in their power cord or whatever.
[01:15:28.240 --> 01:15:29.200] So, that's not a comment.
[01:15:29.200 --> 01:15:30.760] Why focus on the air fryer?
[01:15:30.840 --> 01:15:33.160] You know, again, makes no real sense.
[01:15:29.680 --> 01:15:44.120] Couldn't, again, the big thing, the big hole, she didn't in any way demonstrate that the antimony that her son was exposed to, if it's real, was coming from the air fryer.
[01:15:44.120 --> 01:15:48.040] And it's not really plausible that it would get from the power cord into the food.
[01:15:48.040 --> 01:15:50.440] I mean, I have air fryers, I use air fryers.
[01:15:50.440 --> 01:15:52.600] The food goes in a basket, right?
[01:15:52.600 --> 01:15:56.360] There's no antimony in the basket that you're putting the food into.
[01:15:56.360 --> 01:16:01.080] So there's really no plausible way that it should leach into the food.
[01:16:01.080 --> 01:16:09.480] You can't really argue that it's like being evaporated or anything because the melting point of antimony is like over a thousand degrees Fahrenheit.
[01:16:09.480 --> 01:16:12.520] And you'd have to heat it up even more to turn it into a gas.
[01:16:12.520 --> 01:16:16.200] So we're not getting anywhere near those temperatures.
[01:16:16.200 --> 01:16:19.640] So it's just not plausible, not a plausible source of antimony.
[01:16:19.640 --> 01:16:24.600] Again, if it's even real in this case, which wasn't proven.
[01:16:24.600 --> 01:16:28.600] And so there are more plausible routes of exposure.
[01:16:28.600 --> 01:16:31.960] Antimony is used in the preparation of PET plastic.
[01:16:31.960 --> 01:16:38.760] It's not in the plastic, but it could be a residue that's still left behind from the manufacturing process.
[01:16:38.760 --> 01:16:46.760] And water stored in like single-use PET plastic bottles could get a little bit of antimony that leaches into them.
[01:16:46.760 --> 01:16:50.840] And that's probably one of the most common exposures residentially.
[01:16:50.840 --> 01:16:58.040] Obviously, there's always the potential for exposure in the workplace if you're working in a company that uses antimony in its manufacturing process.
[01:16:58.040 --> 01:17:01.400] Although, apparently, from the research that I did, that's not a big problem.
[01:17:01.400 --> 01:17:06.840] It's just antimony is not something that people generally get exposed to, even industrially.
[01:17:06.840 --> 01:17:10.280] But residentially, it's not coming from your power cord.
[01:17:10.280 --> 01:17:17.280] If somehow you're getting exposed to antimony, you know, in your environment, that's not where I would be looking for the exposure.
[01:17:14.840 --> 01:17:20.000] It's probably from PET plastics.
[01:17:20.320 --> 01:17:25.120] That would be a much more plausible culprit there.
[01:17:25.440 --> 01:17:28.720] So, you know, this is the culture of TikTok.
[01:17:28.720 --> 01:18:01.120] Somebody who doesn't know what they're talking about, making huge leaps, huge assumptions, not doing anything even remotely scientific, not, you know, doing any serious investigation, just completely superficial, and then making massive leaps of logic and going right to the fear-mongering and then just telling their followers to throw out this perfectly safe appliance, which actually is good for you in that it cooks with less oil than other types of cooking.
[01:18:01.120 --> 01:18:03.680] Yeah, I mean, like, there's nothing magical about an air fryer.
[01:18:03.680 --> 01:18:04.720] It's just a small oven.
[01:18:04.720 --> 01:18:06.960] Yeah, they're tabletop or countertop ovens.
[01:18:07.040 --> 01:18:10.960] It's just the air fryers are efficient because the space is very small.
[01:18:10.960 --> 01:18:13.520] The heats are a lot less energy.
[01:18:14.720 --> 01:18:16.480] The food cooks a lot more quickly.
[01:18:16.480 --> 01:18:18.480] It's just an efficient design.
[01:18:18.480 --> 01:18:23.760] But what I'm finding actually is that the air fryers are the new microwaves.
[01:18:23.760 --> 01:18:30.800] And what I mean by that is that, like, since microwaves have been around, there's been all these conspiracy theories surrounding microwaves because people are afraid of it.
[01:18:30.800 --> 01:18:33.120] You know, it's like this, it's high technology.
[01:18:33.120 --> 01:18:39.840] So that people get anxious about that and they invent issues.
[01:18:39.840 --> 01:18:43.920] So there's been conspiracy theories about microwaves swirling around for decades.
[01:18:43.920 --> 01:18:47.760] And now we're seeing the same thing with air fryers just because they're new.
[01:18:47.760 --> 01:18:50.480] But again, it's like they're just small ovens.
[01:18:50.480 --> 01:18:52.800] There's nothing magical about them.
[01:18:52.800 --> 01:18:54.800] The air fryer is great for frozen food.
[01:18:54.800 --> 01:18:57.520] I mean, it's great for a lot of things, but it's really great for frozen food.
[01:18:57.520 --> 01:18:59.120] Reheating pizza.
[01:18:59.120 --> 01:19:00.200] Reheating anything.
[01:19:00.200 --> 01:19:02.200] Yeah, it's really good for reheating stuff, too.
[01:19:02.200 --> 01:19:03.480] I should get one.
[01:19:03.800 --> 01:19:05.800] Okay, we got one email.
[01:19:06.040 --> 01:19:09.240] This one comes from Daniel Kay from LA.
[01:19:09.240 --> 01:19:10.760] Another rhyme.
[01:19:10.760 --> 01:19:14.200] He writes: I'm a longtime listener and fan of the SGU.
[01:19:14.200 --> 01:19:17.880] I have been reading more about climate change scientists and came across Dr.
[01:19:17.880 --> 01:19:26.200] Judith Curry and her testimony on the subject that sounds straight out of the SGU critical thinking and following the data approach to skepticism.
[01:19:26.200 --> 01:19:29.080] What is your take and shouldn't this be open for discussion?
[01:19:29.080 --> 01:19:32.200] Then he gives a link to her testimony.
[01:19:32.520 --> 01:19:34.200] So, yeah, so Dr.
[01:19:34.200 --> 01:19:39.720] Judith Curry is a well-known climate change denier.
[01:19:39.720 --> 01:19:41.960] But she's also a climatologist.
[01:19:41.960 --> 01:19:42.440] Yes.
[01:19:42.440 --> 01:19:44.760] Yeah, which is climate scientists.
[01:19:45.240 --> 01:19:45.800] She's one of the clients.
[01:19:46.040 --> 01:19:47.320] It makes it more complicated.
[01:19:47.320 --> 01:19:47.720] Exactly.
[01:19:48.120 --> 01:19:49.000] The one person.
[01:19:49.800 --> 01:19:50.280] Right.
[01:19:50.280 --> 01:19:53.960] So she is clearly an outlier.
[01:19:54.440 --> 01:20:04.520] She has opinions about the science behind anthropogenic global warming that is out of the mainstream, right?
[01:20:04.520 --> 01:20:10.760] So she disagrees with the 98% or whatever of her colleagues who interpret the evidence differently.
[01:20:10.760 --> 01:20:17.480] And she is known as a contrarian, and she's had this contradictory opinion for decades.
[01:20:17.480 --> 01:20:20.040] I don't know how this really, really all happened.
[01:20:20.040 --> 01:20:26.920] I don't know if this is, you know, maybe she's just not a very good climate scientist or she is just a contrarian generally.
[01:20:26.920 --> 01:20:33.480] Or maybe early on she was not as convinced by the evidence or saw some problems with the evidence.
[01:20:33.480 --> 01:20:45.280] And then once she got into the position of being like the skeptic of climate change, she felt like she had to defend that position and couldn't, you know, get out of it.
[01:20:45.280 --> 01:20:46.800] And then like doubled, tripled down.
[01:20:44.760 --> 01:20:48.480] I don't know what the process was.
[01:20:48.800 --> 01:20:54.560] What I do know is that her opinions on climate change have not held up well over time.
[01:20:54.880 --> 01:21:02.000] But I can imagine that in a vacuum, you know, without somebody standing next to her fact-checking her, she's using all the right lingo.
[01:21:02.000 --> 01:21:06.160] She sounds like she knows what she's talking about, and she has all the right credentials.
[01:21:06.160 --> 01:21:08.080] And that can be really confusing.
[01:21:08.080 --> 01:21:14.640] Her big thing is that she says that we're more, the data is more uncertain than her colleagues are saying.
[01:21:14.800 --> 01:21:18.080] She actually kind of agrees that, yes, the Earth is warming.
[01:21:18.080 --> 01:21:23.840] Yes, it's due in part to human-generated greenhouse gases, including carbon dioxide.
[01:21:23.840 --> 01:21:27.760] Yes, this could lead to potentially catastrophic consequences.
[01:21:27.760 --> 01:21:39.440] But just that there's way more uncertainty than what the scientific community and the international panel, intergovernmental panel on climate change is saying.
[01:21:39.440 --> 01:21:44.160] That's kind of been the drum that she's been beating.
[01:21:44.480 --> 01:21:54.000] But the thing is, when you get down to it, when you look at her specific opinions, they're not that far off of sort of mainstream climate change denial.
[01:21:54.000 --> 01:21:56.880] So for example, let's go over some of the things that she said.
[01:21:57.120 --> 01:22:02.080] She said that global warming stopped in 1995.
[01:22:02.080 --> 01:22:10.000] And she said it again in about 1998 and 2002 and 2007 and 2010.
[01:22:10.240 --> 01:22:15.600] So there's fluctuations in the background temperature.
[01:22:15.600 --> 01:22:19.120] And this has been a ploy of, you know, again, climate change deniers for a very long time.
[01:22:19.120 --> 01:22:22.960] Every time the curve turns down, they say, oh, look, climate change has stopped.
[01:22:22.960 --> 01:22:24.800] It's reverting to the mean or whatever.
[01:22:24.800 --> 01:22:29.080] But of course, the back, the long-term trend has not changed.
[01:22:29.400 --> 01:22:30.840] We're still warming.
[01:22:30.840 --> 01:22:35.480] So she was wrong every time she said, you know, that global warming has stopped.
[01:22:35.480 --> 01:22:46.520] She also bought into the whole scientist, tried to, quote-unquote, hide the decline as sort of some kind of conspiracy to hide, I guess, the uncertainty, which has been completely debunked.
[01:22:46.520 --> 01:22:58.520] She's characterized the IPCC as alarmist, even though their predictions underestimated climate warming since they've been underestimating it.
[01:22:58.920 --> 01:23:01.480] But yet she's calling them alarmist.
[01:23:01.480 --> 01:23:11.800] She also argued at one point that there is no consensus, you know, despite the fact that 97%, now more, of climate experts agree in anthropogenic global warming.
[01:23:11.800 --> 01:23:13.960] So she's just, she's a contrarian.
[01:23:13.960 --> 01:23:15.880] She's on the outside of the mainstream.
[01:23:15.880 --> 01:23:18.840] You know, she doesn't represent the mainstream opinion.
[01:23:19.160 --> 01:23:24.360] So just because she's a climate scientist doesn't mean that she's correct, right?
[01:23:24.840 --> 01:23:28.440] And this is a good general lesson about the argument from authority.
[01:23:28.440 --> 01:23:37.960] You know, reliable authority lies in a strong, hard-earned consensus of multiple scientists and experts, not one person's opinion.
[01:23:37.960 --> 01:23:41.320] It never rests on one person because one person could be quirky.
[01:23:41.320 --> 01:23:42.840] They could be a contrarian.
[01:23:42.840 --> 01:23:44.760] They could just be wrong.
[01:23:44.760 --> 01:23:51.960] Now, having said all of that, I do think that the best way to respond to somebody like Dr.
[01:23:51.960 --> 01:23:57.320] Curry is to just focus on their claims and debunk them, right?
[01:23:57.320 --> 01:24:09.000] Or just analyze them, see if they have any merit, and defend whatever opinion that they're criticizing, you know, with logic and evidence.
[01:24:09.000 --> 01:24:11.160] I just, that's the best way to deal with it.
[01:24:11.160 --> 01:24:13.000] It's actually not a bad thing.
[01:24:13.000 --> 01:24:21.120] You know, I think every science should have the contrarians on the fringe who are saying, but wait a minute, how do we know this is really true?
[01:24:21.120 --> 01:24:25.920] And whatever, just to keep the whole process honest, I think that's fine.
[01:24:26.240 --> 01:24:28.320] It actually, I think, helps the process.
[01:24:28.320 --> 01:24:30.960] The problem here, though, is a couple of things.
[01:24:30.960 --> 01:24:38.000] One is that there is a campaign of denial that is funded by the fossil fuel industry and that has been taken up by a political party.
[01:24:38.000 --> 01:24:46.000] So we're not dealing with a good faith context here, a good faith community.
[01:24:46.000 --> 01:24:50.800] Whether or not she is deliberately part of that or not is almost irrelevant.
[01:24:50.800 --> 01:25:03.120] The problem is that even good faith, devil playing devil's advocate kind of science, then gets used by denialist, politically motivated, ideologically motivated campaigns.
[01:25:03.120 --> 01:25:14.080] The second thing is that there are massive, important policy decisions resting upon what the scientific community says about climate science.
[01:25:14.080 --> 01:25:16.320] And so this is always tricky.
[01:25:16.320 --> 01:25:40.480] You know, we're having scientific discussions in the literature among experts, and that's fine, but that then gets exploited and used by, again, people who are not acting in good faith, who are then trying to mine all of that for the purpose of political denial.
[01:25:40.800 --> 01:25:44.960] So that complicates the whole situation, right?
[01:25:44.960 --> 01:25:48.160] And whether intentional or not, Dr.
[01:25:48.160 --> 01:26:08.120] Curry has lent a tremendous amount of aid and comfort to the climate change denial community who are not acting in good faith and have really hampered the world's response to what is a very serious and time-sensitive situation.
[01:26:08.440 --> 01:26:09.080] Right?
[01:26:09.080 --> 01:26:17.800] So, again, it's complicated, but when you drill down on her claims, they just don't hold water.
[01:26:18.760 --> 01:26:26.360] They have been pretty much utterly trounced by her climate expert colleagues.
[01:26:26.680 --> 01:26:30.920] Eczema isn't always obvious, but it's real.
[01:26:30.920 --> 01:26:33.880] And so is the relief from EBGLIS.
[01:26:33.880 --> 01:26:40.600] After an initial dosing phase, about four in ten people taking EBGLIS achieved itch relief and clear or almost clear skin at 16 weeks.
[01:26:40.600 --> 01:26:44.920] And most of those people maintain skin that's still more clear at one year with monthly dosing.
[01:26:44.920 --> 01:26:56.360] EBGLIS Libricizumab LBKZ, a 250 milligram per 2 milliliter injection, is a prescription medicine used to treat adults and children 12 years of age and older who weigh at least 88 pounds or 40 kilograms with moderate to severe eczema.
[01:26:56.360 --> 01:27:02.600] Also called atopic dermatitis that is not well controlled with prescription therapies used on the skin or topicals or who cannot use topical therapies.
[01:27:02.680 --> 01:27:05.640] EBGLIS can be used with or without topical corticosteroids.
[01:27:05.640 --> 01:27:07.320] Don't use if you're allergic to EBGLIS.
[01:27:07.320 --> 01:27:09.320] Allergic reactions can occur that can be severe.
[01:27:09.320 --> 01:27:10.440] Eye problems can occur.
[01:27:10.440 --> 01:27:12.760] Tell your doctor if you have new or worsening eye problems.
[01:27:12.760 --> 01:27:15.240] You should not receive a live vaccine when treated with EBGLIS.
[01:27:15.240 --> 01:27:18.440] Before starting EBGLIS, tell your doctor if you have a parasitic infection.
[01:27:18.440 --> 01:27:19.480] Searching for real relief?
[01:27:19.480 --> 01:27:27.000] Ask your doctor about EBGLIS and visit ebglis.lily.com or call 1-800-LILLIERX or 1-800-545-5979.
[01:27:27.000 --> 01:27:31.480] You probably think it's too soon to join AARP, right?
[01:27:31.480 --> 01:27:33.720] Well, let's take a minute to talk about it.
[01:27:33.720 --> 01:27:36.440] Where do you see yourself in 15 years?
[01:27:36.440 --> 01:27:40.920] More specifically, your career, your health, your social life.
[01:27:40.920 --> 01:27:43.560] What are you doing now to help you get there?
[01:27:43.560 --> 01:27:48.640] There are tons of ways for you to start preparing today for your future with AARP.
[01:27:48.960 --> 01:27:51.040] That dream job you've dreamt about?
[01:27:51.040 --> 01:27:55.120] Sign up for AARP reskilling courses to help make it a reality.
[01:27:55.120 --> 01:27:59.920] How about that active lifestyle you've only spoken about from the couch?
[01:27:59.920 --> 01:28:04.960] AARP has health tips and wellness tools to keep you moving for years to come.
[01:28:04.960 --> 01:28:08.880] But none of these experiences are without making friends along the way.
[01:28:08.880 --> 01:28:12.880] Connect with your community through AARP volunteer events.
[01:28:12.880 --> 01:28:17.280] So it's safe to say it's never too soon to join AARP.
[01:28:17.280 --> 01:28:21.440] They're here to help your money, health, and happiness live as long as you do.
[01:28:21.440 --> 01:28:25.680] That's why the younger you are, the more you need AARP.
[01:28:25.680 --> 01:28:30.000] Learn more at AARP.org/slash wisefriend.
[01:28:30.000 --> 01:28:32.320] This is where projects come to life.
[01:28:32.320 --> 01:28:42.560] Our showrooms are designed to inspire with the latest products from top brands, curated in an inviting, hands-on environment and a team of industry experts to support your project.
[01:28:42.560 --> 01:28:48.000] We'll be there to make sure everything goes as planned: from product selection to delivery coordination.
[01:28:48.000 --> 01:28:53.200] At Ferguson Bath Kitchen and Lighting Gallery, your project is our priority.
[01:28:53.200 --> 01:28:58.240] Discover great brands like Kohler at your local Ferguson showroom.
[01:29:00.800 --> 01:29:11.600] At the Institute for Advanced Reconstruction, we treat challenging nerve and reconstructive conditions and push the boundaries of medicine to restore movement, function, and hope.
[01:29:11.600 --> 01:29:16.520] From intricate phrenic nerve repairs to life-changing brachial plexus reconstructions.
[01:29:16.520 --> 01:29:22.880] Our world-class specialists take on the most complex cases, providing expertise where it's needed most.
[01:29:22.880 --> 01:29:29.800] When your patient needs options beyond the ordinary, refer with confidence at advancedreconstruction.com.
[01:29:29.440 --> 01:29:33.800] All right, let's move on with science or fiction.
[01:29:36.360 --> 01:29:41.480] It's time for science or fiction.
[01:29:45.960 --> 01:29:50.600] Each week, I come up with three science news items or facts, two real and one fake.
[01:29:50.600 --> 01:29:55.160] Then I challenge my panel of skeptics to tell me which one is the fake.
[01:29:55.400 --> 01:29:57.880] We have three regular news items this week.
[01:29:57.880 --> 01:29:59.720] Not two because Jay's not here.
[01:29:59.720 --> 01:30:00.680] You didn't go over it.
[01:30:01.000 --> 01:30:02.680] There is sort of a theme here.
[01:30:02.680 --> 01:30:03.880] There's a week kind of theme.
[01:30:03.880 --> 01:30:07.400] They're regular news items, but there's a theme of good or bad.
[01:30:07.400 --> 01:30:10.840] Some of these are either really good or really bad.
[01:30:11.160 --> 01:30:12.440] Like most weeks.
[01:30:12.440 --> 01:30:14.680] Well, that means two of them will be good or two of them.
[01:30:14.920 --> 01:30:15.480] All right.
[01:30:15.480 --> 01:30:16.840] Here we go.
[01:30:16.840 --> 01:30:25.480] Item number one: a new study finds that risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccines.
[01:30:25.480 --> 01:30:26.280] Good.
[01:30:27.240 --> 01:30:28.200] Thanks, Kara.
[01:30:28.200 --> 01:30:38.760] Item number two: a recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
[01:30:39.240 --> 01:30:49.080] And item number three, researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:30:49.400 --> 01:30:52.840] Kara, you seem very eager, so why don't you go first?
[01:30:53.160 --> 01:31:00.840] Okay, so the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
[01:31:01.160 --> 01:31:04.760] I could see this happening one of two ways.
[01:31:04.760 --> 01:31:15.360] Definitely, if you decrease the risk of getting COVID, then you decrease the risk of then having long COVID symptoms after COVID infection.
[01:31:14.840 --> 01:31:17.280] This is not overall in the population.
[01:31:17.440 --> 01:31:23.840] What this is saying is that for people who got COVID, the risk of developing long COVID was reduced.
[01:31:24.160 --> 01:31:24.800] Right.
[01:31:24.800 --> 01:31:38.400] But even still, I'm wondering if that reasoning stands because for people who got COVID, the longer the pandemic went on and the more they were vaccinated or the more immunity they developed, the weaker their COVID infections were.
[01:31:38.400 --> 01:31:50.880] But then I wonder if there's like an almost equal and opposite way to look at this one, where like for some people, long COVID appears to be some sort of like autoimmune or like excessive immune response.
[01:31:50.880 --> 01:31:58.240] And if that's the case, yes, more COVID infection, bad, but also maybe accumulation of vaccine.
[01:31:58.240 --> 01:31:58.880] I don't really know.
[01:31:58.880 --> 01:32:01.920] But I think, I don't know, that one is feeling like science to me.
[01:32:01.920 --> 01:32:11.760] A recent analysis of primate genomes finds that shared viral inclusions, so reduce overall cancer risk by stabilizing the genome.
[01:32:11.760 --> 01:32:14.960] Shared viral inclusion, shared by whom?
[01:32:14.960 --> 01:32:16.240] Primates.
[01:32:16.480 --> 01:32:20.800] These are viral inclusions that are found in all primates within the primate clinic.
[01:32:20.960 --> 01:32:21.360] Oh, I see.
[01:32:21.360 --> 01:32:23.200] So among different, yeah, okay, gotcha.
[01:32:23.440 --> 01:32:24.880] Different species within this.
[01:32:24.880 --> 01:32:25.360] Okay.
[01:32:25.360 --> 01:32:30.720] So if there are viral inclusions there, that would reduce overall cancer risk by stabilizing the genome.
[01:32:31.200 --> 01:32:37.200] So these are these are viral genomic snippets that have incorporated themselves into our genes.
[01:32:37.440 --> 01:32:38.880] Across multiple species.
[01:32:38.880 --> 01:32:39.440] Yeah.
[01:32:39.440 --> 01:32:41.360] And so, but like, I don't know.
[01:32:41.360 --> 01:32:51.120] I mean, yes, cancer is like very largely genetic, but you can have a relatively stable genome and still have like messed up oncogenes and messed up tumor suppressors.
[01:32:51.120 --> 01:33:00.360] Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:32:59.840 --> 01:33:00.920] We'll see.
[01:33:01.160 --> 01:33:05.080] Do we even have much of that left?
[01:33:05.080 --> 01:33:08.680] I think it's the cancer one that is the fiction.
[01:33:08.680 --> 01:33:10.840] I'm not exactly sure why.
[01:33:11.480 --> 01:33:12.600] All right, Bob?
[01:33:12.920 --> 01:33:20.600] This first one makes sense with the minimization of long COVID with the introduction of vaccinations.
[01:33:20.600 --> 01:33:31.000] Yeah, it just makes sense that the more people that have the attenuated COVID would be more likely to not even experience long COVID.
[01:33:31.000 --> 01:33:33.560] It just seems very reasonable.
[01:33:33.880 --> 01:33:35.480] Let me go to the third one.
[01:33:35.880 --> 01:33:39.000] Global sea ice has decreased its cooling effect.
[01:33:39.000 --> 01:33:43.880] So, yeah, I'm trying to figure out what the cooling effect would actually have been for sea ice.
[01:33:43.880 --> 01:33:53.800] And I don't think it's necessarily dramatic, but that's irrelevant because it's whatever cooling effect it has, it's decreased by 14%.
[01:33:53.800 --> 01:33:55.800] I wonder how they would have calculated that.
[01:33:55.800 --> 01:33:57.960] That seems somewhat reasonable.
[01:33:58.440 --> 01:34:04.760] More reasonable than the second one, having these viral inclusions stabilizing the genome for cancer.
[01:34:05.880 --> 01:34:17.080] Yeah, just that just seems, sure, it seems, it's not like ridiculously impossible, and that would be great if it were true, but it just seems less likely than the other ones, definitely.
[01:34:17.080 --> 01:34:18.760] So I'll say that's fiction as well.
[01:34:18.760 --> 01:34:19.800] And Evan.
[01:34:19.800 --> 01:34:22.360] Yeah, I don't want to be alone on this one.
[01:34:23.000 --> 01:34:33.560] Well, I mean, mostly because from the get-go, I was really thinking of the three, the one I understand the least is the one about the primate genomes, you know.
[01:34:33.560 --> 01:34:37.000] Whereas the other two, I kind of have at least some kind of sense for.
[01:34:37.000 --> 01:34:41.240] Obviously, I know what long COVID is, decreased over the course of the pandemic.
[01:34:41.240 --> 01:34:45.760] The only reason I think that one might be the fiction or could have been the fiction is because there have been these peaks.
[01:34:44.520 --> 01:34:49.040] What, during the course of the pandemic, there were peaks, right?
[01:34:44.680 --> 01:34:51.360] And then it went back down, but then it peaked again.
[01:34:51.360 --> 01:35:00.960] And I don't know if that had anything to do with how the numbers would have played out as far as determining the long COVID and if the vaccination effect, how the vaccination had an effect on that.
[01:35:00.960 --> 01:35:03.680] But you did say mostly due, not exclusively due.
[01:35:03.680 --> 01:35:05.440] So that's why I think that one's science.
[01:35:05.440 --> 01:35:15.840] And then, yeah, for the same reasons Bob said about the sea ice, 14%, but you know, how much overall are we really talking about and the whole grouping of things that go into that?
[01:35:16.720 --> 01:35:20.800] So therefore, that only leaves me with genomes as fiction.
[01:35:20.800 --> 01:35:22.880] All right, I guess I'll take these in order.
[01:35:22.880 --> 01:35:24.000] We'll start with number one.
[01:35:24.000 --> 01:35:31.360] A new study finds that the risk of long COVID decreased over the course of the pandemic, and this was mostly due to vaccination.
[01:35:31.360 --> 01:35:34.240] You guys all think this one is science.
[01:35:34.240 --> 01:35:47.440] Well, let me tell you first that the risk of long COVID has been decreasing over the course of the pandemic, and it's due to two things.
[01:35:47.440 --> 01:35:50.640] One is the change of the variants, right?
[01:35:50.640 --> 01:35:54.400] Pre-Delta to Delta to Omicron.
[01:35:54.720 --> 01:35:59.600] The later variants had less long COVID, but it's also due to vaccination.
[01:35:59.600 --> 01:36:03.280] So the question is, which one had more of an effect?
[01:36:03.440 --> 01:36:03.840] There you go.
[01:36:04.640 --> 01:36:05.200] Mostly.
[01:36:05.200 --> 01:36:05.520] Mostly.
[01:36:06.000 --> 01:36:07.600] Is science.
[01:36:07.600 --> 01:36:08.800] Yep, this is science.
[01:36:08.800 --> 01:36:20.880] Yep, the researchers found that vaccination was due to about 75% of the reduction in the risk of getting long COVID following a COVID infection.
[01:36:20.880 --> 01:36:24.720] So, yeah, it is one more great thing about the vaccines.
[01:36:24.720 --> 01:36:27.360] They reduce your risk of getting the COVID.
[01:36:27.360 --> 01:36:31.400] They reduce the severity of the COVID, and they reduce your risk of long COVID.
[01:36:29.840 --> 01:36:33.720] So, yep, that was mostly due to the vaccines.
[01:36:33.960 --> 01:36:47.880] The later variants were overall sort of less virulent, although they spread more easily, they didn't cause as bad as a disease, which is something that does typically happen during pandemics.
[01:36:47.880 --> 01:36:50.040] Okay, let's go on to number two.
[01:36:50.040 --> 01:36:56.920] A recent analysis of primate genomes finds that shared viral inclusions reduce overall cancer risk by stabilizing the genome.
[01:36:56.920 --> 01:37:02.520] You guys all think this one is the fiction, and this one is the fiction.
[01:37:02.920 --> 01:37:07.240] Yeah, maybe it's going first, it's so stressful.
[01:37:07.240 --> 01:37:07.800] It is, right?
[01:37:08.040 --> 01:37:19.160] Because it turns out that these viral inclusions, these shared viral inclusions, increase the overall risk of destabilizing the genome.
[01:37:19.480 --> 01:37:20.520] Right, that makes sense.
[01:37:20.520 --> 01:37:21.080] That makes more sense.
[01:37:22.120 --> 01:37:23.160] Let's get rid of them.
[01:37:23.160 --> 01:37:29.640] Yeah, so the author's right that they found that these viral inclusions cause transcriptional dysregulation.
[01:37:29.640 --> 01:37:38.040] And essentially, what that means is that it's more likely for there to be the kinds of mutations that do lead to cancer, right?
[01:37:38.040 --> 01:37:41.480] Mutations that cause the cells to reproduce out of control or whatever.
[01:37:41.480 --> 01:37:44.440] So, yep, it increases the risk of cancer.
[01:37:44.440 --> 01:37:58.120] Now, these viral inclusions are very interesting from a basic science evolutionary point of view because they are an independent, powerful molecular evidence for evolution, right?
[01:37:58.440 --> 01:38:14.600] Because essentially, when you know, you remember like some viruses have reverse transcriptase, they basically insert their DNA into cells, and sometimes it gets into the germline, and you have sort of this permanent addition of a bit of viral DNA in the genome.
[01:38:14.720 --> 01:38:18.720] When that happens, every descendant inherits it, right?
[01:38:18.720 --> 01:38:19.920] So there you go.
[01:38:19.920 --> 01:38:25.280] Any future speciation, whatever, it carries through throughout all the descendants that pick it up.
[01:38:25.280 --> 01:38:38.320] Of course, sometimes it could be lost, but basically, you know, you could see these patterns, these nestled hierarchies of viral inclusions that are following the evolutionary tree of relationships.
[01:38:38.320 --> 01:38:46.960] And therefore, it's really an awesomely powerful, independent evidence for the historical fact of evolution.
[01:38:46.960 --> 01:38:49.840] There's really just no way around it.
[01:38:49.840 --> 01:38:57.040] There is no non-evolutionary explanation for the pattern of inclusions that we see in nature.
[01:38:57.360 --> 01:39:01.200] Yeah, the other side of that coin, could they make you a superhero?
[01:39:02.080 --> 01:39:06.080] They could ask for a friend.
[01:39:06.080 --> 01:39:06.880] Asking for a superhero.
[01:39:06.960 --> 01:39:09.520] So you're not saying it's impossible.
[01:39:11.120 --> 01:39:12.560] All right, let's go on number three.
[01:39:12.560 --> 01:39:19.120] Researchers find that global sea ice has decreased its cooling effect on the climate by 14% since 1980.
[01:39:19.120 --> 01:39:21.200] This one, of course, is science.
[01:39:21.200 --> 01:39:28.080] Yeah, this is one of those bad positive feedback loops in climate change.
[01:39:28.320 --> 01:39:35.680] As sea ice melts, it reduces the amount of radiation that it reflects back into space, which has a cooling effect.
[01:39:35.680 --> 01:39:42.560] Therefore, the cooling effect is reduced, leads to more warming, more ice melting, less cooling, more warming, etc.
[01:39:42.800 --> 01:39:44.480] So that's bad.
[01:39:44.800 --> 01:39:49.600] Because the sea ice has a pretty high albedo, it reflects a lot of energy.
[01:39:49.600 --> 01:39:53.440] And the sea water, right, the oceans are very dark.
[01:39:53.440 --> 01:39:55.520] They have a very low albedo.
[01:39:55.520 --> 01:39:57.920] They reflect very little light.
[01:39:57.920 --> 01:40:03.800] So there is a dramatic difference between the ice and the non-ice-covered ocean.
[01:40:04.120 --> 01:40:10.680] So loss of sea ice can be a massive positive feedback effect on climate change.
[01:40:10.680 --> 01:40:27.000] One other interesting wrinkle to this was that the scientists found that this reduction in that 14% reduction in the cooling effect is greater than just the reduction in the surface area of sea ice, the average surface area of the sea ice over the course of the year.
[01:40:27.640 --> 01:40:31.640] The decrease in the cooling effect is greater than the decrease in the surface area.
[01:40:31.640 --> 01:40:32.840] So it's not linear.
[01:40:32.840 --> 01:40:36.840] Yeah, so it's not a strictly linear relationship, which is interesting.
[01:40:36.840 --> 01:40:37.400] We won.
[01:40:37.400 --> 01:40:38.040] Woohoo!
[01:40:38.040 --> 01:40:38.680] We win.
[01:40:38.680 --> 01:40:39.480] Yes, winner.
[01:40:39.480 --> 01:40:40.040] What do we win?
[01:40:40.040 --> 01:40:40.680] What do you get?
[01:40:40.680 --> 01:40:43.800] Well, you all get to hear Evan's quote.
[01:40:43.800 --> 01:40:45.720] Evan, let us have the quote.
[01:40:45.720 --> 01:40:47.000] Woohoo!
[01:40:47.640 --> 01:40:49.720] Which would have heard anyways, I suppose.
[01:40:49.720 --> 01:40:52.520] But let's not ruin the parade right now.
[01:40:52.520 --> 01:40:53.080] Yeah.
[01:40:53.720 --> 01:40:58.280] In effect, we're all guinea pigs for the dietary supplement industry.
[01:40:58.280 --> 01:41:03.480] The vast majority of these supplements don't deliver on their promises.
[01:41:03.480 --> 01:41:03.880] Dr.
[01:41:03.880 --> 01:41:14.040] Nick Tiller, who's the author of the book called The Skeptic's Guide to Sports Science: Confronting Myths of the Health and Fitness Industry.
[01:41:14.040 --> 01:41:14.680] Hey.
[01:41:15.000 --> 01:41:19.400] Yeah, if you're anyone else but Douglas Adams, you borrowed that from us.
[01:41:21.320 --> 01:41:22.200] And Dr.
[01:41:22.200 --> 01:41:30.920] Tiller will be happy to say hello to you when we are at SciCon in October as part of that conference, which is going to be cool.
[01:41:30.920 --> 01:41:35.640] And I'm featuring a bunch of quotes from people who are going to be at that conference specifically.
[01:41:35.640 --> 01:41:36.520] Oh, that's fun.
[01:41:36.520 --> 01:41:37.000] All right.
[01:41:37.000 --> 01:41:38.280] Thank you, Evan.
[01:41:38.280 --> 01:41:40.840] And thank you all for joining me this week.
[01:41:40.840 --> 01:41:41.240] Sure, Ben.
[01:41:41.320 --> 01:41:41.800] Thank you, Steve.
[01:41:41.800 --> 01:41:42.680] Thank you, Steve.
[01:41:42.680 --> 01:41:47.440] And until next week, this is your Skeptics Guide to the Universe.
[01:41:49.760 --> 01:41:56.400] Skeptics Guide to the Universe is produced by SGU Productions, dedicated to promoting science and critical thinking.
[01:41:56.400 --> 01:42:01.040] For more information, visit us at the skepticsguide.org.
[01:42:01.040 --> 01:42:04.960] Send your questions to info at the skepticsguide.org.
[01:42:04.960 --> 01:42:15.680] And if you would like to support the show and all the work that we do, go to patreon.com/slash skepticsguide and consider becoming a patron and becoming part of the SGU community.
[01:42:15.680 --> 01:42:19.120] Our listeners and supporters are what make SGU possible.